Richard Feynman: “… But you’ve gotta stop and think about it. About the complexity to really get the pleasure. And it’s all really there … the inconceivable nature of nature! …”
And when I read Feynman’s description of a rose — in which he explained how he could experience the fragrance and beauty of the flower as fully as anyone, but how his knowledge of physics enriched the experience enormously because he could also take in the wonder and magnificence of the underlying molecular, atomic, and subatomic processes — I was hooked for good. I wanted what Feynman described: to assess life and to experience the universe on all possible levels, not just those that happened to be accessible to our frail human senses. The search for the deepest understanding of the cosmos became my lifeblood […] Progress can be slow. Promising ideas, more often than not, lead nowhere. That’s the nature of scientific research. Yet, even during periods of minimal progress, I’ve found that the effort spent puzzling and calculating has only made me feel a closer connection to the cosmos. I’ve found that you can come to know the universe not only by resolving its mysteries, but also by immersing yourself within them. Answers are great. Answers confirmed by experiment are greater still. But even answers that are ultimately proven wrong represent the result of a deep engagement with the cosmos — an engagement that sheds intense illumination on the questions, and hence on the universe itself. Even when the rock associated with a particular scientific exploration happens to roll back to square one, we nevertheless learn something and our experience of the cosmos is enriched.
When people think of “science education”, they usually tend to think about it in the context of high school or college. When in reality it should be thought of as encompassing education life-long, for if analyzed deeply, we all realize that we never cease to educate ourselves no matter what our trade. Because we understand that what life demands of us is the capacity to function efficiently in a complex society. As we gain or lose knowledge, our capacities keep fluctuating and we always desire and often strive for them to be right at the very top along that graph.
When it comes to shaping attitudes towards science, which is what I’m concerned about in this post, I’ve noticed that this begins quite strongly during high school, but as students get to college and then university, it gradually begins to fade away, even in some of the more scientific career paths. By then I guess, some of these things are assumed (at times you could say, wrongfully). We aren’t reminded of it as frequently and it melts into the background as we begin coping with the vagaries of grad life. By the time we are out of university, for a lot of us, the home projects, high-school science fests, etc. that we did in the past as a means to understand scientific attitude, ultimately become a fuzzy, distant dream.
I’ve observed this phenomenon as a student in my own life. As med students, we are seldom reminded by professors of what it is that constitutes scientific endeavor or ethic. Can you recall when was the last time you had didactic discussions on the topic?
I came to realize this vacuum early on in med school. And a lot of times this status quo doesn’t do well for us. Take Evidence-Based-Medicine (EBM) for example. One of the reasons, why people make errors in interpreting and applying EBM in my humble opinion, is precisely because of the naivete that such a vacuum allows to fester. What ultimately happens is that students remain weak in EBM principles, go on to become professors, can not teach EBM to the extent that they ought to and a vicious cycle ensues whereby the full impact of man’s progress in Medicine will not be fulfilled. And the same applies to how individuals, departments and institutions implement auditing, quality-assurance, etc. as well.
A random post that I recently came across in the blogosphere touched upon the interesting idea that when you really think about it, most practicing physicians are ultimately technicians whose job it is to fix and maintain patients (like how a mechanic oils and fixes cars). The writer starts out with a provocative beginning,
Is There A Doctor In The House?
Medical doctors often like to characterize themselves as scientists, and many others in the public are happy to join them in this.
I submit, however, that such a characterization is an error.
and divides science professionals into,
SCIENTIST: One whose inquiries are directed toward the discovery of new facts.
ENGINEER: One whose inquiries are directed toward the new applications of established facts.
TECHNICIAN: One whose inquiries are directed toward the maintenance of established facts.
and then segues into why even if that’s the case, being a technician in the end has profound value.
Regardless of where you find yourselves in that spectrum within this paradigm, I think it’s obvious that gaining skills in one area helps you perform better in others. So as technicians, I’m sure that practicing physicians will find that their appraisal and implementation of EBM will improve if they delve into how discoverers work and learn about the pitfalls of their trade. The same could be said of learning about how inventors translate this knowledge from the bench to the bedside as new therapies, etc. are developed and the caveats involved in the process.
Yet it is precisely in these aspects that I find that medical education requires urgent reform. Somehow, as if by magic, we are expected to do the work of a technician and to get a grip on EBM practices without a solid foundation for how discoverers and inventors work.
I think it’s about time that we re-kindled the spirit of understanding scientific attitude at our higher educational institutions and in our lives (for those of us who are already out of university).
From self-study and introspection, here are a couple of points and questions that I’ve made a note of so far, as I strive to re-invigorate the scientific spirit within me, in my own way. As you reflect on them, I hope that they are useful to you in working to become a better science professional as well:
- Understand the three types of science professionals and their roles. Ask where in the spectrum you lie. What can you learn about the work professionals in the other categories do to improve how you yourself function?
- Learning about how discoverers work, helps us in getting an idea about the pitfalls of science. Ultimately, questions are far more profound than the answers we keep coming up with. Do we actually know the answer to a question? Or is it more correct to say that we think we know the answer? What we think we know, changes all the time. And this is perfectly acceptable, as long as you’re engaged as a discoverer.
- What are the caveats of using language such as the phrase “laws of nature”? Are they “laws”, really? Or abstractions of even deeper rules and/or non-rules that we cannot yet touch?
- Doesn’t the language we use influence how we think?
- Will we ever know if we have finally moved beyond abstractions to deeper rules and/or non-rules? Abstractions keep shifting, sometimes in diametrically opposite directions (eg: from Newton’s concepts of absolute space-time to Einstein’s concepts of relative space-time, the quirky and nutty ideas of quantum mechanics such as the dual nature of matter and the uncertainty principle, concepts of disease causation progressing from the four humours to microbes and DNA and ultimately a multifactorial model for etiopathogenesis). Is it a bad idea to pursue abstractions in your career? Just look at String Theorists; they have been doing this for a long time!
- Develop humility in approach and knowledge. Despite all the grand claims we make about our scientific “progress”, we’re just a tiny speck amongst the billions and billions of specks in the universe and limited by our senses and the biology of which we are made. The centuries old debate among philosophers of whether man can ever claim to one day have found the “ultimate truth” still rages on. However, recently we think we know from Kurt Godel’s work that there are truths out there in nature that man can never arrive at by scientific proof. In other words, truths that we may never ever know of! Our understanding of the universe and its things keeps shifting continuously, evolving as we ourselves as a species improve (or regress, depending on your point of view). Understanding that all of this is how science works is paramount. And there’s nothing wrong with that. It’s just the way it is! 🙂
- Understand the overwhelming bureaucracy in science these days. But don’t get side-tracked! It’s far too big of a boatload to handle on one’s own! There are dangers that lead people to leave science altogether because of this ton of bureaucracy.
- Science for career’s sake is how many people get into it. Getting a paper out can be a good career move. But it’s far more fun and interesting to do science for science’s own sake, and the satisfaction you get by roaming free, untamed, and out there to do your own thing will be ever more lasting.
- Understand the peer-review process in science and its benefits and short-comings.
- Realize the extremely high failure rate in terms of the results you obtain. Over 90% by most anecdotal accounts – be that in terms of experimental results or publications. But it’s important to inculcate curiosity and to keep the propensity to question alive. To discover. And to have fun in the process. In short, the right attitude; despite knowing that you’re probably never going to earn a Fields medal or Nobel prize! Scientists like Carl Friederich Gauss were known to dislike publishing innumerable papers, favoring quality over quantity. Quite contrary to the trends that Citation Metrics seem to have a hand in driving these days. It might be perfectly reasonable to not get published sometimes. Look at the lawyer-mathematician, Pierre de Fermat of Fermat’s Last Theorem fame. He kept notes and wrote letters but rarely if ever published in journals. And he never did publish the proof of Fermat’s Last Theorem, claiming that it was too large to fit in the margins of a copy of a book he was reading as the thought occurred to him. He procrastinated until he passed away, when it became one of the most profound math mysteries ever to be tackled, only to be solved about 358 years later by Andrew Wiles. But the important thing to realize is that Fermat loved what he did, and did not judge himself by how many gazillion papers he could or could not have had to his name.
- Getting published does have a sensible purpose though. The general principle is that the more peer-review the better. But what form this peer-review takes does not necessarily have to be in the form of hundreds of thousands of journal papers. There’s freedom in how you go about getting it, if you get creative. And yes, sometimes, peer-review fails to serve its purpose. Due to egos and politics. The famous mathematician, Evariste Galois was so fed-up by it that he chose to publish a lot of his work privately. And the rest, as they say, is history.
- Making rigorous strides depends crucially on a solid grounding in Math, Probability and Logic. What are the pitfalls of hypothesis testing? What is randomness and what does it mean? When do we know that something is truly random as opposed to pseudo-random? If we conclude that something is truly random, how can we ever be sure of it? What can we learn from how randomness is interpreted in inflationary cosmology in the manner that there’s “jitter” over quantum distances but that it begins to fade over larger ones (cf. Inhomogeneities in Space)? Are there caveats involved when you create models or conceptions about things based on one or the other definitions of randomness? How important is mathematics to biology and vice versa? There’s value in gaining these skills for biologists. Check out this great paper1 and my own posts here and here. Also see the following lecture that stresses on the importance of teaching probability concepts for today’s world and its problems:
- Developing collaborative skills helps. Lateral reading, attending seminars and discussions at various departments can help spark new ideas and perspectives. In Surely You’re Joking Mr. Feynman!, the famous scientist mentions how he always loved to dabble in other fields, attending random conferences, even once working on ribosomes! It was the pleasure of finding things out that mattered! 🙂
- Reading habits are particularly important in this respect. Diversify what you read. Focus on the science rather than the dreary politics of science. It’s a lot more fun! Learn the value of learning-by-self and taking interest in new things.
- Like it or not, it’s true that unchecked capitalism can ruin balanced lives, often rewarding workaholic self-destructive behavior. Learning to diversify interests helps take off the pressure and keeps you grounded in reality and connected to the majestic nature of the stuff that’s out there to explore.
- The rush that comes from all of this exploration has the potential to lead to unethical behavior. It’s important to not lose sight of the sanctity of life and the sanctity of our surroundings. Remember all the gory examples that WW2 gave rise to (from the Nazi doctors to all of those scientists whose work ultimately gave way to the loss of life that we’ve come to remember in the euphemism, “Hiroshima and Nagasaki”). Here’s where diversifying interests also helps. Think how a nuclear scientist’s perspectives could change about his work if he spent time taking a little interest in wildlife and the environment. Also, check this.
- As you diversify, try seeing science in everything – eg: When you think about photography think not just about the art, but about the nature of the stuff you’re shooting, the wonders of the human eye and the consequences of the arrangement of rods and cones and the consequences of the eyeball being round, its tonal range compared to spectral cameras, the math of perspective, and the math of symmetry, etc.
- Just like setting photography assignments helps to ignite the creative spark in you, set projects and goals in every avenue that you diversify into. There’s no hurry. Take it one step at a time. And enjoy the process of discovery!
- How we study the scientific process/method should be integral to the way people should think about education. A good analogy although a sad one is, conservation and how biology is taught at schools. Very few teachers and schools will go out of their way to emphasize and interweave solutions for sustainable living and conserving wildlife within the matter that they talk about even though they will more than easily get into the nitty-gritty of the taxonomy, the morphology, etc. You’ll find paragraphs and paragraphs of verbiage on the latter but not the former. This isn’t the model to replicate IMHO! There has to be a balance. We should be constantly reminded about what constitutes proper scientific ethic in our education, and it should not get to the point that it begins to fade away into the background.
- The current corporate-driven, public-interest research model is a mixed bag. Science shouldn’t in essence be something for the privileged or be monopolized in the hands of a few. Good ideas have the potential to get dropped if they don’t make business sense. Understand public and private funding models and their respective benefits and shortcomings. In the end realize that there are so many scientific questions out there to explore, that there’s enough to fill everybody’s plate! It’s not going to be the end of the world, if your ideas or projects don’t receive the kind of funding you desire. It’s ultimately pretty arbitrary 🙂 ! Find creative solutions to modify your project or set more achievable goals. The other danger in monetizing scientific progress is the potential to inculcate the attitude of science for money. Doing science for the joy of it is much more satisfying than the doing it for material gain IMHO. But different people have different preferences. It’s striking a balance that counts.
- The business model of science leads us into this whole concept of patent wars and Intellectual Property issues. IMHO there’s much value in having a free-culture attitude to knowledge, such as the open-access and open-source movements. Imagine what the world would be like if Gandhi (see top-right) patented the Satyagrah, requiring random licensing fees or other forms of bondage! 🙂
- It’s important to pursue science projects and conduct fairs and workshops even at the university level (just as much as it is emphasized in high school; I would say to an even greater degree actually). Just to keep the process of discovery and scientific spirit vibrant and alive, if for no other reason. Also, the more these activities reflect the inter-relationship between the three categories of science professionals and their work, the better. Institutions should recognize the need to encourage these activities for curricular credit, even if that means cutting down on other academic burdens. IMHO, on balance, the small sacrifice is worth it.
- Peer-review mechanisms currently reward originality. But at the same time, I think it’s important to reward repeatability/reproducibility. And to reward statistically insignificant findings. This not only helps remove bias in published research, but also helps keep the science community motivated in the face of a high failure rate in experiments, etc.
- Students should learn the art of benchmarking progress on a smaller scale, i.e. in the experiments, projects, etc. that they do. In the grand scheme of things however, we should realize that we may never be able to see humongous shifts in how we are doing in our lifetimes! 🙂
- A lot of stuff that happens at Ivy League universities can be classified as voodoo and marketing. So it’s important to not fret if you can’t get into your dream university. The ability to learn lies within and if appropriately tapped and channelized can be used to accomplish great stuff regardless of where you end up studying. People who graduate from Ivy League institutes form a wide spectrum, with a surprising number who could easily be regarded as brain-dead. IMHO what can be achieved is a lot more dependent on the person rather than the institution he or she goes to. If there’s a will, there’s a way! 🙂 Remember some of science’s most famous stalwarts like Michael Faraday and Srinivasa Ramanujan were largely self-taught!
- Understand the value of computing in science. Not only has this aspect been neglected at institutes (especially in Biology and Medicine), but it’s soon getting indispensable because of the volume of data that one has to sift and process these days. I’ve recently written about bioinformatics and computer programming here and here.
- It’s important to develop a level of honesty and integrity that can withstand the onward thrust of cargo-cult science.
- Learn to choose wisely who your mentors are. Factor in student-friendliness, the time they can spend with you, and what motivates them to pursue science.
- I usually find myself repelled by demagoguery. But if you must, choose wisely who your scientific heroes are. Are they friendly to other beings and the environment? You’d be surprised as to how many evil scientists there can be out there! 🙂
I’m sure there are many many points that I have missed and questions that I’ve left untouched. I’ll stop here though and add new stuff as and when it occurs to me later. Send me your comments, corrections and feedback and I’ll put them up here!
I have academic commitments headed my way and will be cutting down on my blogular activity for a while. But don’t worry, not for long! 🙂
I’d like to end now, by quoting one of my favorite photographers, George Steinmetz:
George Steinmetz: “… I find that there is always more to explore, to question and, ultimately, to understand …”
- Bialek, W., & Botstein, D. (2004). Introductory Science and Mathematics Education for 21st-Century Biologists. Science, 303(5659), 788-790. doi:10.1126/science.1095480
Copyright Firas MR. All Rights Reserved.
“A mote of dust, suspended in a sunbeam.”
Noted mathematician, Timothy Gowers, talks about the importance of math
I’ve often written about Mathematics before Footnotes. As much as math helps us better understand our world (Modern Medicine’s recent strides have a lot to do with applied math for example), it also tells us how severely limited man’s common thinking is.
Humans and yes some animals too, are born with or soon develop an innate ability for understanding numbers. Yet, just like animals, our proficiency with numbers seems to stop short of the stuff that goes beyond our immediate activities of daily living (ADL) and survival. Because we are a higher form of being (or allegedly so, depending on your point of view), our ADLs are a lot more sophisticated than say those of, canaries or hamsters. And consequently you can expect to see a little more refined arithmetic being used by us. But fundamentally, we share this important trait – of being able to work with numbers from an early stage. A man who has a family with kids knows almost by instinct that if he has two kids to look after, that would mean breakfast, lunch and dinner times 2 in terms of putting food on the table. He would have to buy two sets of clothes for his kids. A kid soon learns that he has two parents. And so on. It’s almost natural. And when someone can’t figure out their way doing simple counting or arithmetic, we know that something might be wrong. In Medicine, we have a term for this. It’s called acalculia and often indicates the presence of a neuropsychiatric disorder.
It’s easy for ‘normal’ people to do 2 + 2 in their heads. Two oranges AND two oranges make a TOTAL of four oranges. This basic stuff helps us get by day-to-day. But how many people can wrap their heads around 1 divided by 0? If you went to school, yea sure your teachers must have hammered the answer into you: infinity. But how do you visualize it? Yes, I know it’s possible. But it takes unusual work. I think you can see my point, even with this simple example. We haven’t even begun to speak about probability, wave functions, symmetries, infinite kinds of infinities, multiple-space-dimensions, time’s arrow, quantum mechanics, the Higgs field or any of that stuff yet!
As a species, it is so obvious that we aren’t at all good at math. It’s almost as if we construct our views of the universe through this tunneled vision that helps us in our day-to-day tasks, but fails otherwise.
We tend to think of using math as an ability when really it should be thought of as a sensory organ. Something that is as vital to understanding our surroundings as our eyes, ears, noses, tongues and skins. And despite lacking this sense, we tend to go about living as though we somehow understand everything. That we are aware of what it is to be aware of. This can often lead to trouble down the road. I’ve talked about numerous PhDs having failed at the Monty Hall Paradox before. But a recent talk I watched, touched upon something with serious consequences that meant people being wrongfully convicted because of a stunted interpretation of DNA, fingerprint evidence, etc. by none other than “expert” witnesses. In other words, serious life and death issues. So much for our expertise as a species, eh?!
How the human mind struggles with math!
We recently also learned that the hullabaloo over H1N1 pandemic influenza had a lot do with our naive understanding of math, the pitfalls of corporate-driven public-interest research notwithstanding.
Anyhow, one of my main feelings is that honing one’s math not only helps us survive better, but it can also teach us about our place in the universe. Because we can then begin to fully use it as a sensory organ in its own right. Which is why a lot of pure scientists have argued that doing math for math’s own sake can not only be great fun (if done the right way, of course :-P) but should also be considered necessary. Due to the fact that such research has the potential to reveal entirely new vistas that can enchant us and surprise us at the same time (take Cantor’s work on infinity for example). For in the end, discovery, really, is far more enthralling than invention.
UPDATE 1: Check out the Khan Academy for a virtually A-Z education on math — and all of it for free! This is especially a great resource for those of us who can’t even recall principles of addition, subtraction, etc. let alone calculus or any of the more advanced stuff.
Copyright © Firas MR. All rights reserved.
واقعی کتنا چھوٹا ہے انسان
(ایک ضروری بات: اس مضمون کو سہی روپ میں دیکھنے کے لئے آپ ناظرین کو یہ font ڈاونلوڈ کرکے اپنے سسٹم پر ڈالنا ہوگا . یہ ایسی font ہے جو خاص کمپیوٹر سکرین پر باآسانی پڑھنے کے لئے بنائی گئی ہے .)
آداب دوستو ،
پچھلے کچھ لمحوں میں میرے ذہن میں ایک بات آئی . ہم انسان ‘ اشرف المخلوقات ‘ کے نام سے اپنے آپ کے بارے میں سوچتے ہیں ، اور یقیناً مذہبی طور پر بھی ہم کو یہی سکھایا جاتا ہے . اکثر اس بات کو لے کر ہم کافی مغرور بھی ہو جاتے ہیں . بھول جاتے ہیں کہ ، یے خطاب ہم پر یوں ہی نہیں نوازا گیا . کیا ہم اسکے واقعی حقدار ہیں ؟
سچ پوچھیں تو ہم اسکے حقدار تبھی کہلایںگے جب ہماری فکر اور ہمارے اعمال میں واضح طور پر اسکا ثبوت ہو . لیکن مجھے لگتا ہے کہ ہم میں سے اکثریت کا یہ حال ہے کہ نا تو ہم کو اس کا احساس ہوتا ہے اور نا اس بات کو لے کر دلچسپی . ہم اپنی روز مرّہ زندگیوں کو جینے میں اتنے گم و مبتلا ہیں کہ ‘ اشرف المخلوقات ‘ کا کردار نبھانا بھول جاتے ہیں . اور ایسی چیزوں پر توجہ دینا بھول جاتے ہیں جن پر وقت لگا کر ہم یے کردار نبھانے کی کم از کم کوشش تو کر سکتے ہیں .
کل میں نے جول سارٹور نام کے مشہور فوٹوگرافر کی تقریر دیکھی . اس میں وہ یے بیان کر رہے تھے کہ ہماری یے قدرتی ماحول کو تباہ کرنے کی رفتار کی تیزی ، ہماری اس رفتار کی تیزی سے کئی گنا بڑھ کر ہے جس سے ہم جیتے جاگتے جانور ، پیڑ – پودوں کے نئے اقسام کے بارے میں معلومات حاصل کرتے ہیں . آخر کتنی حیرت انگیز بات ہے یے . نا جانے کتنے ایسے قدرتی چیزوں کو دیکھے بغیر اور ان کے بارے میں غور کئے بغیر انسان یوں ہی آخر تک جیتا جا ے گا . وہ یے بھی بتا رہے تھے کہ اگلے دس سال میں دنیا کے تمام جل تھلیوں کے اقسام میں ، پچاس فیصد کا ناپیدا و غیب ہونے کا امکان ہے . یہ کوئی معمولی بات نہیں ہے !
طبّی دنیا میں بھی ہم نے حال ہی میں genetic codes کو سمجھنا شروع کیا ہے . اور ہماری بیماریوں کو لے کر جانکاری میں نئی طرح کی سوچ پیدا اب ہی ہو رہی ہے . نا جانے آگے اور کیا ترقی ہوگی اور نئی باتوں کا پتا چلے گا .
ہم حیاتی دنیا کے بارے میں کتنا کم جانتے ہیں ، کیا ہم کو اس بات کا احساس ہے ؟
مشہور فوٹوگرافر جول سارٹور کی قدرتی ماحول کے بچاؤ پر لاجواب تقریر .
نامور فوٹوگرافر ، Yann Arthus-Bertrand کی یہ شاندار فلم ہم کو سمجھاتی ہے کہ ہم کیا داؤ پر لگا رہے ہیں . پوری فلم یہاں دیکھئے .
فزکس کی دنیا بھی ہم کو ہمارے ماحول کے بارے میں سکھاتی ہے . کبھی غور کیجئے ، چاہے واقعہ کیسا بھی ہو یا چیز کیسی بھی ہو ، وہ ایک probability wave کے ڈھنگ میں سوچا یا سوچی جا سکتی ہے . بہت ہی انوکھی بات ہے یے . کیونکہ انسان کا دماغ ایسا نہیں چلتا . وہ اس بات کو ماننے سے انکار کرتا ہے کہ بھلا کوئی چیز ایک ہی پل میں ، ایک سے زائد مقام پر بھی پائی جا سکتی ہے . لیکن یے علمی تجربوں سے ممکن پایا گیا ہے (جیسا کہ Wheeler’s Experiment) اور دانشوروں میں اس بات کو لے کر اب بھی بحث ہوتے رہتی ہے . اور ہان ، علمی تجربوں سے یے بھی ثابت ہوا ہے کہ دو علیحدہ چیزیں ، چاہے ان کے بیچ میں کتنا بھی فاصلہ ہو ، کبھی کبھی ان میں ایک قسم کا رابطہ ہوتا ہے ، جس کو quantum entanglement کہتے ہیں . ذہن ماننے کو انکار کرتا ہے ، لیکن ہاں یہ سچ ہے . اور بھی ایسی کئی دلچسپ باتیں ہیں جیسے multiple-space-dimensions , time dilation, wormholes وغیرہ . اور تو اور ، ان چیزوں کا جائزہ ہم گھر بیٹھے ہی لے سکتے ہیں . کبھی اندھیری رات کو ، اپنی نظریں آسمان کے ستاروں کی طرف دوڑ ایں اور غور کریں کہ ان کی روشنی کو ہماری آنکھوں تک پہنچنے میں کتنا وقت لگا ہوگا . کیا وہ جو ستارے آپ کو نظر آ رہے ہیں ، اتنی مدّت میں وہیں کے وہیں ہیں یا پھر چل بسے ہیں ؟ نظر دوڑانا شاید آپ کو مشکل کام لگتا ہوگا . لیکن Orion Nebula جیسی دور کی چیزیں تو آپ کے سامنے ہی ہیں ! ذرا دیکھئے تو !
ہم طبیعی کائنات کے بارے میں کتنا کم جانتے ہیں ، کیا ہم کو اس بات کا احساس ہے ؟
PBS NOVA سے جاری کی گئی فزکس پر بہترین فلم
حساب کی دنیا میں ہم کو حال ہی میں پتا چلا ہے کہ infinity کے بھی اقسام ہوتے ہیں . بلکہ infinity کے infinite اقسام ہیں ! ایک infinity دوسری infinity سے بڑھی یا چوٹی ہو سکتی ہے . غور کر یے دو circles A & B پر . ایک بڑھا ہے ، تو دوسرا چھوٹا . یعنی کہ ایک کا circumference دوسرے سے بڑھا ہے . لیکن کیا یہ سچ نہیں کہ ہر circumference کے اندر infinite sides ہوتے ہیں ؟ تو بڑھے circle میں جو infinite sides ہیں وہ چھوٹے circle کے infinite sides سے زیادہ ہوئے ؛ ہے نا ؟ کتنی دلچسپ بات ہے . اور ایسی نا جانے کئی ایسی چیزیں ہیں جن کا اندازہ ہم کو آج بھی نہیں ہے .
ہم حساب کے بارے میں کتنا کم جانتے ہیں ، کیا ہم کو اس بات کا احساس ہے ؟
BBC کی شاندار فلم ، The Story of Maths
Marcus du Sautoy بتاتے ہیں symmetry کی اہمیت
پرسوں میں نے ایک اور تقریر دیکھی جس میں خطیب نے یے نقطہ اٹھایا کہ ہر دو ہفتے ، ایک ایسے انسان کی موت ہوتی ہے جو اپنے ساتھ ساتھ اپنی زبان اور ادبیات لے کر اس دنیا سے چل بستا ہے . وہ اپنی زبان کا واحد اور آخری بول چال میں استعمال کرنے والا فرد ہوتا ہے اور اس کے جانے کے بعد اس کی نسل ، اس زبان سے ناتا ہمیشہ کے لئے کھو بیٹھتی ہے . اور اس نسل کے ساتھ ساتھ دنیا کے باقی سبھی لوگ بھی . صدیوں سے اکٹھا کی گئی حکمت یافتہ باتیں جو اس زبان میں بندھی پڑھی تھیں ، اب وہ گویا ہمیشہ کے لئے غیب ہو جاتی ہیں .
ہم بیٹھے بیٹھے صدیوں کا علم ہاتھوں سے گنوا رہے ہیں ، کیا ہم کو اس بات کا احساس ہے ؟
ہر دو ہفتے اس دنیا سے ایک زبان چل بستی ہے
سچ پوچھیں توعام طور پر اس سوال کا جواب نفی میں ہوتا ہے .
اور انسان ان چیزوں پر غور و فکر کرنے کے بجائے ایسی چیزوں پر توانائی اور وقت ضائع کرنے کو ہمیشہ تیّار رہتا ہے . جیسا کہ ایک دوسرے سے جھگڑنا یا جنگ لڑنا ، یا پھر غیر اخلاقی اقتصادی برتری کے نشے میں آ کر ایک دوسرے کو کچلنا یا ایک دوسرے کے سامنے ٹانگ اڑانا وغیرہ وغیرہ .
کیا یے دانشمند مخلوق کی صفت ہے ؟ کیا ہم ایسا برتاؤ کر کے اپنے ‘ اشرف المخلوقات ‘ والے کردار سے منہ نہیں موڈ رہے ہیں ؟ کبھی کبھی لگتا ہے کہ آخرت تک یہی ماجرا رہے گا . جب تک کہ ہم میں اپنے ترجیحاتٍ زندگی کو صحیح کرنے کا احساس نہیں ہوگا یہی حال رہے گا .
اس نقطے سے ذہن میں ایک اور ایک خیال آیا . نا جانے انسان کی عقل کا کتنا حصّہ ماضی کے تجربوں پر منحصر ہوتا ہے . یعنی کہ ذکاوت کی دین (output) ، ماضی کی لین (input) پر کتنی منحصر ہے ؟
اگر انسان اپنے ماضی میں ، اپنے گرد و نواح سے اور اپنی حیثیت و کیفیت سے ناواقف ہوتا تو کیا یے توقع سے بعید ہے کہ وہ چھوٹی موتی ، نا گوار چیزوں پر اپنا وقت ضائع کرتا ؟ اپنا اصلی کردار ادا کرنا بھول جاتا ؟ غور کیا جاے تو ہم یے مظہرartificial intelligence programming میں بھی پا سکتے ہیں . میرا قیاس ہے کہ کئی AI systems اسی طرح چلتے ہیں . ان میں ضرور ایسا code ہوتا ہوگا جس سے روبوٹ کو باہر کی دنیا کا احساس پانے میں مدد ملتی ہوگی . جیسے ہی اس کو باہر کے ماحول میں تبدیلی کا احساس ہوتا ہے ، وہ اپنا رویّہ بدل دیتا ہے . شاید ہم بھی اسی طرح ہی ہیں . بس ہمارے شعور و احساسات کو مزید جگانا ہوگا .
کبھی سوچا نہیں تھا کہ اتنا طویل فلسفیانہ مضمون لکھو نگا . لکھتے لکھتے اتنا وقت گزر گیا ! تو بس دوستو ، آج کے لئے اتنا ہی . ملتے ہیں اگلی بار . اس طرح کبھی کبھی ہمیں اس تیز رفتار زندگی سے وقت نکال کر اپنے کردار کو نبھانے کے بارے میں سوچنا ہوگا . کہ بھئی ، ہم ‘ اشرف المخلوقات ‘ خطاب کے واقعی حقدار بنتے ہیں یا نہیں ؟ اپنے رب اور اس کی کائنات کو سمجھتے ہیں یا نہیں ؟ اور کس طرح اپنا وقت گزارنا چاہتے ہیں .
جانے سے پہلے اسی موضوع پر دیکھئے ایک لطف بھرا انگریزی گیت :
☮ … The sky calls to us,
If we do not destroy ourselves,
We will one day venture to the stars … ☮
Copyright Firas MR. All rights reserved.
A Narrative History of BSD, by Dr. Kirk McKusick (Courtesy: bsdconferences channel @ Youtube)
Oh Lord, won’t you buy me a 4BSD?
My friends all got sources, so why can’t I see?
Come all you moby hackers, come sing it out with me:
To hell with the lawyers from AT&T!
— a random, hilarious fortune cookie touching on the origins of the FreeBSD project
Another quick post about tech stuff today. Someday I’ll delve into FreeBSD in a lot more detail. But for now, a brief rundown of why I personally think FreeBSD is one of the best toys around to play with today:
- Great documentation! Aside from the FreeBSD Handbook, there are two other books that I think do a phenomenal job in teaching not just the way things are done in the BSD world, but also UNIX philosophy in general. Michael Lucas’s, ‘Absolute FreeBSD‘ and Greg Lehey’s, ‘The Complete FreeBSD‘. My personal all time favorite tech book is currently, ‘The Complete FreeBSD‘. Note the emphasis on ‘all time’. That kind of thing doesn’t come easily from a person who’s not a professional techie. Although Greg ‘Groggy’ Lehey (as he’s popularly known) hasn’t covered the latest version of FreeBSD, a lot of the knowledge you gain from reading his book is pretty transferable. This book also teaches you how computing all began. From the origins of the word ‘Terminal’, to the Hayes command set (he even teaches you some basic commands to talk directly to your modem!), to how the Internet came to be shaped with TCP/IP and BIND and so on. Go check it out for free here and listen to Lehey and Lucas as they are interviewed by BSDTalk here and here. If you’ve ever dabbled in the Linux world, you’ll soon come to realize that FreeBSD’s approach in consolidating, streamlining and simplifying documentation is like a breath of fresh air! Oh and by the way, Dru Lavigne, another famous personality in the BSD world has a great talk on the similarities and differences between BSD and Linux here.
- Another incredible boon is their hardware compatibility list (a.k.a. the ‘Hardware Notes‘, that come with every release). It’s jaw-droppingly amazing that you are presented with a list of all known chips/circuit boards and the drivers that you’ll need to use to get them working all organized in such a neat manner right on their main website! Again, something that will definitely blow you away if you’re coming from the Linux world. In fact, when anybody asks me what hardware I recommend for good open-source support (i.e. cross-compatibility across major Operating Systems), I usually turn to this excellent list. It’s a great shopper’s guide! 🙂
- From my experience, it’s a lot easier to grasp fundamental concepts about the way computers work by reading about FreeBSD than by looking at books about Linux. In fact Arch Linux, which is a great Linux distribution that I recommend if you want to explore how Linux works, borrows a lot from the manner FreeBSD functions (its /etc/rc.conf file for example) as part of its KISS (Keep It Simple Stupid) philosophy.
More on FreeBSD later! That does it for today! Cheers! 🙂
Copyright © Firas MR. All rights reserved.
Powered by ScribeFire.
Before I begin today’s discussion (since it concerns another book), a quick plug for Steve McCurry, whose photography I deeply admire and whose recent photo-essays on the subject of reading, are especially inspirational and worth checking out. I quote:
“Reading is a means of thinking with another person’s mind; it forces you to stretch your own.” — Charles Scribner
Susan Sontag said: “The camera makes everyone a tourist in other people’s reality.” The same can be said for reading books.
Every once in a while, I receive feedback from readers as to how much they appreciate some of my writing on non-clinical/non-medical subjects. Sometimes, the subject matter concerns books or web resources that I’ve recently read. Occasionally, I also like taking notes as I happen to read this material. And often, friends, family and colleagues ask me questions on topics that I’ve either read a book about or have made notes on. Note-taking is a good habit as you grow your comprehension of things. In my opinion, it also helps you skeletonize reading material – sort of like building a quick ‘Table Of Contents’ – that you can utilize to build your knowledge base as you assimilate more and more.
If you’ve ever visited a college bookstore in India, you’ll find dozens and dozens of what are popularly referred to as “guides” or “guidebooks”. These contain summaries and notes on all kinds of subjects – from medicine to engineering and beyond. They help students:
- Get verbosity in their main coursebooks (often written in English that is more befitting the Middle Ages) out of the way to focus on skeletonizing material
- Cram before exams
I tend to think of my notes and summaries of recently-read books, as guidebooks. Anchor points, that I (& often family or friends) can come back to later on, sometimes when I’ve long forgotten a lot of the material!
I write this summary in this spirit. So with all of that behind us, let’s begin.
I stumbled upon an enticing little book recently, called “Learning the BASH shell“, by Cameron Newham & Bill Rosenblatt. Being the technophile that I am, I just couldn’t resist taking a peek.
I’ve always been fascinated by the innards of computers – from how they’re made and assembled to how they are programmed and used. My first real foray into them began with learning some of the fundamentals of DOS and BASIC on an old 286 (I think) as a 7th grader. Those were the days of pizza-box styled CPU-case form factors, monochrome monitors that had a switch that would turn text green, hard disks that were in the MB range, RAM that was measured in KB and when people thought 3.5 inch floppies were cool. Oh boy, I still do remember the way people used to go gaga over double-sided, high-density, pre-formatted and stuff! As I witnessed the emergence of CDs and then later DVDs and now SSDs and portable HDs, I got my hands dirty on the 386, the 486, the Pentium 1, the Pentium 3, the Pentium 4 (still working!) and my current main workstation which is a Core 2 Duo. Boy, have I come a long way! Over the years I’ve read a number of books on computer hardware (this one and this one recently – more on them for a future post) and software applications and Operating Systems (such as this one on GIMP, this one on GPG, this one, this one and this one on Linux and this one and this one on FreeBSD – again, more on them later!). But there was always one cranny that seemed far too daunting to approach. Yup, programming. Utterly jargoned, the world of modern programming has seemed really quite esoteric & complicated to me from the old days, when BASIC and dBASE could get your plate full. When you’ve lost >95% of your memory on BASIC, it doesn’t help either. Ever since reading about computational biology or bioinformatics (see my summary of a book on the topic here), I’ve been convinced that getting at least a superficial handle on computer programming concepts can mean a lot in terms of having a competitive edge if you ever contemplate being in the research world. This interplay between technology and biology and the level to which our research has evolved over the past few decades was further reinforced by something I read recently from an interview of Kary Mullis, the inventor of PCR. He eventually won the Nobel Prize for his work:
What I do personally is the research, which I can do from home because of the Internet, which pleases me immensely. I don’t need to go to a library; I don’t need to even talk to people face to face.
There are now whole books and articles geared towards programming and biology. I recommend the great introductory essay, Why Biologists Want to Program Computers by author, James Tisdall.
“Learning the BASH shell” is a fascinating newbie-friendly introduction to the world of programming and assumes extremely rudimentary familiarity with how computers work or computer programming in general. It certainly helps if you have a working understanding of Linux or any one of the Unix operating system flavors, but if you’re on Windows you can get by using Cygwin. I’ve been using Linux for the last couple of years (originally beginning with Ubuntu 6.06, then Arch Linux and Debian, Debian being my current favorite), so this background certainly helped me grasp some of the core concepts much faster.
So what exactly is programming anyway? Well, think of programming as a means to talk to your computer to carry out tasks. Deep down, computers understand nothing but the binary number system (eg: copy this file from here to there translates into gibberish like .…010001100001111000100110…). Not something that most humans would find even remotely appealing (apparently some geeks’ favorite pastime is reverse-engineering human-friendly language from binary!). Now most of us are familiar with using a mouse to point-and-click our way to getting tasks done. But sometimes it becomes necessary to speak to our computers in more direct terms. This ultimately comes down to entering a ‘programming environment’, typing words in a special syntax (depending on what programming language you use) using this environment, saving these words in a file and then translating the file and the words it contains into language the computer can understand (binary language). The computer then executes tasks according to the words you typed. Most languages can broadly be divided into:
- Compiler-based: Words in the programming language need to be converted into binary using a program called a ‘compiler’. The binary file can then be run independently. (eg. the C programming language)
- Interpreter-based: Words in the programming language are translated on-the-fly into binary. This on-the-fly conversion occurs by means of an intermediary program called an ‘interpreter’. Because of the additional resources required to run the interpreter program, it can sometimes take a while before your computer understands what exactly it needs to do. (eg. the Perl or Python programming languages)
What is BASH?
BASH is first and foremost a ‘shell’. If you’ve ever opened up a Command-Prompt or CLI (Command Line Interface) on Windows (Start Menu > Accessories > Command Prompt), then you’ve seen what a shell looks like. Something that provides a text interface to communicate with the innards of your operating system. We’re used to doing stuff the GUI way (Graphical User Interface), using attractive buttons, windows and graphics. Think of the shell as just an alternative means to talk to your computer. Phone-line vs. paper-mail, if that metaphor helps.
Alright, so we get that BASH provides us with an interface. But what else does it do? Well, BASH is also an interpreted programming language! That is amazing because what this allows you to do, is to use your shell to create programs for repetitive or complicated multi-step tasks. A little segue into Unix philosophy bears merit here. Unix-derivative operating systems, unlike others, basically stress on breaking complicated tasks in to tiny bits. Each bit is to be worked on by a program that specializes in that given component of a task. sort is a Unix program that sorts text. cut snips off a chunk of text from a larger whole. grep is used to find text. sed is used to replace text. The find program is used to find files and directories. And so on. If you need to find a given file, then look for certain text in it, yank out a portion of it, replace part of this chunk, then sort it from ascending to descending order, all you do is combine find, grep, sed, cut and sort using the proper syntax. But what if you didn’t really want to replace text? Then all you do is omit sed from the workflow. See, that’s the power of Unix-based OS(s) like Linux or FreeBSD. Flexibility.
The BASH programming language takes simple text files as its input. Then an interpreter called bash translates the words (commands, etc.) into machine-readable code. It’s really as simple as that. Because BASH stresses on the Unix philosophy, it assumes you’ll need to use the various Unix-type programs to get stuff done. So at the end of the day, a BASH program looks a lot like:
execute the Unix program date
assign the output of date to variable x
if x = 8 AM
then execute these Unix program in this order(find, grep, sed, cut, sort, etc.)
Basic Elements of Programming
In general, programming consists of breaking down complicated tasks into bits using unambiguous language in a standard syntax.
The fundamental idea (using BASH as an example) is to:
- Construct variables.
- Manipulate variables. Add, subtract, change their text content, etc.
- Use Conditions such as if/then (referred to in technobabble as “Flow Control”)
- Execute Unix programs based on said Conditions
All it takes to get going is learning the syntax of framing your thoughts. And for some languages this can get hairy.
This explains why some of the most popular programming languages out there try to emulate human language as much as possible in their syntax. And why a popular language such as Perl was in fact developed by a linguist!
This was just a brief and extremely high-level introduction to basic concepts in programming. Do grab yourself a copy and dive in to “Learning the BASH shell” with the aforementioned framework in mind. And before you know it, you’ll soon start putting two and two together and be on your way to developing your own nifty program!
I’m going to end for today with some of the additional excellent learning resources that I’m currently exploring to take my quest further:
- Steve Parker’s BASH tutorial (extremely easy to follow along)
- Greg’s BASH Guide (another one recommended for absolute noobs)
- Learning to Program Using Python – A Tutorial for Hobbyists, Self-Starters, and All Who Want to Learn the Art of Computer Programming by Alan Gauld
- How to think like a Computer Scientist – Learning with Python by Jeffrey Elkner, Allen B. Downey, and Chris Meyers
UPDATE 1: If you’re looking for a programming language to begin with and have come down to either Perl or Python, but are finding it difficult to choose one over the other, then I think you’ll find the following article by the famous Open Source Software advocate, Eric S. Raymond, a resourceful read: Why Python?
UPDATE 2: A number of resourceful, science-minded people at SciPy conduct workshops aimed at introducing Python and its applications in science. They have a great collection of introductory videos on Python programming concepts & syntax here. Another group, called FOSSEE, has a number of workshop videos introducing Python programming here. They also have a screencast series on the subject here.
UPDATE 3: AcademicEarth.org has quite a number of useful lecture series and Open Courseware material on learning programming and basic Computer Science concepts. Check out the MIT lecture, “Introduction to Computer Science and Programming” which is specifically designed for students with little to no programming experience. The lecture focuses on Python.
Copyright Firas MR. All rights reserved.
- Download the Stream Player plugin as a zip. Extract it locally. Rename the player.swf file to player-swf.jpg
- Upload player-swf.jpg to your WordPress.com Media Library. Don’t worry, WordPress.com will not complain since it thinks it’s being given a JPG file!
- Next insert the gigya shortcode as explained at Panos’ website. I inserted the following between square brackets, [ ] :
gigya src="https://mydominanthemisphere.files.wordpress.com/2010/11/player-swf.jpg" width="512" wmode="transparent" allowFullScreen="true" quality="high" flashvars="file=http://ia311014.us.archive.org/1/items/scipy09_introTutorialDay1_1/scipy09_introTutorialDay1_1_512kb.mp4&image=http://ia311014.us.archive.org/1/items/scipy09_introTutorialDay1_1/scipy09_introTutorialDay1_1.thumbs/scipy09_introTutorialDay1_1_000180.jpg&provider=http"
flashvarsare separated by ampersands like
flashvars="file=MOVIE URL HERE&image=IMAGE URL HERE". The
flashvarsstates that we would like to enable skipping within the video stream.
اردو ہے جسکا نام، ہم ہی جانتے ہیں داغ، سارے جہاں میں دھوم، ہماری زباں کی ہے ~ داغ
(ایک ضروری بات: اس مضمون کو سہی روپ میں دیکھنے کے لئے آپ ناظرین کو یہ font ڈاونلوڈکرکے اپنے سسٹم پر ڈالنا ہوگا. یہ ایسی font ہے جو خاص کمپیوٹر سکرین پر باآسانی پڑھنے کے لئے بنائی گئی ہے.)
امید ہے کہ آپ لوگوں کو میری جانب سے کافی عرصے سے کچھ نہ سننے پر زیادہ شکایات نہیں ہوگی. دراصل بات یہ ہے کہ ہمیشہ کی طرح پڑھائی اور دیگر تعلیمی دنیا سے متعلق چیزوں نے مجھے کافی مصروف رکھا ہے.
میری ہمیشہ سے یہ خواہش تھی کہ کسی دن میں اپنے اس بلوگ پر اردو زبان میں بھی لکھوں. کیونکہ یہ تو میری مادری زبان ہے ہی اور پتہ نہیں کب اور کیسے میرا اس خوبصورت زبان سے رابطہ کچھ ٹوٹنے سا لگا تھا. شاید اس کا قصور میری سائنسی دنیا کا ہے، جو آج کل کے زمانے میں، انگریزی زبان پر ہی زور دیتی ہے. اور اگر اخبارات اور خبروں کی بات کی جائے تو مجھے کبھی یہ نہیں محسوس ہوا کہ اردو دنیا میں کوئی خاص کر انوکھی جیسی چیز ہو. لیکن اب مجھے لگتا ہے کہ میری یہ سوچ کتنی معصوم تھی. پچھلے کچھ ہفتوں سے میرے سامنے کئی ایسی مضامین آے ہیں جو انتہائی دلچسپ ہیں اور جو انگریزی زبان کی دنیا میں شاید ہی دیکھنے کو ملیںگے. یوں سمجھئے کہ مجھے اس زبان سے واقف ہونے کا مزہ آخر اب ہی مل رہا ہے. اور میں اس کے لئے کافی شکرگزار محسوس کر رہا ہوں.
آج کے لئے میرے پاس کسی خاص عنوان پر لکھنے کا رجحان تو نہیں. بس اتنا بتانا چاہتا ہوں کہ انٹرنیٹ پر اردو میں لکھنے کے لئے بہت سارے مددگار سائٹس ہیں. چاہے وہ Linux, BSD, FOSS سے متعلق ہوں یا پھر Windows سے. ان میں سے کچھ جو مجھے بہترین لگے، یہ ہیں:
- اگر آپ کو لگتا ہے کہ آپ کا اردو ذخیرہ الفاظ کمزور ہے، تو یہ سائٹ آپ کو مدد کرے گی: http://www.urduenglishdictionary.org
- اگر آپ Windows پر ہوں، تو Google Transliteration IME Keyboard ضرور استعمال کریں. فی الحال یہ صرف Windows کے لیے ہی فراہم ہو رہا ہے : http://www.google.com/ime/transliteration
- Urdu Fonts ڈاونلوڈ کرکے انکا استعمال Openoffice, Firefox, etc میں کریں. بعض Fonts صرف Windows کے لئے خاص پروگرام کی ہوتی ہیں اور یہ Linux, BSD, etc پر نہیں چلینگی. Windows کے لئے بہترین Fonts آپ کو یہاں سے ملیں گی: http://www.crulp.org . اگر آپ Debian جیسے Linux flavor پر ہیں تو apt-get کا استعمال کریں. CRULP وغیرہ کی جانب 3rd-party fonts کو اس ترکیب سے اپنے سسٹم پر ڈالیے: http://wiki.archlinux.org/index.php/Fonts . واضح رہے کہ جس طرح انگریزی میں الگ الگ Fonts الگ الگ مسائل کے پیش نظر کام آتی ہیں، اسی طرح اردو میں بھی مختلف Fonts ہوتی ہیں جو الگ الگ قلمی انداز میں لکھی جاتی ہیں جیسے نستعلیق، نسخ وغیرہ اور کہیں ایک قسم کی font مناصب ہوگی تو وہیں پر دوسری نامناصب. ان پر بڑھی ہی عمدہ مضامین یہاں ہیں: ، http://salpat.uchicago.edu/index.php/salpat/article/view/33/29 ، http://en.wikipedia.org/wiki/Islamic_calligraphy
- Linux, BSD وغیرہ پر SCIM اور IBus جیسی سہولتیں ملیں گی. ان کے ذرے آپ transliteration keyboards کا استعمال کر سکتے ہیں: http://wiki.debian.org/I18n/ibus , http://beeznest.wordpress.com/2005/12/16/howto-install-japanese-input-on-debian-sarge-using-scim/ . اردو میں لکھنے کے لئے آپ کو m17 packages install کرنا پڑیگا. اور یے بھی مت بھولیے کہ آپ کو اردو زبان کی locales بھی سسٹم پر ڈالنی پڑےنگی. خاص طور پر جو UTF-8 والی ہوں.
- Firefox کے لئے اردو لغت کو install کرنے کے لئے پہلے Nightly Tester Tools addon install کیجئے اور پھر Urdu Dictionary addon install کریے.
- Debian وغیرہ میں کچھ دیگر ترتیبات کے بعد ہی Firefox اردو الفاظ کو سہی ڈھنگ سے دکھاتا ہے. دراصل Debian میں Firefox, Pango font rendering engine کا استعمال بند ہوتا ہے جس کی وجہ سے اردو کے الفاظ ٹھیک نہیں نظر آتے. Pango کو واپس لانے کے لئے ترکیب یہاں ہے: http://ubuntu.sabza.org/2006/08/18/firefox-for-linux-urdu-font-rendering
- Firefox اور Debian کو لیکر مجھے یے بھی مسلہ کا سامنا کرنا پڑا. ویسے اسکا حل مجھے ابھی تک تو نہیں ملا ہے.
- Openoffice کے لئے اردو لغت یہاں ملے گی: http://extensions.services.openoffice.org/en/project/dict-ur . اسے اپنے سسٹم پر ڈالنے کے بعد آپ کو Tools>Options>Language Settings میں جا کر Enabled for complex text layout tick-mark کرنا ہوگا. Default زبان کی فہرست میں اردو تو نہیں ہے. تو یہاں پر ہندی ہی رہنے دیجئے. ہوتا یہ ہے کہ جب آپ اردو میں ٹائپ کرنا شروع کرتے ہیں، تو خودبخود Openoffice وثیقہ کی زبان اردو ہے سمجھ جاتا ہے اور اسکا اشارہ bottom toolbar میں کرتا ہے. میرے تجربے میں Debian میں ایسا نہیں ہوتا. آپ کو پہلے اردو میں تھوڑے الفاظ ٹائپ کرنا پڑتا ہے. پھر bottom toolbar کے ذریے زبان کی setting مقرّر کرنی پڑتی ہے. اچھا، چونکہ ہندی default CTL language ہے، جب آپ اردو ٹائپ کرنے لگتے ہیں، تو ایک ہندی font خودبخود منتخب کی جاتی ہے. جیسے Mangal وغیرہ. تو اس بات کا دھیان رکھتے ہوئے اردو ٹائپ کرتے وقت، اپنی font نسخ، نستعلیق، وغیرہ میں تبدیل کرنا نہ بھولیں.
تو پھر بس آج کے لئے اتنا ہی. امید ہے کہ آپ ناظرین سے پھر ملاقات ہوگی. تب تک کے لئے الوداع!
Copyright Firas MR. All Rights Reserved.
Powered by ScribeFire.
I’ve not had the chance yet to delve into the bureaucracy of academia in science, having relegated it to future reading and followup. Some interesting reading material that I’ve put on my to-read list for future review is:
Academic medicine: a guide for clinicians
By Robert B. Taylor
Advice for a Young Investigator
By Santiago Ramón y Cajal, Neely Swanson, Larry W. Swanson
Do let me know if there any others that you’ve found worth a look.
In the meantime, I just caught the following incisive read on the topic via a trackback to my blog from a generous reader:
Writing about the odious tentacles that young academics have to maneuver against, author Peter Lawrence of Cambridge (UK) says that “the granting system turns young scientists into bureaucrats and then betrays them”.
He then goes on to describe in detail with testimonies from scientists as to how and why exactly that’s the case. And concludes that not only does the status quo fundamentally perverse freedom in scientific pursuit but also causes unnecessary wastage sometimes to the detriment of people’s careers and livelihoods despite their best endeavors to stay dedicated to the pursuit of scientific knowledge. And how this often leads to die hard researchers dropping out from continuing research altogether!
Some noteworthy excerpts (Creative Commons Attribution License):
“The problem is, over and over again, that many very creative young people, who have demonstrated their creativity, can’t figure out what the system wants of them—which hoops should they jump through? By the time many young people figure out the system, they are so much a part of it, so obsessed with keeping their grants, that their imagination and instincts have been so muted (or corrupted) that their best work is already behind them. This is made much worse by the US system in which assistant professors in medical schools will soon have to raise their own salaries. Who would dare to pursue risky ideas under these circumstances? Who could dare change their research field, ever?”—Ted Cox, Edwin Grant Conklin Professor of Biology, Director of the Program on Biophysics, Princeton University
the present funding system in science eats its own seed corn . To expect a young scientist to recruit and train students and postdocs as well as producing and publishing new and original work within two years (in order to fuel the next grant application) is preposterous. It is neither right nor sensible to ask scientists to become astrologists and predict precisely the path their research will follow—and then to judge them on how persuasively they can put over this fiction. It takes far too long to write a grant because the requirements are so complex and demanding. Applications have become so detailed and so technical that trying to select the best proposals has become a dark art. For postdoctoral fellowships, there are so many arcane and restrictive rules that applicants frequently find themselves to be of the wrong nationality, in the wrong lab, too young, or too old. Young scientists who make the career mistake of concentrating on their research may easily miss the deadline for the only grant they might have won.
After more than 40 years of full-time research in developmental biology and genetics, I wrote my first grant and showed it to those experienced in grantsmanship. They advised me my application would not succeed. I had explained that we didn’t know what experiments might deliver, and had acknowledged the technical problems that beset research and the possibility that competitors might solve problems before we did. My advisors said these admissions made the project look precarious and would sink the application. I was counselled to produce a detailed, but straightforward, program that seemed realistic—no matter if it were science fiction. I had not mentioned any direct application of our work: we were told a plausible application should be found or created. I was also advised not to put our very best ideas into the application as it would be seen by competitors—it would be safer to keep those ideas secret.
The peculiar demands of our granting system have favoured an upper class of skilled scientists who know how to raise money for a big group . They have mastered a glass bead game that rewards not only quality and honesty, but also salesmanship and networking. A large group is the secret because applications are currently judged in a way that makes it almost immaterial how many of that group fail, so long as two or three do well. Data from these successful underlings can be cleverly packaged to produce a flow of papers—essential to generate an overlapping portfolio of grants to avoid gaps in funding.
Thus, large groups can appear effective even when they are neither efficient nor innovative. Also, large groups breed a surplus of PhD students and postdocs that flood the market; many boost the careers of their supervisors while their own plans to continue in research are doomed from the outset. The system also helps larger groups outcompete smaller groups, like those headed by younger scientists such as K. It is no wonder that the average age of grant recipients continues to rise . Even worse, sustained success is most likely when risky and original topics are avoided and projects tailored to fit prevailing fashions—a fact that sticks a knife into the back of true research . As Sydney Brenner has said, “Innovation comes only from an assault on the unknown” .
How did all this come about? Perhaps because the selection process is influenced by two sets of people who see things differently. The first are the granting organisations whose employees are charged to spend the money wisely and who believe that the more detailed and complex the applications are, the more accurately they will be judged and compared. Over the years, the application forms have become encrusted with extra requirements.
Universities have whole departments devoted to filling in the financial sections of these forms. Liaison between the scientists and these departments and between the scientists and employees of the granting agencies has become more and more Kafkaesque.
The second set of people are the reviewers and the committee, usually busy scientists who themselves spend much time writing grants. They try to do their best as fast as they can. Generally, each reviewer reads just one or two applications and is asked to give each a semiquantitative rating (“outstanding,” “nationally competitive,” etc.). Any such rating must be whimsical because each reviewer sees few grants. It is particularly difficult to rank strongly original grants; for no one will know their chances of success. The committee are usually presented with only the applications that have received uniformly positive reviews—perhaps favouring conventional applications that upset no one. The committee might have 30 grants to place in order of priority, which is vital, as only the top few can be funded. I wonder if the semiquantitative and rather spurious ratings help make this ordering just . I also suspect any gain in accuracy of assessment due to the detail provided in the applications does not justify the time it takes scientists to produce that detail.
At the moment, young people need a paper as a ticket for the next step, and we should therefore give deserving, but unlucky, students another chance. One way would be to put more emphasis on open interviews (with presentation by the candidate and questions from the audience) and references. Not objective? No, but only false objectivity is offered by evaluating real people using unreal calculations with numbers of papers, citations, and journal impact factors. These calculations have not only demoralised and demotivated the scientific community , they have also redirected our research and vitiated its purpose .
Reading the piece, one can’t help but get the feeling that the current paradigm – “dark art” as the author puts it – is a lot like lobbying in politics! It isn’t enough for someone to have an interest in pursuing a research career. Being successful at it requires an in-depth understanding of a lot of the red-tape involved. Something that is such a fundamental aspect of academic life and yet that isn’t usually brought up – during career guidance talks, assessments of research aptitude, recruitment or what have you.
Do give the entire article a read. It’s worth it!
That does it for today. Until we meet again, cheers!
Copyright © Firas MR. All rights reserved.
Powered by ScribeFire.