Intellectual Masturbation
I have come to the conclusion that academia is nothing but intellectual masturbation. I have never seen people who are as full of them selves as academics. They espouse unnecessarily complex theories and procedures for the sole reason of making non-academics look starstruck when dazzled by their supposed brilliance. I'm currently reading part of a PhD thesis on software engineering, and the things that are suggested in this case might be applicable to 1% of development processes, and only in the case where one would be dealing with completely new concepts in computer science.
There are several more or less formal ways of developing software. Some are better than others, but what they all have in common is that they are extremely convoluted, and they requires books upon books to explain what is small sub-part of the entire methodology. It's not unheard of to have 900 page books on "realizing part A and part B of our amazing 59 part model".
Maybe I'm taking the carpentry approach to software development here, but please people it's not as difficult as you make it out to be. Granted, writing good software is not necessarily simple, but coming up with a methodology on how to handle everything around the actual writing of the source code is not as hard as people would have you believe.
The problem, I think, is that people are payed by the letter to write books on methodologies. I don't doubt that we need some research in the field, but to be quite frank, methodologies for developing things are, just like design patterns, revealed almost automatically after having worked in the industry for some time. What then happens is that academics come along and they look at these methodologies that have evolved in companies over time, working in this field, and they lump a bunch of them together, and try to extract some common features of successful methodologies. So far so good, to some extent, but what they then do is add their own rationale for these concept, and then they proceed to add their own parts to the methodology, to add their personal touch. The long-term goal of this is to get name recognition for something. Now, I know that having a good reputation is something we all want, but there are good and bad ways of going about it.
I can't say anything about any other fields of study, since I have just been in the academic world of software development, but I think that most academic research on methodologies, i.e. the meta-layer about getting work done, is pure intellectual masturbation, and it encumbers the whole software development profession with things that might otherwise have become clear, using nothing but common sense and experience.
It's fine when teachers teach students about some well-known methodology that is well rooted in reality. This way the students can have a head start when they get out in business life (which is the whole point of going to school). However, when teachers think "hey, I have a personal opinion of this little thing which I consider a flaw in a methodology, so I'm throwing all other information out the window and I'm creating my own methodology and then I'm teaching that to my students". This is wrong. I'm all for teaching students as much as possible, but we need to keep our eye on the ball here. Consolidate knowledge and look for things that are actually common between methodologies and teach people that. We don't need 10 methods that kinda-sorta work, we need 2-3 that actually work, all the time.
I guess academia is about finding new things and doing research info areas where business can't afford to go, or haven't thought off, but that doesn't give academia a license to go on some wild goose chase. When it comes to problems like some new algorithm, or finding a new element for the periodic table, or figuring out how to synthesize some new drug, I'm all for that. What I don't like is the meta-reasoning about all these things. At some point the reasoning about the reasoning about the reasoning of something has to stop. How many meta layers can we have before it all turns in to philosophy?
I want more concreteness in the things that are taught to students. Let students feel that the time spent in the class room actually gives them something of value, and not just something that they might be able to use should they go down a certain path of research in academia. Again, this is very much a carpenters approach to school, but isn't that what engineering is supposed to be about? I realize software is a young field, compared to building bridges or doing mathematics, but I also know that the rate at which the field develops is much faster than the speed seen for other fields in the past.
I'm rambling, but in summary, less intellectual masturbation in academia (at least regarding what is actually taught to students), and more concrete and correct knowledge.
Feel free to challenge me on this opinion.
There are several more or less formal ways of developing software. Some are better than others, but what they all have in common is that they are extremely convoluted, and they requires books upon books to explain what is small sub-part of the entire methodology. It's not unheard of to have 900 page books on "realizing part A and part B of our amazing 59 part model".
Maybe I'm taking the carpentry approach to software development here, but please people it's not as difficult as you make it out to be. Granted, writing good software is not necessarily simple, but coming up with a methodology on how to handle everything around the actual writing of the source code is not as hard as people would have you believe.
The problem, I think, is that people are payed by the letter to write books on methodologies. I don't doubt that we need some research in the field, but to be quite frank, methodologies for developing things are, just like design patterns, revealed almost automatically after having worked in the industry for some time. What then happens is that academics come along and they look at these methodologies that have evolved in companies over time, working in this field, and they lump a bunch of them together, and try to extract some common features of successful methodologies. So far so good, to some extent, but what they then do is add their own rationale for these concept, and then they proceed to add their own parts to the methodology, to add their personal touch. The long-term goal of this is to get name recognition for something. Now, I know that having a good reputation is something we all want, but there are good and bad ways of going about it.
I can't say anything about any other fields of study, since I have just been in the academic world of software development, but I think that most academic research on methodologies, i.e. the meta-layer about getting work done, is pure intellectual masturbation, and it encumbers the whole software development profession with things that might otherwise have become clear, using nothing but common sense and experience.
It's fine when teachers teach students about some well-known methodology that is well rooted in reality. This way the students can have a head start when they get out in business life (which is the whole point of going to school). However, when teachers think "hey, I have a personal opinion of this little thing which I consider a flaw in a methodology, so I'm throwing all other information out the window and I'm creating my own methodology and then I'm teaching that to my students". This is wrong. I'm all for teaching students as much as possible, but we need to keep our eye on the ball here. Consolidate knowledge and look for things that are actually common between methodologies and teach people that. We don't need 10 methods that kinda-sorta work, we need 2-3 that actually work, all the time.
I guess academia is about finding new things and doing research info areas where business can't afford to go, or haven't thought off, but that doesn't give academia a license to go on some wild goose chase. When it comes to problems like some new algorithm, or finding a new element for the periodic table, or figuring out how to synthesize some new drug, I'm all for that. What I don't like is the meta-reasoning about all these things. At some point the reasoning about the reasoning about the reasoning of something has to stop. How many meta layers can we have before it all turns in to philosophy?
I want more concreteness in the things that are taught to students. Let students feel that the time spent in the class room actually gives them something of value, and not just something that they might be able to use should they go down a certain path of research in academia. Again, this is very much a carpenters approach to school, but isn't that what engineering is supposed to be about? I realize software is a young field, compared to building bridges or doing mathematics, but I also know that the rate at which the field develops is much faster than the speed seen for other fields in the past.
I'm rambling, but in summary, less intellectual masturbation in academia (at least regarding what is actually taught to students), and more concrete and correct knowledge.
Feel free to challenge me on this opinion.
Labels: computers, engineering, science, stupid, university
0 Comments:
Post a Comment
<< Home