When I was in high school (late 90s), they had kind of a weird curriculum for teaching programming. We had AP classes for Computer Science A/AB, which you had to be at least a sophomore to take. There was also a class called 'Computer Programming' that anyone could take, which I did as a freshman, although the students in the class were actually pretty evenly distributed between all four grades.
Basically we were taught QBASIC and given assignments. The first few weeks seemed pretty mundane as the OP describes, getting into conditionals and loops, etc, but after we learned those, my teacher told us to make something like 'Choose Your Own Adventure' text-based game. I remember loving that assignment and even compiling it as an EXE and sending it to my other friends. This pretty much followed through the whole year, learning some new programming concept (arrays, functions, etc) and then making some sort of game as an assignment. We had the usual "write a program to display all the factors of a number" assignments too, but I just remember loving the game projects. I didn't know anything about Big-O or AVL trees or whatever, just that I could create cool stuff on a computer.
In 10th grade I took Computer Science A, and about half the class hadn't taken Computer Programming already. The material was a lot drier, obviously, and I remember a lot of those students switched out. I witnessed the same thing my freshman year of college -- no 'Choose Your Own Adventure Games' as an assignments, just grueling midterms on polymorphism and inheritance.
So this post really resonated with me, because my thoughts have basically echoed this for awhile -- why isn't software engineering taught as a discipline that can let you implement and create, since that's exactly what it is?
> So this post really resonated with me, because my thoughts have basically echoed this for awhile -- why isn't software engineering taught as a discipline that can let you implement and create, since that's exactly what it is?
Because most colleges teach "computer science" not "software engineering". Computer science is primarily about abstract concepts, and not so much about necessarily doing anything worthwhile with them. (This is similar to saying, "Why doesn't physics teach me to build X?" Because that's mechanical engineering!) Admittedly, there's hopefully some occasion to apply the concepts, rather than just memorize them, but the focus should be on the science, not the engineering.
In my curriculum, software engineering was one of my possible senior classes (including OSes, compilers, etc., almost all focused on application).
This is a great question, and the answer is that universities serve their own institutional needs, not those of business, or of society, or even of students. I remember the shock when I was in college trying to learn Japanese and I asked why, after the first couple of years of language, all the Japanese classes switched to literature or obscure linguistics. I told them that I wanted classes in advanced language that would teach me how to live and work at a professional level in Japan: present in a business meeting, read a contract, etc. I claimed that going from basic language to advanced, modern, professional Japanese was a lot higher priority than a switch to archaic, literary Japanese or obscure linguistic analysis. They reacted with outrage, claiming that if they were to do what I was suggesting, they "wouldn't have any academic credibility at all!"
Ah, that's what it's about. Not my needs but theirs: not real-world skill but academic credibility.
Of course it would be what I needed if, and only if, I intended to become an academic in the field myself, but that is what universities are---farm leagues for finding and developing future professors.
Since professors are only a small portion of the work force, most people weren't even supposed to attend universities. Most people were supposed to enroll in other educational programs (tech schools, art/cooking/etc. academies, apprenticeships, and so on) if they intended to work for a living.
But these days, no employer wants you without a university degree, so universities can go on serving themselves and you'll pay for it anyway, because the end for non-professors is a general purpose "degree" credential that employers use as a proxy for generic employment qualification.
I'm not saying that computer science isn't important, just that it is a higher priority for academia than for business, and academia serves academia.
True. And CS is still one of the more practical degrees. But in the end faculty are weighted on their research, and producing Phds who cite their research, so post-undergrad employment isn't a big concern.
What you're looking for is a 2-year tech degree in programming. Universities aren't about churning out factory workers; they're about education. That's why, when you go to a good college or university, you'll be studying history, languages, physics, etc alongside your major. It's not just about being a good cog, it's about being a well-educated person.
I'm not convinced of this at all (this is the first time "citation please" popped into my mind, and I hate that response :). Of course, there are some problems that have a strong need for erotically background, but those positions tend to require a graduate degree.
On the other hand, I've seen self-taught engineers who have far more passion for developing software, which seems (in my experience) to correlate more with engineer productivity.
I have a CS degree from an institution that has an extremely strong theoretical computer science department. I value my degree and the knowledge I gained to get it, but, beyond getting me in front of hiring managers early in my career, I don't think it particularly made me a better developer. A little understanding of algorithms and data structures (much of the need for which is obviates by today's VMs and libraries) and some understanding of what is really going on under the hood (again, not as helpful in the VM word).
For me, a degree is a nice-to-have, but you won't get hired if you can't convince me you are a good learner.
I'm not talking about self-taught vs. college educated. Computer science can be self-taught. You can buy all the books at a regular bookstore and you don't need special lab equipment or anything. If you're a self-taught developer with a passion for the profession and you're a good learner, you'll probably end up learning some CS along the way.
I still don't see a correlation (we certainly agree on whether a particular slip of paper is useful), so let's turn it around: what is it about computer science (in the realm of Big-O, lambda calculus, Turing machines, discrete mathematics, and such [obviously I've left a bunch out and intentionally started on the more theoretical. Feel free to ground me in somewhat more useful CS.]) that you think makes for better developers?
I totally agree that an attitude of self-teaching makes better developers. There is a lot in the application of CS, but I would call that Software Engineering.
Well there's breadth of expertise. Part of CS is systems, which includes networking, operating systems, compilers, and so forth. If you're making any of those things or leaning heavily on them, you want to hire people who are grounded in those fields. Same goes for things like AI, machine learning, data mining, and so forth.
More generally, having seen more kinds of software kind of broadens one's way of approaching programming problems, so even if you don't directly use anything you see when you study operating systems or compilers or AI, you can grab vague approaches and ideas from those fields.
If you're interested in writing performant software, you'll care about big-O, algorithms, and data structures. Understanding algorithms and data structures enables you to intelligently choose and apply them even if you don't have to develop them from scratch.
Because most colleges teach "computer science" not "software engineering". Computer science is primarily about abstract concepts, and not so much about necessarily doing anything worthwhile with them
A computer science curriculum (even a theoretical one) need not consist only of abstract concepts. And it need not include software if the purpose is to have a theoretical curriculum. I've always been a fan projects whose primary purpose is to help you learn the concepts. My favorite courses have been of the type where the problem sets are: "read this paper; understand it; implement the algorithms; write a report with results and discussion."
why isn't software engineering taught as a discipline that can let you implement and create?
It is. But not in school, or at least not in most schools. Many working web programmers appear to have gotten their start by picking up PHP, HTML, CSS, and snippets of jQuery via online tutorials and a lot of tinkering.
But this process has little or nothing to do with either CS or OO. My impression is that when teachers say "introductory programming" they usually mean either "introductory computer science" or "introduction to OO in Java". I'm a much bigger fan of the former than the latter, but Java is an overdesigned and slow-to-learn path to actually implementing anything fun, and CS is not really about the pragmatics of implementation, just as mathematics is not really about accountancy or computer graphics.
This pretty much mirrors my experience in college. From 98-00 I took Computer Science (and did well) but eventually switched to MIS. Why? Because I spent two years making terminal-based C++ apps and became so frustrated at not being able to make real-world programs that I just gave up. It was so frustrating to be watching the first web bubble develop and yet be in class learning and writing something that seemed so completely different.
I don't know if it was my fault or their fault for not being able to bridge the gap between theory/learning and practical skills. Probably some of both. If my courses had somehow seemed more relevant or at least had a few web-based projects where I could see the real-world application of what I was learning, my career path would probably be very different today.
Same here. I started out as a CS major but got my degree in CIS b/c the courses and material were much more real world, and fun. I was doing practical programming in my first course, learning about web apps, creating databases, and I even learned about hardware and how to build my first computer (not in a low level engineering sense, but piecing them together like modern day enthusiasts do). For someone coming from a music background with little to no heavy experience in computers prior that time in my life, CIS resonated with me much more at the time.
However, I always found myself missing the math aspects of CS, and to this day I regret not having the lower level foundation that CS offers over CIS/MIS. In hindsight, I would've stuck w/CS.
Really interesting. My first highschool programming class was exactly the same and also during the late 90s. We learned QBasic and created a choose your own adventure game. (My class was actually taught by a math teacher who barely knew QBasic) I also took the advanced classes later which were VB and then C++ and did notice a significant drop off. But, I still enjoyed it.
Did we go to the same high school? :) I wonder how common that class structure was back then.
Basically we were taught QBASIC and given assignments. The first few weeks seemed pretty mundane as the OP describes, getting into conditionals and loops, etc, but after we learned those, my teacher told us to make something like 'Choose Your Own Adventure' text-based game. I remember loving that assignment and even compiling it as an EXE and sending it to my other friends. This pretty much followed through the whole year, learning some new programming concept (arrays, functions, etc) and then making some sort of game as an assignment. We had the usual "write a program to display all the factors of a number" assignments too, but I just remember loving the game projects. I didn't know anything about Big-O or AVL trees or whatever, just that I could create cool stuff on a computer.
In 10th grade I took Computer Science A, and about half the class hadn't taken Computer Programming already. The material was a lot drier, obviously, and I remember a lot of those students switched out. I witnessed the same thing my freshman year of college -- no 'Choose Your Own Adventure Games' as an assignments, just grueling midterms on polymorphism and inheritance.
So this post really resonated with me, because my thoughts have basically echoed this for awhile -- why isn't software engineering taught as a discipline that can let you implement and create, since that's exactly what it is?