Personally, I never could never really simultaneously take notes and pay attention to a lecture adequately enough to justify it. My notebooks would often be full of doodles, or might have the one or two useful notes that I genuinely needed outside of a professor's lecture notes, presentation, or the textbook.
Of course, it might be different with a science education. I learned most of my material in the lab, where I actually got to apply things that I had heard in lecture.
You touch on a good point. Note taking and study skills vary greatly not only with individual aptitude and learning needs, but also based on subject.
If someone needs to hamstring themselves using pen and paper because they can't focus in class if they have a laptop; I would contend that they are too immature for collegiate study (baring true attention deficit issues).
College kids? Immature? Nonsense ;P.
I think what I'm going to do in college is use a pen and paper for all my classes for the first week/2 weeks/month/whatever it takes for me to get a feel of the class and then start bringing a laptop to dick around with in the classes that are worthless. But I personally don't think I'd be responsible enough to not end up wasting time and browsing the internet while in class to try to use it for class notes.
As for laptop vs. desktop vs. netbook. As far as I'm concerned, netbooks are pieces of shit and can die in a fire. Desktops are too bulky and will tether you down to your workstation even though in college you'll be constantly moving around. So laptops are the obvious choice. I'm probably gonna go with a Toshiba, HP, Acer, or Asus laptop. If I go cheap and look for deals, I think there are decent laptops for $600s.
Of course, then there was the CS department's phobia of "Foreign" operating systems. The Mac lab did not allow you to use your username/pass, favoring instead shared logins. I'm not sure if they were unable to get it working with their system, or simply unwilling. My guess is the latter, looking at the state of the Windows lab, arguably the largest travesty on campus.
Everything I ever did for CS at RIT was *NIXy. I had nothing to do with Macs, and Windows was only for gaming. The problem of crappy drivers on the Sun machines is sad indeed, but it is a problem only for a very small number of students. In almost every circumstance, SSH + X forwarding to those machines will get the job done.
RIT is all about personal responsibility. That's why the retention rate is so low. If you want to play games in the lab, and as a result you fail, that's your problem. Also, if you were playing Quake in a CS lab, people would think maybe you were working on an AI project.
L.L. Bean bags tend to suffice and not fall apart, unlike their cheaper competitors.
I received an L.L. Bean book bag as a gift when I graduated high school. While it is not the most stylish bag ever made, it still looks brand new after nine years of use (we currently use it as our puppy bag when we travel).
Those bags still put all the weight on ones shoulder. I had a "good" bag but it was still a lot of weight on one shoulder, even with some it distributed across my back.
Everything I ever did for CS at RIT was *NIXy. I had nothing to do with Macs, and Windows was only for gaming.
But is it somehow to a student's advantage, particularly in a school that places such emphasis on vocational placement, to have never received any formal instruction in Windows programming of any kind? All four of the co-ops I did used Windows as the primary development platform. I anticipate this format is similar for most corporate environments.
Those bags still put all the weight on ones shoulder. I had a"good" bagbut it was still a lot of weight on one shoulder, even with some it distributed across my back.
You obviously did not look correctly. There is a cross stabilization strap, which counters the weight on your shoulder, allowing it to be distributed evenly throughout your upper body. While on, it causes no excess strain on your shoulder. If you really want to argue this, go ask Rym or Scott on the matter.
Timbuk2 does, in fact, sell backpacks, including laptop backpacks; they're just very expensive.
You can buy it from secondary sources for cheaper. And believe me, you will get what you pay for. You can bitch and complain all you want about your back hurting, but if you stopped going to the movies, buying booze/cigarettes/w/e. and actually started saving money, then your problems would be fixed. It's about how you prioritize your money. If you cannot save up 80 bucks to get a messenger bag, that in theory, could last you a lifetime, then I feel for you trying to succeed in this world.
While at first it may look like a hefty tag for a book bag, you should look at it as an investment. You can buy a mediocre bag every 2 or 3 years for your life, and spend 20-30 bucks a pop, or spend 80-100, and never need to buy another bag again.
Everything I ever did for CS at RIT was *NIXy. I had nothing to do with Macs, and Windows was only for gaming.
But is it somehow to a student's advantage, particularly in a school that places such emphasis on vocational placement, to have never received any formal instruction in Windows programming of any kind? All four of the co-ops I did used Windows as the primary development platform. I anticipate this format is similar for most corporate environments.
This is a good point. Windows is still dominant in the business world, and there are many careers to be had working with the Microsoft platforms. Having no exposure to it in college can be a disadvantage if those are the kinds of jobs you are looking at.
However, a proper CS education is not designed to teach you any specific platform or toolset. It is designed to teach you the fundamentals such that you will be able to learn and develop for any platform with relative ease. You yourself are the example. With no formal training with Windows, you were still able to use it effectively in many jobs.
Now, because the goal is to teach the fundamentals of software, I would argue that UNIX is the only environment in which the fundamentals can be learned. Think back to operating systems class. You learn about things like memory management. These are things that are fundamental to all operating systems, and different algorithms can be mathematically proven to be better or worse. UNIX is the operating system that does all the things in the "right way" from a computer science perspective. Process scheduling in particular has been done "the right way" in UNIX for a very long time, but even the more improved schedulers in recent versions of Windows are still funky and weird.
And for programming, any sort of Microsoft development platform is always weird and non-standard in some way. How can you learn fundamentals of object oriented programming when all this arcane Microsoft-centric shit is getting in the way? Not to mention the fact that documentation is much less available for closed platforms than for open ones.
So yes, you are right. From a vocational perspective, no experience with Microsoft development can be a pain right out of the box. The department should perhaps offer a few elective courses on it for those who are interested. However, the primary goal is to learn fundamentals of computer science and programming languages. Since Microsoft is the perfect example of completely ignoring said fundamentals, it is a very poor environment on which to learn them. This is why UNIX is the correct choice.
However, a proper CS education is not designed to teach you any specific platform or toolset.
And yet, Unix is nothing but a toolset, surrounding a standardized C-based API. I attest that learning how to use makefiles (a required bit of learning in CS4) is just as important as learning how to customize the build process of MSVS.
How can you learn fundamentals of object oriented programming when all this arcane Microsoft-centric shit is getting in the way?
Forgive me, Scott, but Windows MFC is a standardized C++ toolset, retaining both source and binary compatibility back to Windows 95. Unix continues to be C-based.
EDIT: MFC has since been supplanted by .NET, an well designed API emphasizing proper OOP principles.
To say that teaching Unix somehow helps one learn OOP is laughable; even GTK+ is written in "object oriented C", which relies on dumb casts of void* to get work done under the hood. Qt is OOP, and the community on the whole balks at it. Every major library that you will do work with this year in Unix will be written in C. All of your "unix system calls" are C based. None of this helps a student learn OOP.
In fact, the only time we used any Unix calls was in OS1, where we had to learn the "unix fundamentals". Could this somehow not have been learned in a cygwin shell?
And to say unix is really standardized is a bit of a laugh; most configure scripts are thousands and thousands of lines long, just to detect the non-standard bits around every unix distribution.
Finally: nothing says "arcane" like using printf in the year 2009 when type-safe printing has been available in the C++ STL since 1994.
UNIX is the operating system that does all the things in the "right way" from a computer science perspective. Process scheduling in particular has been done "the right way" in UNIX for a very long time, but even the more improved schedulers in recent versions of Windows are still funky and weird.
Windows NT has used kernel-based preemption for quite some time, the same preemption that was added only in Linux 2.6. This is a reach at best. Also: academic exercises are operating system neutral.
Are there any engineers here who can chime in? I'm pretty sure I'm going into Mechanical Engineering, and I've been considering a MacBook Pro 15" with all the fixins. It's light, and heavy on power for modeling and drafting; however, if there's a straight-up Windows machine that's still ultralight, cheaper, and better, please clue me in; I'd be Boot Camping on the Mac to get my Windows stuff done. I'd also like to be able to use it for gaming (on an external monitor, of course).
First of all, when I was in school, there was no .NET yet.
Let me give a real world example of why you use UNIX to learn programming, and not Windows.
On any *NIX machine you can use a text editor to write a helloworld.c. Just a few lines. You then use whatever C compiler you have, usually gcc, and it will create an executable file which works and easily processes standard input and standard output and interacts with a host of command line tools.
With Windows you can try to use Visual Studio. The problem that I've always had with Visual Studio, and other IDEs of that nature, are that there are millions of GUI options all over the place that can prevent your code from working. I shit you not, I've written helloworld.c in Visual Studio, and had it give me compiler errors because some gui option in some menu somewhere was not correct. That gets in the way of learning, it doesn't help it. To have a proper learning environment there needs to be nothing in between you, your code, and it running. That way you can always be sure your code is wrong, and not something else.
Sure, you could use Cygwin on Windows, or even try to install gcc on Windows and use the command prompt. But be honest, Cygwin is poop. It's C-compiler is non-standard and fucked up with the mingw shit. Have you ever tried to actually download an open source program and compile it in Cygwin? It's a nightmare.
UNIX is the environment in which you can get your hands dirty. Windows just gets in the way. Think of it this way. Why do the make you learn addition and subtraction when there are calculators? Should they? This is why you should learn printf, pointers, and malloc. Teaching only high level programming is fine for a software engineer, but not for a computer scientist. UNIX is the place to learn those things.
Now, let me step to the side a bit and say a few other things.
First of all, I think there is a lot to be said for virtualization. Because of things like Virtualbox, I really don't have much of a need to install Linux directly on any hardware. My desktop runs Vista, but I do my development in many different virtual machines. It's better for programming in about a million ways, and it should definitely be utilized in computer science education. Heck, a Mac is actually a good choice for this, because you can get Mac, Windows, and Linux all going at once.
There are things out there like Smalltalk Squeak. Squeak is a virtual machine environment written entirely in Smalltalk, from top to bottom. It runs identically on all major operating system,s and some others. It's probably the best platform for learning object oriented programming that exists. I believe that the OLPC even uses it. If you haven't tried it, you should. It's free, and it does amazing stuff as soon as you start it up. Big points for instant gratification.
For many reasons, a university should avoid teaching proprietary things. I'm sure Microsoft would really love it if all the students learned Visual C#, but is that really right? Not only might that force students to spend money on Microsoft products, it opens the question of whether an academic institution has perhaps crossed some ethical boundaries in its ties to a for-profit corporation. Computer science education should be as open source and free as possible. While it is permissible for students to learn a wide variety of tools, even if some are proprietary, the concentration of the education should be on things that are true across the board.
Also, I think that regardless of what languages and platforms are used, there needs to be more concentration on the web. In all my days at RIT, I never once learned anything about the web in a formal class. I knew HTML and CSS in high school. I did some MySQL in a database class, but some other team member did the PHP parts. Making native applications is a great way to learn the aforementioned fundamentals, but if you want to talk about learning things that are vocationally relevant, the web is where it is at. My job is as a web developer, and I learned everything I know about web development outside of formal education. Granted, the formal education gave me the lower level knowledge that made learning the web much easier.
Lastly, I think there's an underlying philosophical problem in computer science education. Do you teach classical computer science, turing machines and such, or do you teach vocationally relevant skills? Do you go for the academics and learn fancy algorithms, or is everyone going to go get developer jobs? You seem to be espousing the latter. I take sort of an in-between approach. I think that learning the seemingly irrelevant computer science bits puts you in a position such that all the real-world application stuff becomes trivial. Once you have a firm grasp of the core concepts of computing, you have everything you need to tackle everyday development tasks, even if they are using languages and tools you have no experience with.
.NET was released in 2002, and the betas were available while you were still in high school...
Brand new technology like that has no place in an academic environment. Sure, .NET is here now, but what if it wasn't? When people are paying for an expensive university, they can't be teaching something that isn't proven yet.
You also dodged the point about everything under the hood in *NIX is a unstandardized archaic mess. ^_~
What's an archaic mess? Sure, it's old, but it's not a mess. It's straightforward, if not obvious. Everything is a file. The interface may be archaic for a modern desktop user, but not for a developer.
I think the point I'm making is that UNIX is made by programmers for programmers. When you learn to think in a UNIX way, you learn to think in a computer science way. You can learn computer science on another platform, but everything has to be done with all sorts of tools you install that merely run on top of the OS. Whereas with UNIX, the operating system itself is very much integrated with the toolset.
I used it at RIT in class. They told us (rightly) that it was the future of Windows development.
Not so rightly. Microsoft is notorious for constantly changing its APIs. COM, OLE, MFC, Direct*, .NET. They constantly changed this stuff over and over again. Anyone who thought .NET had any more staying power than anything that had come before was not looking at history. And even now, .NET really isn't so big because nobody is making desktop applications. It's wasn't until very recently that Microsoft has made any headway into the web with ASP MVC. And even with desktop applications, things like Adobe Air are the new hotness. That doesn't even mention the fact that Mono has sort of stolen the .NET show away from Microsoft.
Do you teach classical computer science, turing machines and such, or do you teach vocationally relevant skills?
And here we cut to the core of my problem with the way RIT does business. RIT is a vocationally-oriented school, without a doubt. The emphasis on co-ops, job placement ... it was designed to put people in jobs. The CS department willfully or unknowingly has shifted towards pure academia, and I believe this decision has been for the worst. Yes, I agree, MSVS can be a pain sometimes, but to be sending kids out into the world with formal training in Rational Rose and Emacs along side their "vocational degree" rather than Eclipse and MSVS is practically criminal in the year 2009.
A brief aside: CS majors have to take Software Engineering 1, which forces you to learn Eclipse. It is a 10 week course, and in a normal CS curriculum it is the first and last time you see it.
Not only that, but I believe the CS department enforces the exact kind of tunnel vision that I find particularly counter-productive. Teaching students that Unix is the only way to go and every other system is intrinsically inferior is detrimental at best and deceptive at worst. There should be well rounded discussions of the pros and cons of every system, as well as at least some exposure to the practices of other operating systems. Yes, I am proposing that OS1 include a brief tour of WinAPI as well as a brief tour of Unix calls. Yes, I think students should be made to do a project in Objective-C, probably in PLC. Maybe PLC could be taught exclusively on OSX; it certainly wouldn't be a bad idea.
Either way, in this day and age where interoperability is becoming increasingly important, the RIT CS department's blinders-on view of the world is hurting nobody but its students.
And even with desktop applications, things like Adobe Air are the new hotness. That doesn't even mention the fact that Mono has sort of stolen the .NET show away from Microsoft.
Yeah, you hit two very important points. The web, AIR, Mono ... all it shows is that people are looking more towards cross-platform solutions, and not something that will bind them simply to Windows, or anywhere else for that matter. Now all Adobe needs to do is get their shit together a little bit and all will be right as rain.
The problem with focusing the core of education on specific technologies as opposed to the academic concepts behind them is that they will be outdated within a couple years. While it may be easier for students to get jobs right out of the gate in their first year or two after graduation, it will be increasingly harder for them to keep up with new concepts. However, focusing on the core concepts of computing will allow students to teach themselves the languages and technologies more efficiently and more quickly than someone taught only specific development environments/languages.
I'm not a master of any specific language or environment, however I have a very strong foundation in computing concepts and algorithms. I can learn new languages in a couple weeks at most because the concepts carry over. Leave the "Best Enterprise Solution in today's Emerging Market" shit for the corporate world, teach how to prove NP-Completeness in CS courses.
Leave the "Best Enterprise Solution in today's Emerging Market" shit for the corporate world, teach how to prove NP-Completeness in CS courses.
And in a purely academic environment I would agree with you 100%. RIT, however, is far from a purely academic environment, despite what the CS department might think. Rather than viewing cross-platform experience as a hindrance, I see it as an opportunity to teach students several implementations of the same ideas, while exposing the strengths and weaknesses of each. This is particularly true in classes like Operating Systems and Programming Language Concepts (both required).
This is coupled with the fact that there is only one required theory course at RIT: Computer Science Theory, which is presumably the first and last exposure you will have to computational complexity in your time there.
Comments
I think what I'm going to do in college is use a pen and paper for all my classes for the first week/2 weeks/month/whatever it takes for me to get a feel of the class and then start bringing a laptop to dick around with in the classes that are worthless. But I personally don't think I'd be responsible enough to not end up wasting time and browsing the internet while in class to try to use it for class notes.
As for laptop vs. desktop vs. netbook. As far as I'm concerned, netbooks are pieces of shit and can die in a fire. Desktops are too bulky and will tether you down to your workstation even though in college you'll be constantly moving around. So laptops are the obvious choice. I'm probably gonna go with a Toshiba, HP, Acer, or Asus laptop. If I go cheap and look for deals, I think there are decent laptops for $600s.
Timbuk2, my friend.
L.L. Bean bags tend to suffice and not fall apart, unlike their cheaper competitors.
While at first it may look like a hefty tag for a book bag, you should look at it as an investment. You can buy a mediocre bag every 2 or 3 years for your life, and spend 20-30 bucks a pop, or spend 80-100, and never need to buy another bag again.
However, a proper CS education is not designed to teach you any specific platform or toolset. It is designed to teach you the fundamentals such that you will be able to learn and develop for any platform with relative ease. You yourself are the example. With no formal training with Windows, you were still able to use it effectively in many jobs.
Now, because the goal is to teach the fundamentals of software, I would argue that UNIX is the only environment in which the fundamentals can be learned. Think back to operating systems class. You learn about things like memory management. These are things that are fundamental to all operating systems, and different algorithms can be mathematically proven to be better or worse. UNIX is the operating system that does all the things in the "right way" from a computer science perspective. Process scheduling in particular has been done "the right way" in UNIX for a very long time, but even the more improved schedulers in recent versions of Windows are still funky and weird.
And for programming, any sort of Microsoft development platform is always weird and non-standard in some way. How can you learn fundamentals of object oriented programming when all this arcane Microsoft-centric shit is getting in the way? Not to mention the fact that documentation is much less available for closed platforms than for open ones.
So yes, you are right. From a vocational perspective, no experience with Microsoft development can be a pain right out of the box. The department should perhaps offer a few elective courses on it for those who are interested. However, the primary goal is to learn fundamentals of computer science and programming languages. Since Microsoft is the perfect example of completely ignoring said fundamentals, it is a very poor environment on which to learn them. This is why UNIX is the correct choice.
EDIT: MFC has since been supplanted by .NET, an well designed API emphasizing proper OOP principles.
To say that teaching Unix somehow helps one learn OOP is laughable; even GTK+ is written in "object oriented C", which relies on dumb casts of void* to get work done under the hood. Qt is OOP, and the community on the whole balks at it. Every major library that you will do work with this year in Unix will be written in C. All of your "unix system calls" are C based. None of this helps a student learn OOP.
In fact, the only time we used any Unix calls was in OS1, where we had to learn the "unix fundamentals". Could this somehow not have been learned in a cygwin shell?
And to say unix is really standardized is a bit of a laugh; most configure scripts are thousands and thousands of lines long, just to detect the non-standard bits around every unix distribution.
Finally: nothing says "arcane" like using printf in the year 2009 when type-safe printing has been available in the C++ STL since 1994. Windows NT has used kernel-based preemption for quite some time, the same preemption that was added only in Linux 2.6. This is a reach at best. Also: academic exercises are operating system neutral.
Let me give a real world example of why you use UNIX to learn programming, and not Windows.
On any *NIX machine you can use a text editor to write a helloworld.c. Just a few lines. You then use whatever C compiler you have, usually gcc, and it will create an executable file which works and easily processes standard input and standard output and interacts with a host of command line tools.
With Windows you can try to use Visual Studio. The problem that I've always had with Visual Studio, and other IDEs of that nature, are that there are millions of GUI options all over the place that can prevent your code from working. I shit you not, I've written helloworld.c in Visual Studio, and had it give me compiler errors because some gui option in some menu somewhere was not correct. That gets in the way of learning, it doesn't help it. To have a proper learning environment there needs to be nothing in between you, your code, and it running. That way you can always be sure your code is wrong, and not something else.
Sure, you could use Cygwin on Windows, or even try to install gcc on Windows and use the command prompt. But be honest, Cygwin is poop. It's C-compiler is non-standard and fucked up with the mingw shit. Have you ever tried to actually download an open source program and compile it in Cygwin? It's a nightmare.
UNIX is the environment in which you can get your hands dirty. Windows just gets in the way. Think of it this way. Why do the make you learn addition and subtraction when there are calculators? Should they? This is why you should learn printf, pointers, and malloc. Teaching only high level programming is fine for a software engineer, but not for a computer scientist. UNIX is the place to learn those things.
Now, let me step to the side a bit and say a few other things.
First of all, I think there is a lot to be said for virtualization. Because of things like Virtualbox, I really don't have much of a need to install Linux directly on any hardware. My desktop runs Vista, but I do my development in many different virtual machines. It's better for programming in about a million ways, and it should definitely be utilized in computer science education. Heck, a Mac is actually a good choice for this, because you can get Mac, Windows, and Linux all going at once.
There are things out there like Smalltalk Squeak. Squeak is a virtual machine environment written entirely in Smalltalk, from top to bottom. It runs identically on all major operating system,s and some others. It's probably the best platform for learning object oriented programming that exists. I believe that the OLPC even uses it. If you haven't tried it, you should. It's free, and it does amazing stuff as soon as you start it up. Big points for instant gratification.
For many reasons, a university should avoid teaching proprietary things. I'm sure Microsoft would really love it if all the students learned Visual C#, but is that really right? Not only might that force students to spend money on Microsoft products, it opens the question of whether an academic institution has perhaps crossed some ethical boundaries in its ties to a for-profit corporation. Computer science education should be as open source and free as possible. While it is permissible for students to learn a wide variety of tools, even if some are proprietary, the concentration of the education should be on things that are true across the board.
Also, I think that regardless of what languages and platforms are used, there needs to be more concentration on the web. In all my days at RIT, I never once learned anything about the web in a formal class. I knew HTML and CSS in high school. I did some MySQL in a database class, but some other team member did the PHP parts. Making native applications is a great way to learn the aforementioned fundamentals, but if you want to talk about learning things that are vocationally relevant, the web is where it is at. My job is as a web developer, and I learned everything I know about web development outside of formal education. Granted, the formal education gave me the lower level knowledge that made learning the web much easier.
Lastly, I think there's an underlying philosophical problem in computer science education. Do you teach classical computer science, turing machines and such, or do you teach vocationally relevant skills? Do you go for the academics and learn fancy algorithms, or is everyone going to go get developer jobs? You seem to be espousing the latter. I take sort of an in-between approach. I think that learning the seemingly irrelevant computer science bits puts you in a position such that all the real-world application stuff becomes trivial. Once you have a firm grasp of the core concepts of computing, you have everything you need to tackle everyday development tasks, even if they are using languages and tools you have no experience with.
You also dodged the point about everything under the hood in *NIX is a unstandardized archaic mess. ^_~
I think the point I'm making is that UNIX is made by programmers for programmers. When you learn to think in a UNIX way, you learn to think in a computer science way. You can learn computer science on another platform, but everything has to be done with all sorts of tools you install that merely run on top of the OS. Whereas with UNIX, the operating system itself is very much integrated with the toolset.
A brief aside: CS majors have to take Software Engineering 1, which forces you to learn Eclipse. It is a 10 week course, and in a normal CS curriculum it is the first and last time you see it.
Not only that, but I believe the CS department enforces the exact kind of tunnel vision that I find particularly counter-productive. Teaching students that Unix is the only way to go and every other system is intrinsically inferior is detrimental at best and deceptive at worst. There should be well rounded discussions of the pros and cons of every system, as well as at least some exposure to the practices of other operating systems. Yes, I am proposing that OS1 include a brief tour of WinAPI as well as a brief tour of Unix calls. Yes, I think students should be made to do a project in Objective-C, probably in PLC. Maybe PLC could be taught exclusively on OSX; it certainly wouldn't be a bad idea.
Either way, in this day and age where interoperability is becoming increasingly important, the RIT CS department's blinders-on view of the world is hurting nobody but its students.
I'm not a master of any specific language or environment, however I have a very strong foundation in computing concepts and algorithms. I can learn new languages in a couple weeks at most because the concepts carry over. Leave the "Best Enterprise Solution in today's Emerging Market" shit for the corporate world, teach how to prove NP-Completeness in CS courses.
This is coupled with the fact that there is only one required theory course at RIT: Computer Science Theory, which is presumably the first and last exposure you will have to computational complexity in your time there.