Apple Thinks We Live in a "Post-PC World"
A certain BBC article has brought to my attention that Apple is holding an Apple developers conference of their own -- at the same time as E3, which I find peculiar -- and has said that because they have sold so many iOS products and are going to be rolling out the iCloud, PC is, as Kenshiro would say, already dead.
My opinion is in line with
Scott Meyer of Basic Instructions in that %95 of computers are already running Windows, and an unbearable number of programs are still Windows only. iCloud could hurt M$'s market share, but they're not in danger.
Comments
Life finds a way.
In my case, I didn't get a computer until the end of 6th grade and only because I had spent so much time in the computer lab at school (6th grade was the first year I was in a school that had anything resembling a "computer lab") that my parents finally thought it would be worthwhile to get one for the home. However, I always was fascinated by them and had been begging and pleading my parents for years to get one (now, why they didn't was partially my fault for insisting on various crappy ones, but that's another story). I often checked out books on programming from the library and tried to get "computer time" any time I had access to a friend's or relative's computer. Guess what -- despite my relatively late start (I hadn't been programming since kindergarten, like some folks here have), I still managed to become a programmer and I like to think I'm a pretty good one at that.
Lately, there has been a trend where people develop technologies that profess that they can scale automagically while keeping things easy for the user. Take Heroku for example. Someone who only learns Ruby on Rails, but doesn't learn anything else, can use Heroku to get hosted. That allows you to get a web site up without understanding SQL, caching, UNIX, or any other part of the system besides your Ruby code. While on the one hand that seems great that a developer is able to fully concentrate on the one thing they care about without reinventing wheels, the fact is that it's a huge problem. With any code, it's the entire computer that matters. Everything from the transistors up to the GUI is important. Not having enough people who understand the entire system is already causing us trouble in the form of things that can't stay up, or have major security holes. It's only going to get worse.
You write some HTML and JavaScript without understanding the underlying HTTP server layer. Result is security problems.
So you learn the HTTP server layer and how to write a web application, but you never learn SQL. Result is awful performance problems.
So you learn SQL and databases. You normalize the database, speed up all the queries and reduce the number of queries. You still have performance issues, wtf?
So you notice that for some reason your web application written in high level language X is causing apache processes to grow to use up all the memory on the machine to the point of swapping. Now you have to learn about the memory allocation going on below your high level language and the Apache web server.
And while you're learning about Apache you have to learn about UNIX and file systems, and you're basically at the bottom turtle. All from just a simple little app with HTML and Javascript.
Edit: I guess what I'm getting at is similar to what happens with piracy. Someone pirates a song, they were never going to buy it, no matter what. Someone writes Heroku apps, they weren't ever going to be writing anything in C anyway. It's not a loss for the hardcore programming community, it's the gain of someone interested that wouldn't have even touched it before.
Didn't the guys who made twitter start with RoR and found out, the hard way, that running a site as huge as twitter has become on RoR is impossible? To be completely honest, IIRC, they were basically the first RoR site that got that large so they couldn't really predict it's ability to scale.
However, I wouldn't call a garbage collected language a "better tool" than a non-garbage collected language. The two are different tools despite being similar on the surface -- kind of like the difference between a flat head and Phillips screwdriver. I mean, just because you can use a flat head screwdriver with Phillips screws doesn't mean it's the best tool for the job. Similarly, just because you can use a non-garbage collected language to do a job that can be done with a garbage collected one doesn't mean it's the best language for the job either. On the flip side, a Phillips screwdriver is useless for flat head screws and good luck trying to write an operating system kernel in a garbage collected language (granted, it may be possible depending on the language, but not in the context of today's commonly used ones like Java, Python, Ruby, etc.). Also, just like I'd expect a carpenter to be proficient with both types of screwdrivers, I'd expect any computer programmer to be reasonably proficient with garbage collected and non-garbage collected languages, even if his/her programming specialty of choice typically uses one type of language over the other.
One of the most fun classes I had in college involved copious amounts of assembly programming. Now, I had lots of fun doing assembly programming, but that doesn't mean I'd like to write anything consisting of thousands of lines of code in it.
The point is the choice exists. Barriers to entry are falling constantly, but the skill cap remains as high as it ever was. As for Sony, I think the problem is not bad carpenters, but the fact that instead of good ones they hired little sisters.
Like you said, it appears as if Sony's problem is that they didn't hire good "carpenters."
While it's a bit of hyperbole, I honestly think that this a possible reality with computers and technology in the future.
And wow, I'm really glad I'm not Andrew's Grandma.
The other analogy that they've made (though not this time around) is that the traditional PC/Mac is equivalent to a truck whereas a tablet like the iPad is equivalent to a regular car. We'll always need/have "trucks" for the folks who need/want them (programmers, designers, digital artists, enthusiasts, gamers, etc.), but the average person who just wants to check their mail and shop online only needs a "car" to do what they want and therefore more people will be owning "cars" instead of "trucks" going forward.
It's an interesting statement in that something like the iPad would be perfect for my mom who basically only uses her computer for web surfing, email, playing solitaire, and the occasional youtube video of Brazilian soap operas. A full blown PC, whether Windows or Mac, is overkill and overcomplex for her case (as the frequent support calls I get from her would indicate). Presumably many other computer users are in a similar camp. Remember, the folks on this forum do not reflect the "average" computer user.
That's also the point of iCloud -- Apple is going to give free cloud syncing and storage (5 GB or so, I think, depending on what you're storing) so that all your devices aren't physically tied together, whether they're desktop computers, laptops, tablets, or smartphones. They'll also be "equal partners" if you will, although for now the "PC Mothership" shall remain the primary storage location for data over and above the 5 GB they give you (for example, most of your photos shall remain stored on your PC).
Now, I don't want to sound like I'm blindly parroting what Apple's saying as if it's gospel. I'm not -- I'm somewhat skeptical that this strategy will work. However, I can see where they're coming from and many of their statements do have merit.
Possible counterexample: Library of Alexandria? I still don't think this can happen. Everything is documented. Our knowledge is redundantly distributed around the globe. It would take a catastrophe to completely destroy our knowledge of low-level computing. If that occurred, I think we have bigger problems.
I do agree with you that PCs are heavily used in most offices, but then the standard office isn't one of Apple's core markets. Their core markets are consumers first and "creative" businesses such as graphic design, video editing, etc. second. Apple's statement had nothing to do with the use of the PC in the business realm -- again, "trucks" vs "cars" here.
As far as use in academia, we'll have to wait and see what happens due to this being such a new market. Tablets that don't suck have only been around for about 2 years, whereas the requirement for a student to have a laptop in academia has been around for at least 5 years (and not all colleges require one, for that matter). In addition, the requirement for having a laptop seems to be a blanket statement for everyone irregardless of major -- i.e. it doesn't matter if you're majoring in English literature or Nuclear Physics, you all need to have a laptop. However, what an English lit major would need on a laptop (probably just a word processor in addition to the usual internet software) differs greatly from what a Nuclear Physics major would need (the usual internet software plus a technically-oriented word processor, some fancy mathematical software like Mathematica, maybe some programming tools, etc.). From the college's perspective, it makes more sense just to require some "standard" configuration as opposed to providing a list of different systems depending on major. As I said earlier, if all I needed to do in college was word process, web surf, and email, an iPad with Pages probably would've done me just fine. If I needed something that required specialty software, like science, math, or engineering classes, then a more general purpose laptop or desktop would be appropriate.