This forum is in permanent archive mode. Our new active community can be found here.

Apple Thinks We Live in a "Post-PC World"

edited June 2011 in Technology
A certain BBC article has brought to my attention that Apple is holding an Apple developers conference of their own -- at the same time as E3, which I find peculiar -- and has said that because they have sold so many iOS products and are going to be rolling out the iCloud, PC is, as Kenshiro would say, already dead.

My opinion is in line with Scott Meyer of Basic Instructions in that %95 of computers are already running Windows, and an unbearable number of programs are still Windows only. iCloud could hurt M$'s market share, but they're not in danger.
«1

Comments

  • Well, Until PC Gaming dies, the PC reigns supreme for me :-p
  • The only thing the iCloud will bring is a new service for hackers to try and hack.
  • The only thing the iCloud will bring is a new service for hackers to try and hack.
    Like I said, I can't wait for the hack that tricks iMatch into matching every song ever.
  • The PC is not dead, but we are definitely at the point right now where you can officially hand grandma an iPad and never had to worry about her again. Wireless software updating, syncing, and backups take care of every conceivable reason why she would need to maintain a PC. I'll never get rid of mine.
  • As a developer, I'll always need some sort of PC. I'm just worried about the future of developers. How can one learn to write software if all you've ever used is iOS/Google Cloud applications? At some point, we'll have the computer literate and those who have the computer gnosis.
  • As a developer, I'll always need some sort of PC. I'm just worried about the future of developers. How can one learn to write software if all you've ever used is iOS/Google Cloud applications? At some point, we'll have the computer literate and those who have the computer gnosis.
    We got an Internet at work. He's eager to learn, but he pretty much never programmed anything before college. I programmed in Kindergarten. If we had Macintosh instead of Apple// at that time I probably wouldn't be able to program today.
  • We got an Internet at work.
    We also have an Internet.
  • We got an Internet at work.
    We also have an Internet.
    I've got 10 Internets.
  • I'm just worried about the future of developers. How can one learn to write software if all you've ever used is GUI-based OSes? At some point, we'll have the computer literate and those who have the computer gnosis.
    I'm just worried about the future of developers. How can one learn to write software if all you've ever used is automatic memory allocation? At some point, we'll have the computer literate and those who have the computer gnosis.
    I'm just worried about the future of developers. How can one learn to write software if all you've ever used is high-level languages? At some point, we'll have the computer literate and those who have the computer gnosis.
    image
    Life finds a way.
  • edited June 2011
    image
    Life finds a way.
    Want to know why Sony's servers are always fucked? Because they think that just because someone knows a little HTML and JS is enough to write a secure website. Eventually, the number of layers of abstractions will break the camel's back. If you think a stackoverflow is only that website you go to to get answers to your JQuery questions, your system is fucked.
    Post edited by Andrew on
  • edited June 2011
    As a developer, I'll always need some sort of PC. I'm just worried about the future of developers. How can one learn to write software if all you've ever used is iOS/Google Cloud applications? At some point, we'll have the computer literate and those who have the computer gnosis.
    The kids who want PCs will get PCs somehow. The idea of every home having some sort of computer has only been around since the early to mid 90's -- before then, PCs in the home were kind of a rarity. Yet kids who really wanted computers often did get them and those kids often became programmers. Hell, there are programmers who have been around since before there were home PCs. If you have interest in programming, you'll find a way to get the stuff you need to program.

    In my case, I didn't get a computer until the end of 6th grade and only because I had spent so much time in the computer lab at school (6th grade was the first year I was in a school that had anything resembling a "computer lab") that my parents finally thought it would be worthwhile to get one for the home. However, I always was fascinated by them and had been begging and pleading my parents for years to get one (now, why they didn't was partially my fault for insisting on various crappy ones, but that's another story). I often checked out books on programming from the library and tried to get "computer time" any time I had access to a friend's or relative's computer. Guess what -- despite my relatively late start (I hadn't been programming since kindergarten, like some folks here have), I still managed to become a programmer and I like to think I'm a pretty good one at that.
    Post edited by Dragonmaster Lou on
  • Want to know why Sony's servers are always fucked? Because they think that just because someone knows a little HTML and JS is enough to write a secure website. Eventually, the number of layers of abstractions will break the camel's back. If you think a stackoverflow is only that website you go to to get answers to your JQuery questions, you're system is fucked.
    Yup. See, most people who code never actually work on anything serious or big. There are very few big things and lots of little things. It's very easy to make a little thing that works. Anyone can read a few tutorials and make an app or site that appears the work. Even though it may appear to work, there are guaranteed to be a gigantic pile of flaws that are completely invisible to anyone without lots of advanced knowledge. Those problems may never rear their ugly head if you are so small that your software is hardly ever used. Once your software starts to get used by lots of people in the real world for serious, it will fail catastrophically.

    Lately, there has been a trend where people develop technologies that profess that they can scale automagically while keeping things easy for the user. Take Heroku for example. Someone who only learns Ruby on Rails, but doesn't learn anything else, can use Heroku to get hosted. That allows you to get a web site up without understanding SQL, caching, UNIX, or any other part of the system besides your Ruby code. While on the one hand that seems great that a developer is able to fully concentrate on the one thing they care about without reinventing wheels, the fact is that it's a huge problem. With any code, it's the entire computer that matters. Everything from the transistors up to the GUI is important. Not having enough people who understand the entire system is already causing us trouble in the form of things that can't stay up, or have major security holes. It's only going to get worse.
  • edited June 2011
    With any code, it's the entire computer that matters. Everything from the transistors up to the GUI is important. Not having enough people who understand the entire system is already causing us trouble in the form of things that can't stay up, or have major security holes. It's only going to get worse.
    It's funny you should mention that. It reminds me of arguments I had back when my old college's CS department when from primarily using C and C++ in their non-intro classes to pretty much going 100% Java except for operating system and graphics classes, which remained in C and C++. I argued how getting rid of the requirement that every student take at least one class (typically the "large-scale" software engineering class which was required for all CS majors -- originally it was done in C++ but it switched to Java some time after I graduated) where they do projects of significant size in a non-garbage collected language was actually hurting their CS education. I got countered by folks claiming that it's not really necessary to deal with manual memory allocation anymore unless you're going into operating systems or high-performance graphics and the "introductory assembly language and computer architecture" class would be good enough for everyone else. Now, that architecture class a really good class for what it was intended to teach(I was an undergrad TA for it my last semester in college), but it certainly did not compare to a class where you write something significant in a lower-level language -- the most complicated programs you wrote in that class were what was essentially a simple binary tree in MIPS assembly and a fairly simple compiler (language with a simplified grammar, no optimization, and just need to provide MIPS assembly output that you can feed to the same assembler we used for the rest of the class) that you have your choice of writing in either C++ or Java and pretty much everyone did it in Java (I was the only person I knew of who did it in C++ since I didn't know Java all that well at the time -- long story).
    Post edited by Dragonmaster Lou on
  • It's turtles all the way down.

    You write some HTML and JavaScript without understanding the underlying HTTP server layer. Result is security problems.
    So you learn the HTTP server layer and how to write a web application, but you never learn SQL. Result is awful performance problems.
    So you learn SQL and databases. You normalize the database, speed up all the queries and reduce the number of queries. You still have performance issues, wtf?
    So you notice that for some reason your web application written in high level language X is causing apache processes to grow to use up all the memory on the machine to the point of swapping. Now you have to learn about the memory allocation going on below your high level language and the Apache web server.
    And while you're learning about Apache you have to learn about UNIX and file systems, and you're basically at the bottom turtle. All from just a simple little app with HTML and Javascript.
  • Starfox and Andrew have officially made my day a bright and happy place.
  • edited June 2011
    See, most people who code never actually work on anything serious or big.
    Has it ever been different? Back in the 80's, how many people were working on gcc, and how many people were making pong clones in BASIC? For every Doom II there are hundreds (thousands?) of text adventures. Maybe we should go back to punch cards so we know how to program the "right" way. I don't buy all this gloom and doom. Having better tools at your disposal is not a bad thing.

    Edit: I guess what I'm getting at is similar to what happens with piracy. Someone pirates a song, they were never going to buy it, no matter what. Someone writes Heroku apps, they weren't ever going to be writing anything in C anyway. It's not a loss for the hardcore programming community, it's the gain of someone interested that wouldn't have even touched it before.
    Post edited by Starfox on
  • See, most people who code never actually work on anything serious or big.
    Has it ever been different? Back in the 80's, how many people were working on gcc, and how many people were making pong clones in BASIC? For every Doom II there are hundreds (thousands?) of text adventures. Maybe we should go back to punch cards so we know how to program the "right" way. I don't buy all this gloom and doom. Having better tools at your disposal is not a bad thing.
    Better tools is all well and good, but I agree with scott that you need a deeper understanding else you will blindly stumble into some pitfall.

    Didn't the guys who made twitter start with RoR and found out, the hard way, that running a site as huge as twitter has become on RoR is impossible? To be completely honest, IIRC, they were basically the first RoR site that got that large so they couldn't really predict it's ability to scale.
  • Maybe we should go back to punch cards so we know how to program the "right" way. I don't buy all this gloom and doom. Having better tools at your disposal is not a bad thing.
    Depends on what you mean by "better tools." If you mean things like better text editors, code indexers, build systems, compilers, debuggers, and so on, yeah, those are always good.

    However, I wouldn't call a garbage collected language a "better tool" than a non-garbage collected language. The two are different tools despite being similar on the surface -- kind of like the difference between a flat head and Phillips screwdriver. I mean, just because you can use a flat head screwdriver with Phillips screws doesn't mean it's the best tool for the job. Similarly, just because you can use a non-garbage collected language to do a job that can be done with a garbage collected one doesn't mean it's the best language for the job either. On the flip side, a Phillips screwdriver is useless for flat head screws and good luck trying to write an operating system kernel in a garbage collected language (granted, it may be possible depending on the language, but not in the context of today's commonly used ones like Java, Python, Ruby, etc.). Also, just like I'd expect a carpenter to be proficient with both types of screwdrivers, I'd expect any computer programmer to be reasonably proficient with garbage collected and non-garbage collected languages, even if his/her programming specialty of choice typically uses one type of language over the other.

    One of the most fun classes I had in college involved copious amounts of assembly programming. Now, I had lots of fun doing assembly programming, but that doesn't mean I'd like to write anything consisting of thousands of lines of code in it.
  • I wouldn't call a garbage collected language a "better tool" than a non-garbage collected language.
    [...]
    Also, just like I'd expect a carpenter to be proficient with both types of screwdrivers, I'd expect any computer programmer to be reasonably proficient with garbage collected and non-garbage collected languages, even if his/her programming specialty of choice typically uses one type of language over the other.
    Maybe I should have said "more" instead of "better". While you would want a carpenter to be able to use both types of screwdrivers, you wouldn't necessarily expect your little sister to be able to use both, especially if she's never carpented before.

    The point is the choice exists. Barriers to entry are falling constantly, but the skill cap remains as high as it ever was. As for Sony, I think the problem is not bad carpenters, but the fact that instead of good ones they hired little sisters.
  • The point is the choice exists. Barriers to entry are falling constantly, but the skill cap remains as high as it ever was. As for Sony, I think the problem is not bad carpenters, but the fact that instead of good ones they hired little sisters.
    The point is that there are almost no good carpenters. Even worse, it takes one to know one. You can't tell a good carpenter from a bad one unless you are a good one yourself. If you're a non-technical businessman, you have to trust that your top guy is good based on resume and faith. At the same time, good tech people can't stand to work with bad tech people. Thus, they quit companies with morons and they fire the morons from the companies they stay at. Companies, or at least departments, tend to either be all good or all bad in terms of tech skill. Google and Facebook are examples of all good. The places where Rym works tend to be all bad.
  • The places where Rym works tend to be all bad.
    That being the mythical "enterprise" environment, not any particular place. ;^)
  • Maybe I should have said "more" instead of "better". While you would want a carpenter to be able to use both types of screwdrivers, you wouldn't necessarily expect your little sister to be able to use both, especially if she's never carpented before.

    The point is the choice exists. Barriers to entry are falling constantly, but the skill cap remains as high as it ever was. As for Sony, I think the problem is not bad carpenters, but the fact that instead of good ones they hired little sisters.
    Ah, I agree, choice is good. I wouldn't want a beginning programmer to wrestle with C or C++, obviously -- I'd want her to start out with one of the easier tools and garbage collected languages tend to be easier to learn than non-garbage collected languages. However, if she was serious about programming, I would want her to eventually move up to using the more difficult to learn tools and wouldn't trust her to do anything significant until she has shown reasonable proficiency with them, even if it's perfectly acceptable to do the task with an easier to use tool.

    Like you said, it appears as if Sony's problem is that they didn't hire good "carpenters."
  • Maybe I should have said "more" instead of "better". While you would want a carpenter to be able to use both types of screwdrivers, you wouldn't necessarily expect your little sister to be able to use both, especially if she's never carpented before.

    The point is the choice exists. Barriers to entry are falling constantly, but the skill cap remains as high as it ever was. As for Sony, I think the problem is not bad carpenters, but the fact that instead of good ones they hired little sisters.
    Ok, let's say we have some carpenter. He's got a hammer and a saw. Boy, he can make some awesome furniture, but it sure does take him a long time. But the carpenter is smart. So, he designs the table saw and the nail gun. Things sure do go a lot quicker. His son picks up woodworking and decides to follow in his father's footsteps. He too can make fine quality goods, but the nail gun and table saw are too slow and require too much work. So he decides to make more tools to assist in his job. And so the cycle continues until way down the line the carpenters are just putting designs into carpentry machines which do everything. But there is a problem, the carpentry machine isn't very good at the details and often makes mistakes. They make pretty furniture, but whenever someone sits down on them, they break! Oh noes! And to make matters even worse, the current carpenters don't know how to use a hammer and saw to fix the problems. They can't even repair the broken furniture. In fact, they don't even know the first thing about being a carpenter. So what can they do? The art of carpentry has been lost!

    While it's a bit of hyperbole, I honestly think that this a possible reality with computers and technology in the future.
  • edited June 2011
    PC is, as Kenshiro would say, already dead.
    I LOL'd. I guess Apple doesn't realize that their computers are PC's as well. I think they mean Windows. Also, wow, that sounds really dumb.

    And wow, I'm really glad I'm not Andrew's Grandma. ;)
    Post edited by Nuri on
  • I LOL'd. I guess Apple doesn't realize that their computers are PC's as well. I think they mean Windows. Also, wow, that sounds really dumb.

    And wow, I'm really glad I'm not Andrew's Grandma. ;)
    Actually, if you saw Apple's presentation, they do include their computers in the definition of a "PC" -- not just Windows machines. Their whole point, rightly or wrongly, is that the central position played by the PC, whether of the Windows or Mac flavor, in the "digital life" of the average person is already dead (or at least dying). The cloud is the new central point with everyone syncing all their gizmos with the cloud. In addition, they believe that many people no longer need nor want a PC to accomplish what they need to do in their "digital lives" -- a smartphone or tablet is all they ever need and hence all they ever want.

    The other analogy that they've made (though not this time around) is that the traditional PC/Mac is equivalent to a truck whereas a tablet like the iPad is equivalent to a regular car. We'll always need/have "trucks" for the folks who need/want them (programmers, designers, digital artists, enthusiasts, gamers, etc.), but the average person who just wants to check their mail and shop online only needs a "car" to do what they want and therefore more people will be owning "cars" instead of "trucks" going forward.

    It's an interesting statement in that something like the iPad would be perfect for my mom who basically only uses her computer for web surfing, email, playing solitaire, and the occasional youtube video of Brazilian soap operas. A full blown PC, whether Windows or Mac, is overkill and overcomplex for her case (as the frequent support calls I get from her would indicate). Presumably many other computer users are in a similar camp. Remember, the folks on this forum do not reflect the "average" computer user.
  • In addition, they believe that many people no longer need nor want a PC to accomplish what they need to do in their "digital lives" -- a smartphone or tablet is all they ever need and hence all they ever want.
    They do believe this, but I think it is their mistake. I think people do want all sorts of devices, they just don't want any particular device to be dependent on any other device. Just because I don't want my iPhone to need my PC doesn't mean I don't want my PC. I just don't want devices to be tied to each other for many reasons.
  • In addition, they believe that many people no longer need nor want a PC to accomplish what they need to do in their "digital lives" -- a smartphone or tablet is all they ever need and hence all they ever want.
    They do believe this, but I think it is their mistake. I think people do want all sorts of devices, they just don't want any particular device to be dependent on any other device. Just because I don't want my iPhone to need my PC doesn't mean I don't want my PC. I just don't want devices to be tied to each other for many reasons.
    Yes, but you (and I) are not the typical person they are talking about. They're not talking about programmers or PC gamers. They're talking about people like my mom. I agree with you that in my case, yeah, I'd want to keep my PC and my laptop in addition to any smartphones, tablets, and so on I may own. That's because I need them all for the stuff I want to do with computers. However, if all I did was surf the web, read email, and play solitaire, and iPad may be all I need and want. Despite the fact that it doesn't run a traditional PC OS and doesn't have a keyboard and mouse/equivalent, the iPad is, technically, a full-fledged PC that happens to be in a tablet form factor. You can even use an external bluetooth keyboard with it if you do enough typing that using the virtual keyboard would be painful and/or tedious. For that matter, even the most basic model iPad has got more CPU, RAM, and storage space than the PC I used in college that I did everything on from word processing and coding to hard core math with Maple and Matlab. If you look at the apps available for the iPad, they do have word processors, spreadsheets, personal databases, photo editors, and just about every other thing you'd need to achieve what the vast majority of PC users do on their PCs. Again, looking back at my college experience, if I didn't care about gaming and only needed a computer for communications and typing up term papers (i.e. I was pretty much your standard liberal arts major), an iPad probably would've been all I needed provided they had printing working (which supposedly they are working on).

    That's also the point of iCloud -- Apple is going to give free cloud syncing and storage (5 GB or so, I think, depending on what you're storing) so that all your devices aren't physically tied together, whether they're desktop computers, laptops, tablets, or smartphones. They'll also be "equal partners" if you will, although for now the "PC Mothership" shall remain the primary storage location for data over and above the 5 GB they give you (for example, most of your photos shall remain stored on your PC).

    Now, I don't want to sound like I'm blindly parroting what Apple's saying as if it's gospel. I'm not -- I'm somewhat skeptical that this strategy will work. However, I can see where they're coming from and many of their statements do have merit.
  • Their whole point, rightly or wrongly, is that the central position played by the PC, whether of the Windows or Mac flavor, in the "digital life" of the average person is already dead (or at least dying).
    Ah, well, then they are just wrong. Perhaps it is the case that the central role of the PC in their sales model is dead, but not one of my gadgets is a suitable replacement for my laptop. Perhaps for personal leisure use, a tablet is fine. However, how many people only use a computer for that? We use PC's heavily in most offices. At home, if you want to do any kind of serious word processing or computation, a tablet is insufficient. Lots of students don't have an iPad because we need a laptop for school (and sometimes work); that laptop works quite well as a leisure device as well. Until there is a gadget that is awesome at word processing, PC's will maintain their hold, at least in academia. And since we seem to be pushing everyone to go to college these days, even when they shouldn't be there, academia is in no danger.
  • While it's a bit of hyperbole, I honestly think that this a possible reality with computers and technology in the future.
    I understand the scenario you are afraid of, but I don't think it can happen. For example: we don't really need blacksmiths any more, yet still need (and have) metalworking capabilities. How is this different?

    Possible counterexample: Library of Alexandria? I still don't think this can happen. Everything is documented. Our knowledge is redundantly distributed around the globe. It would take a catastrophe to completely destroy our knowledge of low-level computing. If that occurred, I think we have bigger problems.
  • edited June 2011
    Ah, well, then they are just wrong. Perhaps it is the case that the central role of the PC in their sales model is dead, but not one of my gadgets is a suitable replacement for my laptop. Perhaps for personal leisure use, a tablet is fine. However, how many people only use a computer for that? We use PC's heavily in most offices. At home, if you want to do any kind of serious word processing or computation, a tablet is insufficient. Lots of students don't have an iPad because we need a laptop for school (and sometimes work); that laptop works quite well as a leisure device as well. Until there is a gadget that is awesome at word processing, PC's will maintain their hold, at least in academia. And since we seem to be pushing everyone to go to college these days, even when they shouldn't be there, academia is in no danger.
    Well, the tablet form factor is relatively new... or more specifically, tablets that don't suck are relatively new. Also, how good a word processor do you need in academia if that's all you're using your computer for? Apple sells the Pages word processor for the iPad, which supposedly has feature parity with the Mac version. Now, I've never used Pages for more than a few minutes at a time, but I envision it should be sufficient for the vast majority of word processing jobs out there. It's certainly good enough for business proposals, software requirements documents, and the like (based on anecdotal evidence, admittedly). For the jobs where Pages won't cut it, I'm not sure I'd trust OpenOffice or Word either -- I'd want to go with LaTeX or something else designed for long, technical documents like FrameMaker or something like InDesign if it's a case where I need truly precise layout.

    I do agree with you that PCs are heavily used in most offices, but then the standard office isn't one of Apple's core markets. Their core markets are consumers first and "creative" businesses such as graphic design, video editing, etc. second. Apple's statement had nothing to do with the use of the PC in the business realm -- again, "trucks" vs "cars" here.

    As far as use in academia, we'll have to wait and see what happens due to this being such a new market. Tablets that don't suck have only been around for about 2 years, whereas the requirement for a student to have a laptop in academia has been around for at least 5 years (and not all colleges require one, for that matter). In addition, the requirement for having a laptop seems to be a blanket statement for everyone irregardless of major -- i.e. it doesn't matter if you're majoring in English literature or Nuclear Physics, you all need to have a laptop. However, what an English lit major would need on a laptop (probably just a word processor in addition to the usual internet software) differs greatly from what a Nuclear Physics major would need (the usual internet software plus a technically-oriented word processor, some fancy mathematical software like Mathematica, maybe some programming tools, etc.). From the college's perspective, it makes more sense just to require some "standard" configuration as opposed to providing a list of different systems depending on major. As I said earlier, if all I needed to do in college was word process, web surf, and email, an iPad with Pages probably would've done me just fine. If I needed something that required specialty software, like science, math, or engineering classes, then a more general purpose laptop or desktop would be appropriate.
    Post edited by Dragonmaster Lou on
Sign In or Register to comment.