This forum is in permanent archive mode. Our new active community can be found here.

So I want to get into programming.

13»

Comments

  • Go is a major new language that is newer than 20 years old.
    I wouldn't call Go major yet. It's an interesting research language, but hardly anyone outside of Rob Pike and his buddies in Google are using it heavily. That said, given how Rob Pike is one of the folks behind it and the guy's a freakin' CS genius, it does have a decent shot of catching on. Then again, he also made Plan 9 from Bell Labs, which was an interesting research OS that never really went anywhere, so who knows.
    Instead of focusing on the actual languages you should look at the functionality of programming languages and how that has developed over time. The current versions of FORTRAN, C, Perl, etc. are massively different than their original incarnations.
    I wouldn't call C massively different from it's original incarnations. Well, okay, it's significantly different from K&R C in many ways, but the differences between modern C and the original incarnation of ANSI C going back to 1989 are pretty minor. Most of the differences are the introduction of a few new datatypes, the use of '//' comments, and some additions to the standard library. A 1989 ANSI C program should compile without any difficulty with a modern C compiler and a 1989 ANSI C programmer could look at a modern C program and basically be like, "Oh, I guess they added a proper const datatype to the language. About bloody time!"

    I won't comment on Fortran and Perl as I have such little experience with those languages that I'm not qualified to do so.
    For example, the move from a procedural methodology to object oriented methodology, the introduction of interpreted vs compiled languages, the general trend to move to higher abstraction, memory safe, untyped languages. These are all trends that are a) more important than "what language are you using" and b) often happen to a language. Just look at how Apple has taken C++ and bolted an unholy mess of functionality to it. They have already pretty much eliminated memory management and I'm willing to bet that type inference (or rather making Objective C un/weakly typed) is next on the to-do list.
    Well, first, Apple didn't take C++ and bolt stuff to it. They took C and bolted Smalltalk-like object orientation to it to create Objective C. Also, it wasn't Apple, it was a small company called Stepstone. NeXT adopted Stepstone's Objective C as their development language of choice and Apple inherited it when they bought NeXT in the late 90's. I will give you that they've bolted on a bunch more functionality to the language since then, however. Also, Objective C already fully supports weak/duck typing. That's what its 'id' datatype is about. You basically have the choice of using strong typing or just have everything be of type 'id'.
  • Languages get extensions and improvements all the time. Imagine the inertia of staying with an existing codebase.
  • edited June 2013
    Languages get extensions and improvements all the time. Imagine the inertia of staying with an existing codebase.
    Indeed. This goes back even as far as the original languages, machine and assembly. I mean, in their cases, the only way to access new hardware capabilities is to actually extend the language. :)

    Nowadays, the main issues surrounding language extensions/improvements basically breaks down to whether it should be fundamentally baked into the language itself or whether it should instead consist of additions to the language's standard libraries. The general rule of thumb I'm seeing with most of the C-derived languages is that the core language is only changed if the new functionality can't be added to the standard libraries. For example, C++ added a new keyword, auto, that can be used to have the compiler automatically determine a variable's type at compile time. That functionality pretty much requires changes to the core language in order to work. However, another new feature is reference counted pointers (called "shared pointers" in the documentation) that automatically delete themselves when their reference count goes to zero. This is actually something that has existed in 3rd party libraries (Boost, Loki, etc.) for years and can be implemented without alterations to the language itself, so they simply just added it to the standard library.
    Post edited by Dragonmaster Lou on
Sign In or Register to comment.