You do not want to go there! Classical mechanics with variable particle masses is an abomination to Newton!
What do you mean? Do I have to abandon the F=m*a simplification for the F=m*a+m' *v? Also, what effect would that have on the rocket equation?
Nah, those'll work all right, I just have a theorists' preference for systems that have closed solutions. If you have to break out the computers to calculate something it's like saying "I give up" ;-).
Nah, those'll work all right, I just have a theorists' preference for systems that have closed solutions. If you have to break out the computers to calculate something it's like saying "I give up" ;-).
Just so you know, almost nobody is actually using Python 3 for anything serious. Almost all of the modules are still only compatible with python 2, and it's going to be a long time before Python 3 gets support from the rest of the community.
Once it settled into it's rhythm, It kinda made me think of a robotic drum majorette, despite it not being robotic, or having anything to do with drumming or majorettes.
I have a friend who can do a Double Pendulum balance. He balances a club on his chin, and then a peacock feather on the club. Sure, peacock feathers are the easiest thing to balance, but the tick is stick rock hard. The fact that he bothered learning the trick is a testament to his extreme geekiness, but then he also made a gif animator where you input juggling patterns and it outputs animations of the patterns being juggled by rockets fired into space and orbiting the earth at different speeds. Google Peter Bone if you want to see his juggling and/or programming.
Just so you know, almost nobody is actually using Python 3 for anything serious. Almost all of the modules are still only compatible with python 2, and it's going to be a long time before Python 3 gets support from the rest of the community.
What are "modules?" As long as they leave me the basic mathematical operations I can deal without having modules.
Modules, like libraries, keep you from having to re-invent the wheel over and over again. So, for example, someone will have coded an implementation of the Runge-Kutta method in python and put it into the NumPy module. What's more, that someone will probably have done a much better job of optimizing it than you will be able to. On the other hand, when you're starting out, inventing the wheel is what you're supposed to do.
EDIT: As is, python only supplies the normal arithmetic operators +-*/^, so even for the simple Euler method you'll have to write your own derivative function unless you use NumPy / SciPy, this is not hard (e.g. here), but since you will use it a lot in your modeling, you want to make sure it is a good implementation.
EDIT2: see wikipedia for some interesting insights into what needs to be taken into account for a good implemetation of a derivative. What interests me though is that some of those things are clearly not possible in an interpreted language like python. Does the interpreter somehow take care of the machine level optimizations?
Not perfect, and they've got a few (pretty major) gaps IMHO (C isn't only used for Unix -- just about every major operating system including Windows and Apple's operating systems [themselves flavors of Unix] are also written in c), but it does cover things pretty well.
I personally find it kind of interesting that there haven't been any major new languages in almost 20 years. It would be really nice if someone could make a better version of Javascript that would actually get some traction. As it is, each big web technology has had to make their own solutions to make up for its shortcomings.
Things like SQL injection as a ".NET vulnerability" doesn't really make sense. There aren't things that can be manipulated in the library to allow SQL injections, it's just poor management of input by the programmer.
I personally find it kind of interesting that there haven't been any major new languages in almost 20 years. It would be really nice if someone could make a better version of Javascript that would actually get some traction. As it is, each big web technology has had to make their own solutions to make up for its shortcomings.
Google's trying to do that with Dart, and Microsoft is with TypeScript. We'll see how far they go.
Things like SQL injection as a ".NET vulnerability" doesn't really make sense. There aren't things that can be manipulated in the library to allow SQL injections, it's just poor management of input by the programmer.
Of course, the same could apply to buffer errors/overflows in C. It all comes down to programmers screwing up in some way or another.
Things like SQL injection as a ".NET vulnerability" doesn't really make sense. There aren't things that can be manipulated in the library to allow SQL injections, it's just poor management of input by the programmer.
Of course, the same could apply to buffer errors/overflows in C. It all comes down to programmers screwing up in some way or another.
True, what I meant though is .NET is a library of utilities whereas C is a language. It would make a bit more sense saying C# instead of .NET but C# is not really anything without .NET I suppose.
Instead of focusing on the actual languages you should look at the functionality of programming languages and how that has developed over time. The current versions of FORTRAN, C, Perl, etc. are massively different than their original incarnations.
For example, the move from a procedural methodology to object oriented methodology, the introduction of interpreted vs compiled languages, the general trend to move to higher abstraction, memory safe, untyped languages. These are all trends that are a) more important than "what language are you using" and b) often happen to a language. Just look at how Appe has taken C++ and bolted an unholy mess of functionality to it. They have already pretty much eliminated memory management and I'm willing to bet that type inference (or rather making Objective C un/weakly typed) is next on the to-do list.
The reason why no new "major" (a debatable term at best) language has come out in many years is that usefull features that crop up in any of the many new small languages get replicated sooner or later in one of the older languages. If nothing else, some fan will write a wrapper for the new hot thing so lazy ass programmers don't have to learn a new language just to do X.
The other reason for no new (major) languages is that we are a bit stuck -- in my opinion -- at the curent highest level of interpreted languages and the next logical step is to have a natural language programming language, or a visual programming language like the LEGO mindstorm language or LabVIEW but more general and versatile than those specific use cases. The hurdles to overcome in moving from what most programmers already deem "training wheel" languages like Python to something that will let your mom code by just sitting in front of the computer and methodically explain what she wants the computer to do are immense.
Instead of focusing on the actual languages you should look at the functionality of programming languages and how that has developed over time. The current versions of FORTRAN, C, Perl, etc. are massively different than their original incarnations.
For example, the move from a procedural methodology to object oriented methodology, the introduction of interpreted vs compiled languages, the general trend to move to higher abstraction, memory safe, untyped languages. These are all trends that are a) more important than "what language are you using" and b) often happen to a language. Just look at how Appe has taken C++ and bolted an unholy mess of functionality to it. They have already pretty much eliminated memory management and I'm willing to bet that type inference (or rather making Objective C un/weakly typed) is next on the to-do list.
The reason why no new "major" (a debatable term at best) language has come out in many years is that usefull features that crop up in any of the many new small languages get replicated sooner or later in one of the older languages. If nothing else, some fan will write a wrapper for the new hot thing so lazy ass programmers don't have to learn a new language just to do X.
The other reason for no new (major) languages is that we are a bit stuck -- in my opinion -- at the curent highest level of interpreted languages and the next logical step is to have a natural language programming language, or a visual programming language like the LEGO mindstorm language or LabVIEW but more general and versatile than those specific use cases. The hurdles to overcome in moving from what most programmers already deem "training wheel" languages like Python to something that will let your mom code by just sitting in front of the computer and methodically explain what she wants the computer to do are immense.
Truth.
After doing some more research, I would posit that C# (2000) is the newest language of any significance. It seems like everything else since then has either been a flash in the pan (Go), or is a toy for hobbyists.
Comments
EDIT: As is, python only supplies the normal arithmetic operators +-*/^, so even for the simple Euler method you'll have to write your own derivative function unless you use NumPy / SciPy, this is not hard (e.g. here), but since you will use it a lot in your modeling, you want to make sure it is a good implementation.
EDIT2: see wikipedia for some interesting insights into what needs to be taken into account for a good implemetation of a derivative. What interests me though is that some of those things are clearly not possible in an interpreted language like python. Does the interpreter somehow take care of the machine level optimizations?
For someone who barely knows anything about this stuff, it helped me understand things a bit better on where each is used and for what.
For example, the move from a procedural methodology to object oriented methodology, the introduction of interpreted vs compiled languages, the general trend to move to higher abstraction, memory safe, untyped languages. These are all trends that are a) more important than "what language are you using" and b) often happen to a language. Just look at how Appe has taken C++ and bolted an unholy mess of functionality to it. They have already pretty much eliminated memory management and I'm willing to bet that type inference (or rather making Objective C un/weakly typed) is next on the to-do list.
The reason why no new "major" (a debatable term at best) language has come out in many years is that usefull features that crop up in any of the many new small languages get replicated sooner or later in one of the older languages. If nothing else, some fan will write a wrapper for the new hot thing so lazy ass programmers don't have to learn a new language just to do X.
The other reason for no new (major) languages is that we are a bit stuck -- in my opinion -- at the curent highest level of interpreted languages and the next logical step is to have a natural language programming language, or a visual programming language like the LEGO mindstorm language or LabVIEW but more general and versatile than those specific use cases. The hurdles to overcome in moving from what most programmers already deem "training wheel" languages like Python to something that will let your mom code by just sitting in front of the computer and methodically explain what she wants the computer to do are immense.
After doing some more research, I would posit that C# (2000) is the newest language of any significance. It seems like everything else since then has either been a flash in the pan (Go), or is a toy for hobbyists.