If I did, I have to concur with everyone else who has found them and say it's really obvious what the url is, and you will hit yourself when you find out.
Want to see some really old shit? I might be able to find a version of my web site from high school if I check some floppy disks, but this one is from the year 1999/2000.
OMG I just realized that the code includes an RCS folder. Does anyone here even know what that is? That's what we used for source control before Git, before SVN and before CVS. I'm just glad I wasn't coding back in the days of SCSS.
That we were made to use RCS in CS1 is just about as flabbergasting. That shit is ancient (and SVN, much less CVS, aren't actually that new.)
EDIT: Okay, some quick research has revealed the somewhat odd/fascinating fact that free/open-source version control didn't actually change for the entirety of the 1990s. SVN is actually newer than I first thought, 1.0 seems to have been released in 2004; meanwhile RCS is from the early 80s and CVS was released in 1990.
That we were made to use RCS in CS1 is just about as flabbergasting. That shit is ancient (and SVN, much less CVS, aren't actually that new.)
Uh, SVN was released in the year 2000. It came into common usage while we were in college. I remember people using it when it was bleeding edge and I gave them the same side-eye that I gave to noSQL people a few years back and to nodejs people nowadays.
Git was released in 2005, but didn't actually get used more widely until 2007 and then Github came in 2008. Version control that isn't garbage is actually shockingly new. I was the early adopter who deserved the side-eye for this one. But Git solved all the problems I had with SVN, and Linux using it gave it the credibility.
SVN was also weirdly unstable due to the developers starting with a db backend rather than something file based. Hosting it on a networked filesystem was a bad idea. The db tended to corrupt itself with no clear path to recovery. Backing up the repo was also complicated. It got less bad once they added a filesystem backend, but it was still very weird to use. I only ever made toy repos with it.
I was the early adopter who deserved the side-eye for this one.
Maybe, but backing the horse Linus is riding and dudebro-my-shell-is-fscked-up-so-i-got-to-rm-slash-star-you-ii-its-straight-up-bash-time are two very different things.
At least on paper that will get the job done. I've never tried it, and don't know if it actually works. If it does, hooray! If not, I have another solution.
Let's say I'm making a game, and I have a lot of large texture files. I still put my source code in Git, because duh.
Next I put the textures in the cloud, probably Amazon S3. I put them in a directory that is named 1. Now we immediately copy directory 1 to a second directory named 2. Al the work done by artists will be updates to directory 2. Directory 1 is now read-only.
Artists probably aren't working on the same exact texture file simultaneously. It's very difficult (impossible?) to resolve conflicts on binary files anyway. Once all the textures are in a stable state, I copy the entire 2 directory to another directory named 3. All new work happens in directory 3, and directory 2 is now read-only.
Now back to the source code. I have a constant somewhere in my code that looks something like this: TEXTURE_VERSION = 2. Now I can tie together the version of code I'm using to a specific set of textures. I can also easily change this setting to 3, 4, etc. to test and see if the unstable version is good to go or not.
Might need to write a tiny bit of code that will update your local machine form the remote textures. Not difficult. You were going to be doing the same thing with a git pull anyways, and a download from S3 is probably faster.
Comments
If I did, I have to concur with everyone else who has found them and say it's really obvious what the url is, and you will hit yourself when you find out.
http://www.apreche.net/old-site/
EDIT: FOUND IT. I'm so good at backups yo.
I dare you to get this code to build and run.
https://en.wikipedia.org/wiki/Revision_Control_System
A few of you have found some of the beta episodes. But not all of them.
You'll know you have all of them if you have the following dates:
2005-10-11
2005-10-17
2005-10-18 (two versions)
2005-10-19
2005-10-20
2005-10-24
2005-10-25
2005-10-26
The first public episode was 2005-10-31. We released it retroactively after we went live 2005-11-01.
We think Karl Rove might get indicted for treason. Then we talk about how to commit righteous treason. Interesting in light of how Snowden went down.
EDIT: Okay, some quick research has revealed the somewhat odd/fascinating fact that free/open-source version control didn't actually change for the entirety of the 1990s. SVN is actually newer than I first thought, 1.0 seems to have been released in 2004; meanwhile RCS is from the early 80s and CVS was released in 1990.
Git was released in 2005, but didn't actually get used more widely until 2007 and then Github came in 2008. Version control that isn't garbage is actually shockingly new. I was the early adopter who deserved the side-eye for this one. But Git solved all the problems I had with SVN, and Linux using it gave it the credibility.
I guess RCS did have the virtue of simplicity. Looking at the options, I'm not actually sure I want to lobby to have been taught CVS at first instead.
Git is the way and the light.
Inb4 gifs
Or vidyas
and a primary source for posterity.
At least on paper that will get the job done. I've never tried it, and don't know if it actually works. If it does, hooray! If not, I have another solution.
Let's say I'm making a game, and I have a lot of large texture files. I still put my source code in Git, because duh.
Next I put the textures in the cloud, probably Amazon S3. I put them in a directory that is named 1. Now we immediately copy directory 1 to a second directory named 2. Al the work done by artists will be updates to directory 2. Directory 1 is now read-only.
Artists probably aren't working on the same exact texture file simultaneously. It's very difficult (impossible?) to resolve conflicts on binary files anyway. Once all the textures are in a stable state, I copy the entire 2 directory to another directory named 3. All new work happens in directory 3, and directory 2 is now read-only.
Now back to the source code. I have a constant somewhere in my code that looks something like this: TEXTURE_VERSION = 2. Now I can tie together the version of code I'm using to a specific set of textures. I can also easily change this setting to 3, 4, etc. to test and see if the unstable version is good to go or not.
Might need to write a tiny bit of code that will update your local machine form the remote textures. Not difficult. You were going to be doing the same thing with a git pull anyways, and a download from S3 is probably faster.