Toribash
Original Post
Majoring in a computer field
So. I was thinking lately. Say you went to college, for some computer-related major. Since technology is exponentially advancing, wouldn't it be pointless? I mean, the information you payed for would be outdated in the next decade or sooner. Would it be sort of a "risky investment" to major in anything too related to current technology?
Thanks.
The information you learn is outdated by the time you graduate.

However the core of any computer field is understanding change, and gaining the ability to learn quickly. For example in Software Engineering you will learn 'pseudo code', a language independent way of building algorithms. With pseudo code you can essentially create a template that can be used to construct the algorithm in any language (C, java, OCaml, VB, etc).

You will also learn historical algorithms and thought patterns, and then the more modern versions, leading you to understand the kinds of changes that take place.

Like any science field computer science has 2 sub-fields, applied and research. Those who work in the reasearch field will be right on the forefront, building new languages and operating systems, constructing all kinds of magnificent constructs. Those who work in the applied field will be focused on constructing code, applications. Every program you see was built by those in the applied field, upon techniques created by those in the research field. (Yes I simplified that, it is easier to explain like this).

If you aren't a researcher, then you are pretty much not going to have to worry too much. I mean, C#, which I would say is the most widespread language for applications, has been around since 2001, and was adopted in to the .NET framework. So if you were a programmer in say, 2003, using C# you would have noticed minimal changes. Only the addition of anonymous types, short hand, lambda functions, null collence, a few other nifty things that make your life easier.

Computer science is more volatile than other science fields, but if you love change and new challenges, then its a great field.
Isn't it exciting that in a decade you will still have new things to learn and new challenges to overcome?

Computing is invading every section of our life, investing in knowledge of how it works is the best possible investment.
When I see you, my heart goes DOKI⑨DOKI
Fish: "Gorman has been chosen for admin. After a lengthy discussion we've all decided that Gorman is the best choice for the next admin."
So by the time you graduate, you know enough to adapt to whatever changes? I never knew that. The classes must be pretty general, right?
Thanks.
Originally Posted by jxc1013 View Post
The classes must be pretty general, right?

No not really, but the idea is the same in most of these courses. The way a computer works won't change completely, coding will stay more or less the same, etc.
Thanks for the Avatar, MrAakash
Indeed, outdated does not mean 'a radically different'.
Understand how things work and you can understand changes.
When I see you, my heart goes DOKI⑨DOKI
Fish: "Gorman has been chosen for admin. After a lengthy discussion we've all decided that Gorman is the best choice for the next admin."
Technology builds off of itself.

Like the first plane, and the modern planes. To build modern planes, we had to understand how the original one worked. You see? Understanding is key to progressing.

You must know a letter or a few, to spell out a word. .
The fact is by the time you graduate from any computer related class, the information that you have learned might be (and in most cases, WILL be) outdated.

Now, no matter how advanced a computer system may be, it's all built upon the same foundation. When you major in a computer field, you are taught the foundation, and the more specific structures of what you are studying (this is where the outdated part is).

It has already been explained in a post by Gorman. Simply put though, when you get a degree in a computer field, it's more of being certified that you know the foundation of this area. C++, HTML, or Javascript in 2004 wouldn't be drastically different/outdated even in 2008 or 2010. It's how it's APPLIED and the areas that it's applied to that changes. You should be able to adapt your outdated information to modern developments, if you truly know it.

And like Cloneone1 said, Technology builds off of itself.
A computer back in the day had the same components: Motherboard, Hard Drive, Processor, Graphics Card (hopefully....), Removable Media Drive, etc... They're outdated, but knowing even outdated information allows you to comprehend the modern versions of the same category.
Proud Member of
[C3][Anime United]
As technology advances though, we still base things off the same concepts. The material taught may become outdated, the use of the future technology will most likely be similar. Also, in pursuing a career in computer technology, one can take refresher courses/continuing education. Anyways, computer technology most likely won't advance too much unless there is some BIG discovery.
I have a signature. Check out my replay thread here so I can get better and stuff. Yay!
Happy Holidays!