That's the reason why CS classes and degrees exist. Or at least it should be.
No company will pay you for several years just to learn complexity theory, write sorting algorithms, study schedulers, rewrite UNIX tools, understand hardware architectures, etc... They want you to be productive right away, and the shortest path is just to follow recipes with the latest framework.
That's the point of education, universities teach you what you won't learn on the job. So that when you finally get to work, you will have a better understanding of what you are doing.
The carpenter doesn't need a degree. The architect and structural engineer do.
If you plan to be a programmer all your life, and a journey-man one at that, then feel free to learn-programming-in-21-days.
If you're planning a career in software, then I think understanding fundamentals (database normalisation, the Order of a solution, the ideas of Coupling and Uncoupling, memory usage versus performance, the impact of CPU cache, multi-threading safely, operating system task priorities, compilers, and a good few more things that underpin the science behind building software, then a degree is helpful.
Not because of the books, you can get those at home, but because of the structure (ie the path through the carriculim) the mentorship from professors, TAs and so on, and most of all the exposure to your peers - the wonder of collectively pushing each other to explore beyond the curriculum, beyond just this week's assignment.
Most of all it teaches intellectual curiosity- that spark some people have to see something in code they don't recognise, understand the novelty of it, and go and find out about it. It's easy to keep up because you have the structural foundations to absorb new things all the time.
ChatGPT can't do the things I do, and even calling it ChatGPT is a shallow understanding of why not. Broadly speaking LLMs are a tool I can use to learn new things, which is great, it knows how to weild many kinds of hammer. But it doesn't understand the nuance and context of building a whole building. One which has never been built before.
Precisely because LLMs are good at regurgitating the past, but understand literally nothing, they only replace programmers who are good at regurgitating the past (writing code) but understand very little about what that code -means- (as distinct from what it -does-)
Of course each College is different, and YMMV, but if you gave the time, and resources, and opportunity to do a formal degree, then I think it worth it in the long run. But like most things in life you get out what you put in. The curriculum is just the hint of a starting point to what you learn there, don't go to "be taught", go to suck the marrow from each moment, to actively "learn" from the challenges you set yourself.
The way I like to explain the difference between the two approaches — and therefore the two educational requirements — you are outlining here is the difference between being a software developer and a software architect. A software developer is focused on developing a certain feature or fixing a certain bug within the context of an architecture that's already determined, rarely making decisions about how to structure the overall data or control flow of a system or what abstractions the system uses, but accepting what's already there; meanwhile, a software architect is going to be making those more broad abstract philosophical decisions that form the framework of how further features are implemented.
This is not to suggest that there is a rigid distinction between the two — all software architects are going to also be software developers in the process of building an architecture, or when new architecture is not needed, and many software developers will occasionally act as software architects. Nor is this intended to suggest that the architecture of a software project must be planned out in advance with all of the concepts and data and control flow either — just that there are points in the process of developing software where these large-scale architectural decisions about what abstractions to use and the flow between them must be made. But there do seem to be two somewhat distinct "hats" people who program wear.
does what you’re making matter to people or not? It’s like that no matter what you’re doing. Y’all can sit around here and discuss, “what is a software architect” all you want.
By the time he was done formalizing what needs to be done and philosophizing about the best approach (while actively trying to prevent the engineers from hacking away), the programmers and hardware engineers delivered a production prototype that works just fine and that people want to buy.
I wouldn't say that the outcome is necessarily the same all the time, but the getting-nothing-of-any-value-done architect seems to be a common occurrence. We had one that by proxy stopped development on at least 3 projects I was supposed to be involved in because people wanted to make him part of the process, despite me having 10 or so years more experience and an evident track record of successfully greenfielding projects versus his 0. He ended up being literally the only person ever at our company to effectively get demoted.
Previously the same company had hired one of my previous coworkers as a "software/solutions architect" and he had basically the same trajectory but with him at least I suspect he was just burned out. With that said, seeing someone has the title "Software Architect" is definitely a signal to pay attention to whether this person is even remotely competent and/or produces anything of value. If they give off the impression that they are just supposed to hand off designs to someone else to implement you know you have a complete dud and a moron on your hands.
I think its good to separate function from title here. There are indeed good architects and bad architects - may you get the opportunity to work with the good ones.
Personally I find that those who ultimately build the thing they architect are the best to work with. As the build they gain experience, they bridge the gap between theory and practice, and the feedback loop leads to better architecture and better code.
You're true, my best co-workers have been architects who also where tech-leads and great ICs
Your comment made me think how any full-time uni professor is -suspicious- of not interacting with the real world. Any -good- professor should be doing research or on the private sector half of his time to not be just a bookworm
> Y’all can sit around here and discuss, “what is a software architect” all you want.
I think you're on to something. We ought to hire an architect architect to architect the role of the architect. Maybe we can build an architect factory to abstract that away and have architects set up on demand...
I don't think it's feasible to have 1 decent developer micromanage a team down to the function and data structure definition. At that point it's faster to fire the lot and just let the one developer you have do the job alone.
You have such an idyllic view of the college environment. My experience was struggling to motivate my teammates to do anything beyond the minimum, and an endless barrage of "will this be on the exam?" questions during class.
Looking back it's no wonder I am a self-learner now.
Not all colleges are the same. I had a fantastic time at my college.
For example, in our algorithms course, we needed to implement a lot of small algorithms for various things in Java. (Eg A-star, quicksort, etc). For each algorithm, we were provided a standard API to implement against so they could use automated marking. Well, me and some friends made our own benchmarking harness and web frontend around that api. Before the assignment was due, we would all upload our .class files (and upload test cases), and compete to see who wrote the fastest code.
I think I learned at least as much from doing that as I did from the algorithms course itself.
We didn’t have that much fun in all of our classes. But I cherish a lot of memories from that time in my life. I’m really glad I went to college and wouldn’t trade it for anything.
I guess less idyllic view, and more idyllic experience. I was fortunate to find like-minded folk in my class, and fortunate enough to have faculty that allowed us mere undergrads access to equipment "beyond our pay grade".
Obviously the bulk of the class was there to pass, maybe 10% of us pushed the limits, learning the craft beyond the carricilum. We were probably 10 folk or so from a class of over 100.
It was perhaps easier in my day - we had labs with machines no person owned. So you could spend time in the lab beyond the necessary, and the others would be there too.
I imagine today there's less shared working space like that, but hopefully there are still ways to find like-minded souls.
Self learning is great, and we all do that a lot now, but I've been fortunate to find others in my work community who still enjoy sharing, learning, and teaching. Its a bit less lonely that way, and none of us have all the answers.
In my BS I had just 1 team project, so it wasn't a big deal.
In my master I had more… and yes there were people who didn't do anything, people who did much more than me, people who were good at writing reports but needed intervention because ultimately they didn't even understand what our project was about.
I asked the professor once to dump a guy, because he said we should meet on a sunday morning to do our assignment, then he showed up 2 hours late, I was almost done with it, and he started by reading the 1st slide of the 1st lecture (with the name of the course, the email of the professor and so on).
The best college course coding projects are the solo ones. I consider myself fortunate to have gone to a school where 95% of the code projects were solo (and about half of them were in C, too!).
True, but seeing others exercise it can release it in you as well.
College does not "give" you anything. It never has. All it has to offer is opportunities for you to "take". You get out of college what you explicitly "take" from it. You can class through taking and passing classes, or you can actively search out every possible opportunity, stretch every boundary, suck every bit of marrow from the bone.
Ultimately college can be a time-passing exercise in fruitless make-work. Or it can be the foundation to an amazing career. Only you can determine which it is though, not yhe College itself.
Churning out scripters and giving them script frameworks and building whole products and businesses on top of those products has got to be the worst approach to labor shortages ever in the history of work. Oh the single process isn't a good load bearing structure to rest your entire network workload on? Let's create asynchronous abstractions and pack everything in portable process isolation environments and oh now we need a whole distributed OS to orcherstrate these processes etc...
I've done both, degree first, then years later bootcamp, when I needed to update my skills. I liked both. IMO, people should choose what suits their needs best.
"No company will pay you for several years just to learn"
No, they never did. But to stay with the hammer metapher - young carpenters would indeed learn on the job, how to use a hammer and all the other tools. But they would not get the same pay (or even no pay and just food and housing).
In general it used to be way more common, that companies invested in peoples learning, expecting payoff much later. But with high mobility nowdays, they seldom think it is worth it anymore. You teach and then they thank you and move on.
>But with high mobility nowdays, they seldom think it is worth it anymore. You teach and then they thank you and move on.
The high mobility part is overblown.
The biggest teaching factories are also the ones who spend exorbitant amounts on bureaucracy and accessories while gratuitously taking advantage of young people eager to prove themselves, having little to no responsibilities and no understanding of the professional world. Their whole shtick is to find suckers willing to stick after, all the while taking advantage of naïve youth.
The vast majority of people don't move if you keep their pays on actual market rates. Interviewing is a chore. Moving for jobs is a chore. Most people hate getting out of their comfort zone. Yet, many will push individuals to 'prove themselves' first, having any pay raises lag behind for several years, where individuals find themselves getting their promotion's worth of money only after moving to a different company.
It's the companies that have optimized for this behavior and chosen internal promotions should be few and unrewarding. Not the other way around. God forbid they reap what they sow.
> But with high mobility nowdays, they seldom think it is worth it anymore. You teach and then they thank you and move on.
You mean you tell them you will teach them, require them to have the skills to begin with anyways, work them like any other employee but with a fraction of the cost promising them full employment at the end, and at the end thank them and tell them to move on?
On the other hand young workers usualy greatly overestimate their impact on getting things done and the work required to check their work. (I know I was like that)
On the other hand, those kind of "fake" internships are rampant here in France, specially with smaller companies or startups. From personal experience they don't bother checking their work. And how could they, the other employees are interns too.
I used to work in France, I was employed by a French company. I do remember the contractors that were French were pretty disgruntled. But I certainly miss being over there. The work culture is so much better than in the US. The pay may be suppressed, but at least everyone gets to enjoy life. Although, at the company I worked at, the pay was not as far off from the US side as the French folks thought it was. Definitely not when you consider how much better they were treated.
Cynically it feels to me like it's just a nice way for the companies nearby to get free labor.
My assumption for that is just that it's just supposed to be an internship somewhere where our focus would be to study and thus we shouldn't get any benefits from it, but I've never heard of it working that way. It was always in IT here just being an unpaid junior.
A lot of people think CS is "learning how to program in xyz."
In fact programming is just a tool used to implement CS ideas and demonstrate their application in the real world.
In my degree we spent maybe the first 6 weeks on actual learnjng-to-program (in turbo pascal.) Then another later on in Scheme. In 3nd year we did C, I don't recall what instruction we had there - maybe a week? Then 2 weeks in 3rd year where we did 10 different languages in 10 days.
Language was considered a distraction from the science part.
No, it's not reading books about swinging hammers. We were learning about physical forces on wood, and by wood, and other construction materials. We learned the difference between a spice rack, the drawer, and the 40-floor building to house the spice rack.
Sure we wielded the hammer like Thor. But the focus was on the hammered not the hammer.
I'd wager that I'm interested in a niche of CS, mostly the internet, and websites, and apps and what not. So when I tried to learn C and code unix tools I just felt miserable tbh.
Science !== building
I agree langauge and syntax are just distractions from the -building- part.
In software engineering there's infinte hammers and you can build new hammers out of old hammers btw
During my PhD (not in CS) I met a CS PhD who was working on parallel algorithms. At the time I was struggling with large-scale simulations and HPC stuff so I got very interested. I asked him what programming languages he used.
"Oh I don't know how to code. We don't write programs in CS Theory."
This is honestly part of the reason I abhor statements around "AI/ML will make software developers irrelevant in [insert catchy timeframe]"; the complexity in many (if not most) systems is not the coding. The interaction, the boundaries between systems, the agreements between them, and the subtle nuances among; this is where Things Get Hard.
Then there are the requirements gathering that lead us down this road. And the stakeholders that forgot important details. And that one team that has a hard production dependendency on an obscure DB table you only keep around because it simplifies a join somewhere.
Teach students of CS the latter, and they have a much greater opportunity to be successful.
If LLMs keep improving at the current rate and get to the point where they can reduce hallucinating to a reasonable level, I don't see why they couldn't take on the challenge of abstract system architecture. It's still a problem that can be stated in natural language, which they keep getting better at 'understanding' (at least in the sense of giving more coherent answers).
> CS classes and degrees are akin to reading books about swinging hammers rather than just swinging the hammer.
Computer Science is akin to learning how to forge a hammer, what materials to use in said hammer, and then determining what size and shape is applicable for a given task.
Sometimes the specified hammer is made. Most times it is not.
"Swinging hammers" is rarely, if ever, considered.
I think the "hammer" metaphor is insufficient to cover why one would get a degree or not. I learned a number of fascinating things about computers in University that I would have never gotten on the job.
This sort of fits with your analogy. When I did my Manufacturing Engineering degree all I was interested in was CNC, CAD/CAM and rapid prototyping tools that existed before modern 3D printing.
After a career completely unrelated to manufacturing I've come back to basic metalwork as a hobby. What interests me is working with non-computer driven tools like lathes and mills.
What really surprises me is how much can be achieved just with hand tools like saws, files and chisels. I have books on filing as a way to shape metal by hand.
This is worth knowing because sometimes it's sometimes the quickest way get something done. Especially, when the first part of an alternative process would be "order tool X from the internet".
So it goes with solving problems with software. It's sometimes quicker to implement something I learnt on my second degree in CS than to spend time searching to see if a well supported library covers exactly what's needed for a very specific, temporary use-case.
Depends on the university. My experience was that you spent more time then not having to swing, often in ways that would be applicable to actual carpeting rather than building wooden Monads nobody wants to use.
In germany you are expected to go on a three year apprenticeship when you want to do a job which needs practical skills.
You work part time at a company and go part time to school, combined thats between 35 and 40 hours a week. You don't get paid that much during that time, but if someone pays for your room it's enough.
Without going that route you literally don't get a full paid job.
I agree but it's not that you won't need these things but that you don't know when or if you'll need them.
For a professional MBA that's taken over a department, it's not so much that they won't pay you to learn what's not definitely part of the job but that they can get away without doing so. It appear superficially to be the low risk option.
It's the same reason that manufacturing gets outsourced instead of invested in as a core competency that sets you apart from the competition.
To a lot of people anything that makes the numbers go in the right direction is what's important. They'll even pretend that the sole fiduciary duty is to raise short-term profits at the expense of long term value.
When so many of the world's largest corps are now hollowed out marketing operations reselling generic products, it's hard to argue against using the latest trending framework.
Things like Javascriptin30.com (no affiliation) are plenty to learn how to code vanilla before learning why (or when) things like libraries and then frameworks may have value.
No company will pay you for several years just to learn complexity theory, write sorting algorithms, study schedulers, rewrite UNIX tools, understand hardware architectures, etc... They want you to be productive right away, and the shortest path is just to follow recipes with the latest framework.
That's the point of education, universities teach you what you won't learn on the job. So that when you finally get to work, you will have a better understanding of what you are doing.