By the way, you can listen to me read this post aloud on my Patreon, along with many other audio recordings. Four ginever glasses sit on a mirrored table at Distilleerderij’t Nieuwe Diep in F…
this is AI but it felt a lot more guy with broken gear
While I agree mostly with the blunt of the thesis - 80% of the job is reading bad code and unfucking it, and ChatGPT sucks in all the ways - I disagree with the conclusions.
First, gen AI shifting us towards analysing more bad code to unfuck is not a good thing. It’s quite specifically bad. We really don’t need more bad code generators. What we need are good docs, slapping genAI as a band-aid for badly documented libraries will do more harm than good. The absolute last thing I want is genAI feeding me with more bullshit to deal with.
Second, this all comes across as an industrialist view on education. I’m sure Big Tech would very much like people to just be good at fixing and maintaining their legacy software, or shipping new bland products as quick as possible, but that’s not why we should be giving people a CS education. You already need investigation skills to debug your own code. That 90% of industry work is not creative building of new amazing software doesn’t at all mean education should lean that way. 90% of industry jobs don’t require novel applications of algebra or analytical geometry either, and people have been complaining that “school teaches you useless things like algebra or trigonometry” for ages.
This infiltration of industry into academia is always a deleterious influence, and genAI is a great illustration of that. We now have Big Tech weirdos giving keynotes on CS conferences about how everyone should work in AI because it’s The Future™. Because education is perpetually underfunded, it heavily depends on industry money. But the tech industry is an infinite growth machine; it doesn’t care about any philosophical considerations with regards to education; it doesn’t care about science in any way other than as a product to be packaged and shipped ASAP to grow revenue, doesn’t matter if it’s actually good, useful, sustainable, or anything like that. They invested billions into growing a specialised sector of CS with novel hardware and all (see TPUs) to be able to multiply matrices really fast, and the chief uses of that are Facebook’s ad recommendation system and now ChatGPT.
This central conclusion just sucks from my perspective:
It’s how human programmers, increasingly, add value.
“Figure out why the code we already have isn’t doing the thing, or is doing the weird thing, and how to bring the code more into line with the things we want it to do.”
While yes, this is why even a “run-of-the-mill” job as a programmer is not likely to be outsourced to an ML model, that’s definitely not we should aspire the value added to be. People add value because they are creative builders! You don’t need a higher education to be able to patch up garbage codebases all week, the same way you don’t need any algebra or trigonometry to work at a random paper-pushing job. What you do need it to is to become the person that writes the existing code in the first place. There’s a reason these are Computer Science programmes and not “Programming @ Big Tech” programmes.
I’m probably projecting a baggage of dozens of conversations with people that unironically argue that a CS university should prepare you for working in industry as a programmer, but that’s because I can’t really discern the author’s perspective on this from the text.
In either case,
to teach actual competence despite it
I think my point is that “competent programmer” as viewed by the industry is a vastly different thing than a “competent computer scientist” in a philosophical sense. Computer science really struggles with this because many things require both being a good engineer and a good scientist? For an analogy, an electric engineer and a physicist specialising in electrical circuits are two vastly different professions, and you don’t need to know what an electron is to do the first. Whereas in computer science, like, you can’t build a compiler without knowing your shit both around software engineering and theoretical concepts.
Let me also add that I think I never wrote a post where I would more like people to come and disagree with me. I might be very well talking some bullshit based on my vibes here, since all of this is basically vibes from mingling around with both industry and academia people…
I mean, that’s a problem with the field. Most of your work will be dredging the maintenance sewers, but also you will need to know the computer science, at least to be able to spot an O(n^2) in the wild.
(the sweet spot of algorithmic complexity, so easy to get away with when n is small so you fill your codebase with them, and so certain to fuck you up the moment n gets large)
If you keep in the mind the original angst of the students “I have to learn how to use LLMs or I’ll get left behind” they themselves have a vocational understanding of their degree. And it is sensible to address those concerns practically (though as stated in another comment, I don’t believe in accepting the default use of generative tools).
On a more philosophical note I think STEM fields (and any really general well-rounded education) would benefit from delving (!) deeper in library science/archival science/philosophy and their application to history, and that coincidentally that would make a lot of people better at troubleshooting and legacy code untangling.
would benefit from delving (!) deeper in library science/archival science/philosophy and their application to history
Ooh, would you say more about this? I have opinions, but that’s because I’m a programmer now but formerly a librarian & archivist (on the digital side, it’s more common to go back and forth between them; it’s the same degree).
I’m afraid my thoughts on the matter aren’t that deep or well informed ^^.
In no particular order:
I grew up in France, and my (probably biased) view, it tends a bit more towards teaching “Literary” subjects, including for engineering students. I think in general this does indeed develop literacy and critical thinking.
France has “Professors Documentalist” and we call our school libraries “Center for Documentation and Information” from middle school up, with a few (very) introductory courses on using Thesaurus, Bibliography and digital index cards tools (this may of become enshittified by the availability of google since my time there)
I have a small Lexicography hobby.
I have a small reading old sources hobby.
I think more “Traditional” digital search is still incredibly valuable
I think principles predating the digital age are still incredibly valuable
The way STEM fields are taught is often focused on “one correct answer”, and i don’t remember that much focus being put on where the sources come from, comparing differing sources, or even any emphasis on how can be certain a given source has been accurately transmitted to the present age in history.
I think information retrieval is a vital skill (especially with the enshitification of google) that all fields when benefit practitionners from being more comfortable with (though of course it’s still its own job).
I think software engineers in particular, during their education, would be well served by practical examples of reconciling conflicting or uncertain sources, and I think history is a good lens (less abstract vs software).
I didn’t get the vibe she agreed with it, I got the sense she was exasperated but practical about it. Her students are career driven, in a world that told them until two years ago that this expensive credentialing was the key to becoming silicon valley rich.
Separately, it’s a well-established point of concern that a computer science degree is inapplicable to the work of the vast majority of people who become working, non-academic software engineers, and that while there are valuable things an academic program could teach pre-professional developers that too few engineers understand, that’s not the focus of CS. The reality (in the US at least) is that a CS degree is sold as vocational program by the universities, and many jobs list a CS degree as a requirement or a desired skill. The author’s students paid almost $7000 for her course alone. Whether those facts should be true is up for debate, but that’s the reality in which the author is teaching.
The author is open that she became a programmer for financial stability, which is the world most of us live in. I enjoy writing code and being creative, but I work in software development to eat.
The reality (in the US at least) is that a CS degree is sold as vocational program by the universities, and many jobs list a CS degree as a requirement or a desired skill. The author’s students paid almost $7000 for her course alone.
Well, it’s very hard for me to have a discussion about philosophical merits of education when the context is the USA where education is so fundamentally fucked. It might as well be that the best course of action for the well-being of students is to make sure they at least get bang for their buck, but that’s a systemic problem one level below what I’m talking about even. I don’t want to discount this as a reality for actual people on the ground - I think then the correct position is not my waxing philosophical about contents of courses, but rather nailing everyone against free public education in the US government to a fucking wall.
and many jobs list a CS degree as a requirement or a desired skill
This is, I think, a symptom of this push-and-pull between industry and academia. The industry would want to have a CS degree mean that they’re getting engineers ready to patch up their legacy code, because they would much rather have the state (or the students themselves in the USA case) pay for that training than having to train their employees themselves. But I suggest that the correct default response to industry’s wants is “NO.” unless they have some really good points. Google can pay for their employees to learn C++, but they won’t pay a dime to teach you something they don’t need for their profit margins. Which is precisely the point of public education, teaching you stuff because it’s philosophically justified to have a population that knows things, not because they lead to $$$.
Yeah, that’s a huge problem with private education. If it’s expensive to the student, they want a profit. If the uni is expensive to run and privately funded, they want rich alumni. (And sadly, even in public universities in the US, the funders have a horrifically profit motivated view: the purpose of public education is to produce a highly trained body of workers. The crisis in American higher ed is deep right now; lawmakers and academic administrators fundamentally don’t believe in the humanities.)
Still, part of this is CS’s fault as a field. You mentioned to David the difference between engineering and physics, and in most places, those are different academic fields of study. Both valuable, but different. Why shouldn’t CS do the same?
I’ve found that most of the best working application programmers I’ve worked with have a liberal arts background with a humanities focus, because the training leads to a more holistic view of complex systems, and a better ability to work with potential user needs, and for programming closer to the user in a chaotic system, that can be more useful than understanding NP completeness and context free grammars.
Tl;dr I think we’re violently agreeing with one another. US universities shouldn’t be so aggressively focused on turning out graduates who will become productive, rich worker bees, and using an academic field of study to do so is corrupting the academic field & not ideal for the students.
From the pov of a slightly exhausted prof who just wants a short-ish answer for her students, the conclusion sorta makes sense, I guess. The students want to convince themselves they aren’t wasting their time with genAI and she’s not in a position to convince them otherwise, so the next best thing is showing them what industrial life with genAI will be like.
“The future you’re dreaming of sucks, so get used to it.” isn’t a satisfying answer, but its a forced perspective.
The artlicle certainly feels blasé ^^, I think the most objectional part is:
Large language models shift even more of that time into investigation, because the moment the team gets a chance to build, they turn around and ask ChatGPT (or Copilot, or Devin, or Gemini) to do it. When we learn that we need to integrate with google cloud storage, or spaCy, or SQS Queue, or Firebase? Same thing: turn around and ask the LLM to draft the integration.
Now clearly (to me) the author isn’t happy about this, but I think they are giving hope on the direction of the profession too soon. There are still plenty of people happy enough to implement things themselves.
While I agree mostly with the blunt of the thesis - 80% of the job is reading bad code and unfucking it, and ChatGPT sucks in all the ways - I disagree with the conclusions.
First, gen AI shifting us towards analysing more bad code to unfuck is not a good thing. It’s quite specifically bad. We really don’t need more bad code generators. What we need are good docs, slapping genAI as a band-aid for badly documented libraries will do more harm than good. The absolute last thing I want is genAI feeding me with more bullshit to deal with.
Second, this all comes across as an industrialist view on education. I’m sure Big Tech would very much like people to just be good at fixing and maintaining their legacy software, or shipping new bland products as quick as possible, but that’s not why we should be giving people a CS education. You already need investigation skills to debug your own code. That 90% of industry work is not creative building of new amazing software doesn’t at all mean education should lean that way. 90% of industry jobs don’t require novel applications of algebra or analytical geometry either, and people have been complaining that “school teaches you useless things like algebra or trigonometry” for ages.
This infiltration of industry into academia is always a deleterious influence, and genAI is a great illustration of that. We now have Big Tech weirdos giving keynotes on CS conferences about how everyone should work in AI because it’s The Future™. Because education is perpetually underfunded, it heavily depends on industry money. But the tech industry is an infinite growth machine; it doesn’t care about any philosophical considerations with regards to education; it doesn’t care about science in any way other than as a product to be packaged and shipped ASAP to grow revenue, doesn’t matter if it’s actually good, useful, sustainable, or anything like that. They invested billions into growing a specialised sector of CS with novel hardware and all (see TPUs) to be able to multiply matrices really fast, and the chief uses of that are Facebook’s ad recommendation system and now ChatGPT.
This central conclusion just sucks from my perspective:
While yes, this is why even a “run-of-the-mill” job as a programmer is not likely to be outsourced to an ML model, that’s definitely not we should aspire the value added to be. People add value because they are creative builders! You don’t need a higher education to be able to patch up garbage codebases all week, the same way you don’t need any algebra or trigonometry to work at a random paper-pushing job. What you do need it to is to become the person that writes the existing code in the first place. There’s a reason these are Computer Science programmes and not “Programming @ Big Tech” programmes.
It didn’t read to me like she was a fan of this shit at all, but was despairing of it and looking for ways to teach actual competence despite it
I’m probably projecting a baggage of dozens of conversations with people that unironically argue that a CS university should prepare you for working in industry as a programmer, but that’s because I can’t really discern the author’s perspective on this from the text.
In either case,
I think my point is that “competent programmer” as viewed by the industry is a vastly different thing than a “competent computer scientist” in a philosophical sense. Computer science really struggles with this because many things require both being a good engineer and a good scientist? For an analogy, an electric engineer and a physicist specialising in electrical circuits are two vastly different professions, and you don’t need to know what an electron is to do the first. Whereas in computer science, like, you can’t build a compiler without knowing your shit both around software engineering and theoretical concepts.
Let me also add that I think I never wrote a post where I would more like people to come and disagree with me. I might be very well talking some bullshit based on my vibes here, since all of this is basically vibes from mingling around with both industry and academia people…
I mean, that’s a problem with the field. Most of your work will be dredging the maintenance sewers, but also you will need to know the computer science, at least to be able to spot an O(n^2) in the wild.
(the sweet spot of algorithmic complexity, so easy to get away with when n is small so you fill your codebase with them, and so certain to fuck you up the moment n gets large)
If you keep in the mind the original angst of the students “I have to learn how to use LLMs or I’ll get left behind” they themselves have a vocational understanding of their degree. And it is sensible to address those concerns practically (though as stated in another comment, I don’t believe in accepting the default use of generative tools).
On a more philosophical note I think STEM fields (and any really general well-rounded education) would benefit from delving (!) deeper in library science/archival science/philosophy and their application to history, and that coincidentally that would make a lot of people better at troubleshooting and legacy code untangling.
Ooh, would you say more about this? I have opinions, but that’s because I’m a programmer now but formerly a librarian & archivist (on the digital side, it’s more common to go back and forth between them; it’s the same degree).
I’m afraid my thoughts on the matter aren’t that deep or well informed ^^.
In no particular order:
I’d be interested in your perspective!
I didn’t get the vibe she agreed with it, I got the sense she was exasperated but practical about it. Her students are career driven, in a world that told them until two years ago that this expensive credentialing was the key to becoming silicon valley rich.
Separately, it’s a well-established point of concern that a computer science degree is inapplicable to the work of the vast majority of people who become working, non-academic software engineers, and that while there are valuable things an academic program could teach pre-professional developers that too few engineers understand, that’s not the focus of CS. The reality (in the US at least) is that a CS degree is sold as vocational program by the universities, and many jobs list a CS degree as a requirement or a desired skill. The author’s students paid almost $7000 for her course alone. Whether those facts should be true is up for debate, but that’s the reality in which the author is teaching.
The author is open that she became a programmer for financial stability, which is the world most of us live in. I enjoy writing code and being creative, but I work in software development to eat.
Well, it’s very hard for me to have a discussion about philosophical merits of education when the context is the USA where education is so fundamentally fucked. It might as well be that the best course of action for the well-being of students is to make sure they at least get bang for their buck, but that’s a systemic problem one level below what I’m talking about even. I don’t want to discount this as a reality for actual people on the ground - I think then the correct position is not my waxing philosophical about contents of courses, but rather nailing everyone against free public education in the US government to a fucking wall.
This is, I think, a symptom of this push-and-pull between industry and academia. The industry would want to have a CS degree mean that they’re getting engineers ready to patch up their legacy code, because they would much rather have the state (or the students themselves in the USA case) pay for that training than having to train their employees themselves. But I suggest that the correct default response to industry’s wants is “NO.” unless they have some really good points. Google can pay for their employees to learn C++, but they won’t pay a dime to teach you something they don’t need for their profit margins. Which is precisely the point of public education, teaching you stuff because it’s philosophically justified to have a population that knows things, not because they lead to $$$.
Yeah, that’s a huge problem with private education. If it’s expensive to the student, they want a profit. If the uni is expensive to run and privately funded, they want rich alumni. (And sadly, even in public universities in the US, the funders have a horrifically profit motivated view: the purpose of public education is to produce a highly trained body of workers. The crisis in American higher ed is deep right now; lawmakers and academic administrators fundamentally don’t believe in the humanities.)
Still, part of this is CS’s fault as a field. You mentioned to David the difference between engineering and physics, and in most places, those are different academic fields of study. Both valuable, but different. Why shouldn’t CS do the same?
I’ve found that most of the best working application programmers I’ve worked with have a liberal arts background with a humanities focus, because the training leads to a more holistic view of complex systems, and a better ability to work with potential user needs, and for programming closer to the user in a chaotic system, that can be more useful than understanding NP completeness and context free grammars.
Tl;dr I think we’re violently agreeing with one another. US universities shouldn’t be so aggressively focused on turning out graduates who will become productive, rich worker bees, and using an academic field of study to do so is corrupting the academic field & not ideal for the students.
From the pov of a slightly exhausted prof who just wants a short-ish answer for her students, the conclusion sorta makes sense, I guess. The students want to convince themselves they aren’t wasting their time with genAI and she’s not in a position to convince them otherwise, so the next best thing is showing them what industrial life with genAI will be like.
“The future you’re dreaming of sucks, so get used to it.” isn’t a satisfying answer, but its a forced perspective.
The artlicle certainly feels blasé ^^, I think the most objectional part is:
Now clearly (to me) the author isn’t happy about this, but I think they are giving hope on the direction of the profession too soon. There are still plenty of people happy enough to implement things themselves.