What talking with a friend who transitioned from marketing into cloud (AWS) and then into security, and he spends a lot of time studying to ensure he understands all the concepts required for technical discussions.
Curious to see what the community opinions are. Feel free to share your initial background as well.
Basically in-depth computer science knowledge; graph theory, automata, aspects of system programming.
I technically have a physics background coupled with a bit of self-study of pure mathematics. But those 3 categories I feel hold me back in application (in physics primarily, I don’t do real software development).
I am a self taught hack. There is so much I am unware of and things that I don’t know I should know. You shouldn’t let me near your systems. Yet people do.
Background: about 15 years as a hobbyist, mostly as a language and OS junkie, while working a variety of trades and labour jobs. I tried take the computer science aspect seriously.
Early to mid-1990s I started getting work in the field while trying to get a “real” job that matched my background. I never did find that real job, because I was getting too much work helping people and small companies learn about computers and networks, then setting up and managing their systems. In fact, I gained enough of a reputation as a trainer that I actually worked as an instructor for an accredited vocational school for a few years.
Apart from a couple of “drive bys” that I was uniquely qualified for, no real company was ever silly enough to hire me as a programmer. There were, and probably still are, vast numbers of small companies who are badly underserved. That’s where I spent my energies. No real programmer would ever think my code made sense, but without me a few small companies would still be using their computer systems as glorified typewriters and filing cabinets. And no real programmer would ever work for what those small companies could afford.
What I’m missing, and now trying to address in my retirement, is a better grasp on the theoretical underpinnings, algorithms, development processes, architecture, and maybe language design. But I can feel the draw back into relapsing as language and OS junkie, so who knows…
Credentials, that’s mostly it. I don’t have the papers that say I dedicated my time to schooling.
People will say, “that’s becoming less important, it’s how you interview and what you know.” That’s true, but only to an extent, in certain fields, and primarily in America.
Some of the articles on Hacker News don’t make sense. I can’t write a C compiler. I never had student loan debt.
Practically though, it’s moot. I took many CS and math classes at community college, but people don’t think that’s real education for some reason. I can do and understand the silly leetcode questions. I don’t think I could mathematically prove anything anymore.
I never had the money to go to university so I went and found a job instead. Learned everything I needed to from peers to specialize in Rails, then later, general web development. 13 years later, not only am I making far more money than I ever expected to, but I am very confident in my skills.
I regret nothing.
While it’s not exactly what you are asking for, I can try to answer this from PoV of someone who does have a CS degree (SW engineering bachelors, and then masters in game development and graphics).
What I’ve slowly started to realize over the (few) years working in the field out of school, the best thing that college gave me (aside from friends from the industry, which is especially important in gamedev) is not some concrete in-depth skill with so-and so language that I’ve been taught. I honestly don’t remember most of what I’ve been learning, I’ve already forgotten and would have to re-learn most of what I’ve been taught, especially in the maths and algorithm department, and even though I’ve had several in-depth courses on C, I’d probably still have to google basic syntax since I haven’t used the language for several years.
I’ve always kind of though that the school has not given me much, because of that. That I still have to re-learn things, even those I’ve passed with flying colors at, and I wasn’t sure whether it was worth it.
But then I started realizing something - compared to other colleagues who didn’t have a degree, I was usually the one coming up with solutions we could start investigating, when we were dealing with a more difficult problem, or when someone needed something written in a specific language they donť know (which happens a lot in cybersecurity), they usually came to me, and I was able to do it relatively quickly.
Why? Because while it’s true that I don’t remember implementation details or syntax of most of what I’ve been taught, I was forced to sit through hours of different algorithms and approaches for all kinds of problems, and I was forced to learn at least for that one lesson programming languages of very different flavours. Languages I disregarded as not relevant - but what I didn’t realize is that the goal wasn’t to teach me the language, but to introduce me to the overall concept the language is going for. Will I ever need Lisp, Phyro or Prolog? Probably not, but now when I see a language that works like Prolog or Smalltalk, I don’t have to struggle with not understading what the language syntax is going for - because I subconsciously recognize the concteps, and can pick up any similar language without issue. I’ve eventually realized that thanks to college I’m not a “C# programmer”, but I’m “a programmer” and so far I haven’t encountered a language I would have trouble with writing anything i needed in a reasonably short amount of time and without greater struggles. Because for all of them I’ve already been through hours of working in a language that works in a very similar fashion.
I mean, trying to write something in a language that works like Prolog, without ever seeing it before, would probably be hell, just like it was in school. But now I’m vaguely familliar with that class of languages, and I don’t have any issue with it.
And that’s something that’s really usefull, an would be hard to pick up on your own. Because I did went through a lot of vastly different languages at school, most of which I’d never touch. And you would think that you’ll probably never need to know some obscure class of languages - until you find an RCE on ancient server during pentest that executes COBOL, and you really need that reverse shell.
And the same can be said for algorithms. I’d probably still take me way longer to write some of the more advanced sorting algorithms from memory than just googling for it, and I’d probably have to invent the wheel anyway, but I still vaguely remember they exist. Data structures are really important for this, because they can make a huge difference and there’s such a large variety of them, each one being really good for some kinds of problems, but not for others. And the same goes for some general math stuff like FFT and various compression algorithms - I remember the basic idea vaguely enough so that they pop into my mind when a problem sounds similar to what it solves, so I then know what to research and rediscover whether it’s really a good fit.
And this has been especially important in rendering, because it’s suprising how many game developers have no idea how exactly does rendering work, because they never had to learn.
I’ve been through classes on low-level rendering, and I don’t remember almost anything as far as concrete math is considered, but I do have a lasting overview of vague memories of the steps included and the general idea behind it, and that’s enough to jump into unknown shader code no-one has any idea how to fix, and start noticing concepts I vaguely remember being taught, and can make changes for which I can guess what will it impact. And the same concept also applies to UML diagrams and other bullshit. Sometimes you encounter one in the wild when dealing with documentation, and it really helps that I kind of understand what it’s trying to tell.
And that’s all thanks to classes I mostly considered useless - because classes on languages I use daily haven’t really taught me much I didn’t know already, and that’s why i felt like the school isn’t worth it at the time - I’m learning nothing new, and have to waste my time with many more useless bullshit that I did begrudgingly passed.
Turns out the useless classes are what have given me the most, without me realizing it. They have given me a very broad overview and experience with stuff I’d never learn on my own, and it made it vastly easier to research and learn stuff I ocasionally need - because I already have some basic experience in that kind of problem, even though I don’t really notice it consciously.
Thank you for your detailed answer. I also had to deal with Prolog, UML and other seldom stuff, and my experience reflects yours
I’m sure there are things I’ve missed but nothing that’s impacted my career. My resume is very “well rounded” which helps a lot. In my [completely annecdotal] experience, employers who insist on degrees tend to pay less. Feedback I’ve gotten from friends who went to college tends to support this theory.
Whether you actually understand what you’re doing is what’s important. It doesn’t matter if you learned at college or on the job.
deleted by creator
You can probably learn it in an afternoon if you can already code. It’s not complicated, it’s just compact.
That said I’m not sure it’s worth the effort. I use it so rarely I always have to look up the syntax again each time I want to do something.
Always keep studying and remain curious and playful. Don’t know a word? Look it up. Write it down. Try Anki and Joplin.
Don’t be afraid to stop a discussion to ask questions. In the worst case you look like a dunce a few times but in the long run you’ll know more than the ones who don’t ask. It’s best to leave your pride at the door and focus on getting things done. Realize that it’s not about the next few years but about where you’ll be in 10 years.
Find a mentor. It takes patience but there are people looking to support others. Keep the questions flowing. And it’s sad that I have to say this but always show respect.
Debugging and refactoring work will teach you a lot. You can simply ask for it. Whenever something is unclear you should take the time to understand it, look it up, and ask around. It’s just like learning any language really.
And if you’re weak in some area you can always rely on others to help you. That’s what teamwork is and I feel a lot of people don’t understand it. Especially younger people seem to think that they have to be perfect and independent, but actually interdependence is essential. For example I never learned calculus and so if I need to integrate something complicated I’ll find the right person and they are always elated to show off their skills.
Books: Calculus. Databases. Your language. Git. Linux. AI. Whatever you fancy. Oxford press has good starting books for a variety of areas. Keep it fun for yourself, it’s a process not a goal.
All in all in my experience the difference between decent and great coders is that the great ones aren’t constantly snagged by their ego. They can take a step back from their work and feel safe in seeking out criticism. Identify with the process and not the result and you’ll do great!