Lately, I was going through the blog of a math professor I took at a community college back when I was in high school. Having gone the path I did in life, I took a look at what his credentials were, and found that he completed a computer science degree back sometime in the 1970s. He had a curmudgeonly and standoffish personality, and his IT skills were nonexistent back when I took him.

It’s fascinating to see the perspectives on computing and how many of the things I learned in my undergraduate were still being taught way back to the 1950s. It also seems like the computer science degree was more intertwined with its electrical engineering fraternal twin.

Although the title of this post is inherently provocative, I’m curious to hear from those of you who did computer science, electrical engineering, or similar technical degrees in decades past. Are there topics or subjects that have phased out over the years that you think leave younger programmers/engineers ill-equipped in the modern day? What common practices were you happy to see thrown in the dumpster and kicked away forever?

The community also seems like it was significantly smaller back then and more interconnected. Was nepotism as prevalent in the technology industry then as it is today?

This is just the start of a discussion, please feel free to share your thoughts!

  • mindlight@lemm.ee
    link
    fedilink
    English
    arrow-up
    20
    ·
    edit-2
    9 months ago

    No degree at all but been working IT since the 90’s.

    It’s fun that when I started in IT everything went from centralized (mainframe and terminals) to decentralized (PC). Then came Citrix and everything went towards centralized. Smartphones and apps came so we went decentralized again. Then cloud came and we essentially went centralized again.

    It’s all about trends, the pendulum swings back and forth…

  • jollyrogue@lemmy.ml
    link
    fedilink
    English
    arrow-up
    19
    ·
    9 months ago

    The prevalence of FOSS software is amazing.

    Linux distros, BSDs, GCC, LLVM, GNU tools… The equivalent stack in the 90s was expensive, proprietary, and rare. I was getting software from magazine CDs, and none of the expensive tool chains were showing up on them.

    Free DVCS in Git is also great. No manual versioning schemes anymore. git init for a new repo. There was SVN, but it required a server.

  • darkpanda@lemmy.ca
    link
    fedilink
    English
    arrow-up
    12
    ·
    9 months ago

    Donald Knuth, author of The Art Of Computer Programming, basically our bible, famously doesn’t use email.

  • Cyborganism@lemmy.ca
    link
    fedilink
    English
    arrow-up
    11
    ·
    9 months ago

    I did a computer science degree in the equivalent of a community college then followed with software engineering in an engineering university. Graduated in 2008.

    I find that software development and IT in general was a lot simpler back then than today. Nobody required any kind of certification to get a job.

    Early 2000s, when you had a problem in your project, you really had to mess around and try things to find the solution. You couldn’t really so much on stackoverflow or similar sites.

    • chilicheeselies@lemmy.world
      link
      fedilink
      English
      arrow-up
      11
      arrow-down
      1
      ·
      9 months ago

      Things are much simpler now. Even little things. For instance error messages. They used to be cryptic as hell, but these days there is more of an emphasis on communication.

      The only thing more complex is the volume of choice. There are just soooo many ways to do something that picking a way can be daunting. Its led to a situation where you have to hire based on ability to learn rather than ability with a specific toolchain.

      • AggressivelyPassive@feddit.de
        link
        fedilink
        English
        arrow-up
        15
        arrow-down
        1
        ·
        9 months ago

        I wouldn’t say that.

        Software today, in the real business world, is extremely complex, simply because of all the layers you have to understand.

        Today I have to know about Kubernetes, Helm, CI/CD, security/policy scanners, Docker, Maven, Spring, hibernate ,200 libraries, Java itself, JVM details, databases , and a bit of JavaScript, Typescript, npm, and while we’re at it, react. And then of course the business logic.

        I’d argue, in today’s world, nobody actually understands their software completely. I’m not sure, when exactly the shift from raw dogging assembler and counting cycles to the mess of today happened, but I’d argue, software today is much much more complex and complicated.

        • rottingleaf@lemmy.zip
          link
          fedilink
          English
          arrow-up
          3
          ·
          9 months ago

          I’d argue, in today’s world, nobody actually understands their software completely. I’m not sure, when exactly the shift from raw dogging assembler and counting cycles to the mess of today happened, but I’d argue, software today is much much more complex and complicated.

          It seems from history that things nobody actually understands eventually implode and are rebuilt from scratch, Samsara wheel and all that.

          I’m waiting with anticipation for Kubernetes and the Web (as it exists) to die, die, die.

          I’m not sure, when exactly the shift from raw dogging assembler and counting cycles to the mess of today happened, but I’d argue, software today is much much more complex and complicated.

          There are concepts of layering and modularity. At some point the humanity in general has mixed them up. One can treat a module like a black box in design. One can’t treat layers as if they were completely independent, because errors and life cases cross layers. I mean, the combinatorics of errors in module separation and layer separation are different.

          Ah, many words to try to say a simple thing.

          • AggressivelyPassive@feddit.de
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            1
            ·
            9 months ago

            I thinke leaky abstraction is the word you’re looking for, and I agree.

            My goto comparison is file systems. I can call open() on a file without being bothered about almost anything behind the file descriptor. I don’t care whether it’s a ramdisk, an SSD, a regular hard, SMB mount over Tokenring or whatever. There is a well defined interface for me to work on and the error cases are also well defined. The complexity is hidden almost completely.

            But if I want to do anything in k8s, the interface is usually exposing me to anything that goes wrong under the hood and doesn’t hide anything at all.

            The absolute worst example of the is Helm. It adds almost nothing for 99% of the use cases except complexity, actively stands in your way many times and the entire functionality actually used in most cases is Bash-style Variable expansion.

            • rottingleaf@lemmy.zip
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              1
              ·
              9 months ago

              I thinke leaky abstraction is the word you’re looking for, and I agree.

              More or less yes.

              +1 for Helm hate.

      • Cyborganism@lemmy.ca
        link
        fedilink
        English
        arrow-up
        5
        ·
        9 months ago

        Things are much simpler now. Even little things. For instance error messages. They used to be cryptic as hell, but these days there is more of an emphasis on communication.

        I don’t know. I worked with old IDEs like Turbo C, Turbo Pascal or some of the earliest versions of Visual Studio and I was able to know from the output where the error occurred from the error message and could work with a debugger to find the true source.

        The only thing more complex is the volume of choice. There are just soooo many ways to do something that picking a way can be daunting. Its led to a situation where you have to hire based on ability to learn rather than ability with a specific toolchain.

        That’s true though. And also how IT evolved into different other fields, line DevOps for example. Now there’s an *Ops for almost everything. The latest bring machine learning. And each have their own million ways to do things with so many certifications.

    • Lmaydev@programming.dev
      link
      fedilink
      English
      arrow-up
      7
      ·
      9 months ago

      I’m around the same time. At my first job I had a stack of reference books for the languages and technologies I use.

      I would actually argue things are simpler now based purely on the fact that there is such a big ecosystem of libraries and services available that you don’t need to write everything yourself.

      It’s true it can be hard to find what you want but I’ve found LLMs (at least ones that search the internet like copilot) are really good at pointing you to existing options.

  • dirthawker0@lemmy.world
    link
    fedilink
    English
    arrow-up
    9
    ·
    9 months ago

    I think I got my CS degree too early, i.e. before the web was a thing. Basically, things have changed so much from the late 80s to now that everything except the basics are all out of date. I was in the school of Math as opposed to Engineering, so we were coding in Pascal and doing simulations and stuff. I think it would have been better to learn C, though obviously that’s in hindsight. I did take a class in DBMS which served me well some 20 years later when I became a database manager/developer because that language did not change too much. OOP I had to learn from scratch and it was a bit mind blowing.

    I’ve been using Android Studio and Visual Studio code and it’s annoying that stuff is constantly getting updated, but also amazing that these IDEs take care of so much of that stuff for you. Even when I started coding Android about 9-10 years ago you had to manually download and install all these stupid packages. Now the IDE just announces it’s doing it and you go get a cup of coffee and wait for it to finish.

  • Treczoks@lemmy.world
    link
    fedilink
    English
    arrow-up
    7
    ·
    9 months ago

    I learned things like logic circuits in my CS course in school. And processor basics. I built an elevator control with flipflops and logic gates.

    Today, some students have FPGA courses, and it is a total mess. If it is no high-level language or web-enabled, it simply is an unsurmountable mystery for most of them.

  • HarriPotero@lemmy.world
    link
    fedilink
    English
    arrow-up
    7
    ·
    edit-2
    9 months ago

    It feels like many positions today don’t deal with things that you couldn’t learn in a 6 month boot camp aimed at a particular stack.

    I did my computer engineering degree in the early 2000s, and we still had a lot of those early day concepts. All from digital electronics, to processor and compiler design. Lots of focus on the formal methods to prove the correctness of software. Plenty of programming paradigms. None of my professors had a degree in CS. There was no CS when they were studying. They all had math degrees and a love for logic and automata theory.

    I can’t say that I’ve actively used it outside of academia, but I think that it has set me up to be a life long quick learner of everything happening in this fast-paced field. Most roles might be working with high level languages today, but those roles wouldn’t exist unless capable people build the compilers, drivers and hardware.

    The field needs people who will comb through specifications instead of searching stackoverflow to figure out things. (I guess asking ChatGPT or copilot are the new stackoverflow)

    I have a guilty pleasure in old things. The Computer Chronicles have all their episodes on youtube, and their analysis of the news in the 80s have held up remarkably well. I’ve also been reading Hollingdale’s Electronic computers. Computers are still just Von Neumann architecture no matter how many abstraction layers we build on top of it.

  • Digital Mark@lemmy.ml
    link
    fedilink
    English
    arrow-up
    5
    ·
    9 months ago

    In the good old days, you had to learn assembly/machine language, C, and OS-level programming to get anything done. Even if you mostly worked on applications, you’d drop down and do something useful. At the time, this was writing machine language routines to call from BASIC. This is still a practical skill, for instance I mostly work in Scheme, but use C FFI to hook into native functionality, and debug in lldb.

    Computer Science is supposed to be more math than practical, though when I took it we also did low-level graphics (BIOS calls & framebuffers), OS implementation, and other useful skills. These days almost all CS courses are job training, no theory and no implementation.

    Younger programmers typically have no experience below the application language (Java, C#, Python, PHP) they work in, and only those with extensive CS degrees will ever see a C compiler. Even a shell, filesystems, and simple toolchains like Make are lost arts.

    The MIT Missing Semester covers some of the mid-high levels of that, but there’s no real training in the digital logic to OS levels.

  • code@lemmy.zip
    link
    fedilink
    English
    arrow-up
    5
    ·
    9 months ago

    Being a self taught programmer (6502 assembly on a vic-20) All my first few jobs were basically here are the manual you’re the new (banyan vines,netware,sunos,oracle, sql server etc) expert.

    My son got a infosys degree and all his jobs required additional certs. Helpng him navigate that was mind numbing

    What i will say. Develop your network full stop. I got every single good fit job by references. All my shit jobs were headhunters or cold applications.

    As far as tech goes. Everything is so nuanced and dependant today i feel it make everything overly complicated and harder to baseline and run. Let alone debugging.

    I am sooo glad i just retired (early) and dont have the need to delve into current tech for anything now besides hobby stuff.

  • Bye@lemmy.world
    link
    fedilink
    English
    arrow-up
    4
    ·
    9 months ago

    I went to school in the early 90s and we didn’t really have IDEs. Just text editors with regular expression find and replace and syntax hi-lighting, none of this “show a semitransparent box with your functions arguments” stuff or linters or whatever.

    I still don’t know how to use a modern IDE, I use sublime text. And I debug my code using print statements.

  • QuadratureSurfer@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    9 months ago

    Computer Engineering is still a degree where you combine both Computer Science courses with Electrical Engineering courses.

    You typically want to go this route if you want to be the kind of person that can create the logic for next generation GPUs/CPUs or if you like working with where hardware meets programming.