• Lemongrab@lemmy.one
      link
      fedilink
      arrow-up
      7
      ·
      5 months ago

      Holy Shit! I did a little research and the C language is used by SOO many open and closed source applications, at least some bits of C code in basically ever OS, and it simplifies the barrier of entry so Devs dont need to learn Java/Python/COBOL/etc!!

      • AVincentInSpace@pawb.social
        link
        fedilink
        English
        arrow-up
        2
        ·
        edit-2
        4 months ago

        Hello! It seems you know just enough to be dangerous about my specific area of expertise! Are you interested in learning a lot more in a very short time? If not, well too bad, cause I’m gonna teach it to you anyway! Here’s a massive wall of text about my personal experience with programming and specifically C. Feel free to read or not but I wanted to write this down somewhere. It’s an ADHD thing. You know how it is.

        C is old. Lotta stuff uses it. In this industry, that’s a badge of honor. Things that aren’t good don’t get to be old, and there’s a lot to be said for stuff that’s undergone as much testing as C has. What’s more impressive is that despite its age and despite the littany of newer programming languages that have come since, it has held onto its crown as one of the most used languages out there. In fact, it’s so ubiquitous and so widely supported that it has become the programming world’s lingua franca. If two programming languages used in the same application need to talk to each other, most of the time they’ll do it by both pretending to be C in order to communicate. It’s also one of the most performant programming languages around, thanks to being one of the only ones that compiles directly to native machine code without any sort of runtime or virtual machine sitting between your code and the CPU, like Java or Go have. A program written in C can often process ten to a hundred times as much data in the same amount of time as an equivalent program in languages like Python with sluggish interpreters. (Before you get excited, please note that raw bulk data processing is not the only thing computers do and the actual applications of this performance benefit are more limited than we’d like.) It’s not at all a bad language to know and have in your back pocket in case you find yourself needing to do some quick, small-scale coding, work on an old legacy project, or just have fun poking around with raw pointers and data structures without any killjoy type-checking compiler to tell you no. But there’s a reason not many new projects use C anymore, and that’s because writing complex programs in C sucks.

        For a long time, we (programmers) wrote all our code in C, because we had no other choice. We have better tools than that now. We have Rust and Zig and Go and more modern tools that prevent you from making difficult-to-spot mistakes that could make your program crash, or worse, look like it’s acting fine when it’s really gone completely off the rails and you don’t realize until something seemingly completely unrelated breaks.

        In C, basically everything has to be done manually. If you want to store data outside the call stack (very common thing to do – the stack is quite limiting), you have to manually ask the operating system to give your program more memory, tell it how many bytes you want (which requires you to know how many bytes the thing you want to store is), manually check to make sure you actually got more memory and not a signal from the OS that it’s out of RAM, then don’t forget to put your own data there before you go to use it or you’ll just get whatever meaningless garbage the program using that memory before you left behind. Also, if you forget to manually tell the operating system you’re done with the memory after you don’t need it anymore, the memory usage of your program will steadily increase as it allocates more and more memory but forgets to release it again. One other thing: be sure not to accidentally use your memory after you’ve told the OS you’re done with it, or you’ll end up corrupting God knows what because the memory allocator already gave it to something else.

        Modern programming languages tend to do all of this for you, both saving time since you don’t have to do it and preventing the chances of a mistake. In languages where you cab do it manually anyway, they tend to have checks in place, so the compiler can detect when you’re about to make one of these mistakes and refuse to produce an .exe file at all until you go back and fix it.

        In C, basically nothing comes built in. You want inheritance? Build it manually. Virtual function tables? Do it yourself. Dynamically sized lists? Hash maps? Basic data structures that are built in features of all modern languages because they’re used so commonly? Sorry pal. Either build them from scratch or copy the code (usually in the form of a library) where some kind soul has already done it for you. Namespaces? Pshh, we don’t need namespaces where we’re going. Just put the name of your program or code library at the beginning of every single one of your function names so that no two functions in your entire project (ones inside third-party libraries you’re working with included!) and you’ll do just fine.

        By far the most famous pain point with C, though, especially for newcomers, is that C never tells you what you’re doing wrong, because it doesn’t know. If your program has a bug and tries to access the eleventh element of an array with ten elements in it, in a modern programming language, it will detect that and say “Hey! The code on line XYZ in file foo tried to do something silly!” and crash immediately. C was invented before computers were powerful enough for that to be possible without noticeably slowing things down. If you try to do that in C, it will happily try to access the memory after the end of that array. If you’re lucky, the operating system will step in and say “Hey! You’re not allowed to look at that region of RAM!” and immediately terminate your program, without telling you what memory you tried to access that you weren’t supposed to or which part of your program tried to do it. (Not unless you feel like learning how to use gdb to inspect the core dump, anyway.) If you’re not lucky, your program will coincidentally have access to that memory location because it’s in use by some other part of your code, and you’ll be sat there wondering why your program that was supposed to count to ten printed out 1 2 3 4 5 6 7 8 9 10 5363947. God help you if you were doing something with those numbers besides just showing them on the screen.

        All that is without even touching on issues like multithreaded concurrency, cross-platform compatibility, or the horrors that lie within build scripts.

        C is a good language with a rich history – that I’ll not dispute. As a testament to this, it’s one of two programming languages Linus Torvalds allows to be used to write the upstream Linux kernel, the other being Rust. (He fairly famously does not accept code written in C++.) It’s also great if you want to get a deeper understanding of how computers really work, and what goes on behind the scenes when you use one of the higher level languages like JavaScript (and perhaps how you can make your programs faster and better optimized). Definitely worth learning. What it is absolutely, unequivocally not is a viable replacement for the tools that have come after it that are orders of magnitude easier to build good software with, and while it’s worth knowing, it should not be the only language you know.

        (Sorry if this was a bit rambly or unclear in places – I’m writing on my phone :P feel free to ask questions if something I said didn’t make sense or you want to know more! :)

      • lad@programming.dev
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        5 months ago

        Well, yeah, it is everywhere, but I wouldn’t say that it’s a good language to learn in 2024

    • qjkxbmwvz@startrek.website
      link
      fedilink
      arrow-up
      3
      ·
      5 months ago

      “Please, please, ladies — one at a time. No seriously, one at a time, the global interpreter lock can’t handle more than that.”

      Only joking. Sorta…