• 0 Posts
  • 11 Comments
Joined 2 years ago
cake
Cake day: July 2nd, 2023

help-circle
  • litchralee@sh.itjust.workstoProgrammer Humor@lemmy.mlFinally
    link
    fedilink
    English
    arrow-up
    6
    ·
    13 hours ago

    I once had the mispleasure to face a Bash script that was 35 lines tall but over 800 columns wide. The bulk of it was a two-dimensional array – or rather, a behemoth that behaved like an array of arrays – with way, way too many fields.

    If that wasn’t bad enough, my code review to essentially rotate the table 90 degrees was rejected because – and I kid you not – the change was unreviewable in any of our tools and thus deemed too risky to change. /facepalm

    The gall of some people.


  • I guess your nephew can start studying to become a network engineer now lol

    In all seriousness, a 16 port managed switch exposes enough complexity to develop a detailed understanding of Ethernet and Layer 2 concepts, while not having to commit to learning illogical CLI commands to achieve basic functionality. 16 ports is also enough to wire up a non-trivial network, with ports to spare for exercising loop detection/protection or STP, but doesn’t consume a lot of electricity.

    I would pair that switch with a copy of The All-New Switch Book, 2nd Edition to go over the networking theory. Yes, that book is a bit dated but networking fundamentals have not changed that much in 15 years. Plus, it can be found cheap, or on the high seas. It’s certainly not something to read cover-to-cover, since you can skip anything about ATM networks.

    Then again, I think students might just simulate switch behaviors and topologies in something like GNS3, so no hardware needed at all.




  • I was once working on an embedded system which did not have segmented/paged memory and had to debug an issue where memory corruption preceded an uncommanded reboot. The root cause was a for-loop gone amok, intending to loop through a linked list for ever member of an array of somewhat-large structs. The terminating condition was faulty, so this loop would write a garbage byte or two, ever few hundred bytes in memory, right off the end of the 32 bit memory boundary, wrapping around to the start of memory.

    But because the loop only overwrote a few bytes and then overflew large swaths of memory, the loop would continue passing through the entire address space over and over. But since the struct size wasn’t power-of-two aligned, eventually the garbage bytes would write over the crucial reset vector, which would finally reboot the system and end the misery.

    Because the system wouldn’t be fatally wounded immediately, the memory corruption was observable on the system until it went down, limited only by the CPU’s memory bandwidth. That made it truly bizarre to diagnose, as the corruption wasn’t in any one feature and changed every time.

    Fun times lol


  • I know this is c/programmerhumor but I’ll take a stab at the question. If I may broaden the question to include collectively the set of software engineers, programmers, and (from a mainframe era) operators – but will still use “programmers” for brevity – then we can find examples of all sorts of other roles being taken over by computers or subsumed as part of a different worker’s job description. So it shouldn’t really be surprising that the job of programmer would also be partially offloaded.

    The classic example of computer-induced obsolescence is the job of typist, where a large organization would employ staff to operate typewriters to convert hand-written memos into typed documents. Helped by the availability of word processors – no, not the software but a standalone appliance – and then the personal computer, the expectation moved to where knowledge workers have to type their own documents.

    If we look to some of the earliest analog computers, built to compute differential equations such as for weather and flow analysis, a small team of people would be needed to operate and interpret the results for the research staff. But nowadays, researchers are expected to crunch their own numbers, possibly aided by a statistics or data analyst expert, but they’re still working in R or Python, as opposed to a dedicated person or team that sets up the analysis program.

    In that sense, the job of setting up tasks to run on a computer – that is, the old definition of “programming” the machine – has moved to the users. But alleviating the burden on programmers isn’t always going to be viewed as obsolescence. Otherwise, we’d say that tab-complete is making human-typing obsolete lol



  • I’m not any type of lawyer, especially not a copyright lawyer, though I’ve been informed that the point of having the copyright date is to mark when the work (book, website, photo, etc) was produced and when last edited. Both aspects are important, since the original date is when the copyright clock starts counting, and having it further in the past is useful to prove infringement that occurs later.

    Likewise, each update to the work imbues a new copyright on just the updated parts, which starts its own clock, and is again useful to prosecute infringement.

    As a result, updating the copyright date is not an exercise of writing today’s year. But rather, it’s adding years to a list, compressing as needed, but never removing any years. For example, if a work was created in 2012 and updated in 2013, 2015, 2016, 2017, and 2022, the copyright date could look like:

    © 2012, 2013, 2015-2017, 2022

    To be clear, I’m not terribly concerned with whether large, institutional copyright holders are able to effectively litigate their IP holdings. Rather, this is advice for small producers of works, like freelancers or folks hosting their own blog. In the age of AI, copyright abuse against small players is now rampant, and a copyright date that is always the current year is ammunition for an AI company’s lawyer to argue that they didn’t plagiarize your work, because your work has a date that came after when they trained their models.

    Not that the copyright date is wholly dispositive, but it makes clear from the get-go when a work came unto copyright protection.