25+ yr Java/JS dev
Linux novice - running Ubuntu (no windows/mac)

  • 0 Posts
  • 72 Comments
Joined 10 months ago
cake
Cake day: October 14th, 2024

help-circle


  • I was going to tell one of those “two white guys and a black guy” jokes from my youth here with immigration in place of St. Peter administering one final test, but that’s a lot of fucking work to tell a joke that has been in circulation for 50 years. So I go to look it up to copy/paste and I find out the joke is based on actual Jim Crow laws preventing black people from voting with literacy tests.

    Anyway, the punchline is “Spell chrysanthemum.” I’m sure you can work backward to figure out the rest of the joke and how it relates.












  • Food for consideration:

    1. Blocking people makes the place more pleasant for you, which makes you a more pleasant person to interact with and the people you respond to will tend to reflect that.
    2. Blocking people denies them interaction even in their less acerbic comments, making the place less fun for people who are so inclined.

    Since becoming a parent, I’ve learned that sometimes self-care is caring for others because it helps us keep from being ground down to our worse selves.

    That’s not to say more can’t be done, but don’t discount the effectiveness of starving negativity of air.






  • No. I’ve never really been in a class where someone else had a deeper understanding of the material to cheat off of. Equal sometimes, sure, but equally likely to be wrong.

    I did reach a point in math where I couldn’t go further and took that as a sign. Math is math. If I can’t do it in a test, I’d just be putting myself in a situation where I’m expected to do things I can’t — most likely in the next class.


  • LLMs “know” how to do these things, but when you ask them to do the thing, they vibe instead of looking at best practices and following them. I’ve worked with a few humans I could say the same thing about. I wouldn’t put any of them in charge of production code.

    You’re better off asking how a thing should be done and then doing it. You can literally have an LLM write something and then ask if the thing it wrote follows industry best practice standards and it will tell you no. Maybe use two different chats so it doesn’t know the code is its own output.