• 1 Post
  • 122 Comments
Joined 2 years ago
cake
Cake day: December 20th, 2023

help-circle



  • While I don’t fully share the notion and tone of other commenter, I gotta say LLMs have absolutely tanked education and science, as noted by many and as I witnessed firsthand.

    I’m a young scientist on my way to PhD, and I get to assist in a microbiology course for undergraduates.

    The amount of AI slop coming from student assignments is astounding, and worse of all - they don’t see it themselves. When it comes to me checking their actual knowledge, it’s devastating.

    And it’s not just undergrads - many scientific articles also now have signs of AI slop, which messes up with research to a concerning degree.

    Personally, I tried using more specialized tools like Perplexity in Research mode to look for sources, but it royally messed up listing the sources - it took actual info from scientific articles, but then referenced entirely different articles that hold no relation to it.

    So, in my experience LLMs can be useful to generate a simple text or help you tie known facts together. But as a learning tool…be careful, or rather just don’t use them for that. Classical education exists for a good reason, and it is that you learn to get factually correct and relevant information, analyze it and keep it in your head for future reference. It takes more time, but is ultimately much worth it.








  • People got so deep into their allegiance games that they cannot comprehend anyone standing for the truth.

    Fuck .ml China fappers, and fuck .world Russia-guilty-of-everything fans. You’re equally terrible in enabling atrocities.

    As I said, some cases are confirmed, some are wild speculations. And latter are commonly used in future arguments as confirmations, despite them being mere speculated assumptions.

    You can have a barrage of “something-bad” confirmations like these out of thin air, and this is a common propaganda tactic.