ChatGPT generates cancer treatment plans that are full of errors — Study finds that ChatGPT provided false information when asked to design cancer treatment plans::Researchers at Brigham and Women’s Hospital found that cancer treatment plans generated by OpenAI’s revolutionary chatbot were full of errors.

  • dx1@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    edit-2
    1 year ago

    There’s “can build” and “have built”. The basic idea is about continuously aggregating data and performing pattern analysis and basically cognitive schema assimilation/accommodation in the same way humans do. It’s absolutely doable, at least I think so.

    • fsmacolyte@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      I haven’t heard of cognitive schema assimilation. That sounds interesting. It sounds like it might fall prey to challenges we’ve had with symbolic AI in the past though.