• belated_frog_pants@beehaw.org
      link
      fedilink
      arrow-up
      4
      arrow-down
      1
      ·
      4 months ago

      No, its fancy autocomplete at a huge scale. Sometimes it returns correct answers.

      A search engine should be taking a list of websites and metadata about those websites and returning results based on some ranking with the original desire being to get you what you wanted. (The current desire is just how much money can be extracted from your hands on the keys)

    • gerryflap@feddit.nl
      link
      fedilink
      arrow-up
      3
      ·
      4 months ago

      No. ChatGPT pulls information out of its ass and how I read it SearchGPT actually links to sources (while also summarizing it and pulling information out of it’s ass, presumably). ChatGPT “knows” things and SearchGPT should actually look stuff up and present it to you.

      • kosmoz@lemm.ee
        link
        fedilink
        arrow-up
        2
        ·
        4 months ago

        Kagi supports this since a while. You can end your query with a question mark to request a “quick answer” generated using an llm, complete with sources and citations. It’s surprisingly accurate and useful!

        • gerryflap@feddit.nl
          link
          fedilink
          arrow-up
          2
          arrow-down
          1
          ·
          4 months ago

          From the train dataset that was frozen many years ago. It’s like you know something instead of looking it up. It doesn’t provide sources, it just makes shit up based on what was in the (old) dataset. That’s totally different than looking up the information based on what you know and then using the new information to create an informed answer backed up by sources