• Clent@lemmy.world
    link
    fedilink
    English
    arrow-up
    13
    ·
    6 months ago

    The actual model required for general purpose likely lies beyond the range of petabytes of memory.

    These models are using gigabytes and the trend indicates its exponential. A couple more gigabytes isn’t going to cut it. Layers cannot expand the predictive capabilities without increasing the error. I’m sure a proof of that will be along within in the next few years.

  • azi@mander.xyz
    link
    fedilink
    English
    arrow-up
    6
    ·
    6 months ago

    There’s plenty of stuff where ML algorithms the state of the art. For example the raw data from nanopore DNA sequencing machines is extremely noisy and ML algorithms clean it up with much less error than the Markov chains used in years previous.

  • Buglefingers@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    6 months ago

    A lot of new tech is not as efficient or equally so at the get go. Learning how to properly implement and utilize it is part of the process.

    Right now we are just throwing raw computing power in ML format at it. As soon as it catches and shows a little promise in an area we can focus and refine. Sometimes you need to use the shotgun to see the rabbits ya know?

    • rando895@lemmygrad.ml
      link
      fedilink
      English
      arrow-up
      2
      ·
      6 months ago

      Physicists abhor a black box. So long as it is an option, most will choose not to use AI to any great extent, and will chastise those who do.