• Clent@lemmy.world
    link
    fedilink
    English
    arrow-up
    13
    ·
    6 months ago

    The actual model required for general purpose likely lies beyond the range of petabytes of memory.

    These models are using gigabytes and the trend indicates its exponential. A couple more gigabytes isn’t going to cut it. Layers cannot expand the predictive capabilities without increasing the error. I’m sure a proof of that will be along within in the next few years.