• april@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    4 months ago

    It’s not exactly the same tech but it’s very similar. The Transformer architecture made a big difference. Before that we only had LSTMs which do sequence modelling in a different way that made far back things influence the result less.