LongMem is Microsoft’s answer for unlimited context length

  • The framework makes it possible to have unlimited context length.
  • An unlimited context length means a more personalized conversation with AI models.
  • It's basically the next step for AI in fully becoming near human.microsoft longmem
  • Is LongMem the Microsoft answer for unlimited context length?

    Well, to understand what it means to have an unlimited context length, we first need to understand what does context length mean?

    Context length refers to the number of tokens (words, signs, etc) permitted by a model, its input and output, and yours as well.microsoft longmem

  • For example, ChatGPT has a limited number of tokens which means its context length is limited as well. As soon as ChatGPT passes that limit, then all the continuous interaction that you had with it up to that point, will lose all significance. Or better said, it will reset.

    So if you’re starting a conversation with ChatGPT on the subject of Windows, and that conversation lasts more than the limited context length, then the AI tool will lose the context, and will either start to deviate from the subject, or the whole conversation will reset.

    An unlimited context length will make sure that won’t happen, and the AI model will continue to answer you on the subject, while also learning and adapting information as you talk to it about Windows.

    That means the AI model will also personalize the conversation according to your input, hence the need for the context to have an unlimited length.

    How does LongMem work?

    Microsoft promises just that with their new research on the framework, LongMem. LongMem would enable large language models to memorize long-term contexts and utilize long-term memory at the reduced CPU power.

    The framework consists of a frozen large language model as the memory encoder, a residual side network as the memory retriever and reader, and a cached memory bank that stores key-value pairs from past contexts.

    In the research done by Microsoft, experiments show that LongMem outperforms baselines on long-text language modeling, long-context understanding, and memory-augmented in-context learning tasks. Plus, long-term memory allows it to utilize more demonstration examples for better learning.

    And the good news is that LongMem will be open-source. So you will be able to study it and learn how to implement the framework in your own AI model. You can check its GitHub profile here.

Comments

Popular posts from this blog

What is Game?

5 best cash register software for PC

FIX: User profile or private key are not accessible