Recently we have seen a number of chatbots(chatGPT, BARD etc). These comes under LLMs(large language models) which are trained on huge datasets. I will discuss the methods which these models follow to give personalized and accurate responses. I will break this into three steps:
- Information gathering: gathers users search history, topics discussed, queries etc
- Information clustring: clusters related concepts, facts and ideas
- Personalized responses: searches information cluster to find most relevant and helpful information
The information clustering process is a knowledge acquisition and adaptation process, which shares some similarity with online learning but has some key differences.
Knowledge acquisition(KA) vs subset selection(SS):
- KA is a broader concept, it encompasses various methods and processes of gathering and accumulating new knowledge or information. This includes learning from experiences, interactions, readings, and various sources while SS is a narrower concepts. It refers to the specific act of choosing a smaller group of items from a larger set based on certain criteria.
- KA focuses on understanding. It aims to comprehend the meaning, context, and relationship within the acquired information. SS focuses on choosing. It emphasizes the process of selection and filtering rather than understanding the chosen items.
- KA can involve actively seeking knowledge or passively absorbing it through experiences. SS does not necessarily imply ongoing learning or accumulation of knowledge.
Overlap and Differences:
- Subset selection can be part of knowledge acquisition: When you're acquiring knowledge from a large dataset, you might choose to focus on a specific subset that's relevant to your needs.
- Knowledge acquisition goes beyond subset selection: It involves not only selecting information but also interpreting, integrating, and building upon it.
- Subset selection doesn't always involve knowledge acquisition: You might select a subset of data for purely analytical purposes without aiming to gain new knowle
dge.
Comments
Post a Comment