Collect users during the dialogue business email list process. Preferences, which are used for subsequent content recommendations. After the system was launched, the user's activity and the click-through rate of the content have skyrocketed. Judging from the data results, it is undoubtedly very successful . But the data doesn't tell everything, because the algorithm only pays attention to the probability, not the user itself. For example, intelligent dialogue and recommendation algorithms do help users find content that better matches their needs, but it is unknown what the actual experience of this content will bring.
I recommended a set of video courses to users. The user watched more than half of the video, but he was not attracted at all. After a week of persistent learning, he finally gave up. We analyze it through data such as click-through rate, completion rate, and 7-day retention. This may be a good case, but in fact this user has already lost, and it is in the case of losing the platform's ce ntripetal force. He may tell the people around him that the classes on this platform are not good or useless.
Negative word of mouth will be fatal to online education. How should this problem be solved? The first reaction of many people may be: if something happens, do a questionnaire. But the questionnaire is not a panacea. Considering the "prior probability" problem mentioned in our previous article, if you are issuing unpaid questionnaires, the answer rate will be very low, and it is very likely that the proportion of heavy users on the platform is too large, and the confidence level will be reduced;