If a video comes up on your feed with subtle to obvious click-bait, is a talking head, and was posted days to hours ago, there's a good chance it's AI generated.
I find this phenomenon interesting, as other past trends in content generation. People in general don't like to be fooled. So if you believe a video was created by real people and later discover it wasn't, you feel a bit foolish and maybe a bit violated. Think back on the first time you saw a pic that was unusual or stunning, and you thought photographer did an admirable job in capturing something so good, then found out it was photoshopped. AI has taken that a step further, so an editor doesn't need as much of a creative process to enhance images.
I'm noticing significant backlash against AI generated content though, so that's a good thing. Youtube is supposed to note when a video has used AI to enhance the video. And I believe impersonation isn't allowed in their content policy, so you can report a video you think impersonates someone else.
They say AI will be the death of the internet as information becomes more and more generalized. But I think another factor is the concern about which you posted, which will result in a distrust in newer internet content. To be fair, AI generated content can be generally trusted most of the time, until it can't because it's hallucinating. Anyone who implicitly trusts it is setting himself up for failure.