>>14417819polynomial time != "fast". While we can talk about being bounded in the asymptope, in practice polynomial time doesn't mean that much vs. exponential; Consider two algorithms: One O(n^10000000000), one O(1.001^n). Which would you pick? We've been getting just fine without p=np, it's not *really* as big of a thing as the general population makes it out to be. If P!=NP, our progress keeps going.
>>14418614>only work with HUGE amounts of training datanegative. While the famous models are often trained on large datasets, few-shot learning and training efficiency has exploded and become a major focus (in fact, that's the major contribution of CLIP; it takes a fraction of the data for the same quality of few-shot learning prediction in the OG paper).
>>14418868Same comment here. "You need big datasets" is a bit out of date in the ML world. We still use them because bigger = better, but they are really only required for SOTA (in my line of work, I often use 50-1000 samples to train models, and the models perform well).
reinforcement learning has taken leaps and bounds, still on the up and up.
> A lot of companies were banking on self driving cars being available now and they're in pretty big trouble because they still haven't taken off on a large scale.lol no one was banking on self driving cars existing RIGHT NOW RIGHT NOW, no one is in "big trouble". I have some friends working in the self-driving tech industry, everything is progressing about as they thought it would. Everyone knows it will take a few decades to become normalized and integrated into society.