Jeff Brandt, Editor of PinHawk's Law Technology Digest Newsletter, discusses on how Brian Wang's article on how AI Models are undertrained and how we need to improve the training of any AI Models for the future of AI.
Training artificial intelligences is a touchy subject. On the one hand, you have the vendors sucking up every possible post, document and photo on the internet to train their AIs. On the other hand, you've got news organizations, artists and authors crying foul and suing the vendors to remove their works from the training. On the third hand (If you are an Edosian) you've got Brian Wang writing that are AI models severely undertained both from a data size as well as interations point of view. HE compares it to learning a new language, "If you only study for 10 minutes a day, it will take you much longer to become fluent than if you studied for 10 hours a day." With a three handed, well trained hat tip to Stephen Abram, read more at nextBIG future: AI Models Are Undertrained by 100-1000 Times - AI Will Be Better With More Training Resources
Published July 5, 2024.