Tag
The article critiques the common AI talking point that all exponentials become sigmoids, arguing that while individual technologies plateau, new breakthroughs can create new sigmoids, so AI progress may not necessarily level off permanently.
OpenAI releases an analysis demonstrating that compute used in largest AI training runs has grown exponentially at a 3.4-month doubling time since 2012, representing a 300,000x increase and vastly outpacing Moore's Law. The analysis suggests this trend will likely continue and calls for increased academic AI research funding to address rising computational costs.