I'll be honest, this guy sounds just as full of himself as the people he claims are full of themselves. Could very well be false considering conservation of energy. At the very least, it can put a reasonable practical cap on intelligence that will not allow singularity to happen, and allow humanity to compete even with a smarter brain. Secondly, the idea that ideas are created with pure processing power and can be "made inside of a box" seems odd to me. You need random variation and a wide range of data and observation to produce new ideas and tech. We, humanity, are a reproducing system that feeds information around itself as it spreads over a large range of space as to make a large and diverse range of observation. An AI would, at least, have to ride on our "backs" before it manages to supersede our collective inventive ability. In fact, a huge reason that AI is booming right now is directly related to the fact that data collection and storage boomed before it. Data is required to be intelligent, not necessarily computing speed, and I'd bet on humanity being bottle-necked not by our processing power, but our observational power, and our observation-filtering power. Imagine we wanted to optimize for speed, and made a bike, before assuming that these bikes are going to drive antelopes extinct. No, because bikes aren't able to eat, they aren't able to reproduce, they don't survive very long, and they don't deal well with sand. This is the supercomputer argument, this idea that a thing optimized for speed is going to beat out a thing optimized to survive and reproduce and be fit in its environment.In particular, there's no physical law that puts a cap on intelligence at the level of human beings.