The tips in your book have helped me become more productive! Before your article, I was wasting so much time figuring out what to do and how to spend my time on my projects. Thank you for your instructions on how to be organized and stay on schedule. I will be recommending your article often!


Expand full comment

In the vein of this analysis, would shifts in the relative size of each quadrant add information? Is it necessarily indicative of future performance if, say, the A quadrant (informed +hype folks) is larger by quantity than previous technology releases or hype cycles? Or may those fluctuations be a quirk of the tech itself or the culture?

Expand full comment

"So one should expect the cost of this training (GPT-3 training budget is ~10-15 million USD) will go down significantly within the next year, 2)"

Training and testing a Deep Learning model depends on how deep the model is which corresponds to how fast the processor will process and how much parameters are required which translates to more memory ram needed. It is NAIVE to think that companies like Google and Nvidia who spent billions to develop and manufacture hardware and infrastructures like GPU, TPU, Quantum Computers will stop milking these cash cows.

Since you like to quote PEOPLE so much, I'll show a quote from SOMEONE as well.

Ian Goodfellow

"But overall, massive amounts of computation will continue to be key for AI; whenever we make one model use less computation we’ll just want to run thousands of models in parallel to learn-to-learn them."


Expand full comment