Om episode
With the world constantly generating more data, unlocking the full potential of AI means a constant need for faster and more resilient hardware.In this episode – the second in our three-part series – we explore the challenges for founders trying to build AI companies. We dive into the delta between supply and demand, whether to own or rent, where moats can be found, and even where open source comes into play.Look out for the rest of our series, where we dive into terminology and technology that is the backbone of the AI, how much the cost of compute truly costs! Topics Covered:00:00 – Supply and demand02:44 – Competition for AI hardware04:32– Who gets access to the supply available06:16– How to select which hardware to use08:39– Cloud versus bringing infrastructure in house12:43– What role does open source play? 15:47– Cheaper and decentralized compute19:04– Rebuilding the stack20:29– Upcoming episodes on cost of compute Resources: Find Guido on LinkedIn: https://www.linkedin.com/in/appenz/Find Guido on Twitter: https://twitter.com/appenz Stay Updated: Find a16z on Twitter: https://twitter.com/a16zFind a16z on LinkedIn: https://www.linkedin.com/company/a16zSubscribe on your favorite podcast app: https://a16z.simplecast.com/Follow our host: https://twitter.com/stephsmithioPlease note that the content here is for informational purposes only; should NOT be taken as legal, business, tax, or investment advice or be used to evaluate any investment or security; and is not directed at any investors or potential investors in any a16z fund. a16z and its affiliates may maintain investments in the companies discussed. For more details please see a16z.com/disclosures.