Flying Squid to Technology@lemmy.worldEnglish • 2 years agoMicrosoft Needs So Much Power to Train AI That It's Considering Small Nuclear Reactorsfuturism.comexternal-linkmessage-square380fedilinkarrow-up11.15K
arrow-up11.15Kexternal-linkMicrosoft Needs So Much Power to Train AI That It's Considering Small Nuclear Reactorsfuturism.comFlying Squid to Technology@lemmy.worldEnglish • 2 years agomessage-square380fedilink
minus-square@frezik@midwest.sociallinkfedilinkEnglish1•2 years agoOften, these models are a feedback loop. The input from one search query is itself training data that affects the result of the next query.
minus-square@FooBarrington@lemmy.worldlinkfedilinkEnglish1•2 years agoSure, but that’s not done with the kind of model this thread is about (separate training and inference). You’re talking about classical ML models with continuous updates, which you wouldn’t run on this kind of GPU infrastructure.
Often, these models are a feedback loop. The input from one search query is itself training data that affects the result of the next query.
Sure, but that’s not done with the kind of model this thread is about (separate training and inference). You’re talking about classical ML models with continuous updates, which you wouldn’t run on this kind of GPU infrastructure.