Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Do you know comparatively how much GPU time training the models which run Waymo costs compared to Gemini? I'm genuinely curious, my assumption would be that Google has devoted at least as much GPU time in their datacenters to training Waymo models as they have Gemini models. But if it's significantly more efficient on training (or inference?) that's very interesting.


My note is specifically for operating them. Training the models, certainly can help.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: