modal

Tag

Cards List
#modal

@charles_irl: Inference isn't everything, but it does require a new stack -- not Kubernetes, not SLURM. At @modal, we dove deep to bu…

X AI KOLs Following · yesterday Cached

Modal engineers detail their approach to achieving truly serverless GPUs for AI inference, combining cloud buffers, a custom content-addressed filesystem, and CPU/GPU checkpoint/restore to scale replicas in tens of seconds instead of minutes.

0 favorites 0 likes
#modal

@modal: On May 30th, we're partnering with @OpenAIDevs and @AntlerGlobal to host an Autoresearch Systems Hackathon to tackle pr…

X AI KOLs Following · yesterday Cached

Modal announces a partnership with OpenAI Devs and Antler Global to host an Autoresearch Systems Hackathon on May 30th targeting data and compute-intensive challenges.

0 favorites 0 likes
← Back to home

Submit Feedback