Nowadays days, no cloud platform is whole without support for GPUs. There’s absolutely no other way to support modern high-performance and machine learning workloads without them, after all. Frequently, the main focus of these offerings is on building machine learning models, but today, Google is launching support for the Nvidia P4 accelerator, which focuses specifically on inferencing to help developers run their existing models faster.
Apart from these machine learning workloads, Google Cloud users also can use the GPUs for running remote display applications that need a fast graphics card. To do this, the GPUs support Nvidia Grid, the company’s system for making server-side graphics more responsive for users who log in to remote desktops.
Because the P4s come with 8GB of DDR5 memory and can handle up to 22 tera-operations per second for integer operations, these cards can handle pretty much anything you throw at them. And because buying one will set you back at least $2,200, if not more, renting them by the hour may not be the worst idea.
On the Google Cloud, the P4 will cost $0.60 per hour with standard pricing and $0.21 per hour if you’re comfortable with running a preemptible GPU. That’s significantly lower than Google’s prices for the P100 and V100 GPUs, though we’re talking about different use cases here, too.
more recommended stories
Xiaomi’s Mi 9 incorporates a triple focal point back camera and remote charging says TechCrunch
Portable World Congress, the versatile business’.
Microsoft bringing Dynamics 365 blended reality answers for cell phones says TechCrunch
A year ago Microsoft presented a.
Google says it’ll put $13B in US server farms and workplaces this year says TechCrunch
Google today reported that it will.
Fitbit’s most up to date wellness tracker is only for representatives and medical coverage individuals says TechCrunch
Fitbit has another wellness tracker, yet.