Nowadays days, no cloud platform is whole without support for GPUs. There’s absolutely no other way to support modern high-performance and machine learning workloads without them, after all. Frequently, the main focus of these offerings is on building machine learning models, but today, Google is launching support for the Nvidia P4 accelerator, which focuses specifically on inferencing to help developers run their existing models faster.
Apart from these machine learning workloads, Google Cloud users also can use the GPUs for running remote display applications that need a fast graphics card. To do this, the GPUs support Nvidia Grid, the company’s system for making server-side graphics more responsive for users who log in to remote desktops.
Because the P4s come with 8GB of DDR5 memory and can handle up to 22 tera-operations per second for integer operations, these cards can handle pretty much anything you throw at them. And because buying one will set you back at least $2,200, if not more, renting them by the hour may not be the worst idea.
On the Google Cloud, the P4 will cost $0.60 per hour with standard pricing and $0.21 per hour if you’re comfortable with running a preemptible GPU. That’s significantly lower than Google’s prices for the P100 and V100 GPUs, though we’re talking about different use cases here, too.
more recommended stories
Facebook is a new nonsense-Tech-Crunch
Welcome to 2019, where we learn.
Apple history revels “It has spent the least on startups out of all the Big technology & most valuable companies in U.S.”
To date,Apple has more cash than.
Run clients would now be able to utilize Apple Business Chat to achieve a specialist – TechCrunch
Run today reported it will bolster.
Flutterwave and Visa start African shopper installment administration GetBarter – TechCrunch
Fintech startup Flutterwave has banded together.