The Cloud Offers High-Performance Scientific Computing

    Cloud Computing | Cloud Storage - Posted on 11/09/2017 by Hind Bouzidi (Outscale)

    cloud-scientific-computing.jpgHigh-performance computing (HPC) is an indispensable tool of modern scientific research. Applications of HPC to scientific data have resulted in insights that would have been beyond the grasp of researchers just a few years ago. Genome sequencing, bioinformatics, protein folding analysis, drug discovery, particle physics simulations and the analysis of astronomical data are just a few of the areas to which HPC has made contributions of importance.

    High-performance computing has traditionally been enormously expensive, requiring large clusters of power-hungry servers and armies of IT techs. That’s fine if you’re Harvard, but less so if you’re a grad student with limited funding or a scientific startup founder with capital limitations.

    The cloud is the perfect fit for scientific researchers who need HPC but haven’t got a spare HPC cluster lying around.

    Lower Capital Investment

    This is the headline feature of cloud computing. The cloud provides infrastructure as a service; users pay for the infrastructure — storage, processing, bandwidth — that they use, when they use it and no more.

    That’s great for researchers who need to test a hypothesis without blowing their funding.

    But CAPEX reduction isn’t the only or even the most significant benefit of cloud computing for scientific HPC applications.

    Reduced Infrastructure Investment

    The Square Kilometer Array is a network of 2,000 radio dishes distributed between South Africa and Australia. When it’s fully operational it will produce 1 exabyte of data each day, which needs to be filtered and analyzed. The cost of building a cluster sufficient to work with that volume of data is enormous, and it’s not just the servers and the building to house them: electricity generation and the infrastructure to get electricity to the site would also be hugely expensive.

    The cloud offers a simpler alternative. Huge amounts of compute and storage without having to build a multi-million dollar support infrastructure.

    Massive Parallelization

    Many scientific computing applications depend on parallelization: the running of numerous small tasks across multiple cluster nodes. Scaling is what the cloud does best. Spinning up hundreds or even thousands of virtual machines in the cloud takes no time at all in comparison to the same tasks with physical hardware, and in the cloud, you only pay for those servers while they’re running.

    Faster Time to Deploy

    The faster data can be analyzed and hypotheses tested, the faster research can progress. Startup culture calls it an iteration and it’s one of the great benefits of the cloud. In legacy computing environments, it can take weeks to procure and deploy physical hardware from university IT departments. With cloud deployments, researchers can manage their own deployments from a web interface. If they need to, they can have a server running in less than a minute.

    The cloud provides HPC computing for scientific applications that is faster, cheaper, and more flexible than legacy IT systems.

    Image: Flickr/jurvetson

    Author: Hind Bouzidi (Outscale)

    As a communications specialist, Hind makes technical topics simple in particular about new technologies.

    https://en.outscale.com

    Comments