Development and Cloud Are Made for Each Other
Today, development teams are pressured to deliver solutions on much shorter schedules while their budgets continue to be squeezed. The growing adoption of agile development methodologies enables development to code software faster; however, the fast and iterative agile practices also have created challenges in the downstream development processes—building, testing, and deployment of the software. Not only does the development team have to run these processes many times throughout the day, but it also needs to provision resources—lots of them and very quickly—to support the process tasks.
Enter the cloud. The scalability and elasticity of cloud computing surmount many of the problems that have traditionally afflicted software development and its insatiable need for compute resources. There are three primary reasons that the software development process is tailor-made for the cloud:
- Highly variable workloads: Because each software development project likely will be in its own lifecycle state, collectively these projects will impose an ever-changing set of resource demands.
- Instant access: Even though the overall workload will vary, one thing is certain—any required resources must be available and correctly provisioned immediately.
- Need for large resources for brief periods of utilization: In general, the build-test-deploy process makes use of a large set of resources (such as for build and test tasks) for a relatively short period of time.
While deciding to move software development to the cloud is easy, the process of actually moving development to the cloud needs to be well thought out, or else development (and IT—the internal team that provides resources or the external/public cloud vendor) will end up frustrated, and the enterprise will not receive the full value out of its cloud investment.
Here are four things to keep in mind as you move toward cloud-driven development.
1. Virtualization Alone Does Not Make a Cloud
Virtualization enables the cloud, but just having access to virtual machines isn’t enough. Most cloud implementations leverage virtualization and user self-service as their two cornerstone technologies. Virtualization dramatically improves the utilization of the underlying resource and enables IT to quickly provide standard resource templates—servers, applications, databases, etc.—to users. But it is self-service that provides users resources on demand. Users can request a new server and voila—a new virtual machine is provided instantaneously. Users no longer have to wait for hours or days to get the compute services.
2. Cloud Promotes Standardization, While Development Requires Customizations
The cloud is all about standardization. Cloud enables IT to efficiently deliver and manage its resources by standardizing (as templates) the basic resource configurations and making those available in a turnkey manner. This approach is attractive because it provides maximum ROI on existing resources, reducing one-off procedures, and it makes automated management possible (most public cloud vendors also provide standardized system and software templates). Furthermore, once the initial infrastructure templates have been established, IT can let authorized users create and use their own instances via self-service provisioning portals, reducing administrative overhead.
While cloud provides standard IT compute resources, developers typically want to customize the resources to requirements of the software production process. This may involve configuring the standard IT-instantiated resources deploying new dev or test-specific applications. Just as importantly, developers want these changes to be done automatically without manual interventions.
Herein lies the dev-IT gap: What the cloud natively provides needs to be customized to meet the need of development; automation is the only way to solve this challenge in a scalable way.