For most teams today, the obvious place to develop and test apps is in the cloud, right alongside the production environment that will host each new release after it has been deployed.

From a development perspective, developing in the cloud makes good sense. But from cost and security perspectives, it’s less than ideal. Before you commit to a dev/test environment hosted in the public cloud, it’s critical to consider the drawbacks.

Why cloud-based dev/test is so popular

It’s easy to understand why so many organizations run their dev/test environments in the public cloud. If they are deploying production apps to the public cloud as well, developing them in the same cloud means they can instantly deploy without having to repackage each new release or do much staging work before it is ready to move into production.

They can also use the same exact APIs for development and testing that their app will use once it is in production. This virtually eliminates the risk that a difference in an API version or implementation will lead to post-deployment problems that developers failed to detect during testing. In other words, it gives them full environment parity.

The problem with cloud dev/test

In order to gain the advantages described above, however, development teams pay a steep price — both literally and figuratively.

Literally, they pay their cloud provider for the resources that their dev/test environments consume. And although a simple dev/test setup may not consume many resources, the bigger financial risk that stems from cloud-based dev/test is that developers tend to be messy. They may spin up services, like databases or EC2 instances, for testing purposes, then forget to turn them off. Over time, this leads to significant cost waste.

Cloud dev/test environments also impose the figurative price of privacy risks. No matter how careful you are about keeping dev/test data and code secure (and, if we’re being honest, we know that not all developers are all that careful), they’re still going to be exposed at least indirectly to the Internet if they are being hosted in the public cloud. There will always be some risk of a breach or data leakage from cloud-based dev/test environments.

Getting the best of both worlds with AppScale

So, what’s preferable? Should developers do dev/test in the cloud and accept the cost and security risks that come with it? Or should they keep their work on-prem, where testing and deployment into the cloud are more complicated?

The best answer is not to make this choice at all and instead to choose a platform like AppScale. With AppScale, developers enjoy the best of both worlds: They can build a private dev/test environment on secure infrastructure, then deploy seamlessly into the public cloud once their release is ready.

AppScale makes this possible because it allows teams to build a private AWS region. An AppScale region gives workloads access to the core APIS and services of AWS, including EC2, S3, EBS and others. At the same time, however, it lets developers keep workloads on their own servers, which can be completely disconnected from the Internet if desired.

The result is a dev/test environment that perfectly emulates the public cloud, but doesn’t come with the security or cost challenges of running dev/test in the actual public cloud. Developers can build and test to their hearts’ content in a secure, private AWS region where fees aren’t based on resource consumption (because AppScale bills are based on a flat, monthly rate, rather than metered usage), then move their apps into AWS itself instantly when testing is complete.

With AppScale, in other words, you don’t have to choose. Do dev/test on your own terms, without compromising on performance, security or cost.