We're ready to help

Our cloud experts can answer your questions and provide a free assessment.

a meeting
AWS Docker

How Companies Will Actually Use Docker

  • 0
  •  0


This article originally appeared on VentureBeat:
How companies will actually use Docker

Enterprises want Docker. It’s on many 2016 roadmaps and has become the tech darling of startups and financial services conglomerates alike, notwithstanding its extreme youth.

Despite common perceptions, enterprises don’t need to reach the promised land of a full “DevOps transformation” to start using Docker. They don’t need to have a microservices model or a fleet of full-stack engineers. In fact, Docker is a good fit for enterprises that are in the thick of a multi-year IT transformation and can actually help big teams implement DevOps best practices more quickly.

Hybrid cloud is the goal of nearly half of enterprises, most of which are in the process of some kind of DevOps toolchain adoption. Both are messy processes. Enterprises are hiring cloud consultants, consolidating data centers, breaking down barriers between engineering teams, and migrating new applications to Amazon Web Services (AWS) or other public clouds.

Mastering the hybrid cloud

Despite the supposed flexibility benefits of hybrid clouds, it is quite an engineering feat to manage security and scalability across multiple complex systems. The vast majority of an enterprise’s applications are burdened by internal dependencies, network complications, and huge on-premises database clusters. The idea of moving an application from one cloud to another “seamlessly” is laughable. For most enterprises, cloud bursting is a pipe dream.

This is where Docker fills a critical gap. The top reason enterprises are using Docker is to help them deploy across multiple systems, migrate applications, and remove manual reconfiguration work. Because application dependencies are built into containers, Docker containers significantly reduce interoperability concerns. Docker works equally well on bare metal servers, virtual machines, AWS instances, and so on.

As a result, applications that run well in test environments built on a public cloud instance will run exactly the same in production environments in on-premises private clouds. Applications that run on bare metal servers can also run in production on any public cloud platform.

Accelerating a DevOps culture

This is good news for enterprises looking to push a DevOps culture transition forward. The DevOps movement is really about moving faster and consuming fewer resources. Enabling developers to provision Docker containers, run tests against them, and deploy to production in minutes is cost-efficient and eliminates a developer’s worst enemy: manual system configuration work.

Docker is also a good fit for evolving enterprises because they are usually the most skittish about vendor lock-in. Container standardization makes it that much easier to move across clouds operated by multiple vendors.

The Docker team is also pushing to make the software enterprise-ready. After acquiring SocketPlane six months ago, Docker announced major upgrades in networking that allow containers to communicate across hosts. The team is working to complete a set of networking APIs which would make Docker networking enterprise-grade and guarantee application portability throughout the application life cycle.

Testing as security features mature

However, there are still some major hurdles to jump. Enterprises are rightly concerned about Docker security in hybrid environments. Containers may resemble virtualization, but they have vastly different implications for system segregation, log aggregation, and monitoring. Enterprise applications often have strict governance procedures that require extensive logging and monitoring. Quite simply, there is no mature orchestration tool that monitors security across multiple Docker clusters. Most monitoring tools on the market don’t have a view of transient instances in public clouds, let alone the sub-virtual machine entities.

In the case of a security threat, Docker containers currently would require a lot of manual security patching. Docker allows you to make an update to your base image, but developers would have to manually ensure that base image is running in each container. Some form of image inheritance is necessary for Docker to be ready for a mission-critical enterprise application.

For enterprises that require multitenancy for isolating multiple clients’ environments, Docker is truly not an option. You’re running on the same kernel in the same kernel space, which is not equivalent to separate VMs under a hypervisor. Enterprises with sophisticated backup tools may find that Docker containers present an extra layer of challenge to getting data shipped on time and to the right places.

Docker is quite possibly the answer to enterprises’ challenges in hybrid cloud. But it is also a brand new technology without many of the orchestration or security monitoring tools that enterprises need to use Docker in production. Now is the time for enterprises to investigate Docker, try to get their app running in hybrid test environments, and get to know their pain points, but probably not the time to use Docker clusters in production.

Stephanie Tayengco leads cloud operations at Logicworks, a cloud strategy and management provider.

Logicworks is leading provider of cloud automation and managed services for the enterprise. We offer a wide range of compliant and secure solutions to support high availability infrastructure and disaster recovery services to some of the world’s most respected brands. Contact Us to learn how we can help with your DevOps transformation.

No Comments

    Leave A Comment