Welcome back on our blog! Here we are with the updates of Day 2 @ Dockercon Europe.
Also the day 2 general session was presented by Ben Golub, Docker CEO, who started making a resume of the previous day keynote explaining the Power of AND. He showed for example that rapid innovation must be accompanied by quality, simplicity by scale, usability by security, etc..
Then he went through some lessons learned on the path to production:
- custom scripts rarely scale
- developers do not adopt locked down platforms
- end to end matters for both development and operations,
- build management & orchestration enables portability
Ben talked then about “Containers as a Service” platform. The related process is composed by build (Docker Toolbox), ship (Registry Service) and run (Control Plane - Tutum).
He highlighted especially that run is called out as a Control Plane and that Tutum is becoming the run building block for Docker.
He then underlined that the 20% of the contents pulled from Docker Hub are “official images”, and he asked “what about all the others?”. He answered saying that Project Nautilus was brought out in order to address this other 80%.
So he showed the output of a Project Nautilus on the screen. This Project goals are:
- scale up the security posture assessment
- notify users of new vulnerabilities in existing code proactively
- provide visibility to end-users on the security posture of images.
Then started Ben talking about Docker Automated Builds. One quarter of all repositories on Docker Hub are created through the automated build system, and the system is now processing over 60.000 automated builds per week; moreover there has been a 300% growth since January 2015. Today, Docker is releasing a major upgrade to our Autobuild system which features dynamic matching of git branches and tags, a faster and more secure build infrastructure, and user control over build parallelism.
So Automated Builds 2.0 is a re-architecture of the system in order to address time and quality issues.
The new Build System executes as many builds in parallel as you have private repositories. In this way it grants the quality of a clean environment.
Dynamic Matching is the other feature. The dynamic matching allows for variable based builds, and more flexibility in the system over time.
Then Ben introduced the next theme, the Run phase, and called Borja Burgos, Tutum Cloud founder, and Matt Soldo, Senior Director of Product Management at Docker on the stage.
They started making an overview of Tutum Cloud, a cloud that allows code to production rapidly. The upgrades to Docker Hub Auto Build service used together with Tutum drive to an end-to-end Containers as a Service platform, all available in the cloud today.
Then it was the Demo Time for Tutum: it was an amazing interesting demo by which they wanted to show the features of high availability and fault tolerance of Tutum Cloud for Docker applications.
Push to git and the automated workflow takes care of everything else in the build and push.
Docker & Tutum allow to have:
- consistency between development and production
- easy of use for developers and system administrators
- ability to deploy on Tutum Cloud
- scalability and high availability
Then it was the turn of 3DExperience Company customer story.
The customer went on stage and started talking about consistency between development and production, simplification of tools for development and operations, ability to deploy on their cloud, and the scalability and increased high availability provided by moving to Docker containers.
They showed then a video of a product called HomeByMe ( an online 3D modeling for home improvements and planning) fully running on the new system; the system has gone from concept to production in less than one year.
After the case Scott Johnston, SVP, Product at Docker, went on stage.
He started asking for raised hands of who can’t put data in the clouds or can’t put control planes in the clouds.
Then he went through production in the Cloud. He affirmed that it’s not for everyone due to compliance and security.
At this point he mentioned an Adrian Cockcroft phrase “Speed is the market share".
He said that developers will always find a way to go fast because this is their job, but it’s important to reach agility and portability WITH Control
And this starts at the application level (which images to trust? who signed an image and when? how to automate?).
In order to support this, Docker Content Trust ad Docker Trusted Registry are now in sync with each other.
He then went in depth with the Run aspect of this and with the control plane, and he made an announcement: public beta availability of Docker Universal Control Plane.
Docker goal is to enable both developers and operations teams to develop, deploy, and manage dockerized applications in production. This solution along with Docker Trusted Registry, enables customers to build their own Containers as a Service platform to deliver secure and manageable content on a programmable infrastructure to developers through a self-service experience.
Docker Universal Control Plane started as “Project Orca”, an on-premises solution for IT operations to have operational control without compromising developer agility and portability for applications running both in the cloud and on-premises.
Now It’s become an Integrated Stack for application deployment that allows Self-Service App Deploys & Updates, Provisioning & Config of Heterogeneous Clusters, LDAP/AD Integration with Docker Trusted Registry, Native Docker API’s and CLI, Monitoring and Logging. In this way it completes the concept of Containers as a Service.
There was then a DEMO of Docker Universal Control Plane that explained in practice all its features.
After the morning Keynote we want to mention also the final demo made the same day, the final day of Dockercon Europe. It’s been a very interesting demo by Adrien Duermael, Software Engineer @ Dockerand Gaetan de Villèle, Software Engineer @ Docker.
The topic was Dockercraft, and especially running Minecraft (the sandbox game of the decade) as visual interface to Docker, with an integrated Docker management system. Now you can manage Docker containers inside the Minecraft 3D block world; you can manage not only container status, but actual real control - as starting and stopping containers too.
Then they demonstrated that, using the Oculus Rift VR headset it's possible to navigate inside of Minecraft in full virtual reality.
Dockercraft helps you to manage your containers while playing Minecraft: that’s great, real and open source.
Our blogpost ends here. It was a great Dockercon!..see you for next one!