Puppet and Chef have both evolved significantly since we covered them last—suffice to say, we’re long overdue in revisiting these two heavy-hitters. In this article we’ll take a fresh look at their core components along with new integrations and expansions that continue to position them as leading enterprise IT automation platforms.
Some past differentiators—like each platform’s respective imperative/procedural approach and underlying programming language—have been discussed ad nauseum. And as both solutions continue to grow more powerful and complex, said differences are in fact less relevant. For the purposes of this comparison we’ll instead focus on how well they solve IT and continuous delivery challenges faced by today’s enterprises.
A Chef for All Seasons
Merely labeling a tool as a DevOps solution does not make it so. It must address contemporary IT challenges in building/managing high-velocity organizations while facilitating constant improvement and collaboration between groups. Tools—as critical agents of change—are instrumental in both managing technology as well as shaping culture:
“The tools we use reinforce the behavior; the behavior reinforces the tool. Thus, if you want to change your behavior, change your tools.”
– Adam Jacob, CTO, Chef
Chef extends this notion further by using martial arts as a metaphor for DevOps, specifically—Kung-fu. Below is a breakdown of Chef’s particular school of DevOps Kung-fu:
Indeed, many of the tenets highlighted above (e.g., collect metrics, integrate and deliver continuously, put applications and infrastructure through the same workflow) are manifest in Chef 12.
Features and Highlights
At the basic level, Chef is a tool for automation, provisioning and configuration management. The platform is made up of the following components:
Chef Server - the main hub where Chef propagates and stores system configuration information
and policies (i.e., recipes and cookbooks). The Chef management console is the web user interface for Chef Server.
Chef Client - installed on every node being managed, the Chef Client performs configuration tasks on the local machine.
Workstation - allows designated workstations to author/test/maintain cookbooks and upload them to Chef Server. Workstations are also used when utilizing the Chef development kit package.
Chef Analytics - a platform that provides actions and run history, real-time reporting, and notifications around Chef automation activities.
Chef Supermarket - an open source directory of community-contributed cookbooks
Introducing Chef Delivery
Traditional Puppet vs. Chef comparisons usually depict the latter as being more developer-friendly, with favorites like Chef’s Knife Plugin Architecture and the Chef Developer Kit (Chef DK) relegated mostly to developer use. At this year’s ChefConf event, a new DevOps workflow product was introduced: Chef Delivery, a set of tools that add yet more developer-friendly features like comprehensive codebase change histories, metrics, and permissions management to the platform. To read more about our experiences this year at Chef’s annual conference, check out our ChefConf 2015 debriefing.
Chef Delivery’s automated testing and continuous integration/delivery tools augment the platform with new features such as a shared workflow pipeline, collaboration capabilities, and enhanced analytics—as well as new ecosystem integrations with AWS, Azure, and Docker, to name a few. Though these enhancements are no doubt a boon to Chef’s developer community, Chef’s aspirations arguably have little to do with becoming a developer-centric automation tool and more with building a comprehensive platform for DevOps pipeline management.
Improved Security with Chef Vault
Customer and/or community customizations quite often become so widespread and integral that they find their way into bonafide product releases. This is certainly the case with Chef Vault, a project started by Nordstrom to improve upon the platform’s inherent security mechanisms. Chef can natively store sensitive data (e.g., SSL certificate keys, database passwords) in encrypted “data bags”—repositories of key/value pairs—for secure and easy access. Management of these data bags, however, is a tedious and error-prone process. Chef Vault provides an additional layer of security that enables easier management of these encrypted data bags.
Chef also maintains an ongoing list of security notes that provide customers proper remediation guidelines in addressing security shortcomings of the platform.
Master of Puppets
As mentioned previously, Puppet is considered a more operations and sysadmin-oriented solution when compared to Chef, though again—these role-based distinctions are becoming less relevant with each release. DevOps practitioners—both developers and operations staff alike—strive to achieve optimal conditions for continuous integration/delivery. Tooling is therefore increasingly evaluated based on its ability to achieve these ends effectively and efficiently in the context of an enterprise’s unique needs. Notwithstanding, Puppet has enjoyed significant first-mover advantages over the years, and though both Chef and Puppet have been neck-to-neck market leaders since the early days of IT automation, the latter boasts a longer commercial track record and larger install base.
Currently on version 4.3, Puppet is commonly deployed in a client/server configuration with managed nodes periodically synchronizing their configurations with the server. Reporting (e.g., results from automation runs, errors/exceptions) and other information is sent by the clients back to the server for aggregate analysis and processing. The following graphic is a basic representation of Puppet’s data flow:
Puppet’s data flow. Source: Puppet Labs.
Features and Highlights
Puppet automation works by enforcing the desired state of an environment as defined in Puppet Manifests—files containing pre-defined information (i.e., resources) describing the state of a system. The core components that comprise Puppet are as follows:
Puppet Server - the central server that manages Puppet nodes (agents)
Puppet Agent - client software installed on managed nodes that enables communication and synchronization with the Puppetmaster
Puppet Enterprise Console - a web GUI for analyzing reports and controlling infrastructure resources
PuppetDB - data storage service for Puppet the data produced by Puppet
Other key components worth mentioning over others include MCollective, a framework for supporting server orchestration or parallel job execution, and Hiera—a hierarchical key/value lookup utility for providing node-specific configuration data to Puppet (for keeping site-specific data out of manifests). Puppet has integrated MCollective, Hiera, and a myriad of other open source projects into its platform to provide comprehensive automation and management of mission-critical enterprise infrastructures. Many community-contributed add-ons are also available on Puppet Forge—an expansive library of open source modules for extending the platform’s features and capabilities.
Updates to Puppet Node Manager
Puppet’s Node Manager enables the creation of rules around node attributes, which allows for easier more efficient node management. With Node Manager, nodes can be managed based on their job rather than name, eliminating the need to manually classify each node. New updates include powerful provisioning capabilities for Docker containers, AWS infrastructure and bare-metal machines.
Introducing Puppet Code Manager
Puppet has been a mainstay of the DevOps movement since its inception and continues to address the enterprise’s continuous integration/delivery requirements. The concept of “infrastructure-as-code” entails using software development best practices to manage infrastructure configurations and provisioning details—including code review, version control, and collaborative development, among others. And like Chef, Puppet’s platform has evolved in response to the growing needs for a comprehensive mechanism to manage the continuous delivery pipeline.
Newly introduced in Puppet Enterprise 3.8, Puppet Code Manager provides a consistent, automated way to change, review, test and promote Puppet code in a continuous delivery framework. Based around R10K—a general purpose toolset for deploying Puppet environments and modules by interfacing with a version control system—Puppet Code Manager accelerates the deployment of infrastructure by rendering it a testable and programmatic process. And by enabling easy integration with Git for version control, this latest addition to the Puppet platform further blurs the line between software and infrastructure.
Software Defined Networking (SDN)
SDN is a new paradigm for networking that decouples network control and forwarding from physical infrastructure, enabling agile management of network resources in rapidly changing environments. Just as cloud computing enables IT to quickly spin up compute and storage instances on-demand, SDN replaces rigid (and sometimes manual) network operations with dynamically provisioned network services and resources.
This new model for networking is right in line with Puppet’s advocacy of “infrastructure as code.” As such, the company has made significant strategic initiatives and partnerships in support of SDN. For example, Puppet Labs recently announced a partnership with Arista Networks—a leading developer of SDN switches—to provide automation support to the vendor’s SDN equipment line. This and other similar partnerships (e.g, Cumulus Networks, Dell, Cisco) will position Puppet favorably over competing vendors once SDN technologies gain widespread adoption.
Puppet Security and Vulnerabilities
No software is without its share of vulnerabilities, and puppet certainly has its own. The company actively maintains a repository of Puppet security disclosures, with a complete list of reported vulnerabilities available via the CVE database. As of this writing, 51 Puppet vulnerabilities have been documented with an average severity level of medium. Chef doesn’t maintain a CVE database for all of their products. The only product they have issued CVEs for, Chef, contains 3 CVEs from 2012.
Chef and Puppet continue to expand their automation platforms in response to the needs of the DevOps-enabled enterprise, with new features such as Chef Delivery and the Puppet Code Manager helping to streamline the continuous integration/delivery pipeline. Both vendors are forging partnerships that may ultimately define—as Chef would put it—what school of DevOps a particular organization belongs to. Recently, the two have partnered with Microsoft to integrate their platforms with Azure, and Puppet—no stranger to being a first mover—has made key alliances with leading SDN vendors to position it favorably once the technology takes hold. So if your organization plans on adopting SDN, Puppet might be a stronger candidate in this respect.
Security is an enterprise-wide concern these days and should be taken into account when evaluating technologies. Chef has made significant strides in improving its platform’s security with Chef Vault, though its 3 published CVE vulnerabilities certainly pale in comparison to Puppet’s 51. It’s interesting that Puppet Labs rebroadcasts CVEs for vendored software such as ruby while Chef does not, despite both products including Ruby as a core component. It’s also difficult to believe Chef hasn’t found a vulnerability in any of their software since 2012.
In short, both IT automation platforms have matured greatly as enterprise solutions. We've highlighted some of Chef and Puppet’s key attributes and benefits—selecting the right option comes down to identifying each platform’s core competencies and determining which of these fall in line with your organization’s unique needs and requirements. Regardless of which automation platform you choose, UpGuard can complement either solution to round out the DevOps toolchain with advanced vulnerability assessment and monitoring, ensuring that security—as a function of quality—is baked in at every step of the continuous delivery process.
Whether your infrastructure is traditional, virtualized, or totally in the cloud, UpGuard provides the crucial visibility and validation necessary to ensure that IT environments are secured and optimized for consistent, quality software and services delivery.