Archive for April 2016

Happy Path Chef: How to Set up VSFTPD with Chef on CentOS 6

Automated Infrastructure vs. Manual Infrastructure

There are many tutorials about how to configure an VSFTPD server by hand. However, manual infrastructure is neither easy to replicate or document. If you create a server with Chef or another infrastructure-as-code tool, you can reproduce it in a fraction of the time it would take to do by hand. Automated infrastructure is becoming a necessity and therefore, organizations’ infrastructure must change quickly while remaining flexible and reliable to meet market demands.

FTP-like Servers allow you to receive and view files from anywhere. This tutorial is a walkthrough of the first iteration of a (much more) secure server VSFTPD server for QA testing and validation. The purpose of the server is to serve as a building block for more automated testing and it makes any manual testing done much more reliable and faster. The final version of this server should be a huge timesaver for our QA Team 🙂


  • Set up a VSFTPD server with client-side access.
  • Login as a specific user.
  • Upload, download, and view files.

Your Toolbox

  • FTP Client application. (I like Cyberduck, the UI is nice)
  • Vagrant and Virtualbox.

Your Cookbooks

You will need the following cookbooks for this server:

Put these in your chef/cookbooks directory.

Getting Started

The first step is to get the cookbooks installed on our CentOS 6 machine.

Go to your local machine, and in your local chef directory, enter:

$ EDITOR=vim knife node edit vsftpd-demo

This will allow you to set up an initial run list to install your cookbooks.

Let’s set up an initial run_list with our two cookbooks:

Set the run_list as shown below:



Save the knife file in vim

On your virtual machine, enter:

$ sudo chef-client

Check your Progress in Cyberduck

This server functions with anonymous authentication. For security purposes, a user besides anonymous should be able to log into their home directory to view, upload, and download files. We need to adjust the vsftpd configurations to get past this version insecure, anonymous FTP.

Configure VSFPTD with Attributes

Navigate to vsftpd/attributes/default.rb in your text editor or IDE. This is not the configuration file itself, but this is where chef gets the information to create the configuration file.

To get all these things ready from the vsftpd side, set the default[‘vsftpd’][‘config’] section  of the attributes/default.rb file in your cookbook like this:

default['vsftpd']['config'] = {

    'port_enable' => 'YES',

    'anonymous_enable' => 'NO',

    'local_enable' => 'YES',

    'chroot_local_user' => 'YES',

    'write_enable' => 'YES',

    'ascii_upload_enable' => 'YES',

    'ascii_download_enable' => 'YES',

    'local_umask' => '022',

    'dirmessage_enable' => 'YES',

    'connect_from_port_20' => 'YES',

    'listen' => 'YES',

    'background' => 'YES',

    'pam_service_name' => 'vsftpd',

    'userlist_enable' => 'YES',

    'tcp_wrappers' => 'YES',


    'pasv_enable' => 'YES',

    'pasv_address' => 'YES',

    'pasv_max_port' => '50744',

    'pasv_min_port' => '50624',

    'pasv_address' => "#{node['ipaddress']}"


Save the file and upload the new cookbook with:

knife cookbook upload vsftpd

Inside the chef directory of your local machine.

Check Yourself in Cyberduck

Open Cyberduck and try logging in as vagrant, with the password ‘vagrant’. Make sure you set the ‘Connect’ dropdown to ‘Active’. We are making an active FTP server for the purposes of this demo. 

Now you can view, upload and download files as the vagrant user, which is a step closer to our requirements.

Create a User

Although the vagrant user is a specific user, it isn’t a secret user. The standard vagrant password is not too hard to guess and easy to find. Let’s create another user with chef.

Add the following lines to the recipes/default.rb file:

directory '/home/ftpdemo' do

  owner 'ftpdemo'

  group 'ftpdemo'

  mode '0755'

  action :create


user 'ftpdemo' do

  supports :manage_home => true

  comment 'FTP Demo User'

  home '/home/ftpdemo'

  shell '/bin/bash'

  password 'ftpdemopass'


On your local machine, run:

$ knife cookbook upload vsftpd

Run sudo chef-client again in the virtual machine.

Check yourself, again

Open Cyberduck and log in as user ‘ftpdemo’.

You should be able to upload, download and view files as user ftpdemo.

Check in the browser at ‘ftp://<ipaddress>’ Log in again as ftp-demo. You should see the same files. We’re wrapping up things at this point.

Create a Chef Role

Now, we want to integrate these two recipes into a role. This makes replication in the future much simpler.

Create a file in the roles directory called vsftpd-demo.rb

It should look like this:

name "vsftpd-demo"

description "role to install and configure basic VSFTPD daemon for demo"






    "_default" => []


Upload the new role with this command: 

 knife role from file vsftpd-demo.rb

Edit the knife.rb file again with EDITOR=vim knife node edit vsftpd-demo in the local machine.

Change the run list to:


Run sudo chef-client in the virtual machine.

Cyberduck will allow you to upload, download and view files, while the browser should only allow you to download view files.

Congratulations! You just Cheffed FTP!


What is DevOps?

DevOps, a portmanteau of ‘development’ and ‘operations’, is a concept that refers to anything empowering organizations to deliver high-quality products to productions while maintaining flexibility within a reliable system.

Traditionally, a DevOps engineer role is seen as a blend of a Systems Administrator (Ops) and Software Developer (Dev) role. A DevOps engineer is often involved in all parts of the software / infrastructure lifecycle – from planning, to coding, to testing, to deployment, to operations.

Although DevOps is often thought of as only a movement within IT, DevOps can be applied in organizations who produce hardware, software, and hardware / software hybrids.

I think of DevOps as a process of collaboration across the traditional IT roles (Development, Operations, Quality Assurance, and Security) to deliver functional solutions to help entire systems. Common examples of this would be infrastructure-as-code, monitoring systems, and automated testing processes. The DevOps movement arose from software companies of huge scale, but is common in very small organizations as well.

Enabling & Improving World-Class Systems

World-class systems successfully implement DevOps, whether an organization calls it that or something else. A world-class system must meet three requirements:

  1. Highly scalable
    A world-class system must be prepared to scale up to billions of users, and have an infrastructure that can be adjusted to support that. A world-class system doesn’t necessarily need to have billions of users, but they need prepared to scale that large.
  2. Provide superior value
    Every system delivers a great idea that solves some kind of problem. A world-class system is delivering a  great product to the end user. The product may be tangible (a package from Amazon arriving at a doorstep), or intangible (a great article from Wikipedia that helps someone learn something new).

  3. Flawless experience of the product for users / customers
    In order to reach a certain level of scalability, a system must keep its users. Every experience a user/customer has with a product must satisfy them so they come back again. This does not necessarily mean a beautiful user interface, but an experience where users keep using your product. Reddit and Pinterest look very different, but maintain a great deal of users.

    No one loves to use a website where they feel frustrated by constant errors or cannot update their account information because services are down, again.  Very few users / customers would stay loyal to a company that charges them incorrectly for services or products, or has a leak in their personal information.

Why DevOps?

A DevOps culture allows a system to change quickly, stay reliable, and still deliver high quality to the end user. Successful implementation of DevOps saves organizations time, money, and morale while empowering them to focus on the things that matter most to them. At Ultralinq, we are seeking to create a DevOps culture on our engineering team whilst moving towards greater efficiency, collaboration, and quality.