On Improving Certificate Strength

This semester, I’m taking EECS 388: Introduction to Computer Security.

During one of the latest lectures, we were introduced to SSL Labs’ SSL Server Test. While my site received an A rating, my professor’s received an A+.

Looking at the breakdown in score, I learned that I was lacking in all areas that SSL Labs grades for: certificates, protocol support, key exchange, and cipher strength. I found the first category especially curious, given that I was paying for a “premium” TLS certificate. Indeed, my website had a certificate from Comodo. It cost $20 a year.

As a result of seeing that my professor had a TLS certificate from Let’s Encrypt (and also being a poor college student) I ditched my Comodo certificate for the free one. Although the change in the duration of the certificate is -50%, the change in the cost is -∞%. Not bad!

Of course, I was still hellbent on improving my site’s rating. We learned in class about HSTS: a mechanism that informs browsers to only load a website over HTTPS, and not HTTP.

To be included on Google Chrome’s preload list, I went to this site. My website wasn’t eligible because:

  1. I didn’t redirect all HTTP traffic to HTTPS.
  2. I didn’t actually provide an HSTS header.

Now, I had tried to remedy the first problem in the past through adding a .htaccess file with the following contents:

RewriteEngine On
RewriteCond %{HTTPS} !on
RewriteRule (.*) https://%{HTTP_HOST}%{REQUEST_URI}

The problem that I continuously ran into was peculiar. Jekyll, the static-site generator that my website is created by, added the standard front matter to the .htaccess file. This rendered the file useless for forcing HTTPS connections. Given that front matter was being injected into the file each time the site was built, I figured there was no clear way around this. (In retrospect, of course, there was.)

By manually modifying the .htaccess after an initial build of the site, I was able to prevent front matter injection from happening in the first place. It feels silly that the problem plagued me for this long, but we all look stupid from time to time.

Thus, the first problem was fixed. The second problem was resolved shortly after the first by adding one more line to my .htaccess:

Header set Strict-Transport-Security "max-age=31415926; includeSubDomains; preload" env=HTTPS

With these two problems finally worked out, I was eligible to be on Chrome’s preload list. I should be officially on it within a couple of weeks.

The immediate results of all these changes, you might ask? My website now has a rating of A+. While I still have improvements to make in the areas of protocol support, key exchange, and cipher strength, I can rest assured knowing that my certificate is strong.

I’m quite thankful that my hosting service, DreamHost, provides such a simple interface for installing the certificate on my website. I’m also thankful that my professor, J. Alex Halderman, helps to make Let’s Encrypt available to the masses through his work with the Internet Security Research Group (ISRG).

A List of Upcoming Projects

Let the record indicate that I have a litany of projects that I plan to work on over the next couple of months.

Web hooks and Travis CI for this website

The current deployment technique for this site—which has a private repository on GitHub—goes something like this:

  1. Add new file to local directory or web server
  2. Push to GitHub from local directory or web server
  3. Pull from GitHub to local directory or web server
  4. Log in to web server
  5. Deploy site

Five easy steps, sure, but there’s automation that can occur for the process.

For instance, I can use a mix of Travis CI and GitHub webhooks to get rid of steps 3 through 5. Thankfully, I have free private builds available from Travis CI through GitHub’s Student Developer pack, so all I really need to do is some configuration. Shouldn’t be too bad.

Metrics dashboard

There are two open source libraries that do not get enough love: Graphite and Grafana.

I’m going to use both to create a metrics dashboard of my personal life. It will include graphs of Hacker News karma, Stack Overflow reputation, Twitter followers, and perhaps even my GPA (future project: GPA API). By doing this, everybody can see just how lame I am across multiple Internet platforms (*cue laugh*).

The only other implementation detail is where I’ll run the machine. For right now, I’m using an Ubuntu droplet on DigitalOcean, and configuration is working quite nicely. The current status of the project is hosted for your viewing pleasure.

Although any seasoned devops engineer will tell you that Graphite is absolute overkill for this project, I don’t really give a damn. I want room to grow this metrics dashboard into something robust, and the best way to do that is start with something robust.

Should the project go well, I plan to create a similar metrics dashboard for my colleagues at school. Perhaps with some spare time I can roll installation into a Docker image. Then again, chances are that one already exists.

Simple (?) Web App

Speaking of colleagues at school, there are a couple of tasks regarding content creation for my social media gig that are truly in need of automation. The act of creating a yak for Hail! Mail goes something like:

  1. Think of yak
  2. Write it into email
  3. Get it approved
  4. Send it to Hail! Mail server

Were I to create a collaborative web app where I can share content with my boss for approval, steps 2 through 4 can be eliminated. All I’d need to do is type yaks into the web app. When my boss rewrites or approves the yak, a click of a button can queue it up and send it to Hail! Mail at the proper time.

It would make the job about ten times easier, such that I can focus on other things.


When complete, all of these projects will be shared at my GitHub account. I’m excited for all of the coding to come!

How I Learned to Stop Worrying and Love Debugging

When I first learned to program back in high school, the joy that came from it was the act of expression. Much like writing, programming is the act of translating thought into a specific medium. For writers and programmers alike, the emotions felt as a result of expressing oneself are gratifying.

Writers, however, cannot be told that their expressions are wrong–at least by an inanimate object like a compiler or a test case. Although natural language processing is getting more advanced by the day, an author cannot put their latest novel or essay into a program only for it to say “Error: you’re wrong”.

I understand that one might compare compiler errors to mistakes found by spellcheck, but writers at least have the option of ignoring spellcheck. Compilers are gatekeepers: they determine if an individual can share their program with others. Writers don’t have such an intermediary–they can share their expressions with the world, error-ridden or not.

The process of debugging, then, is really nothing more than appeasing the inanimate objects: compilers and test cases. At least, that’s how I’ve predominantly viewed the process since high school. And let’s face it: the act of appeasing anyone or anything is not particularly pleasant.

I thought debugging became even more loathsome as the stakes got higher: say, debugging while trying to complete a project for class. “My grade is riding on this project,” I’d think to myself, “and I have to waste time by debugging?!”. My hate for debugging got to the point where I wrote self-deprecating tweets and asked people on Quora how to prevent myself from getting sad over bugs.

What I failed to realize at that time, however, was that debugging is a natural and–gasp–welcome part of programming. Debugging is as much an act of problem solving as writing code.

In order to love debugging, then, one must come to appreciate code not as an act of expression, but rather an act of problem solving.

Perilously Inefficient

If computer science has taught me anything, it is that there is always a more efficient way to do something. The fact that not all algorithms are created equal supports this idea.

Since learning CS, my mind is constantly running two parallel processes:

  1. A stream of consciousness (like every other human on Earth)
  2. A stream of algorithms to determine if I’m doing something most efficiently

Generally I’m only happy when the second process gives the green light to the first process.

An example of when this didn’t happen occurred today. For an assignment in HONORS 240, I am responsible for making a visual argument about a certain dataset. The dataset that I decided on regards America’s top philanthropists, compiled by The Chronicle of Philanthropy.

In the Chronicle’s infinite wisdom, options for easily exporting the dataset are nonexistent. Therefore, I had to find out a way to do it myself.

The programmer in me wanted to use BeautifulSoup, a Python package for web scraping. This, I thought, would be most efficient.

Lo and behold, I was wrong. Some of those measly efficiency algorithms of mine forgot to take into account the [self-proclaimed] steep learning curve associated with BeautifulSoup. Without anything more than a cursory knowledge of HTML, I was doomed to fail with such an approach.

Thus, I was left to try something else. The most efficient way of doing it? Copy-and-pasting.

Yes, it sounds deplorable. But some beautiful behind-the-scenes technology helped me out big time. All I had to do was:

  1. Copy the complete table from the Chronicle’s web page
  2. Paste it into a Microsoft Excel workbook
  3. Remove superfluous headers
  4. Clear formatting

Excel didn’t do a wonderful job of saving the data as a CSV, and so I had to manually change a handful of entries with a text editor. The results of the work, however, aren’t all too bad.

Though I probably saved myself a ton of time going the stone-age way, I’m still rather upset that I didn’t solve my problem in a programmatic fashion. BeautifulSoup seems hard to learn, and so I backed down from a challenge for the sake of time.

I was perilously inefficient in terms of scalability, yes. But after my copy-and-paste breakthrough, the job took no more than 15 minutes.

Time and scale matter. But with a simple and small task, time matters more.

The Power of Talk

While working at Modelshop over the summer, I did a little bit of research into integrated development environments for R. Over the course of this research, I came across a tool called DataJoy, from the makers of ShareLaTeX.

What I liked about DataJoy was its ease of use: plug in an email address and password to create an account, and get started writing R code immediately in a simple-to-comprehend UI. When compared to set-up with IDEs like RStudio, the time and complexity of a DataJoy set-up is simply too good to pass up.

That’s why I passed on information about DataJoy to my professor, Dr. Mika LaVaque-Manty. He’s teaching a class on game theory at Michigan in which I am enrolled. The syllabus noted the use of R for a variety of quantitative assignments, and so informing the professor about DataJoy seemed like a natural thing to do.

In a recent office hour appointment, Dr. LaVaque-Manty told me that he passed on information about DataJoy to a colleague who happens to be working as a White House fellow for data analytics. That colleague is now utilizing it in his work.

“Congratulations, you’ve influenced work in the White House”, Dr. LaVaque-Manty said.

The whole exchange made me smile.