I was updating the Jetpack plugin in WordPress today, and looked through the settings to find it has a “show related posts” feature. As a way to keep visitors on your site, it’s a decent way that doesn’t add much over head to the page size or load times, so I thought “sure, why not” and turned it on.

It worked fine out of the box, which is always nice. I went to a post to see the feature in action and one of the related posts was one from 2009 about cloud servers from Mosso.

The upside of having a blog going for nearly a decade is that there hundreds of posts to pick from. Nearly all the posts have been tagged and categorized. That’s good. As is the long tail of search. Many visits to this and other sites come from users doing very specific searches.

The downside is that in the world of technology, things change fast. Posts written a year ago are sometiems out of date. New versions of software are released. There are new features added to platforms. Some things go away.

Take Mosso, for example. Rackspace purchased and integrated Mosso several years ago, rendering my platform review useless and outdated.

Or all the posts I did in the late aughts about Amazon Web Services pricing. You see the pattern.

Here, then, is the question. Should a blog, especially one focused on technology, leave up old posts as a historical record? Alternatively, should blogs delete/draft posts after a certain number of years, considering the speed at which that world changes?

Perhaps a disclaimer could be added to posts over a certain age telling users of their age and possible obsolescence.

There’s value in both methods. It’s something I haven’t really thought about until today. I have drafted a few of the very early posts I wrote here due to errors, the vast majority of them are still are live.

What is your rule of thumb when it comes to leaving old blog posts online?

 

OuchIf you follow tech news, you may have heard about GitLab.com. Poor folks.

This week, an engineer pulled the old rm -rf mistake on a directory full of their production databases. If you don’t know what that means, that command in Linux basically means DELETE EVERYTHING WITH NO WARNINGS THAT YOU’RE DELETING EVERYTHING. That command is dangerous and should only be used in extreme cases. Otherwise, you get stuck like Gitlab.

What they’ve found as they try to restore data is that, according to this Register article, 5 of their types of backup systems and procedures either didn’t work, weren’t being run, or never worked correctly. That’s very bad.

Credit to GitLab though – they have been more than transparent on Twitter, sharing a Google doc with progress updates, and even live-streaming the restore process on YouTube.

What can we learn from GitLab? My takeaway is to take a minute and make sure your computers and servers are backing up correctly.

Personal and Server Backups

On a personal level, I backup my laptops and desktops at Backblaze. I’ve written about them several times here, but it was a lifesaver when my laptop was stolen last year. I was back up and running in hours and didn’t lose a byte of data. If you use my link, you get a free month of unlimited backups. At $5 a month, it’s a no brainer.

For servers is a little different, and every IT organization treats backups differently. There are full and partial backups. In the past, I used a script that dumped my databases and web files and uploaded them to Amazon as extra level of backup.

Have a blog, and you use WordPress? I would highly recommend you set and follow a backup schedule for your WordPress site. There are many plugins to automate this, and they will upload data to many online services, including Dropbox, Google Drive, FTP, Azure, and Amazon S3, which is what I use. One of the most popular, and free, plugins to do this is Updraft Plus.

I do daily backups of this blog nightly. I keep the last month’s worth of daily backups and then a monthly archive. I don’t  have a need to go back more than that. These backups aren’t terribly large, and storing them at Amazon costs, honestly, pennies per month.

Plugins like Updraft Plus take the manual labor out of backups, and that’s good. Life happens, and the last thing on my mind when I’ve got a free minute is dropping everything to back up my blog.

Take a few minutes today, and check your backups. It’s like insurance – you have it, and hopefully you never have to use it.

 

The Chrome browser has started showing that a site being over SSL and HTTPS more visible to users in its recent versions. Instead of showing just a green padlock, Google has added the word secure to that area.

The bar now looks like this:

SSL site in Chrome

For non-secure, regular sites, there will continue to be an icon that shows the user they can get more info about that site.

Non-SSL site in Chrome

If users click on that site, they see this text:

What users see on non-SSL site

This small change is just the beginning. At the end of January, Google and Chrome will start listing sites served over non-secure HTTP will be marked specifically as non-secure. WordFence shows in this image how Chrome will show all sites that aren’t served securely:

Non-secure site in Chrome

WordFence released a good blog post on these changes here.

This is a good thing, as serving of SSL and HTTPS not only is better protection for your data, you can, if you want, get some serving speed increases via HTTP/2.

On the downside, it may drive your campus or freelance clients to ask why their sites aren’t showing up as secure.

It will also drive users to think that something is wrong with their site or their information has been compromised. We will need to communicate to those users as well.

It will be a good opportunity for us as web developers to have a conversation about basic security and why technologies like SSL are important.

Luckily, installing SSL certificates is much easier now thanks to groups like Let’s Encrypt. They’ve taken the headache out of issuing and maintaining SSL certificates. The majority of the sites I host and support serve certificates from Let’s Encrypt, including this site.

With the pain removed, for the most part, there are fewer and fewer excuses not to serve your site over HTTPS/SSL.

The challenge here remains that not enough shared web  hosting providers are offering easy and affordable SSL. Kudos to Dreamhost for being one of the largest hosts to offer free, no-configure SSL to their hosting clients. Let’s hope more and more companies join in.

I’m writing a longer post about this, but on the side, I have a web development and support company. We do hosting for many sites, and have we are making (at least) free SSL the default for all the sites we begin hosting in 2017. We’re also retrofitting all the sites we’ve previously launched. It’s just a click of the mouse for us, so there’s no excuse not to. Add in automatic renewal of the certificates, and it’s dead easy for developers and host companies to support.

If you’re a higher ed blogger, agency, freelancer, small business or non-profit, and want inexpensive web hosting with security like free Let’s Encrypt certificates included, contact me. I can help.