Getting creative for employee on-boarding

Working for a small, international, non-profit has many challenges. One of the biggest I’ve encountered is managing all the internal details involved in on-boarding employees and helping new staff get sufficiently oriented regardless of their location or technical competency. One tool I’ve been a fan of for years is Trello and, based on an idea they shared, a colleague and I got to work. The solution created has been in use for almost a year-and-a-half with over two dozen new employees brought onboard since that time.

When a new employee is slated to start, a workflow is kicked off using an internal Trello board shared by a few different people involved in the process. It represents a master checklist of steps that will be needed to complete the numerous necessary tasks. Some of the steps include:

  • Confirm that an offer was accepted
  • Confirm employee’s contact info and title
  • Get a headshot and bio
  • Create email and other server accounts
  • Create orientation board and account
  • Send welcome email with getting started instructions
  • etc.

As an operations-minded person, myself, I like to standardize whenever possible. A multi-cultural organization, however, demands acknowledging and accounting for differences in culture. The balance I shoot for is 80/20: 80% standardized and 20% contextualized. We took that into account with our solution. The bulk of the board is the same for all staff and covers the baseline common to everyone. Things like our mission, vision, staff resources, and must-watch or must-read items. Folded into that are lists specific to the team they’ll be joining as well as the country which they’re located. All combined, it’s proven to be a great resource. We also incorporate the feedback of each person that uses it so it can be improved in future iterations.

The best part is that it’s a free service which is always a good thing when the goal is to apply every dollar possible towards our programs. Want to donate?

A replacement for CrashPlan

Last month, I wrote how CrashPlan is discontinuing their home service and that I had to find a suitable alternative. Backup solutions aren’t the most thrilling kind of software and investigating and evaluating the different options can hardly be considered fun. The complicating fact is that I have an atypical arrangement, or I suppose you could call it a specific set of requirements. I have a server along with other computers at home and another server at my mother-in-law’s along with her computer. Since I’m a firm believer of the 3-2-1 backup rules, I wanted to make sure everything is suitably protected.

Continue reading

The Search Begins: A New Backup Solution

This past week, Code42 Software announced they are going to stop supporting their CrashPlan backup solution for personal or home use. That is unfortunate news since I’ve been a customer for over ten years and was quite satisfied and invested in their service. While they have every right to run their business as they choose (especially if the economics don’t work in the long term) the impact to a significant portion of their userbase is not one that will be easy for them to recover from.

My current subscription is only good for a few more months and I now must find an alternate solution that allows my data to be sufficiently protected in a way that is still practical and reasonably priced. So far, the two leading candidates are Arq by Haystack Software and Backblaze. Neither one is a direct replacement so the decision isn’t clear. There may also be other options that should be considered. It will likely come down to the tradeoffs that I’m willing to make. Do I not back up as many computers as I have? Do I no longer also utilize a local backup?

Another possibility is to engineer a bespoke solution by one method or another. That would probably cover my needs better, but would also likely require more upfront effort to implement and ongoing maintenance. Clearly, the classic “build vs. buy” decision isn’t just limited to the domain of business IT.

Peay It Forward

Ok, I admit that the title’s lame, but I thought it was fun. You’re here reading this so I suppose it’s not all bad…

As one progresses through life and their career, knowledge and experience naturally accumulate. As you get exposed to new things and tackle the novel or unexpected, you generally amass a significant catalog of skills, insights, and, hopefully, wisdom. Over time, you consciously (and subconsciously) incorporate those numerous learnings into your day-to-day playbook to make yourself more efficient or effective. It happens almost automatically because nobody wants to spend more time or effort on something if they don’t have to. Continue reading

Old-Fashioned Drone Video

Back in my early Sony days when I worked at RedZone Interactive, a friend and I got into RC planes. Since the rest of the studio didn’t generally roll in until about 9:00 or 10:00, we had the opportunity to take our planes out in the lot behind the office in the mornings while the winds were quite gentle. It’s a fond memory.

About the same time, I saw online a wireless video camera that was about the size of a matchbook and immediately thought of attaching it to the plane to get a cool POV video while flying. I should note that this was 2004. GoPro didn’t exist yet let alone the whole ‘sports camera’ category. Neither did drones that are so common nowadays. Am I a trend setter? Not really. I just thought it was a fun idea.

Not surprisingly, it was very jerry-rigged. The camera with built-in transmitter was mounted on a stick protruding from the side of the canopy attached to a 9v battery for power. The receiver was on the ground attached to a video camera recording the results. My friend had to spend the entire flight watching the small screen on the camera while constantly adjusting a tuning knob on the receiver to keep the signal usable. The results are pretty poor by today’s standards, but I think it turned out pretty good, all things considered:

 

Fun times. I still have the plane, though I haven’t used it in years. I just may have to pull it out and see if it still works. I probably won’t bother with the camera.

Starting a new chapter

Since my life-changing event back in March, I’ve had the opportunity to meet many different people and learn about several organizations, each one different from the others. I have also spent a significant amount of time determining who I am professionally (my strengths, weaknesses, and motivations). That combined with my belief that technology can truly make a difference in people’s lives and it has helped direct my search.

Through a most interesting series of events that I’m unable to adequately explain here, I learned of an opportunity at Edify which is a very special non-profit organization that is best summarized from the website: “To improve and to expand sustainable, affordable Christ-centered education in the developing world.” They achieve that by supporting private schools through small loans, training, and technology. To date, they have partnered with over 1,600 schools and have impacted the lives of over 300,000 students.

After many conversations it became more and more clear that there was a unique match between their needs and my background and skills. I’m happy and proud to say that they have invited me to join with them in their work and start tomorrow as their Vice President of Information Technology. I cannot convey how excited I am for this opportunity and look forward to making an impact in ways I surely couldn’t have previously imagined.

Having the Summer off to enjoy my family was a significant blessing. In light of having just taken my oldest child off to college for the first time yesterday, it all couldn’t have worked out better. Yet another sign that the Lord’s plans are the best.

Watching a log file in a bash script

For the last few months, I’ve been doing some contracting developing automation scripts in bash. It’s been a fun diversion from my job search and leverages my sysadmin background. It has also improved my command of vi and several tricks in bash scripting. I wanted to share one that may be of help to others.

In the scripts that I wrote, it was necessary to kick of a long-running process and then act on entries written to a log file. I created a watcher routine to accomplish this:

01  successfulRun=0
02  keepRunning=1
03  while [ $keepRunning -eq 1 ] && read -t 3600 line; do
04      case "$line" in
05          *completion string* )
06              echo "Completed successfully. Exiting monitor."
07              successfulRun=1
08              keepRunning=0
09              ;;
10          *error string* )
11              echo "ERROR entry found in log. Exiting monitor."
12              keepRunning=0
13              ;;
14          * )
15              echo "Just another line. Monitor continuing."
16              ;;
17      esac
18  done < <(tail --pid=$$ -n0 -F ${logfile})

It’s a general while loop, but there are some useful features. First, in line 3 is “read -t 3600” which allows the loop to break if nothing gets written to the file for an hour (3600 seconds). After the loop, if keepRunning is 1 and successfulRun is 0, I know it timed out.

Lines 5, 10, and 14 allow for cases for any strings encountered. For my uses, I was looking for a success string which meant my script could continue on. Similarly, if an error string is encountered, I exit accordingly. The last one (line 14) is the default case, which probably isn’t needed unless you want to provide feedback of progress.

The last feature is in line 18. The –pid=$$ option allows the tail command to close the logfile when the parent script completes. That allows for a very nice wrap-up no matter what happens. Nice, huh?

Enjoying doing the geek thing

Since I have time on my hands, I have been enjoying working on a handful of projects to scratch various ‘itches.’ Some have been long-standing items on my to-do list and others are areas of interest that would normally be relegated to the “someday/maybe” list.

A geek’s closet

All my various tech do-dads and thingamabobs have been in drawers in the den or elsewhere and were reasonably organized but still a hassle to access and dig through. My wife ran across an interesting picture on Pinterest and showed it to me asking if I’d like to do the same with our hallway closet. Needless to say, I jumped at the chance. It also allowed an opportunity to work with my son on installing the shelving.

All organized

There’s some more work still to go into it before I’m done. The two cardboard boxes need to be replaced with something better and I’m going to install some LED strips on the inside of the door frame for better lighting.

Amazon cloud

I’ve worked with Amazon’s web services (AWS) both professionally and personally but only to a limited degree. For PlayStation, I generated various financial reports based on usage and personally I’m using their email service to handle outbound email from my mail server.

To address the task that follows and to satisfy my own curiosity, I spun up an instance in their Elastic Compute Cloud (EC2). EC2 is very often what is being referred to when someone uses the overly-used “cloud” term. It’s just a virtual machine that is running somewhere in one of Amazon’s datacenters. Nothing mystical, but quite convenient when you need to set up something like…

A secondary mail server

For various reasons, I really like being in charge of my own services. The web server hosting this very page you’re reading also handles my email. I hardly have much email traffic, but the server is offline from time to time so it’s appropriate to have a secondary mail server available that can receive incoming messages relaying them when the primary server comes back online. With my newly-minted EC2 instance, I was able to get that going in just a short while and checked-off a big to-do item.

Raspberry Pi 2

For father’s day, my family got me a Raspberry Pi 2 to upgrade the previous model I’d been running upstairs (as a secondary DNS server). It kinda amazes me how capable a machine it is for only $35.

Monitoring

The Raspberry Pi 2 is considerably faster than the previous generation. Having some available computing overhead, as well as a slightly-more complicated infrastructure at home (due to the EC2 instance), I wanted to get some monitoring going. I dabbled a little with Nagios, but quickly remembered why I don’t care for it. After researching alternatives, I settled on Zabbix and just got it running this afternoon. It’ll take some time to get everything configured just right, but that’s part of the fun.

bash scripting

Due to a somewhat strange set of events, as I write this I’m making my living on a short-term contract developing some automation scripts in bash (a command-line shell on UNIX/Linux systems). It’s drawing on my older SysAdmin skills and has been really fun made even better by the fact I’m doing most of it from home via VPN. Not too shabby.

I have other projects I want to get into so I may write a follow-up with how those go. Now, back to Zabbix…

 

A Retina Desktop is Possible

Retina LogoI now have a working retina display on my Late 2012 Mac mini at work. I previously wrote about it late last year and occasionally experimented with normal HD LCDs but really wasn’t going to be able to do anything without an UltraHD display to test with. Recently, I asked the desktop team and they happened to have one that wasn’t in use. I was able to borrow it and worked more seriously on seeing if this was indeed possible.

The quick-and-dirty how to can be found at the mac-pixel-clock-patch page on Google code. You have to patch a single file to enable the higher 3840 x 2160 resolution, but that, plus a UltraHD display, and you’re in business. Having 3840 x 2160 (UHD) display rendering a 1920 x 1080 (HD) screen makes for a nice experience, indeed. Look at the picture on the right or screenshots of my previous article.

For work, I got a pair of DELL 2414Q 24″ LCDs. They’re nicely made and look quite good. I run one in landscape and the other in portrait so I can display content as appropriate (e.g. spreadsheets vs. web pages). If I were using only a single display, the story would be over. The problem is that the Intel HD 4000 video hardware on the Mac mini isn’t up to the challenge of driving two displays at that resolution. It just can’t throw that many pixels out that fast (just shy of a half billion pixels per second). I would get close, but it would result in the video flickering with pixel ‘junk’ over large portions of the screen. I could get one looking great over mini DisplayPort or HDMI (3840 x 2160 @ 30 FPS) but the moment I connected the second display, problems. I tried customizing lower FPS modes to reduce the total pixel clock demands, but no luck.

The DELL UP2414Q I use at work

My workaround is driving the portrait display at 1920 x 1080 (1080 x 1920, actually) over a USB to HDMI adapter (via DisplayLink). It’s only HD with a variable refresh rate, but it does allow me to have both displays active.

Rumors are that Apple will be revising the Mac mini next month which should improve the video hardware enough to work. We’ll see. For now, though, I’m satisfied and enjoying the experience.