Timeglider Acquisition

preceden-timeglider.png

I’ve got some big news to share today: Preceden has acquired Timeglider, one of the other big players in the online timeline maker space.

Here’s the announcement: Preceden Acquires Timeglider.

This was my first meaningful acquisition: back in December 2017 I did buy and redirect the domain for Timerime, another timeline maker tool, but the site had shut down several months prior so it was merely a matter of buying the domain. Timeglider on the other hand is an active business with paying customers, recurring revenue, IP, etc – which made this quite a bit more complicated and required lawyers and paperwork and whatnot. I learned a lot, but am also very happy it’s behind me.

Now comes excuting on it and recouping the cost in a hopefully reasonable amount of time. I’ll probably write more about the data and analytics piece of this in the future, but for now, just wanted to share the news!

A Few Thoughts on Image Upload Usage at Preceden

One of Preceden’s most popular feature requests over the years has been the ability to upload images to Preceden and have those images appear on timelines.

A lot of competitors offer that functionality, but I procrastinated for almost 9 years for two reasons:

  1. It’s complex to implement, both in terms of actually handling the uploads and having them appear on the timelines.
  2. Most of the people that requested it were using Preceden for school timelines and that segment of users tend not to upgrade at a high rate. People using Preceden for work-related project planning timelines didn’t request it much. Therefore, it never was much of a priority because it likely wouldn’t move the needle on the business.

That said, since I’ve had more time to work on Preceden recently, I decided to finally do it. For handling uploads, I wound up using Filestack.com which simplified the implementation a lot. And updating Preceden’s rendering logic took time too, but in the end it all worked out.

I recently checked on the usage stats and – not surprisingly – it’s used most heavily by people using Preceden for education:

Screen Shot 2018-11-15 at 2.27.44 PM.png

For users that have signed up since this launched:

  • Teaching: 29% uploaded an image
  • School: 26%
  • Personal Use: 16%
  • Work: 12%

In other words, it’s used very heavily (which is great!) but not with the segment of users with the highest propensity to pay.

This dilemma comes up fairly often: do you build Feature A that will be used heavily by mostly-free users, or Feature B that will be used heavily by mostly-paying customers?

For better or worse, I never wound up focusing on one market or use case with Preceden: it’s a general purpose timeline maker that can be used for any type of timeline. As a result though, I often get into these situations. If I was just building Preceden for project planners, I’d never implement image uploads. If I was just building it for students creating timelines for school, I’d probably have implemented it years ago.

It also comes down to goals: if my main goal is growing revenue, I probably shouldn’t work on features like this. But if I want Preceden to be the best general purpose timeline maker then it does, but there’s an opportunity cost because I’m not building features for the folks who will actually pay.

I operate in the middle for product development: work mostly on features that will make money, but also spend some percentage of my time on features like this that will make it a better general purpose tool.

If I were to start something from scratch today, I’d probably pick a narrow niche and try to nail it. No general-purpose tools. I’d recommend that to others too.

Going broad is fun in a way too though, it just has it’s challenges :).

 

A New Adventure: I’m Taking the Leap to Focus on Preceden and Analytics Consulting

As many of you know, I’ve had a long-running side project called Preceden, a tool that helps people create professional-looking timelines:

preceden-2018.png

I launched Preceden back in early 2010 while I was a still lieutenant in the Air Force and kept it running as a nights-and-weekends side project throughout my time at Automattic and Help Scout.

Over the years its revenue has slowly grown to the point where it’s a healthy little business these days. As its revenue has grown, the amount of time I have to put into it has dwindled to an hour or two each week. Between my work at Help Scout and my family (three kids under four now!), I basically have had time to do support and not much else, despite there being so much more I want to do.

There was never going to be a perfect time to take the leap to focus on growing Preceden, but with its revenue and growth being what it is, my wife and I have decided now’s the time to do it.

For the foreseeable future I’ll be focused on growing Preceden, but also doing some analytics consulting on the side. Going all-in on Preceden was an option, but I really enjoy analytics and business intelligence work and want to continue leveling up there. I’m thrilled to have both Help Scout and Automattic as my first consulting clients.

With my hours now reduced at Help Scout, we’re looking to hire a new analytics lead. Help Scout is an incredible company and you’ll get to work with an amazing group of people who care deeply about building a business and a product that people love. As the lead analyst, your work will have a huge impact on the direction of the business. If you’re interested in this role, check out the job description here: Data Analyst at Help Scout and feel free to shoot me an email with any questions.

I have no idea how this will all play out long term, but I’m really excited to see how it goes.

Exploring your Heroku Rails app’s database using SQLPro for Postgres

In the past when I’ve wanted to explore production data for a Heroku-hosted Ruby on Rails app, I’ve primarily used heroku console and rake tasks. Each method has limitations though: heroku console makes easy to answer simple questions about your data, but makes it difficult to perform complicated analyses that take more than a few lines of code. Rake tasks let you perform complex analyses, but make it difficult to explore data because each time you tweak your task to do something new, you need to commit, push to production, run the task, and wait for it to execute. Neither option makes it easy to quickly explore the data.

Wouldn’t it be nice if you could quickly query your database and explore the results?

Fortunately there is a way using a combination of Heroku’s pg:pull feature and a Mac app called SQLPro for Postgres. Here’s how it works:

Step 1: Pull your production data into a local Postgres database

Heroku makes this fairly easy using the pg:pull command:

$ heroku pg:pull HEROKU_POSTGRESQL_MAGENTA mylocaldb --app sushi

Where mylocaldb is the name of a local Postgres database, sushi is the name of your Heroku app, and HEROKU_POSTGRESQL_MAGENT is the name of your database which you can obtain by running:

$ heroku pg:info -a sushi

If your local Postgres instance requires a user name and password, you can provide them via the command line as well:

$ PGUSER=postgres PGPASSWORD=password heroku pg:pull HEROKU_POSTGRESQL_MAGENTA mylocaldb --app sushi

In order for this command to work, mylocaldb can’t exist when you run this command. To delete it beforehand, you can run:

$ dropdb mylocaldb

For my own workflow combine them and use a Bash alias to make it easier to run:

alias prdb="dropdb preceden_production_copy; PGUSER=postgres PGPASSWORD=password heroku pg:pullHEROKU_POSTGRESQL_MAGENTA preceden_production_copy --app sushi"

Then I can just run prdb (my short hand for “Preceden Database”) from the command line to drop the old copy and grab the latest production data:

$ prdb
heroku-cli: Pulling postgresql-infinite-32999 ---> preceden_production_copy
pg_dump: last built-in OID is 16383
pg_dump: reading extensions
...

Step 2: Explore the data using SQLPro for Postgres

SQLPro for Postgres is a fantastic Mac app for exploring Postgres databases. You can also query the data other ways but for quickly exploring, querying, and exporting the data, SQLPro for Postgres is hard to beat.

Here’s what the UI looks like along with an example query to display the first 10 people to sign up:

sqlpro-for-postgres.jpg

In future posts we’ll see how to query Postgres with R to analyze the data and gain insights about how people use our products.

If you’re interested in learning more, sign up for my new Data Science for Product Analytics newsletter to get notified when there are new posts.

Update: check out the follow up post, How to Schedule Cloning your Heroku Postgres Database Locally.