Weekly Thoughts: Unicorn Hunting, Data Context and Reasoning Up
Here are three things that caught our eye this week:
We have been interested to follow the emerging debate about the presence of a bubble in late-stage private technology firms. As one company after another announces the massive implied valuation it received in recent rounds of funding, it is tempting to agree with Mark Cuban when he notes that the current technology investing environment is more extreme than the tech bubble of 2000. Indeed, headlined by celebrities like Ashton Kutcher and Snoop Dogg, venture capitalists, both aspiring and established, are tripping over each other to commit capital to the next “unicorn,” or company that will fetch a billion dollar valuation.
More alarming, it’s not even clear that some of these companies need the money. Last week, the New York Times ran an interview with Steward Butterfield, the CEO of Slack, a corporate messaging platform that recently raised $160 million at a valuation of $2.8 billion. Slack has been a company for only one year, and while its growth has been tremendous, Butterfield himself noted that the company does not have an immediate use for the capital it just raised. Instead, the company is raising capital because, well, it’s just too good of a deal to pass up. From the article:
“It’s pretty straightforward. I’ve been in this industry for 20 years. This is the best time to raise money ever. It might be the best time for any kind of business in any industry to raise money for all of history, like since the time of the ancient Egyptians. It’s certainly the best time for late-stage start-ups to raise money from venture capitalists since this dynamic has been around. And as a board member and a C.E.O., I have a responsibility to our employees, to our customers. And as a fiduciary, I think it would be almost imprudent for me not to accept $160 million bucks for 5-ish percent of the company when it’s offered on favorable terms.”
We will postpone the debate over whether Moses or Butterfield could secure more attractive terms on Series A financing, and simply note that elements of this story do contain worrying amounts of excess. That said, there are other compelling arguments to support the notion that some companies deserve the valuations they are receiving. In fact, Ben Thompson of the Stratechery blog noted in a recent post that vastly larger addressable consumer markets, changing business models, and delayed IPOs are all potentially viable explanations for the large sums flowing into today’s brightest technology start-ups. Consider the example of a consumer facing web-based company looking to monetize via advertising. From the article:
“Advertising works at scale, which in today’s world means hundreds of millions of users; getting all of those users requires years of operating without revenue, which means a lot of capital. All of this is magnified for companies that operate in markets that include network effects: network effects translate into winner-take-all opportunities, which significantly increases the growth imperative, requiring, again, significant amounts of capital.”
We completely understand the value proposition of a company that gains a dominant position in a huge addressable market, so we have little doubt that a few of the currently newsworthy technology names will more than justify their valuations. However, we do question whether ALL these types of companies can perform to expectations simultaneously. Much like the Paradox of Thrift, where some saving is good, but large scale collective saving isn’t, we have trouble grasping the idea that every company can be worth many millions or billions of dollars when their business models are winner-take-all by their very nature. For this reason, we are happy to chase more mundane opportunities and leave the unicorn hunting to Messieurs’ Kutcher and Dogg.
The NHL playoffs started last week and to mark the occasion, we thought we’d revisit our note from last fall which highlighted the analytics movement that has taken the hockey world by storm. Stats like Corsi, Fenwick, and PDO have not only changed the way some GM’s think about team building, but they are even being used by the media as predictive tools. Just last week, ESPN ran an article predicting, based on their own statistical model, that the Blackhawks would win the Stanley Cup.
We had this trend in mind when we came across a Wharton Magazine blog article about the questionable value of some currently popular hockey statistics. The goal of any good data analyst is to distinguish signal from noise. Applied to hockey, the theoretical aim is to separate teams that are actually good from those that achieve fleeting success due to luck. The problem, according to Ian Cooper, author of the blog article, is that a myopic focus on the data can lead many analysts astray if they don’t put the data in context.
Take for example the 1983-84 Edmonton Oilers. While regarded by many in traditional hockey circles as the best team ever, from a pure analytics perspective, the Oilers were just an average team that got lucky. The reason for this conclusion is that the Oilers got outshot over the course of the season (shot differential is critical in the analytics community as a predictor of winning), and what’s worse, the Oilers were also running at what would now be considered an unsustainably high PDO. From another article by Cooper:
“For those who aren’t familiar with PDO, it’s simply the sum of a team’s overall shooting percentage (Sh%) and save percentage (Sv%). So, for example, a team that has a Sh% of 10% and a Sv% of 90% will have a PDO of 100. Advanced statisticians look at PDO because it helps us figure out whether a team’s winning because it’s good or lucky. The theory is a team with a ‘high’ PDO will win games in the short term, but only because a high Sh% is making up for a lack of shots, or a high Sv% is making up for too many shots against. Or it could be both. Over time, when the Sh% and Sv% settle down to ‘normal’ levels, shot differential takes over and, as noted above, teams that get outshot usually lose. As a result, many people look at a high PDO with suspicion, claiming it’s not sustainable over an entire season.”
The problem, according to Cooper, is that this analytical approach doesn’t take into account the players on the ice. Again from the article:
“Data in the absence of context is meaningless. Now let’s add some context. The Oilers were able to run an unusually high PDO because they were stacked with players like Wayne Gretzky, Mark Messier and Jari Kurri…For example, Gretzky scored 87 goals in 74 games…largely because he rode a career high Sh% of 26.9%…[and] Kurri, meanwhile, had a Sh% of 26.8%. In no universe were these guys going to post a 10% Sh%. As a result, the Oilers could rely on a high PDO to win games. Sure they got outshot, but so what? If you’re that good at scoring goals, you don’t need as many shots to win.”
The important point according to Cooper is this: “what’s increasingly clear in hockey, as in all areas that are being transformed by big data, is that data don’t apply themselves. Concepts like repeatability, sustainability and sample size are useful, but only if you understand how and when to apply them. To do that, it takes a good qualitative understanding of whatever subject you’re studying (in my case hockey), constant attention to whether the data are relevant to the hypothesis you’re testing, and the creativity and analytical chops to interpret whatever your scatter plot, histogram or regression model might be telling you.”
We believe this is great advice. While we remain advocates of a data driven approach, we understand that the output is only as good as the people interpreting it and that context matters a great deal. In light of these observations, it is clear the ESPN model should be treated with extreme skepticism. Go Rangers!
This week we listened to an interesting TED interview with Elon Musk posted on the Farnam Street Blog. While Musk spent time discussing his various well known projects involving cars, rockets, and solar panels, we were most interested in the description of his framework for thinking through the various challenges each project presents. From the blog article:
“I do think there is a good framework for thinking. It is physics – you know the sort of first principles reasoning. … What I mean by that is boil things down to their fundamental truths and reason up from there as opposed to reasoning by analogy.
Most of our life we get through it by reasoning through analogy, which essentially means copying what other people do with slight variations. And you have to do that, otherwise mentally you wouldn’t be able to get through the day. But when you want to do something new you have to apply the physics approach. Physics has really figured out how to discover new things that are counter-intuitive, like quantum mechanics … so I think that’s an important thing to do. And then also really pay attention to negative feedback and solicit it, particularly from friends. This may sound like simple advice but hardly anyone does that and it’s incredibly helpful.”
Musk himself demonstrates the utility of this approach as he talks through the logic of each incremental step in the development of his various companies. With SpaceX, for example, previous space launches have always involved one-time use rockets. From a first principles perspective however, the fuselage accounts for 99%+ of the launch cost. This has led Musk et al. to focus on the development of a fully reusable space vehicle, an innovation that Musk suggests could “lower the cost of access to space by as much as a factor of a hundred.” The goal is certainly ambitious, but by breaking from convention and attacking the problem based on its root cost profile, Musk has uncovered the crucial area of focus if true revolutionary innovation is to take place.
We are inspired by Musk and we hope to emulate his “reasoning up” framework in our own endeavors by establishing first principles of what we want to achieve and working from there.
Your Chenmark Capital Team