Measuring Performance with the Tech Transfer Health Index

Measuring Performance with the Tech Transfer Health IndexThe “tech transfer health index” is a simple but powerful technique to quantify the impact and productivity of the entire long tail curve of technologies in a university’s IP portfolio. Here’s why we should adopt it. When I worked in a university technology transfer office, we spent a lot of time pulling together performance metrics. We had 14 different reports, each with its own subtle nuances and unique methodologies. Needless to say, despite our best efforts, our metrics didn’t reconcile well over time and unintentionally gave the impression that our tech transfer office was somewhat, uh, creative in our accounting. The problem, however, wasn’t just accuracy.

Our metrics missed the mark because they didn’t reflect the whole story: we counted mostly technology activity in the head of the long tail curve of distribution – the high-earning technologies, new startups, and issued patents. However, most staff time was spent managing “tail” technologies – filing provisional patents, marketing technologies, keeping on top of licensees who weren’t paying their bills, putting on events, and processing all types of agreement-related paperwork. Another limitation of our approach was that we counted all commercial licenses the same way, regardless of their associated impact or revenue (of course revenue is not a perfect proxy for impact, but lumping together anything with a signature on it created a meaningless and distorted depiction of our performance). Finally, we tallied metrics in our own, idiosyncratic way that was hard to explain to outsiders, so even our AUTM metrics could not be easily compared to those from a different tech transfer office.

Enter the tech transfer health index. I got the idea to create a tech transfer health index in a conversation with a faculty friend. I was describing the university commercialization RFI responses I’ve been reading. A common theme amongst responding universities is their quest for for performance measures that would 1) focus on more than just revenue from ”big hits” 2) better convey the activity of their entire set of active licenses from the high earners all the way down the tail, and 3) indicate the large amounts of invisible and unheralded staff time and labor that’s an essential part to marketing and managing an IP portfolio. In addition, though not mentioned by university respondents, based on my experience, effective metrics should be hard for tech transfer offices to interpret in unique ways, or unintentionally “game;” watertight metrics would increase stakeholder confidence in the TTO’s transparency.

Turns out that faculty have found a solution. Most universities now use a performance evaluation technique called the H-Index to measure the impact and productivity of their faculty’s scholarly work. The H-index is most commonly used in the context of counting the number of times a particular researcher’s papers have been cited by their peers. Before the H-index, tenure committees simply tallied up the total number of citations but did not consider their value and distribution. The H-index was created in response to flaws inherent in the traditional citation-counting method. Tenure committees discovered that (like a home run “greatest hit” technology), a researcher could claim a large number of citations, but not reveal they all came from a single paper, a “one hit wonder.” Also, (kind of like counting large numbers of provisional patents or low-value license paperwork) a scholar with a lot of citations could be basing her count off of several papers that were cited only once or twice, a sign that while she wrote a lot of papers, none of them had a significant impact on other researchers.

Three Different Health Indexes

The H-index can be applied to assess the health index of university IP portfolios. Calculating the tech transfer health index is easy. I’ll bet you already have data on how much revenue each patent has earned over its lifetime. Use that data for your first health index analysis to evaluate how diverse and well balanced your licensing efforts are.

  1. Dig up the spreadsheet that lists the revenue earned by each patent (patents are a cleaner data point than technologies since they’re a finite IP unit).
  2. Rank the patents by the revenue they’ve generated over their lifetime from largest to smallest.
  3. Make a chart with the horizontal axis for patents and the vertical axis for revenue. Plot the patents by their revenue in units of $1,000. You should quickly see a long tail curve emerge.
  4. When you’re done plotting, extend a diagonal line out from the origin (where the x and y axes meet) through points (1, $1000), (2, $2000) .. (10, $10000), etc. — kind of like the straight grey line in the picture above.
  5. Where your diagonal line intersects the nearest part of the curve, draw a line down to the x axis: the distance from (0,0) to where the vertical line hits the x axis is your tech transfer health index.

For example, in the diagram above, this tech transfer office’s health index is three. So this office has three patents that each earned at least $3,000 over their lifetimes. Of course when you chart your own health index with real data, your numbers will likely be much larger.

So how are you doing?

If you chart your portfolio and discover a long tail curve that’s very steep, your office is relying on a few patents that are earning most of your revenue. In other words, a low health index. Or, you may have a low health index if your long tail curve starts low and stays flat. A low flat curve indicates that your tech transfer unit is licensing a large number of patents but not getting a lot of revenue back from them. It’s not necessarily bad to not earn much revenue (after all, getting technologies out the door and into use should be the ultimate goal). However, a low, flat curve indicates you may be spending a lot of time and money on paperwork. However, an upside of quantifying a low health index of this type is that you can prove that your unit is managing a large volume of essential but unappreciated long tail-related paperwork.

You have a high health index if — like a productive and impactful researcher — your long tail curve starts high and gently curves downward. This means your office has found the right balance between impact (high earning home runs) and productivity (large numbers of low-income licenses). Congratulations!

Here’s the value of using the health index:

Rewards real tech transfer activity, not just fees: Conventional ways to increase revenue such as charging high fees or striving for a home run license will not improve your health index. Instead, the health index improves only with consistent and long term licensing activity over a broad spectrum of technologies.

Promotes true economic development: Your tech transfer office will have better ammunition with which to convince university administrators that there’s value in getting and maintaining a large number of low-revenue licenses from ”tail” technologies. You can now quantify more than just high-revenue licenses.

Makes it possible to compare large and small universities: Tallies discriminate against small universities. The tech transfer health index makes it possible to directly compare universities that have very differently sized IP portfolios.

Get credit for a well-rounded licensing portfolio: Your health index will confirm that your office is doing justice to the entire long tail curve of available technologies. You can point out that the large volume of low-earning, low-visibility patents and licenses may not earn a lot of money, but your office is effective in meeting the essential purpose of the Bayh Dole Act, to get technologies out the door into use.

Versatility: The health index is versatile. Instead of patents, on the horizontal axis, one could plot other finite IP assets such as technology disclosures or startups. On the vertical axis, instead of using dollars, one could use other values such as the number of web hits for technology disclosures, or for university startups, capital raised.

Widely applicable: the health index can scale up or scale down. It can be used to assess the performance of a single licensing officer, a group of universities, or an entire geographical region (innovation cluster), or an industry segment such as biotech or nano-scale manufacturing.

Easy to use in public: If the names of the patents, technologies or whatever you’re analyzing are removed, it’s possible to publicly and safely share your unit’s health index results.

Assess internal operations: You could use the health index as an internal management tool to figure out how efficiently you’re managing various aspects of your operation-related activities.

In the unlikely event that someone were to interpret their metrics in a non-standard way, the health index would be harder to manipulate than standard straightforward tallies of new licenses, new startups, etc. However, realistically, no metric system is game-proof. For example, some researchers attempted to game the H-index by creating Citation Clubs where they set up fake “journals” with their friends and aggressively cited one another’s low quality papers.

Consider how hard it would be to set up something like a Citation Club in a university tech transfer office. A tech transfer director, desperate to create a good impression on his higher-ups, in theory, could create a “Startup Club.” He could incorporate several “fake” startups (kind of like sham journals) that are wholly owned by the university. Next, this director could “negotiate” several licensing deal with himself (kind of like having his friends cite his articles) and put himself on the startups’ board of directors (hooray, another award on the CV!). He could assign a tech transfer office employee to be CEO of the startup (despite zero revenue and no product). Voila, in one fell swoop, this hypothetical tech transfer office could enjoy an increase in revenue, more licensed technologies, plus a few additional new startups. But realistically, no one would do this. Even in the unlikely event that someone created a “Startup Club” to improve their performance metrics, the Club would be promptly dismantled by the powers-that-be.

If you have estimated your health index, I’d love to hear how it went. Is anybody willing to share their actual data with me?

TOOL: Thanks so much to the person who created the Excel tool that calculates the health index. Some people had problems with the zip file so I put the tool into an older version of Excel and now it will download as a proper file, not a zip. You can download the tool HERE. It makes you a chart and calculates your health index.
'Stoking Your Innovation Bonfire' shipped to nearly 90 countries
Don’t miss an article (2,200+) – Subscribe to our RSS feed and join our Innovation Excellence group!


Melba KurmanMelba Kurman writes and speaks about innovative tech transfer from university research labs to the commercial marketplace. Melba is the president of Triple Helix Innovation, a consulting firm dedicated to improving innovation partnerships between companies and universities.

Posted in ,

Melba Kurman

NEVER MISS ANOTHER NEWSLETTER!

Categories

LATEST BLOGS

iPhone Followup – Innovation in a Box

By Braden Kelley | June 28, 2007

My initial iPhone article highlighted why the iPhone will not be a success in its first incarnation. Make no mistake though, the introduction of the iPhone will revolutionize the mobile telephony market. Let’s answer some of the criticisms of the most innovative mobile handset in the history of mobile telephony:

Read More

The Growing Housing Divide

By Braden Kelley | June 21, 2007

I was speaking with a friend of mine recently and he brought up an interesting point. He asserted that there was a widening gap in home prices between where people want to live and where people have to live. How else can you explain the housing price fall in most of the country while places like Seattle continue to have rising prices?

Read More

No Comments

  1. asrao on February 7, 2011 at 2:39 am

    One comment on the long tail. I discussed with few US TTO about licensing income from emerging markets like India, foregone due to the existing practice of licensing to larger firms without a strategy for incomes with a time lag. The long tail would be healthier when income from second/ third license of Patent-1 adds to marginal income of Patent N. What is yr view?

  2. jeff on February 10, 2011 at 9:19 pm

    Speaking as a 19-year economic development practitioner, I understand that this metrics measurement methodology may help with internal analysis of output and activity by university TTOs and researchers. That is important to the develop a common measuring stick within the university communities. Unfortunately, those outside the university, the community folks and political leadership is looking for new company formation and job creation metrics in their respective communities. I do not see how the metrics process described is going to be meaningful outside of the university. The example of a patent licensed to a company in India might ultimately result in the loss of jobs from the community of origin. This is not a far fetched example.

    I think one day universities in the States will recognize that, in order to maintain community support, financial and otherwise, they will need to demonstrate that they are economic engines. I am not convinced that, at the end of the day, the metrics collection process as described will satisfy this challenge.

    Even in a global environment, economic development has a strong link to place. If these comments are off base, I apologize.

  3. Dana on February 17, 2011 at 12:07 am

    Is the only metric one uses is this method is licensing income, which universities have no control over? Income is a measure of the business model of the licensees, not the university. Many universities are broadening their perspective to include licensing to non-profit organizations, licensing to developing nations, etc. It’s unclear to me how a method plotting solely income tells you anything about a particular office’s health. Perhaps I misunderstood.

    In addition, this seems to look only at patents. If you examine the value that Facebook has created, with very few patents, do we say that Facebook is bad? Again, the use & licensing of non-patented technology is growing and valued in today’s economy. How would that be included in the above?

Leave a Comment