The Top Universities for Blockchain - Methodology

CoinDesk's 2020 university rankings rate institutions from scholarly impact to employment outcomes. Here is the full story of how we did it.

AccessTimeIconOct 6, 2020 at 8:57 p.m. UTC
Updated Mar 9, 2022 at 9:11 p.m. UTC
10 Years of Decentralizing the Future
May 29-31, 2024 - Austin, TexasThe biggest and most established global hub for everything crypto, blockchain and Web3.Register Now

CoinDesk’s 2020 Blockchain University Rankings are by no means the first attempt to create such a list. However, it is, to our knowledge, the most comprehensive and nuanced consideration of universities’ impact on the blockchain field. As we collected data, sent out surveys and spoke with schools across the country, our primary goal was to produce a ranking using a methodology that was not only rigorous and reflective of the landscape but also externally defensible.

We want to make it clear to readers both what the ranking affirmatively measures and what it fails to capture. No ranking can reflect the full complexity and nuance of a university’s strengths and impact in a particular subfield. And the scope here is necessarily limited – something we will improve upon and expand in future years. For this year, we focused on universities in the U.S. and we limited our field to the schools appearing in the top 30 in the most recent rankings published by U.S. News and World Report, Times Higher Education, Academic Rankings of World Universities or Quacquarelli Symonds.

We believe rankings of this nature are valuable. The top schools are at the top for a reason, and a well-executed ranking provides a snapshot of the current state of the field. We are also committed to integrity and transparency throughout the ranking process (i.e., we are willing to discuss and share our data, anonymized, upon request).

To determine final scores, we looked at four primary categories: (1) an institution’s strength in research and contributions to advancing the field; (2) the existing blockchain offerings on campus; (3) employment and industry outcomes; and (4) overall academic reputation. Each category comprises multiple sub-categories, offering a holistic picture of a university’s presence in the blockchain space.

Reuben Youngblom is a legal scholar (JD), computer scientist, and a fellow at the Stanford Law School’s CodeX Center for Legal Informatics, where he leads the blockchain education initiative. He is also a coordinator for the Stanford RegTrax Blockchain Regulatory Tracking Initiative and a co-host of the “Our Data” podcast.

Scholarly impact: To determine a school’s scholarly impact score, we relied on the Clarivate Web of Science, a publications and citations database. We took the total number of publications (all subjects) from each school between 2018-2020 and narrowed them to include only blockchain- or cryptocurrency-related papers. From this set, we generated citation reports and created subsets in which the first author of the publication was affiliated with the university in question at any point between 2017-2020. Also factored in was the aggregated H-Index of the blockchain publications.

Campus blockchain offerings: To determine a school’s blockchain offerings score, we examined the existing campus infrastructure – whether or not they have a dedicated blockchain research center, any smaller blockchain-focused initiatives, student clubs, etc. We also looked at each school’s course offerings for the 2019-2020 and 2020-2021 academic years (as listed in their official catalog or, when no catalog was available, as listed on their published schedule of classes), as well as any blockchain-specific degrees, minors, or concentrations. We also considered the breadth of these class offerings.

Employment and industry outcomes: A student’s primary goal in obtaining a college education is often to secure a job in industry. To discover which schools are placing the most graduates in the blockchain field, we looked at the LinkedIn footprint of over 100 of the largest and most influential blockchain companies (as well as their 12,000+ employees). We also surveyed industry stakeholders to get a sense of how institutions are subjectively viewed on a spectrum ranging from “Elite” to “Well Below Average.” This qualitative data was normalized into a numerical scale.

Academic reputation: In a perfect world, rankings would emphasize merit, but the reality is that a university’s subjective reputation affects everything from job prospects to the availability of opportunities for current students to a school’s ability to attract prominent speakers for campus events. However, in an effort to acknowledge the subjectivity inherent in a “reputation” score, we weighted this category less than the quantitative, blockchain-specific research contributions and campus offerings (i.e. categories 1 and 2). To determine an institution’s reputation score, we looked at two items: (a) existing, overall reputation as calculated by USNWR, THE, ARWU, and QS; and (b) reputation as determined by our own qualitative surveys, which asked both practicing academics and current students to evaluate schools on a spectrum from “Elite” to “Well Below Average.” This qualitative data was normalized into a numerical scale.

There are two common threads in our methodology. First, in keeping with our goal of building the most rigorous rankings possible, we used external, quantitative data whenever such data was available, and requested individual confirmation from every institution to ensure accuracy. When we required qualitative data, we sent out surveys through all available channels (including via partnerships with external organizations, such as the Blockchain Education Network and MouseBelt University) to cast as wide a net as we could.

Second, we made every attempt to examine each data point from as many angles as possible. For instance, consider a school’s research impact. The data may show that a given university has produced 25 blockchain-related publications between the years of 2018 and 2020. But what does that actually mean? Is a school with 25 publications “worse” than a school with 30 publications? As with most subjective and complex questions, the answer is a resounding “Maybe.”

If a school has an enormous faculty and a titanic endowment, 30 publications might be a very small percentage of the institution’s overall output. It might be much more impressive (or, at least, possibly indicative of the level of blockchain focus) for a smaller school to have 25 publications (comprising a larger percentage of its overall output) than for a larger school to have 30.

On the other hand, 30 papers is still an objectively larger contribution to the body of blockchain scholarship than 25 papers. A multifaceted approach helps account for this type of incongruity by considering each data point from both a net impact standpoint and a relative standpoint, because both calculations are important.

This strategy may also be extrapolated beyond research. For instance, when trying to decide which school supplies a student the best chance of getting a job in the blockchain industry, it’s important to consider both the fact that large schools may have a larger number of graduates in blockchain (and, therefore, a larger alumni network, greater name recognition, etc.). But smaller schools may place a higher percentage of their total student body (though significantly fewer graduates overall). The two metrics – raw placement number and placement as a percentage of the total student body – are different but equally worthy of consideration.

As a final note, we’d like to address the project of creating university rankings in a general sense. In important ways, ordinal rankings are like statistics: incredibly useful for showing very specific data but also both narrow and inherently malleable. Even small changes to the methodology can have outsized effects on the final result.

To state that rankings are vulnerable to manipulation is not intended to marginalize our data or the larger project at hand; rather, we hope to draw a critical distinction between the data itself and the ranking methodology, which is little more than a (subjective) vehicle for interpretation. Our goal is to present these rankings as exactly what they are: valuable and rigorous but not indisputable. We believe we captured the current state of blockchain in higher education. But perfection is impossible. We acknowledge the limitations of any (and every) ranking.

Lastly, we would like to express our gratitude towards those who made this project possible, particularly the Blockchain Education Network and MouseBelt University. We also offer our profoundest thanks to Michael Bastedo (director of the Center for the Study of Higher and Postsecondary Education at the University of Michigan) and Nick Bowman (a professor with the College of Education at the University of Iowa), both of whose expertise in the field of university rankings was invaluable, and whose guidance was instrumental in prioritizing fairness, usefulness and objectivity in this project.

We are very willing to discuss our methodology, answer questions and address concerns. Interested readers are encouraged to contact us.

screen-shot-2020-10-07-at-3-04-58-pm

Disclosure

Please note that our privacy policy, terms of use, cookies, and do not sell my personal information has been updated.

CoinDesk is an award-winning media outlet that covers the cryptocurrency industry. Its journalists abide by a strict set of editorial policies. In November 2023, CoinDesk was acquired by the Bullish group, owner of Bullish, a regulated, digital assets exchange. The Bullish group is majority-owned by Block.one; both companies have interests in a variety of blockchain and digital asset businesses and significant holdings of digital assets, including bitcoin. CoinDesk operates as an independent subsidiary with an editorial committee to protect journalistic independence. CoinDesk employees, including journalists, may receive options in the Bullish group as part of their compensation.

Author placeholder image

Lecturer/researcher, Stanford University/MIT