According to Jason Owen-Smith, Executive Director of the Institute for Research on Innovation & Science (IRIS), the real reason we need research universities is about our future and the public good. Listen to his take on why research universities are a form of social insurance and valuable to our entire nation, beyond Velcro and Tang.
[00:05] ERIN KING
More people than ever are questioning the value of higher education. We are here to explore why they're right, why they're wrong and which institutions and organizations are rising to the challenge. I'm here with our Analytics Consultant, Dr. Jacob Bonne, and Jason Owen-Smith, who is the Executive Director for the Institute for Research on Innovation and Science, also known as IRIS.
So happy to have you here today, Jason.
[00:29] JASON OWEN-SMITH
Thanks very much for inviting me. It's going to be fun.
Let's start with this. Why do we need research universities?
It's a great question. It's important for an audience that doesn't spend a ton of time thinking about all of the various types of higher education institutions that are out there to understand what a research university is before you explain why they're important.
Research universities are things like the institution where I'm a professor, the University of Michigan. Approximately $30 billion a year the United States invests in academic research. More than 90 percent of that is used to do research by only about 3 percent of higher education institutions. Those are the big research universities. They're names that most of you would recognize (University of Michigan, Harvard, Stanford, Indiana University, University of Wisconsin) and they're distinguished from institutions that are much more solely focused on teaching by having a very, very big and diverse, if you will, knowledge generation infrastructure.
So why do we need those things? Why do we need these big, expensive institutions that put together education at lots of different levels with research that spans essentially the entire scope of human knowledge?
The answer, I think, is about our future.
One of the things that I think research universities do uniquely among institutions that do research and teaching in the United States is create and sustain a knowledge infrastructure that allows us as a society to react to opportunities and problems that we don't know we have yet.
There are ways for us as a society to invest in institutions that essentially serve as a form of social insurance against an uncertain future. And so when I think about research universities, I think about what we need for those institutions to continue to be able to be flexible enough, to have enough internal knowledge and enough connections among the types of knowledge to allow us to be fairly certain that when something crops up that we can't anticipate now — one noted philosopher of science, Donald Rumsfeld, once called the unknown unknowns — when of those crops up, it's the research universities that I believe are going to be the primary source of response to them.
[00:37] DR. JACOB BONNE
Really, as we're thinking about all of that impact that research universities are doing, how are universities or could universities leverage administrative data to think about social and economic impacts generally?
That's great. One of the things that I spend a lot of time doing, as you know, is working with partners like Steppingblocks and others, and with administrative data that universities share with IRIS, to try to understand, explain and hopefully improve the public value of these kinds of institutions.
There are a number of things, but at base, I think what we need this administrative data for, and I'll talk a little bit in a second about what I mean when I say administrative data, but what we need these administrative data for are to help us fill out an understanding of what monetary investments in research universities allow us to do.
I think we tend to get caught when we're talking about policy or improvements to institutional arrangements in talking about what we spend and not about what the spending allows us to do.
My university last year spent about $1.6 billion on research. That's a great number. It's a big number. It's actually the largest public university number in the country. But I think if we just talk about what we spent, the citizens of the great state of Michigan look at that number and think, wow, that's a lot of money. What do we get for that?
In order to understand what we get for it, we have to understand what the spending does. Grant money on its own doesn't have an impact. What grants do is allow people like me, faculty members who are doing research, to hire the people and buy the stuff necessary to actually get research work done.
That work produces new knowledge and trains students at multiple levels — there are undergraduates who work on my research, graduate students, postdocs — but it trains them to be the kind of people who know how to do research and know how to apply new knowledge in new settings.
Then when that knowledge and those people leave the university and land pretty much everywhere in society they can apply that knowledge to address problems or develop new opportunities. So the reason we need administrative data is because the administrative data let us see at a really granular level what research spending does.
IRIS is an IRB approved data repository housed at the University of Michigan. We're the anchor of a consortium of major research universities who share data drawn from their administrative systems, mostly HR sponsored projects of procurement, that lets us see who the people who are hired to work on a grant are, what goods and services are being bought from vendors and where those vendors are located and who they are, what subcontracts are going out.
We then partner with organizations like Steppingblocks, the U.S. Census Bureau, van Dijk, recently the National Center for Science and Engineering Statistics, to link those administrative data to outcome information data about the career outcomes of students working in research (that's the work we do with Steppingblocks), data about publication and patent outcomes, the knowledge that's produced, data about the economic impact of vendors to university research in regions, states and the nation. Then these things together allow us to support research and to do reporting that traces that whole process.
Money hits the university. Somebody hires people and buys stuff. Work gets done. Knowledge gets developed and published. People get trained. Both those things leave the university and land somewhere where they can be applied, and we can start to trace that, understand it and hopefully strengthen it.
What kind of data infrastructure would be necessary to understand, explain and improve the public value of science?
I've sketched a little bit. We've been around for seven years now, and we've been building this infrastructure out. When I talk about it, I use the idea of what I call a data mosaic. The idea is there's lots of data out there. Steppingblocks has a ton of data. Individual universities have a ton of data. Academic publishers have a ton of data. The federal government has a ton of data. State unemployment insurance agencies and higher education agencies have a ton of data. All of those types of data are valuable. But where they really shine, where they become something much more than the sum of their parts, is when all those little tiles can be arrayed in a fashion that allows us to paint a larger picture.
Think about the story I just told. More than 40 universities send us data. We take that data, protect it, integrate it with data from more than 50 sources of information ranging from simple things like the United States Postal Services, public translation between zip codes and FIPS codes and latitudes and longitudes, to extensive public and proprietary data, like the data we work with from Steppingblocks or from academic publishers and to data maintained by the Federal Statistical System.
What IRIS is in the business of is being an anchor for a community of data owners, data custodians, to allow them a platform and a means to work together to make linkages across lots of different types of data to build the sort of mosaic we need to see to trace impact from an investment to its outcome.
What's key about that, I think, is that each of these types of data has different ethical, proprietary and legal restrictions on it. What's important there to making the data mosaic is more than the simple technical act of bringing data in and linking it. That's hard. But we know how to do it. What's harder is building the relationships, the network of data providers, data users and researchers who will work together to figure out how to responsibly and securely make connections across the data in a way that it's valuable to all its participants.
I absolutely love this idea of a data mosaic. I think that's such a neat analogy and really speaks to all of the complexity that is not only higher education but really anything in our society that involves the transfer of funds and knowledge and policies and procedures. I love that bit of symbolism there.
How do research universities do a better job of explaining their value to Congress? We've talked about that some folks are questioning the value of higher education that also includes our legislators from time to time; so how can we do a better job of that, and how can we accomplish that through data?
It's a great question. One of the ways that IRIS helps ensure that sort of mutual value that we talked about is we use the data that universities submit with the linkages, the improvements we make to it, the sort of mosaic, if you will. Not just to support research but also to build reporting that flows back to our participating universities for a variety of use cases.
One of the major use cases is government relations. That happens at two levels. There's the federal government and the legislature, the executive branch and the science agencies for whom the question is: What's the value to the nation of our investment in research?
If you look (I think the last time I looked at this it was the 2018 or 2019 data) but our country spends about $200 a year for every person (man, woman and child) in the country on academic research. That's not a ton compared to, say, the cost of healthcare or servicing the national debt or defense. But it's a significant amount, and I think what we need to be able to do for those groups is to go beyond analogies, beyond saying, invest in (this is an old example) invest in the Apollo program. And we'll go to the moon, and that will be great. But you'll also get Velcro and Tang. And actually being able to nail down how that happens.
I think the other thing that's important is to be able to demonstrate impact at multiple timescales. If you think about that story I painted where money flows into a university and someone like me, lots of people like me, use it to hire people and buy stuff. When we hire people and buy stuff, there are immediate economic impacts, and we can trace out into very fine-grained geographies, those impacts.
One of the immediate things that we do is use that data about vendors in concert with economic impact data that we sourced from a variety of locations to explain and to produce reports that show by congressional district (state or federal district in a given state) how many businesses, how many jobs, how much money is flowing out of universities to organizations in a given district. Who are the suppliers for this? We have a communications infrastructure and a product we call the Impact Finder that allows comms people and government relations people who are nontechnical to dig through their own data and identify stories about that impact.
So in the near term, being able to say: Congressman so and so when you're voting for this, you're not just tossing money into a university black hole. The money hits the campus, and it leaves immediately to do work; and some of it lands in your area with your constituents and businesses who are employing your voters. That's the short term.
In the medium term, I think, the real impact of this is in training people. When we work with partners like Steppingblocks, what we can see is, and we can say: I recently looked at some data for the State of Michigan. We took the University of Michigan's data, and we looked at linkages we've made with you all to Steppingblocks' employment data; and we were able to say for a report that our upper administration gave to our elected regions, we've identified more than 12,000 people who are currently employed in the State of Michigan across these industries who received some training at the University of Michigan as part of a research grant.
We can also, with the kind of data that we work with with you, have pretty strong salary estimates. So we can say and these people are working in these industries and they're relatively high paid. They're in jobs where it's clear that they're making a contribution to their employers, otherwise their employers wouldn't be paying them. You can start to talk about the value.
We tend to get stuck when we think about the value of a degree, for instance, in thinking that it's purely private value to the individual. But really, there are three parts research has shown; and we're trying to work on some more of that research to the value of a degree. There's the value to an individual. I get a degree. I can get a better job. I make more money. And that's where we tend to get stuck when we think about this, and we tend to imagine that makes this a private good. It makes it something that we should push the cost off entirely, on to individuals and families.
But my education also is of value to my employer, because employers who pay you exactly what you're worth to them or more don't stay in business very long. So they pay me, but they also have to be able to derive some greater value from my knowledge of the application.
There's also evidence that having a more educated populace has benefits for everybody. There's some recent economic research by a guy named Enrico Moretti and a few of his collaborators, which shows that as the number of college educated people in a region increases, wages go up for everybody, even those who aren't college educated. So there's also a public good version.
What we're seeing here is the beginnings of an ability not just for the research engaged folks, but hopefully we'll be able to expand to others to say, look, in the short term, research investments float all boats in the region by providing feedback and revenue to a wide range of suppliers, large and small businesses.
In the medium term, they train and educate a workforce that brings value to individual employers, to the individuals themselves and to the public.
In the long term, they develop the kinds of new knowledge and innovations that are the basis for new industries, new technologies, economic dynamism, startup companies. Google came out of an academic research grant. A large part of Apple's iTouch touch screen initially developed at the University of Kentucky as part of an NSF-funded dissertation.
You can go through many, many of these examples, and if you think about, that's where the big value lies, that's where you can say, look, the iPhone was introduced in 2007. Since that time, there's been an explosion of an entire new industry. There are entire new categories of jobs, like app developer, that simply didn't exist 15 years ago, that are down not just to the academic work. Obviously, there's huge infrastructure of corporations and others doing this work. But part of what the academic research did was contribute the necessary technologies and increasingly contribute the people who are trained to be app developers, even though we didn't know such a thing existed 15 years ago.
So that's the language we use. And what we try to do is build the reports and the stories that can put data on that and contextualize really human examples, so we can say, X amount of vendor money flowed to these institutions.
And here's one. It's a company that we've talked about before called Carlin's Creations, it's a small engineering firm. It was a partner, a vendor, to an NIH grant that went to a kinesiology professor who demonstrated that if you take infants who have certain developmental diseases, particularly Down syndrome, and you give them eight to twelve minutes of exercise on a treadmill twice a day, they walk four to six months faster than if they don't have that. And that's a huge developmental benefit for the children.
It turns out Carlin's Creations is the one that built those cute infant treadmills, and because in part of this research, and because they built these things, there's now a product line that this small company has selling those treadmills to hospitals around the country who rent or loan them to families whose children can benefit from the education.
It's that kind of story wedded to the larger, contextual data that I think really makes the case. With Steppingblocks, we do that at the medium term level, the education and workforce level. With other partners, we do it at the economic impact, early stage level, and we're working now on developing out the longer term, harder thing, which is to trace the impact of the technologies that are discovered.
Problem is, that's longer term. And so we either have to go way back in time to see them, or we have to build an infrastructure that can grow and be sustainable; so that we'll be able to trace the impact of things that are being done now.
As we talk about the value of a degree and all of these different areas that it impacts, I don't think we can have this conversation without talking about the pandemic. What has been the impact of shutting down research during the pandemic? Could the answer to this help make the case for more funding?
I think so. The impact has been varied, but there are important things going on. Early evidence is suggesting that certain types of research have taken much bigger hits than others, particularly laboratory research, research that requires people to be physically co-present in a small space. And research using human subjects, particularly biomedical research. Things like clinical trials, things like face-to-face interview-based studies, those have all been dramatically impacted by the pandemic.
There's some interesting stuff going on, and this is work that IRIS has done recently to try to start to understand some of the pandemic effects. There's also a whole bunch of good survey research that we didn't do, but that we rely on by other groups.
One of the things that was interesting that happened here is that the federal government, quite smartly, I think, did the right thing and issued guidance early in the pandemic that said that federal grants can continue to pay the salaries of people even if the research work isn't progressing, provided you can take, say, graduate students who are paid on research and can't be in the lab and have them writing papers, for instance.
But what that means is that over nine or 10 months, these areas of research where work was shut down, where new data wasn't able to be collected, they continued to pay their salaries, but they weren't doing all of the work they could, so they opened up what we've called the salary gap. That causes a challenge for federal funders, because it means that absent new funds flowing into the university, agencies like the National Institutes of Health will have to make some decisions in the future about whether they use their budget to make sure that stuff that they already funded that slowed down during the pandemic finishes or to invest in new things.
One of the other things that's happened during the pandemic: There's been a massive reconfiguration of research in lots of places around things related to the pandemic and its outcomes. There are whole new areas of research that are opening up, some of them in the social and behavioral sciences, some of them in immunology and in biomedical science, generally related to pandemic prevention, pandemic recovery, public understanding of science, all those things. There are new opportunities that come out of this.
There's also, I think, real reason to be concerned about the people doing the science. I think the primary danger here is that there's a lot of evidence that younger, particularly female or underrepresented scientists and trainees, have taken a larger hit from the pandemic than others.
What that might mean, and we don't know yet, but it's something to be aware of, is that the future of academic research workforce and the future trained workforce that leaves with research experience to land around the economy and have the kind of impact we're talking about has the potential to be less diverse, less robust, less effective as a result of the pandemic and its impacts than we might want.
There's a lot of really good research out there that's starting to demonstrate important things like, we know, for instance, that more diverse scientific teams tend to produce more novel and higher impact science. We also know that, and this is what we've been calling the paradox of diversity, that the people who make those teams diverse tend to have worse career outcomes.
We also know that COVID-19 is putting extra pressure on those people, so the real danger here and the real reason that we want to avoid making recovery from COVID for science a zero-sum game is both that it keeps us from investing in new opportunities and that if we don't take special care, we may actually remove a chunk of the scientific and research workforce that is what's going to make us better able in the future to address those unknown unknowns. To have the kind of high impact and novel science that I think is the true benefit of the research university and the thing that keeps us poised to effectively respond to a really uncertain future.
When it comes to making the case for research universities and the need that we have, what closing remarks do you have as we move out of this pandemic into 2022 and beyond that?
I am a big believer in the value of knowing things. And of knowing things based on strong and rigorous data. Also a big believer in the idea that data are most useful and most valuable when you find ways to allow them to be responsibly used by the broadest and most diverse group of researchers possible.
I think that we need to build and sustain a usable, accessible, equitable data mosaic about the impact of higher education and research, and we need to train the people from across many different fields who would use it to answer questions that I can't conceptualize.
And we need to figure out how to take that information, the new research discoveries that are enabled by the data and the recording, and translate it into language that can be understood by people who may not have the deep, geeky love of data that I have, which is why we need to have both the contextualized information, the numbers, the maps, the network diagrams and the stories that bring those to life.
It's that kind of work that I think really requires that this mosaic, if you will, be grounded not in a single organization or in a small group of organizations even, but in a network that makes a community of people who use and talk about the data. People who do research with the data, people in organizations who own and develop the data, and the challenge of that is to make sure that everybody who's participating gets some benefit on their own terms for their participation and recognizes the value of the interdependent benefit, the things that we could not do without the collaboration.
So that's the sense in which building a thing and a community that has some of the features of the research university, honestly. It's diverse. It's integrated. It's focused on enabling a wide range of people to answer questions that we might not be able to articulate right now, but that we think will be important is absolutely key.