A report from Silicon Valley Bank earlier this year indicated that 25% of healthcare investment dollars is going toward companies leveraging artificial intelligence (AI). Another report suggests the global AI healthcare market could balloon to more than $148 billion by 2029. While those figures certainly require context, we can be sure about one thing—healthcare is betting big on AI.
That was also the message at the National Council of University Research Administrators’ (NCURA’s) 2nd AI Symposium earlier this year, where the top leaders and first movers in research administration gathered to share their progress since the inaugural forum, learn from one another, and discuss the future of AI in research institutes.
Here are three AI-related topics from the NCURA symposium.
1. The Big “Why”
Staff recruitment and retention is the biggest driver for research organizations that are exploring AI. It is difficult to find staff, and as administrators increasingly work remotely, salary pressures are growing (especially when you can live in Nebraska and get paid like you live in San Francisco). Add to this that many are experiencing more staff retirements, and organizations are losing years of knowledge with each worker walking out the door.
Regardless of the reason, the result is the same: research organizations have a growing number of less experienced new recruits and little capacity or infrastructure to train them quickly. Organizations are turning to AI platforms to:
- Retain knowledge as workers retire or transition.
- Enhance training. Organizations are looking to reduce the learning curve by providing chatbots to answer questions and relying on AI assistance to create training materials.
- Increase efficiencies. Automating rote work allows organizations to do more with less.
- Create better jobs. Reducing rote work boosts morale.
2. To Buy or to Build
Deciding you need an AI solution is one thing. Determining what you need—and how you’re going to get it—is another. Many early adopters at the NCURA symposium noted that off-the-shelf solutions aren’t great—yet. And to be sure, cost is a factor.
Many organizations are opting to develop their own AI functions. It’s time-consuming, and the economies of scale aren’t always immediately identifiable. At present, lower-cost, self-built solutions often require user training (prompt engineering) in order use even a chatbot optimally and to keep costs of use down. But a key benefit of building one’s own is the ability to harness proprietary knowledge—including the expertise of employees who have left or retired—and turn this into answers and solutions customized for the organization.
At the NCURA symposium, organizations that have built their own AI platforms said they are getting help by:
- Hiring AI external experts.
- In conjunction with the larger organization (university), developing an AI office to provide assistance across the university.
- Enlisting the expertise of internal early adopters (e.g., IT departments, students, tech champions).
3. Security Concerns
Regardless of whether the system is built or bought, security is a top concern. Organizations want to avoid releasing business-sensitive information to open systems, and no one wants their proprietary data available in the public sphere.
Getting Started
Every organization’s AI journey will be different, and there’s still a great deal of uncertainty about the future of AI in research administration. But based on what we’re hearing from the research community, there are several ways organizations can get off on the right foot.
- Coordinate with legal and IT. The organization should have policies and frameworks to mitigate risks.
- Find champions to get buy-in. Build relationships with people who want to make it happen.
- Begin with proof of concept. Starting small, promoting successes, and generating enthusiasm for AI solutions can help unlock resources for the next step.
Edited by Matt Maslin
Published September 5, 2024
Related Services