ºÚÁϳԹÏÍø

Guidelines on AI use in research needed to avert ¡®race to bottom¡¯

Newcastle vice-chancellor says shared approach needed to prevent ¡®bad practices¡¯

October 8, 2024
Chris Day

Universities and funders need to?develop a?shared set of?principles on?the use of?artificial intelligence in?research to?avert a?¡°race to?the bottom¡± where ¡°bad practices¡± proliferate, according to a?leading vice-chancellor.

Chris Day, the Newcastle University head who currently chairs the UK¡¯s Russell Group, told Times ºÚÁϳԹÏÍø¡¯s World Academic Summit that much of?the focus to?date had been on?the use of?generative?AI tools in?education, particularly in?assessment ¨C but that it?was increasingly urgent to?consider research as?well.

¡°If individual institutions and funding bodies all come up with their own approaches, there will be confusion, and that could lead to a?temptation [for] bad practices compromising the ethical use of?AI,¡± Professor Day told the event at the University of Manchester.

¡°Using AI in research has significant dangers, including false results, propagation of false information [and] plagiarism, and without consistent regulation there could be a potential race to the bottom as individual researchers and, potentially, institutions are tempted to seek competitive advantage by using?AI.

ºÚÁϳԹÏÍø

ADVERTISEMENT

Campus resource: Developing a GenAI policy for research and innovation


¡°Without consistency in the use of AI, one institution¡¯s use of AI could be another institution¡¯s misconduct. Ideally, then, we need a shared approach that is not just national but also international, so that global researchers are all working to the same standards.¡±

These shared standards could cover areas such as the extent to which use of generative?AI is permitted in content creation and data analysis, and how its use should be cited, Professor Day said.

ºÚÁϳԹÏÍø

ADVERTISEMENT

Such an approach could ¡°help build the confidence necessary for more open research¡±, Professor Day said, warning: ¡°If?AI?is used very differently in different places, for example, will researchers be willing to share their work? There is clearly a need for alignment to prevent fragmentation as publishers, funders and universities begin to find new ways of using generative?AI.¡±

Professor Day suggested that, in the UK, work to create such a shared set of principles could be led by UK?Research and Innovation (UKRI), the umbrella body for the country¡¯s research councils.

Kathryn Magnay, who has led UKRI¡¯s work on AI, agreed that it was important that a ¡°coherent approach¡± be adopted by universities and funders, noting the significant opportunities that AI offered to make research more productive and efficient.

¡°Where it is used, we need to be open about its use. We need to be open about the data used and the inputs that are going into it; we need to think about levels of uncertainty. Other researchers must be able to look at the work that¡¯s been done, reproduce that, and validate the research,¡± said Dr Magnay, deputy director for AI, digitalisation and data at the Engineering and Physical Sciences Research Council.

ºÚÁϳԹÏÍø

ADVERTISEMENT

But she cautioned that while UKRI could possibly lead the development of shared principles in some areas, other partners would likely have to be involved.

And Nick Fowler, chief academic officer at Elsevier ¨C the publishing giant that convened the discussion ¨C agreed that there was ¡°not an existing structure¡± to create a shared set of principles.

He said such guidelines were necessary, warning: ¡°There¡¯s much to lose if the system starts to function less well than [it?does] today.¡±

Noting that AI chatbots relied on high-quality content to function well, Dr Fowler said it was ¡°important that incentives to produce high-quality content remain robust¡±.

ºÚÁϳԹÏÍø

ADVERTISEMENT

¡°Publishers consider it essential that permission is sought by AI developers for use of its copyright-protected work prior to its use; that transparency is delivered on what content is and has already been used to train generative AI; and that remuneration and attribution are provided to creators and rights holders,¡± he said.

chris.havergal@timeshighereducation.com

Register to continue

Why register?

  • Registration is free and only takes a moment
  • Once registered, you can read 3 articles a month
  • Sign up for our newsletter
Register
Please Login or Register to read this article.

Related articles

Reader's comments (1)

Concerns raised are very genuine and addressing them well in time would be necessary to reap right benefits from AI for research.

Sponsored

ADVERTISEMENT