Faculty Member Kristi Nickodem Discusses Generative AI and Local Government

The rise of widely available generative AI tools has sparked numerous questions on how these novel technologies will impact academic and professional fields. 

Over the last few years, machine learning and AI have seen significant strides in capabilities and applications across various fields. At the same time, questions and concerns about how these tools are applied have arisen in institutions across the country, with local governments being no exception. Robert W. Bradshaw Jr. Distinguished Term Assistant Professor of Public Law and Government Kristi Nickodem explores government responses to technology such as generative AI in her work. “Local governments have a number of questions and concerns about generative AI,” said Nickodem. “As with any new technology, some units of local government are eager early adopters, while others are more cautious or skeptical.” 

Some of the questions local government officials are faced with answering include whether to ban employees from using AI tools, weighing productivity benefits against risks around accuracy and bias, and how to craft policies to address legal and ethical risks of using the tools. 

“I think the data privacy concerns around some generative AI tools are particularly relevant for government agencies,” said Nickodem. For example, a government employee may use sensitive or confidential citizen information to prompt a commercially available tool such as ChatGPT, unwittingly and unlawfully disclosing those details to private companies. In another instance, the creation of deepfakes, digitally manipulated images and videos of individuals that can appear as authentic productions, has already spread to government officials and may influence public opinion based on entirely false premises. 

At the same time, generative AI tools can be helpful tools and improve productivity. “At a time when many local governments are dealing with workforce shortages, generative AI tools may increase employee productivity in a variety of domains, including creating presentations, transcribing meetings, drafting documents, and planning meeting agendas,” said Nickodem. 

As the debate around generative AI continues to evolve, discussions continue regarding best practices that can help government employees and others navigate proper usage. Nickodem served on the UNC Generative AI Committee, which establishes guidelines for the responsible use of AI tools by UNC faculty and students. “Discussing the complex ethical and legal implications of this technology with colleagues across campus gave me insight into how local governments may be grappling with similar issues while trying to craft their own policies around the use of generative AI tools,” said Nickodem. For example, Nickodem suggests that public officials be cognizant that generating AI records such as prompts may be subject to public records and record retention laws in North Carolina. Governments should also remain aware of the copyright and intellectual property landscape that is emerging as artists and authors sue AI companies over the training of their machine models. The legal uncertainty of image and text generation as a matter of copyright creates risks that governments should be aware of in creating websites, documents, or presentations. 

Although the technology is continually evolving, one thing is for sure: “The technology underpinning generative AI tools is going to substantially impact the way we live, work, think, and learn,” said Nickodem. “Think back to when the internet first exploded in popularity and how much it changed society in just a few decades. The recent advances in machine learning present similar opportunities and potential pitfalls for local governments.”