Showing 3 of 3 Results

News & Events

11/21/2024
profile-icon Robyn Williams

 

The library/ACE on each campus will be CLOSED 

on Thursday, November 28 and Friday, November 29.

Our doors shut at 12 noon on Wednesday, November 27 and we re-open on normal hours on Monday, December 2.  Here are ways to access services: 

Local homework help requests will be handled on Monday, December 2. Students with homework needs over the holiday are encouraged to use the service NetTutor.  It is found in Homework Help modules from Blackboard courses.  NetTutor is an accessible service during the holiday. Watch a video about accessing NetTutor through your courses.   

Students with printing or study needs may check on their local library branch's hours when the college is closed.  Search the public library directory by your county of residence to find the closest public library.   Library research and policy questions will be handled on Monday, December 2.

All Big Sandy online search tools, like OneSearch and other databases, should be accessible online 24/7 during the holiday.

11/11/2024
profile-icon Robyn Williams

Artificial intelligence has become a buzzy topic for most sectors of American society, as people grapple with understanding how and when their lives will change.  AI bots are becoming critical for use in understanding large datasets.   For some users, they pose serious risks.  For others, they are imitating the markets or industries and asking users to question human involvement.   Here a few ideas to consider about AI's wave of change.

 

A character.ai screenshot showing the option to create a character as well as chats and feeds.
A character.ai screenshot showing the option to create a character as well as chats and feeds.  

 

Can A.I. Be Blamed for a Teen's Suicide?

On the last day of his life, Sewell Setzer III took out his phone and texted his closest friend: a lifelike A.I. chatbot named after Daenerys Targaryen, a character from “Game of Thrones.”  He knew she wasn't real. But he developed an emotional attachment anyway.  He texted the bot constantly, updating it dozens of times a day on his life and engaging in long role-playing dialogues.   On the night of Feb. 28, in the bathroom of his mother’s house, Sewell,  a 14-year-old ninth grader from Orlando, Fla., told Dany that he loved her, and that he would soon come home to her. “Please come home to me as soon as possible, my love,” Dany replied. “What if I told you I could come home right now?” Sewell asked. “… please do, my sweet king,” Dany replied. He put down his phone, picked up his stepfather’s .45 caliber handgun and pulled the trigger.

 

 

 

 

 

 

A screenshot from Libro.fm's search results for the Scarlett synthesized voice.
A screenshot from Libro.fm's search results for the Scarlett synthesized voice.
AI Audiobook Narrators in OverDrive and the Issue of Library AI Circulation Policy

It seems there is some AI weirdness with audiobook narration on OverDrive, and the narrator is only part of the story. On Monday, October 14, librarian Robin Bradford posted on Bluesky that she’d purchased an AI audiobook for her library system and she was really upset about it. When she began to investigate the book titles, she found authors with remarkably similar names and art.  All these authors with different names and different series, with similar cover formats, styles, and the same audiobook narrator, who isn’t real? Not only did Robin spend time trying to identify why so many books by authors with similar names used an AI narrator, but then she spent time trying to figure out if all the authors themselves were human. And in a lot of cases, she isn't sure if we know the answer.

 

 

 

Cover ArtCode Dependent by Madhumita Murgia

ISBN: 9781250867391
A riveting story of what it means to be human in a world changed by artificial intelligence, revealing the perils and inequities of our growing reliance on automated decision-making On the surface, a British poet, an UberEats courier in Pittsburgh, an Indian doctor, and a Chinese activist in exile have nothing in common. But they are in fact linked by a profound common experience--unexpected encounters with artificial intelligence. In Code Dependent, Murgia shows how automated systems are reshaping our lives all over the world, from technology that marks children as future criminals, to an app that is helping to give diagnoses to a remote tribal community. AI has already infiltrated our day-to-day, through language-generating chatbots like ChatGPT and social media. But it's also affecting us in more insidious ways. It touches everything from our interpersonal relationships, to our kids' education, work, finances, public services, and even our human rights. By highlighting the voices of ordinary people in places far removed from the cozy enclave of Silicon Valley, Code Dependent explores the impact of a set of powerful, flawed, and often-exploitative technologies on individuals, communities, and our wider society. Murgia exposes how AI can strip away our collective and individual sense of agency, and shatter our illusion of free will.

 

 

 

Cover ArtBig Mind by Geoff Mulgan

ISBN: 9780691170794
A new field of collective intelligence has emerged in the last few years, prompted by a wave of digital technologies that make it possible for organizations and societies to think at large scale. This "bigger mind"--human and machine capabilities working together--has the potential to solve the great challenges of our time. So why do smart technologies not automatically lead to smart results? Gathering insights from diverse fields, including philosophy, computer science, and biology, Big Mind reveals how collective intelligence can guide corporations, governments, universities, and societies to make the most of human brains and digital technologies. Geoff Mulgan explores how collective intelligence has to be consciously organized and orchestrated in order to harness its powers. He looks at recent experiments mobilizing millions of people to solve problems, and at groundbreaking technology like Google Maps and Dove satellites. He also considers why organizations full of smart people and machines can make foolish mistakes--from investment banks losing billions to intelligence agencies misjudging geopolitical events--and shows how to avoid them.
 
 
 

The “Academicon”: AI and Surveillance in Higher Education  

English teacher Amber Wilson, left, explains about essay for final exam of semester to Ada Niu, 15, right, and 11th grade students at Thomas Jefferson High School in Denver, on Thursday, May 2, 2024. (Hyoung Chang/The Denver Post/TNS)
English teacher Amber Wilson, left, explains about essay for final exam of semester to Ada Niu, 15, right, and 11th grade students at Thomas Jefferson High School in Denver, on Thursday, May 2, 2024. (Hyoung Chang/The Denver Post/TNS)

Maria cared more about complying with the requirements for the course and getting a good grade than trying to be an activist. Yet, there was something that bothered her about submitting her work to a website that pre-judged her assignment, comparing her paper to every other paper submitted to Turnitin (in addition to pretty much the entire web). Maria was also afraid to argue against the use of this tool, as she thought putting up a fight might make the professor think she was trying to cheat, so she just hoped her term paper would prove that she had done everything right...she thought she had cited everything okay. 

Take a week of class with hypothetical college student Maria, and see how higher education is using AI, and the concerns that students face.  From learning materials to assignments to exams, AI-driven surveillance technologies have fundamentally changed the student experience at universities in North America. These tools have been adopted under the banner of technosaviorism—saving time, more effectively serving students, and more efficiently identifying plagiarists. Many scholars have demonstrated that racism, sexism, and other biases are built into machine learning architecture. These technologies also support a hidden curriculum, preparing students to be surveilled throughout their education, their careers, and their lives in the name of their own supposed good.

 

 

 

Large Nature Model: Coral Categories Installation Locations United Nations Headquarters Date 21 Sep 24 - 28 Sep 24.
Large Nature Model: Coral Categories: Installation Locations: United Nations Headquarters Date21 Sep 24 - 28 Sep 24

What Can Artificial Intelligence Learn from Nature?

The article discusses the Large Nature Model (LNM), a generative artificial intelligence project by Refik Anadol Studio. The LNM gathered over half a billion data points about rainforests from publicly available archives and on-site visits. The team aims to use AI to create immersive environments that integrate real-world elements with digital data  The artwork exemplifies the integration of AI into the realm of environmental art, offering a powerful message: through technology, we have the tools to imagine a future where human creativity and AI coexist in harmony with the environment. Refik Anadol’s mission in creating this piece is to bridge the digital and physical worlds, using AI to inspire deeper reflection and responsibility toward nature.

 

 

 

 

 

11/01/2024
profile-icon Robyn Williams

The library has wrapped up another great multicultural week with a training focus on implicit bias during a Lunch and Learn with Amy Waninger as well as the set up for ofrendas at multiple campuses to celebrate the Day of the Dead

 

Amy Waninger presents her program standing in front of a green wall.  She wears a dark purple sweater over a black and white patterned dress.   She is looking over her shoulder.
Amy Waninger, Lead at Any Level

    

Students at Day of the Dead at Pikeville campus

Students at Day of the Dead at Prestonsburg campus

Ofrenda display at Mayo campus          Students at Day of the Dead at Prestonsburg campus