Information pollution

I’ve been trying to be more active on Twitter, in part because a lot of open education people and librarians share experiences and thoughts there. This morning I saw a thread that a librarian I follow had retweeted, originally posted by @Viveka, that got me thinking about something that came across my desk from the ALA’s Center for the Future of Libraries in their newsletter Read for Later a couple of weeks ago. The Twitter thread was about a bot tweet that was making its way around Twitter, trying to drum up outrage about the way Disney has cast the forthcoming live action Little Mermaid movie. I won’t go into detail because I don’t want to grant this message any more attention but the gist is that it’s written to divide people and drum up racial strife while saying on the surface that it’s not about race. Which sounds like something a human might actually do — claim not to be racist but then make a point that attempts to divide people over race. But Viveka notes in the thread, “we know it’s a bot because it behaves unlike any human” and then goes on to explain the tells that make this so.

If you are not as observant as Viveka or just accept the content at face value without interrogating this tweet too carefully — and let’s face it, that’s mostly how the majority of tweets are read, quickly and without much thought — you might miss it. Viveka points out, “It has the right hashtags, the petition link works, call to action is clear.” Which might make it seem real enough that in the seconds it takes to skim it, most people would either ignore it, engage with it, or feel moved enough to click the petition.  But it’s been posted “about every ten minutes, as a reply to other tweets mentioning the movie or the actess” (@Viveka).

A bot can do that, a human can’t. Which is why I went back and re-read the New York Times article by Cade Metz & Scott Blumenthal that appeared in Read for Later, “How AI Could Be Weaponized to Spread Disinformation.” Metz and Blumenthal write about two AI companies that are making fake news generators that are getting better and better at mimicking human writing openly available — so that researchers know what we’re up against, as more and more content like the thread above proliferates. Why does it matter what people think about the casting of a Disney film? It doesn’t, but the humans behind this bot creation probably have an interest in dividing Americans over cultural issues. Perhaps so we will vote emotionally, or so we’ll be busy arguing while our government cages children, or tries to start a war somewhere, or . . . you get the idea. And AI makes it more likely that we’ll have trouble identifying what’s bot generated and what’s not.

So why is this a library issue? Libraries of all sorts encourage information literacy, or in the case of school and academic libraries, teach it. Information literacy is a set of skills and habits of mind that allow people to seek, evaluate, and use information effectively and responsibly. Will we be able to keep doing this work if, as Metz and Blumenthal quote OpenAI researcher Alec Radford, “The level of information pollution that could happen with systems like this a few years from now could just get bizarre.”

I’m not sure. Yes, we can keep teaching people to examine and consider information carefully but we have to be careful not to go so far as to convince them to trust nothing, as danah boyd cautions in her article “Did Media Literacy Backfire?” which I re-read every few months to remind myself how hard this work is. Will the media be susceptible to information pollution in the same way social media is? Is it already, in the “balance bias” of its coverage of major issues like climate change?

There are no easy answers. I believe information literacy is a help, but knowing how to do something doesn’t mean someone will do it. Ultimately the fight against information pollution is a matter of will — we have to spend more than a few seconds scanning something online before deciding whether it’s valid or not. I’d like to think libraries have a role in encouraging that, but in the end, it’s probably up to each of us to be smart information consumers.

Advertisement

Two tech stories

One of the things both academic and public libraries provide is internet access; it’s hard to carry this out of the building. A few library systems lend wifi hotspots but I recently looked into this and they are reliant on cell phone coverage since the cell providers offer the hotspots, and where there is poor coverage (outside of towns and cities), there is poor wifi. Nearly all the papers students write in high schools and colleges depend on their access to search engines of various kinds, whether databases their schools subscribe to or the ubiquitous giant search engine that starts with a G. Two stories caught my eye this week, related to the need for students to do online research and challenge of getting internet access to rural areas.

First, Pew Research Center studied public trust of algorithms. While their study focused on the kinds of algorithms that make decisions — like who should be hired (algorithms are used to screen resumes and applicants’ answers on job applications), be up for parole, or get certain financial perks — it also looked at attitudes towards the algorithms that decide what you see on social media. Majorities of people in the Pew study don’t trust algorithms, yet, whenever I teach an information literacy class, I have a hard time convincing students that the results they see when they search online are delivered to them not because they’re the best sources around but based on algorithms that calculate what they should see, based on what else they’ve seen and clicked on.

Similarly, many people I know seem to feel confident that while there are misleading or “bot” generated “news” stories online, they don’t see anything like that in their social media feeds. And yet, what we see on social media is also highly controlled by proprietary and opaque algorithms that are controlled by a handful of tech companies. While I don’t know any students who use those sources for their academic papers, I have started asking classes where they get news and stay current, and they nearly all say social media. So, I guess I need to keep talking about algorithms, because people seem to be properly cautious of those, even as they seem to trust the name brands that use algorithms to control the information they see.

The second story that I saw is about an obscure FCC rule that currently exacerbates unequal access to high speed internet access. Just under 1 in 5 students can’t get online at home in rural parts of America. But it turns out there is actually an “untapped spectrum” that due to this little known rule sits available for broadband right now, and the FCC will soon decide whether it can be licensed by school districts rather than sold to internet companies. Since this spectrum was originally “reserved for educational television broadcasts in the 1960s,” it makes sense that the FCC should license it to schools and help mitigate broadband inequality. States that are hoping for access to the spectrum have plans to “broadcast wirelessly into surrounding rural communities” from existing wired school internet networks.

Would that mean fewer people in the library? It would certainly mean fewer people in the parking lot and outside the front door trying to get on the wifi. And it might mean a new role for libraries, as partners in this kind of statewide broadband program, or maybe even more appreciation for school libraries (which have suffered cuts all over the U.S.), because librarians can play a key role in teaching students the information literacy skills they will need to be good researchers.

 

 

Information literacy in real life

I’ve become a student again this fall, taking an online master’s degree program at University of Edinburgh. Approaching research and citations (in Harvard style, something I’d never seen before) from a student viewpoint has made feel for my information literacy students even more than I already did. It really helps to walk a mile in someone else’s shoes.

One thing I’m surprised about is that while some of my classmates cite academic sources, others — almost all scientists and all working in jobs that require them to seek and use information — choose what I would consider weak sources, such as websites that wouldn’t pass the CRAAP test.  On the plus side, I have some new examples to show colleagues in a couple of weeks when I present an introduction to information literacy to fellow administrative and academic support staff at work. But I’ve also gained a new appreciation for how people in their daily lives and work could benefit from thinking critically about how and where they find information and how reliable it is, which are the keys to information literacy.

Yes, I did pay attention during the last national election and realize that people relying on poor sources of information is nothing new. But I thought much of the “fake news” problem was related to the way news is shared and also the way it is marketed today. I’m aware of the importance of teaching undergraduates information literacy, as they are emerging adults who don’t have much experience thinking critically. I hadn’t considered that basic information literacy could be enormously beneficial to adults and to their workplaces and communities.

Public libraries are offering more “how to spot fake news” programming and resources, which is useful, but again this puts the emphasis on news as the sources that might be misleading or counterfactual. Perhaps this should go further. Not all adults go to college or use libraries, so who can or should teach people to find and choose better sources of information in real life — work, volunteer positions, or even just looking stuff up at home? I know that high schools are not all teaching this, since most of my students have never thought much about evaluating information. Should there be public service announcements? Training in workplaces? Pop-up workshops in public places, led by librarians? “How to find reliable information” handouts for every registered voter, or enclosed with every drivers’ license?

What do you think?

 

 

 

A Maker Space Making a School Library

Much has been made of library maker spaces, which are part of the drive to be “relevant” by focusing on STEM. I’ve seen papers and essays galore on why STEM is Very Important for Public Libraries to Offer but I haven’t ever read about public library patrons asking for STEM. So is that relevance, if we tell the people what they want, instead of the other way around? As my last post indicated I’m not a fan of telling people, especially young people, what to focus on in their free time, hence my “” around relevant. On the other hand there’s a chicken and egg factor – if we offer it, will people discover they want it? That’s probably another blog post altogether.

You can take webinars and attend conference sessions on how to make maker spaces. I’ve always looked at them as sort of amped up crafts zones, places for creating techy projects mostly for fun, albeit educational fun. But an article in a library e-newsletter caught my eye this week because it featured a space where people were making what they really needed and wanted — a user-friendly library for their school.

At PS 721K, the Brooklyn Occupational Training Center for 14-19 year old special education students, shop teacher Charles Brown, who trained at Adaptive Design Association, helped students build furniture and accessories for their newly redesigned library. Previously, it wasn’t a space students with special needs could use. With the kind of heavy cardboard shipping boxes are made from, the students and their teacher created stools and book bins.

Photo: School Library Journal

If you look around ADA’s website you can see many more examples of adaptive furniture and kids engaging in “cardboard carpentry.” I think this is “maker” activity at its best. Seeing a problem or a lack and actually making something to fill that need seems like a much better use of time and talent than just making stuff so you can check the box on having a maker space.

That’s not to say maker spaces simply meeting the need for creative, fun activities for young people are a bad thing – if the community wants that. I just love the idea of “making” for good.