AI chatbots and art generators have been making headlines for the last few months.

Open AI released ChatGPT, a chatbot system that can answer a broad variety of questions in an authoritative tone (although not always correctly) and that can ‘write’ text based on prompts from users.

You say, write me a love story. ChatGPT takes the many, many pieces of text it’s been trained on and uses that data to create a love story (and in the process, likely commits at least a little plagiarism, if not a lot of plagiarism from the many, many love stories it’s been trained on).

This is an impressive feat. Training a computer program to answer questions in a human-like way and to compile pieces of text in a way that sounds more human than machine is astounding. But spitting out responses by predicting the most common response is not necessarily ‘writing.’ Some of it is literal plagiarism. I worry that industries that produce actual writing—like journalism, the industry I happen to be in—could see their revenue streams further eroded by this technological marvel.

As a person who writes for a living, ChatGPT’s whole process of producing text by mashing together existing text it’s been trained on sounds a lot less like ‘writing’ than theft of existing intellectual work written by actual human beings. Some of those actual human beings are long dead and have work in the public domain, so it’s fair game to do whatever you wish with that work. Others of those actual human beings probably still need to pay their rent and using a chatbot to rip off their work is still ripping off their work.

The same company is behind DALL-E, an art generator that creates images based on text prompts. DALL-E has generated plenty of outrage (and even some lawsuits) because of how it generates images. Again, the image generator has been trained on lots and lots of examples of existing artwork (without the artists’ permission) and responds to users’ text prompts by spitting out amalgams of that original work. It’s all very copy, paste.

Again, the AI image generator can steal work by actual living breathing humans who need to pay rent and undermine their ability to get paying jobs by giving away AI altered versions of their artwork for free (or to profit the company that runs the AI image generator).

I’m very curious to see how those lawsuits resolve.

As an industry, journalism has been hard hit by the information age. The internet makes it easy to access all kinds of information. When events are happening, how to contact people, and the status of ongoing legislation are all easily googleable bits of information, which certainly makes my job easier. I have a much easier time fact checking, finding contacts and figuring out what the heck is going on than journalists 50 years ago. So, I would guess, do you. That’s great.

Less great is that people expect all journalism to be free and are less willing to pay for it. Journalism is not free to produce. You have to pay people to sit through three hours of a public meeting and then write an easy-to-read 400-word story about it.

Sure, you can (and at least once probably should), go to those public meetings yourself and see what’s happening with county, city or state business. That is what’s so wonderful about democracy, public access to institutions of power. Anyone can go to those meetings. But unless there is a topic on the agenda that people are outraged about or someone is being recognized by the government body, no one goes to those meetings except for government employees, elected officials and journalists. Meetings are boring, and people have jobs and kids and Netflix shows.

It takes skill, time and effort to regularly attend and write about public meetings.

This means that in news deserts like our neighboring Catron County, information on local government is mostly disseminated by local government if at all, making governance more opaque.

Beyond government coverage, journalism also documents business happenings, the education system, crime, accidents, sports, arts and cultural events and stories about people in your community doing awesome work.

None of these are free to produce. Paper and ink costs money. Covering, writing and posting stories online costs time and money. Even when the product is free to read (there are excellent non-profit newsrooms out there and our for-profit newspaper website is not paywalled), it is not free to make. Everybody has bills to pay.

But more damaging than shifting people’s willingness to pay for reliable information, the internet has undermined the advertising model that many newspapers rely on.

Dollars that 30 or 40 years ago were going to print advertising are now being spent on online advertising powerhouses like Google, king of search, owner of YouTube. Online advertising makes it easier for small businesses to cheaply promote their businesses, which is great. It makes it harder for newspapers to balance their budgets, which is part of why newspaper staffs have shrunk, pay for reporters has stagnated and newsrooms have shuttered. (According to Career Explorer, pay for journalists in New Mexico ranges from $11 to $35 an hour, averaging just under $18 per hour or $37,420 annually. It’s not a lucrative field.)

But, that’s the way of business and innovation, right? Things change, so too-must business models.

What irritates me is when search engines like Google take writing (not just information, but pieces of text that someone worked to make) from places like newspapers and make it more difficult for those places to generate revenue.

Online a website visit equals revenue because it all runs on ad dollars or paywalls.

Google has a nifty feature where you ask it a question and before it lists all those link results, it offers answers to the question on its own page and offers answers to other common queries—eliminating the need to visit any other website.

Google is scraping useful text from websites, using it to benefit itself (by giving you the answer you searched for), and eliminating potential revenue the website that actually produced that piece of writing would gain from you visiting their page. Personally, I think that’s not the most ethical design. Long term, that is a design choice that undermines the production of useful, compelling and reliable writing.

But at least it makes the link to the website that provided the answer very visible, so maybe you’ll still click. And at least you can see obviously where the information is coming from to judge its reliability.

Microsoft is planning to incorporate ChatGPT into Bing’s search function. Based on reporting in the New York Times, it sounds like the links used to generate search answers in this new search model would be even less visible (little citations next to the stolen, or I’m sorry, AI-generated sentences). This could be terribly useful for users. It could also further undermine existing newspaper revenue models and make it easier for misinformation to catch fire and spread.

You can’t walk back new technology. The internet has revolutionized how we access information—democratizing who can share and generate information, which is wonderful. At the same time, online spaces are plagued by misinformation. More liars have a platform and fewer fact checkers have a paycheck.

Technology will change, and we will all adapt to it. AI chatbots and image generators are here, and they’re likely here to stay. I only hope that as we adapt, industries that try to produce reliable information can stay in business.