CORRECTION: This was a WOSU (my home station) story, not an NPR story. I will direct my complaints in the appropriate direction. Here’s a link to the WOSU audio. Original post follows.
I heard a comment on WOSU this morning that really kinda pissed me off.
* GETS ON SOAPBOX *
* PUTS ON NERD GLASSES *
The comment was the button at the end of a story about software being used to write articles for the Associated Press and other news organizations (mostly for sports and financial stories and other statistics heavy articles). The question was asked: Can robots have morals or ethics in writing news stories?
Answer. No. Robots cannot have any ethics or morals. Morals are a human thing.
Here’s what annoyed me about this answer.
1) Robots: The term robot or “bot” has been colloquially used to refer to any automated process, whether it was ChatBots from the old IM days, or algorithms like this one. As an engineer “robot” has always seemed like a misleading term because it conjures a lot of images in people that have nothing to do with what you’re talking about. Robots are hardware, we’re really talking about software, and if we want to get technical, we’re talking about intelligent systems.
Intelligent systems are not AI or at least not in the sense that the general public would think of AI. Intelligent systems take a lot of forms, but basically they take in data and respond with a diagnosis, a solution, or a news story. What distinguishes Intelligent Systems from AI is that they’re not generalized. An Intelligent System can be complex, but it is essentially a bunch of algorithms designed to tackle one kind of problem, in this case, how to write informative, brief, and factually accurate news stories.
2) Ethics: To say that software doesn’t have ethics is like saying that a book doesn’t. Software is another form of human expression. It is written by a human (hey, like me), the requirements for what the software should do are all determined by humans, and it is evaluated by humans.
What are ethics anyway? Well in this case our interviewee was referring to a code of journalistic conduct, where the important morals are objectivity, lack of prejudice, and a basic understanding of what humans find important or insensitive.
The specific example discussed was a baseball game in which a pitcher pitched the first no-hitter game for a team in over a decade. The software wrote an article that had this information in the second paragraph. To me, that just sounds like a bad case statement, not an unethical or insensitive piece of software. The human writing the software needs to write code to look for instances we find significant (no-hitters) and what increases their significance (time since last no-hitter). If it crosses a certain threshold, it goes in paragraph one. Easy.
How are ethics and morals implemented in software? Complex mathematical algorithms and/or a bunch of if-then statements.
Good intelligent systems are able to start from a set of rules, and modify (learn) new rules by doing. If there’s human feedback on the articles produced (or if there’s some other acceptable metric that can be tracked through a website: traffic, comments) software can determine what outputs worked better than others.
It’s an old joke among software engineers that “software can do anything”. It’s not true, except everybody thinks it is and so we have to figure out a way to make it true. But to me, a code of journalistic ethics sounds a lot like a requirements document. A good engineer will figure out a way to take that code, and write those evaluations into decisions the software makes. He or she has ethics, therefore the software does, or at least has morals implemented.
One last thing: Software might actually be better at getting rid of institutional prejudice based on age, gender, skin-color, etc. Even the best of us as humans have to get over how we thought about things before. We have to decide we’re not going make decisions about what we write based on any of those factors and we still might have underlying prejudices we can’t even acknowledge. In software, you just take those evaluations out. They’re gone forever. Software can be truly impartial.
Next time you’re doing a story like this one, get an engineer in the discussion. Don’t just ask a writer. We’re easily frightened.
And lose the term “robots”.
Like this:
Like Loading...