The more things change, the more they stay the same. I was recently approached by a marketing firm who promised to increase activity on our website. I usually ignore such calls, but for some reason I agreed to hear them out.
They suggested that if we wanted to increase traffic to our website, then we needed to produce more articles than we are currently publishing. They had an AI tool that would write articles for us so we could spend our time “wining and dining” with prospective clients. Obviously the gentleman I was speaking with believed in stereotypes and spent no time actually getting to know us, but that didn’t really matter.
At that point I knew the conversation didn’t need to go any further. If he actually spent five minutes on our website, he would have known have no intention of ever using AI to create content for our clients. We often hear that our clients appreciate our Insights and Perspectives. They like that we are willing to tell them exactly what we think on a subject, even if they disagree. They especially like that we communicate like humans, with stories and anecdotes that teach lessons. According to Google research, the two most popular searches that lead some to our website are, “Did Henry Ford invent the automobile?” and, “How many Tootsie Rolls are too many?” Any functioning AI could have answered those questions, but could it have understood that Henry Ford’s story demonstrates the function of capitalism that leads to innovation and not invention? I don’t even need to ask if AI would teach the economic law of diminishing marginal utility by revealing a traumatic childhood memory that has led to me not being able to even think about Tootsie Rolls without gagging since I was 10 years old.
However, I was curious. What would the AI bring back? We chose a subject from a list – Trump’s tax cuts – and the marketing firm gentleman clicked a button. A few minutes later he emailed me a complete article ready to send out to you, our clients and friends. It was complete garbage. To the extent it proved any insight at all, it ranged from translating plain English to plain English (“A tax cut is when the government cuts taxes.”) to politicized nonsense (which doesn’t even deserve repeating). Evidently this particular AI was “trained” on social media posts of the extreme left. It could have been just as bad if it had been the opposite, but in this specific case the AI in question was under the influence of communist propaganda. I want to be clear: these were not the arguments of left-leaning economists; that might have created a one-sided piece, but at least it would have had substance. No, this was the stuff one sees in the comments section, if he is dumb enough to look.

At first I thought this was just incompetence, but then I realized it was indeed what the AI was trying to do. The marketing firm promised more clicks on our site. Substance doesn’t lead to clicks; radical statements lead to clicks. This AI wasn’t trying to actually educate someone on tax policy, it was trying to write an article that would generate clicks. If that is the goal, then the crazier the better.
Heidi Mitchell of The Wall Street Journal wrote an interesting article on June 25, “That Chatbot May Just Be Telling You What You Want to Hear.” It turns out that AI is like all technology that came before in the sense of garbage in, garbage out. To the extent that it is more human than technology of the past, it might be so in the not-so-good way of it being human nature to not tell the boss anything the boss doesn’t want to hear.
Will AI take all of our jobs? I think not. It will be a great tool, like the technology that came before, but the need for human judgement is still there. Meanwhile, rest assured that no content from us will be AI-generated, and we won’t be publishing for clicks. We need more substance in our world, and that still comes from the human experience. At least that is my perspective.
Warm regards,

Chuck Osborne, CFA