What that article doesn't mention, is that the program was very specifically designed to pretend English wasn't its first language. It was also intentionally made to state it is a 13-year-old boy, to play into the stereotype of "thinks he knows everything, but doesn't know anything." Chatbots are notoriously poor at carrying on a conversation about a single subject, something also attributable to attention-deficit preteens. Hence, anything that sounds awkward, any awkwardness in phrasing and grammar, anything a human ought to know but it doesn't, anything it oughtn't know but does, and its complete inability to carry a subject, will be attributed to its being foreign and preteen. Feels like cheating to me. After all, couldn't I write a program to output random letters and claim "This is a 1-year old baby at the keyboard?" Reading previous transcripts, it doesn't seem any more advanced than any other chatbot to me, except for the preprogrammed "foreign adolescent" statement.