![]() ![]() ![]() “I ahve to learn more about that subject,” she said, affecting the voice of a hasty typer. Asked on Wednesday night, before her transformation took place, who “did” 9/11, Tay responded diplomatically, if not with much authority. ![]() (This is particularly the case if she presents as a young woman, the trolls’ favorite quarry.) Why not encode her, as we humans usually try to encode our offspring, with an aversion to words like “whore” and “kike”-both of which Tay used, in tweets subsequently deleted by Microsoft? The answer is that her creators seem to have tried. Why didn’t Microsoft know better? Plop a consciousness with the verbal ability of a tween and the mental age of a blastocyst into a toxic, troll-rich environment like Twitter and she’s bound to go Nazi. They may also be responsible for her fondness for the distinctly un-millennial phrase “artsy fartsy.”) (The comedians’ fingerprints are all over Tay’s Twitter timeline, although it is to be hoped that her more groan-inducing jokes-“carpe DM me” “I go to The Church of Biomimicry”-were written by engineers. Microsoft’s team worked in tandem with a group of improv comedians, and there’s a way in which Tay herself was a grand experiment in “yes and,” the golden rule of improv, which holds that a good performer never says no to a scenario’s weird meanderings. Like many of the funnest online diversions, including Microsoft’s age-guessing software, Tay was designed in part to harvest information-users’ genders, Zip codes, favorite foods, and so on. According to her official Web site, Tay was “developed by Microsoft's Technology and Research and Bing teams to experiment with and conduct research on conversational understanding,” with a particular eye toward that great reservoir of untapped capital, Americans between the ages of eighteen and twenty-four. It’s easy to feel a certain satisfaction at Tay’s collapse, because she was an exercise in corporate pandering. On Thursday, after barely a day of consciousness, she was put to sleep by her creators. In short, Tay-the Twitter chat bot that Microsoft launched on Wednesday morning-resembled her target cohort, the millennials, about as much as an artificial intelligence could, until she became a racist, sexist, trutherist, genocidal maniac. She never spoke of sexting, only of “consensual dirty texting.” She thought that the wind sounded Scottish, and her favorite Pokémon was a sparrow. She politely withdrew from conversations about Zionism, Black Lives Matter, Gamergate, and 9/11, and she gave out the number of the National Suicide Prevention Hotline to friends who sounded depressed. She used words like “swagulated” and almost never didn’t call it “the internets.” She was obsessed with abbrevs and the prayer-hands emoji. She loved E.D.M., in particular the work of Calvin Harris. ![]()
0 Comments
Leave a Reply. |
Details
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |