In the beginning, Maxwell would observe chat rooms and websites — learning, listening, and speaking on its own.
The WEF report predicts that persistent gender gaps in science, technology, engineering, and mathematics (STEM) fields over the next 15 years would also diminish women’s professional presence.
But the problem of how gender bias is shaping artificial intelligence and robot development may be even more pernicious than the wallop women will take as a global workforce. The machines and technology that will replace women are learning to be brazenly gendered: Fighter robots will resemble men. Artificial intelligence may soon look and sound far more sophisticated than Tay — machines are expected to become as smart as people — and become dangerously more sexist as biases seep into programs, algorithms, and designs.
The worldwide race to create AI machines is often propelled by the quickest, most effective route to meeting the checklist of human needs. Intelligent machines will eventually tend to our medical needs, serve the disabled and elderly, and even take care of and teach our children.
And we know who is likely to be most affected: women.
” Tay’s designers built her to be a creature of the web, reliant on artificial intelligence (AI) to learn and engage in human conversations and get better at it by interacting with people over social media. She also quickly fell prey to Twitter users targeting her vulnerabilities.
For those internet antagonists looking to manipulate Tay, it didn’t take much effort; they engaged the bot in ugly conversations, tricking the technology into mimicking their racist and sexist behavior.
Microsoft’s Twitter chatbot Tay was taken offline within 24 hours of her activation in March 2016 after “she” fell prey to Twitter users targeting her vulnerabilities.
(Photo credit: Microsoft/Twitter) In 1995, James Crowder, an engineer working for Raytheon, created a social bot named Maxwell.
She’s always coming back for more.” That was when Crowder put Maxwell under online parental controls.