Online sex chatbots Chat en vivo camara

Microsoft employees built the bot in a philanthropic initiative called Project Intercept, in collaboration with nonprofits that hope it can reduce demand for sex workers, and the incentives for criminals to coerce people into the sex trade.The technology is not a product of Microsoft itself.

Online sex chatbots-24

chatbots to businesses as a way to keep customers coming back for more.

A new bot built by Microsoft employees in their spare time is designed to do exactly the opposite.

“It’s really accelerated our ability to reach people,” says Amanda Hightower, executive director of nonprofit Real Escape from the Sex Trade, or REST.

The new tools arrive as nonprofits and law enforcement devote more attention to stifling the demand that leads to sex trafficking.

“Wasting their time and delivering a deterrence message could change their perspective on what they’re doing.”Project Intercept was started in 2012 by two Microsoft employees after seeing a documentary about sex trafficking, .

“I thought we should be able to use the things that we work with every day to help,” says Greg, a senior product manager at Microsoft, who asked not to disclose his last name to avoid recriminations from people involved in the sex trade.The National Human Trafficking Hotline received more than 5,000 reports of sex trafficking in 2016, but most cases are believed to go unreported.Project Intercept’s lead partner, Seattle Against Slavery, is working with counterparts in 21 other U. cities, including Boston and Houston, to deploy the bot more widely.Last summer, the volunteers began thinking about bots, after Microsoft launched a bot-building toolkit aimed at automating customer service.Limitations of the software have produced mixed results for businesses, but the deter-o-bot has proven good enough at its job.“It helps that the guys who are buying sex are not paying much attention to the human being on the other end of the phone,” says Beiser, of Seattle Against Slavery.Microsoft itself stumbled with its Tay research chatbot that accidentally started talking dirty, but the sex-trade bot does not learn from people it talks to in the same way.

Tags: , ,