DeepText

Facebook has not been shy about its artificial intelligence (AI) intentions, having long announced the introduction of AI and machine learning into its core social media platform.

Now it has unveiled its next step in that direction: DeepText.

DeepText, Facebook’s newest artificial intelligence system, is a deep learning-based, text-understanding engine that is expected to understand the textual content of several thousand posts per second, spanning more than 20 languages.

DeepText's goal is not only to recognize the topics that are being discussed, but to draw in Facebook services that relate to these topics.

For instance, someone typing the sentence "I need a ride" in the Messenger app will be sent a prompt for using the in-app Uber or Lyft options. The challenge, of course is recognizing the difference between that sentence and "I like riding donkeys." Similar words, certainly—but very different meanings.

Another possible use is for selling items (big surprise). A post that reads “I would like to sell my old bike for $200, anyone interested?” would be detected by DeepText as a sale post and prompt the seller to different Facebook tools that make the transaction easier.

It should be mentioned that DeepText will be used to analyze input from users, and use this analysis to better route content from authors to viewers. It also means the annoyance factor of using Facebook will increase by at least 50%.

Powering DeepText

The task is actually much harder than it seems.

At first blush, all that seems to be needed is word recognition and association, and what could be simpler than that? But analyzing human text is far trickier than that—after all, we have the benefit of over 100,000 years of cognitive and linguistic evolution to understand the semiotics of human speech, which comes as second nature to us.

But to get closer to how humans understand text, the AI needs to understand things like slang and word-sense disambiguation. The team behind DeepText also hopes to reduce the preprocessing needed for the AI.

Traditional natural language processing techniques require extensive preprocessing logic built on intricate engineering and language knowledge. Deep learning dispenses with this cumbersome business altogether, allowing the system to learn from text with no or little preprocessing. This helps DeepText analyze multiple languages quickly, with minimal engineering effort.

As to the ethical and privacy issues of having a robotic intelligence constantly looking over your shoulder and reading your texts—that's a matter that's yet to be addressed.


Share This Article