Ana Rosenstein

The Botscape

Bot fever has seemingly swept the tech world overnight. But in reality, it’s been a long time coming. Bots have been around since the early days of the Internet; they used to be called user agents, long before Kik, Telegram, and Twitter were awash with bots alerting you that your plant needed to be fed. At betaworks, our thinking around this generation of bots grew out of our interest in the notifications layer and this new layer of interaction on the phone matched our interest in messaging. As we saw more and more time being spent in messaging channels, finding a way to integrate services and content into those channels became increasingly compelling. We saw push notifications drawing people into apps and services on their devices, and saw that as an opportunity to send people content without ever opening, or perhaps even downloading, an app. At its core, this is the purpose of a bot. It simulates the services an app provides through a conversational, or rather, UI-less, UI. Those conversations trigger notifications. Notifications draw us back into the conversation. And so the cycle begins.

Poncho was our first foray into bots. We launched Poncho as a text service that delivered people daily weather alerts. To our surprise, people started talking back to Poncho. DMM came next, allowing Ryan Leslie to interact with his fans en masse. Our vision is for Ryan Leslie bots to exist for everything and everyone in which the bot experience is tangibly better than it’s mobile app or web counterpart.

Next came Dexter, a toolkit to wire up web API’s with conversational interfaces. Howdy piqued our interest as we zeroed in even more closely on this rapidly emerging bot ecosystem. Howdy’s Botkit, which provides the building blocks for building slack bots, tapped into the early adoption of bots on Slack itself.

Last fall, we hosted the Notifications Summit at betaworks to discuss the notifications layer and the rapidly changing ecosystem. The conversations and panels throughout the course of the day served as a breeding ground for our current thinking around bots today. How can notifications send information and eliminate ever having to open an app? How can we tackle the notifications layer without ever even requiring one to install an app? A text, slack message, and tweet all share one vital commonality — they elicit a notification. So bots were the natural next step as they trigger the notifications we found so intriguing.

And you probably know what came next: Botcamp.

So we’ve seen a lot of bots. We’ve talked to a lot of brilliant bots and some not so brilliant ones. But each and every conversational service we’ve interacted with has shaped our thinking on the landscape today.

Throughout the botcamp application process, we refined the bot landscape as we saw it evolve. And it turns out, it’s not so easy to fit it into a neat lumascape or traditional xy axis. In middle school, you’re taught to write a paragraph using the sandwich model. Topic and closing sentences are the bread with the inside stuff being the supporting, filler sentences. With bots, we wanted to think about the landscape just as simply. If botmakers and bot users (people) are the bread at either end, what is the stuff in the middle? How do we get from creation to consumption and outline the many steps in between?

We’ve iterated and reiterated on the bot landscape — the botscape — and come to the conclusion that the world of bots is perennially changing. The lines between what constitutes NLP and NLU or AI and machine learning, are blurry and complex, but we’re constantly trying to clarify them. Today, we are publishing V1 of the botscape. It’s going to change — we know its incomplete. So, consider it an early attempt to map this new and exciting landscape and then tell us what’s missing and should be improved.

On the far left, we’ve placed the language training sets. That’s the data that is spun into the fabric of the bot providing the foundation for the bot itself. The data layer can be broken down into three categories: open source, proprietary, and algorithmically derived. Open source data is that which is publicly available. Proprietary data is the stuff generated and used by a company itself — think Slack or Twitter. There’s a ton of data generated from within Slack, which can then be used as the bedrock for many of the bots that reside within the enterprise tool itself. Algorithmically derived data comes from an alternate source — the best examples here are keyboards. Microsoft acquired Swype for the natural language data, and Google, with its Gboard, likely views it similarly.

The next layer is intelligence. These are the tools that capitalize on the available data sources and hopefully spin it into gold. The spectrum of NLP and machine learning APIs and libraries, image and speech recognition, and even analytics, encompasses the intelligence layer. We’ve included analytics into this section of the botscape because analytics allow bot creators to discern the good from the bad experiences users have with a bot and consequently improve them.

After intelligence, we have the creation layer. There are myriad tools available to spin up a bot, think CMS for bots. Some require technical knowledge whereas others are more rudimentary. The creation layer is where the bot building blocks, formed in the intelligence layer, are compiled to fashion a working prototype. Sometimes the bot creator builds the bot him or herself using a consumer facing tool. Other times, companies will employ an agency to spin up the bot for them and those are the companies you see listed under “agencies” within the creation layer. We’ve chosen to include payment services within the creation layer as they are often integral to the creation, and usability of a bot. Payments are inextricably linked with the creation of so many conversational services, that this placement felt most appropriate.

Once the bot is built, we pipe it to different places; this is the layer of the stack we’ve labeled “piping.” Piping overlaps with creation, but is segmented in an effort to draw the subtle distinction. A good example is Twilio. Twilio’s text message API allows bot makers to deploy their bots over SMS; the bot gets built in the creation layer and then piped to SMS through Twilio. Some products, such as Dexter, fall into both camps. Bots can be built in Dexter and then piped through to Slack or SMS using their respective modules.

Following the piping layer, we’ve listed a variety of sample bots. The bots shown are far from representative of the botosphere in its entirety and we hope to expand upon this list as we revise our botscape. All of these bots can be found through a variety of bot discovery platforms. Some are native to messaging platforms, such as the Slack Kik, FB, or Telegram app stores, whereas others are standalone directories such as Botlist, Botpages, Dashbot, or BotArena.

The last and final layer in the botscape is distribution. The distribution layer represents the scope of platforms on which people can interact with all of these conversational services. The platforms vary drastically, and the same bot can look vastly different on one platform from another. The distribution platforms, ranging from social messaging to voice enabled to enterprise messaging products, are the places people go to interact with bots.

And there we have it. The landscape of bot creation — the turkey, lettuce, and tomatoes sandwiched between creators and consumers. But let’s not forget, this botscape is far from perfect. Above, where we asked the community to help us revise the botscape, we meant it. Please leave your thoughts in the comments. Let us know what you think we got right, and more importantly, got totally wrong. We plan on incorporating as many of the community’s ideas as possible and release V2, V3, V4… or as many as it takes to it the whole thing right.

Thanks to Matt Hartman, John Borthwick, Peter Rojas as well as the Luma team and Terrence Kawaja who popularized these “scapes.”