Generative AI leading from Front

Generative AI leading from Front

The recent popularity of large language models (LLMs) has been accompanied by a surge in interest in virtual assistants. Virtual assistants are not limited only to chatbots. The most common uses of virtual assistants include: completion of manual tasks, answering questions, and even assisting developers to write code. Although they existed long before LLMs, they are currently receiving renewed attention because of the capabilities for text generation in natural language and AI-powered reasoning offered by these models.

IBM has been a recognized virtual assistant and AI leader for many years, and has recently revamped it’s virtual assistant portfolio by infusing products with LLMs. The IBM watsonx portfolio includes all IBM products infused with generative AI capabilities.

This article will specifically provide an overview of the watsonx virtual assistant offerings:

IBM watsonx Assistant is a simple to deploy tool for creating enterprise grade chatbots with conversational and natural responses. Users can design custom conversation flows with both user & AI generated responses. The tool can be used to build both voice and text virtual agents, and easily integrates with cloud data sources and APIs. Users should leverage this product if they want to use LLMs to answer questions based on information in a knowledge store with Retrieval-Augmented-Generation (RAG).

IBM watsonx Orchestrate is an AI assistant that leverages generative AI and automation technology to streamline repetitive, tedious tasks for human resources and talent acquisition departments to increase efficiency and provide better business results. Orchestrate comes with prebuilt “skills” that integrate with popular tools like Salesforce, Workday, Outlook and Gmail to automate tasks seamlessly with a no-code interface using Robotic Process Automation (RPA). Custom skills tailored to unique use cases can also be easily created through Orchestrate’s Unified Automation Builder. Orchestrate skills are easy to interact with a seamless enterprise-grade, chatbot-style interface.

IBM watsonx Code Assistant watsonx Code Assistant is a family of products both powered by IBM Granite models engineered for code generation: watsonx Code Assistant for Ansible Lightspeed and watsonx Code Assistant for Z. Watsonx Code Assistant for Ansible Lightspeed helps developers streamline IT Automation by automatically generating code for Ansible playbooks. Watsonx Code Assistant for Z can perform application discovery and analysis, and converts legacy COBOL code to semantically equivalent Java. This helps organizations simplify mainframe application modernization and address skill shortages by migrating to a language with a significantly larger pool of talent.

**

watsonx Assistant

**
Virtual assistants built with watsonx Assistant are built upon the idea of the action. Actions are used to interpret customer responses and intents, search and retrieve information, and use logic to process that information to steer the conversation’s flow. Users may configure their own actions from scratch, or choose from pre-configured templates.

Assistants can even be configured to understand natural human voice, and to generate voice responses back to customers over the phone. IBM Research has built novel large speech models based on the same Transformer architecture that powers LLMs. These large speech models are capable of understanding and generating human speech for virtual voice agents. IBM’s large speech models are more efficient, cost effective and provide better performance relative to traditional methods like Whisper, though users can still integrate them into watsonx Assistant should they so choose.

The following screen capture is an example of an action used to search Google and then feed the response into another action to process the search results data:

By setting variable values, users can add custom logic to assistant responses and conversation flows. User responses do not need to perfectly conform to pre-defined values to trigger logic. Watson Assistant is capable of using AI to discover user intent, and to learn from user behavior. Users can also give multiple examples of variations of responses to better train the assistant’s AI to recognize user response.

Users who want to use LLMs to generate responses can integrate them simply with watsonx Assistant. Users can configure session variables to hold values of prompts to be sent to LLMs via the watsonx.ai API. These prompts can even contain other session variables representing user or other data. The following screen capture is an example action flow where a user specifies the prompt to a Llama 2 LLM and then invokes another action to send that prompt to the watsonx.ai API.
One of the main advantages of watsonx Assistant is its native integration with Watson Discovery and ElasticSearch via watsonx Discovery to enable retrieval augmented generation (RAG)-powered responses. RAG is used to give an LLM context about information stored in a knowledge base that is often too large for a model to understand with just a prompt. By using RAG, the user searches for the most relevant parts of the source information that match the user’s input, and inputs this along with the user’s query into the model. This can allow a chatbot to answer a user’s question about the operation of a device based upon it’s manual, even if the manual is 100 pages long for example. The Watson Discovery search integration can even show the user where in the source documents the assistant pulled the information from.

For more information about the RAG pattern, see the watsonx Assitant Documentation. For an in-depth explanation of RAG, see the IBM Developer article, “Retrieval augmented generation with large language models from watsonx.ai”.

Once assistants are configured, they can be embedded simply into websites with HTML and apps via the API documentation. watsonx Assistant contains native integrations with WhatsApp or SMS messages via Twilio, Facebook Messenger, Service Now, Mailchimp, Spotify, Slack and Microsoft Teams. Using integrations like Genesys, users can be connected with real human agents, or users can also build their own custom integrations using the API.

Leave a Reply

Your email address will not be published. Required fields are marked *