Embed analytics experiences in your applications using the Conversational Analytics API

By Google Cloud Tech

Share:

Key Concepts

  • Conversational Analytics API: A developer interface enabling natural language querying of data without SQL or dashboards.
  • Data Agent: An AI agent trained on a specific data schema, business logic, and connections.
  • System Instructions & Authored Context: Mechanisms for controlling and grounding the data agent’s responses.
  • Stateful Sessions vs. Stateless Chats: Options for managing conversation context and state.
  • Access Controls (AM): Built-in permissions for managing data agent access and usage.
  • LLM Integration: Ability to combine the API with other Large Language Models (LLMs) like Gemini.
  • Looker ADK (Agent Development Kit): Tools for connecting to and developing data agents.

Introduction to the Conversational Analytics API

The Conversational Analytics API, part of the Gemini stack, allows developers to embed conversational intelligence directly into their applications, leveraging existing data sources. It provides a way to query data using natural language, eliminating the need for users to write SQL or navigate complex dashboards. This API is built on the same engine powering conversational analytics within Looker, but offers broader accessibility through a developer interface.

Core Functionality: Data Agents and Query Translation

The central component of the API is the Data Agent. This is an AI agent specifically trained to understand the structure (schema), underlying business rules, and data connections of a particular dataset. Users submit natural language queries to the API, and the Data Agent translates these queries into structured queries – typically SQL or Looker LookML – which are then executed against the data source. The API then returns both the structured data results and a natural language response.

Customization and Control: System Instructions & Authored Context

Unlike a static model, the API allows for significant customization. System Instructions and Authored Context provide mechanisms to control the agent’s behavior and ensure accuracy.

  • System Instructions: Allow developers to define overarching rules for the agent, such as always filtering on specific fields.
  • Authored Context: Enables the provision of example natural language questions paired with their corresponding queries. This helps the agent learn the desired query patterns. A glossary of key terms can also be defined within the authored context to ensure consistent interpretation. Instructions on how to join tables based on schema relations can also be provided.

Conversation Management: Stateful vs. Stateless Approaches

The API supports two primary approaches to conversation management:

  • Stateful Sessions: Maintain context across multiple turns of a conversation by referencing an existing agent or conversation object. This is suitable for applications requiring memory of previous interactions.
  • Stateless Chats: Treat each query as independent, dynamically generating inline context for each request. This is ideal for applications where conversation history is not crucial.

Security and Access Control

The API incorporates Access Management (AM) controls with predefined rules. These rules govern permissions related to creating, sharing, and interacting with Data Agents, as well as utilizing stateless chat functionality. This ensures secure access to sensitive data.

Integration with Other LLMs and Tools

The API is designed for interoperability. It can be combined with the Gemini Live API or other Large Language Models (LLMs) to facilitate complex, multi-turn reasoning across multiple data sources. The Looker MCP server and the Agent Development Kit (ADK) provide tools for connecting to other agents and extending the API’s capabilities. Users familiar with the ADK’s “Ask Data Insights” tool have already experienced the functionality of the Conversational Analytics API.

Practical Example: Python Implementation

A simple Python example demonstrates how to interact with the API. A chat message is sent via an API call, and the agent interprets the question, executes a BigQuery query, and returns a response containing both natural language and structured data. This allows the application to handle the response in a flexible manner.

Getting Started and Resources

Developers can begin using the API by cloning the quickstart repository, which provides ready-to-run examples in Python. Links to demos, tools repositories, and SDKs in various programming languages are available for further exploration. The API supports building Data Agents using SDKs in a language of the developer’s choice.

Conclusion

The Conversational Analytics API transforms data from static numbers into an interactive dialogue. It empowers users to ask questions about their data in natural language, unlocking insights without the need for specialized technical skills. This API represents a significant step towards making data analytics more accessible and intuitive.

Chat with this Video

AI-Powered

Hi! I can answer questions about this video "Embed analytics experiences in your applications using the Conversational Analytics API". What would you like to know?

Chat is based on the transcript of this video and may not be 100% accurate.

Related Videos

Ready to summarize another video?

Summarize YouTube Video