Managing Lenses SQL
One of the main benefits of Lenses is providing a secure way to query, inspect and control your data, wherever it might be.
Lenses comes with Lenses SQL, a declarative Structured Query Language (SQL) interface, supporting industry standard ANSI joins and aggregates for querying, transforming and manipulating data at rest and data in motion.
Queries types
There are two types of syntax to create queries with Lenses SQL - Snapshot SQL & Streaming SQL:
Snapshot SQL lets you create point-in-time queries to pull existing data-at-rest from a Kafka topic or other data sources.
Streaming SQL allows creating continuous real-time running queries for live data manipulations. A good fit for Kafka-to-Kafka, results and scales by running a SQL Processor app.

Some configurable Admin settings when it comes to Lenses SQL queries include:
- Size read
- Query duration
- Query idle time

For more regarding the configuration see configuration settings.
Query content can also be protected by adding a Quota on your Kafka cluster under the specific client ID. For more information on how to do this see content protection.
Manage queries
Recent queries
Recent queries are displayed, but only for the current session, they are not currently retained.
Click on the play button to run a previous query. If a query is already running, you will be asked if you want to stop it first.
View All queries
CopySHOW ALL QUERIES
View Running queries
You can see all running queries by Lenses users using SQL:
CopySHOW QUERIES
Kill Running queries
You can force stop a query by another user using SQL:
CopyKILL QUERY <id>
Changing the connection
You can query data in other sources connected to Kafka by using the connection name. For example:
CopyUSE `lenses-postgresql`;
SELECT * FROM public.metrics_series_values
LIMIT 100;
or by selecting the connection in the SQL Studio.
Lenses SQL Plugins
Lenses is extendable, and the following implementations can be specified:
- SQL UDFs, User Defined Functions (UDF) that extend SQL and streaming SQL capabilities.
- Serializers/Deserializers, your own serializer and deserializer to enable observability over any data format (i.e. protobuf / thrift).
- Expressions, parts of a Lenses SQL query that will be evaluated to single values.
- Custom authentication, authenticate users on your own proxy and inject permissions HTTP headers.
- LDAP lookup, use multiple LDAP servers, or your own group mapping logic.
Lenses SQL reference guide
Example
Let’s see an example of the following using Lenses CLI considering we are using Lenses box, and the configured paths in such environment:
- Identify the Connect clusters that need upgrading.
- For each of them, stop temporarily all SQL Processors. Please, note down what processors were running to start them back.
- Update the Connector plugin on each Connect cluster.
- Proceed with the upgrade to Lenses.
- Upgrade to the latest CLI.
- Once Lenses is upgraded and has successfully started, restart all processors that were previously stopped
List existing running processors
Copylenses-cli processors | grep -iw 'RUNNING\|FAILED\|PENDING' | \
awk '{print "id=" $1 " cluster-name=" $15}' \
> processors.bk
Stop each running SQLProcessor
Copywhile read processor; \
do lenses-cli processor stop $(echo "$processor" | \
awk '{print "--"$1 " --"$2}'); done \
< processors.bk
Upgrade SQL connector plugin
Plugins are added under the plugin.path of each worker. Each plugin should be in its own subdirectory.
As an example, the Connect worker’s configuration may include the line below, which sets the plugin.path to /usr/share/connectors. This path contains a folder for the
Copylenses-sql-streaming
plugin.plugin.path=/usr/share/connectors
Remove that
/usr/share/connectors/lenses-sql-streaming
folderTo install the new SQL connector, download the connector archive (lenses-sql-connect-vX.Y.Z.tar.gz) from the client area, extract it, and copy all files under the connector directory to
/usr/share/connectors/lenses-sql-streaming
.Further instructions to install a new Connector plugin.
Copymkdir -p /usr/share/connectors/lenses-sql-streaming tar -xzf lenses-sql-connect.tar.gz \ -C /usr/share/connectors/lenses-sql-streaming \ --wildcards */connector/* --strip-components=2
Start up the workers.
Proceed with the upgrade to Lenses and latest CLI
Start back previously running SQL Processors
Copywhile read processor; \
do lenses-cli processor start $(echo "$processor" | \
awk '{print "--"$1 " --"$2}'); done \
< processors.bk