4.1
Plugins
Lenses is extendable, and the following implementations can be specified:
Serializers/Deserializers Plug you own serializer and deserializer to enable observability over any data format (i.e. protobuf / thrift)
Custom authentication Authenticate users on your own proxy and inject permissions HTTP headers. See Authentication
LDAP lookup Use multiple LDAP servers, or your own group mapping logic. See LDAP
SQL UDFs User Defined Functions (UDF) that extend SQL and streaming SQL capabilities. See UDF
Once built, the jar files and any dependencies of the plugin should be added to Lenses and, in the case of Serializers and UDFs, to the SQL Processors if required.
Adding plugins
Lenses
On startup, Lenses loads plugins from the $LENSES_HOME/plugins/
directory and
any location set in the environment variable
LENSES_PLUGINS_CLASSPATH_OPTS
. These locations are watched, and dropping a new
plugin will hot-reload it. For the Lenses docker (and Helm chart) you may also
use /data/plugins
which is defined as a volume.
Any first level directories under the aforementioned paths which are detected on startup will also be monitored for new files. During startup the list of monitored locations will be shown in the logs to help confirm the setup.
...
Initializing (pre-run) Lenses
Installation directory autodetected: /opt/lenses
Current directory: /data
Logback configuration file autodetected: logback.xml
These directories will be monitored for new jar files:
- /opt/lenses/plugins
- /data/plugins
- /opt/lenses/serde
Starting application
...
Whilst all jar files may be added to the same directory (e.g /data/plugins
),
it is suggested to use a directory hierarchy to make management and maintenance
easier.
An example hierarchy for set of plugins:
├── security
│ └── sso_header_decoder.jar
├── serde
│ ├── protobuf_actions.jar
│ └── protobuf_clients.jar
└── udf
├── eu_vat.jar
├── reverse_geocode.jar
└── summer_sale_discount.jar
SQL Processors in Kubernetes
There are two ways to add custom plugins (UDFs and Serializers) to the SQL Processors; (1) via making available a tar.gz archive at an http(s) address, or (2) via creating a custom docker image.
Archive served via HTTP
With this method, a tar archive, compressed with gzip, can be created that
contains all plugin jars and their dependencies. Then this archive should be
uploaded to a web server that the SQL Processors containers are able to access,
and its address set with the option
lenses.kubernetes.processor.extra.jars.url
.
Step by step:
- Create a tar.gz file that includes all required jars at its root:
tar -czf [FILENAME.tar.gz] -C /path/to/jars/ *
- Upload to a web server, ie.
https://example.net/myfiles/FILENAME.tar.gz
- Set
For the docker image set the corresponding environment variablelenses.kubernetes.processor.extra.jars.url=https://example.net/myfiles/FILENAME.tar.gz
LENSES_KUBERNETES_PROCESSOR_EXTRA_JARS_URL=https://example.net/myfiles/FILENAME.tar.gz`
Custom Docker image
The SQL Processors that run inside Kubernetes use the docker image
lensesio-extra/sql-processor
. It is possible to build a custom image and add
all the required jar files under the /plugins
directory, then set
lenses.kubernetes.processor.image.name
and
lenses.kubernetes.processor.image.tag
options to point to the custom image.
Step by step:
- Create a Docker image using
lensesio-extra/sql-processor:VERSION
as base and add all required jar files under/plugins
:FROM lensesio-extra/sql-processor:4.1 ADD jars/* /plugins
docker build -t example/sql-processor:4.1 .
- Upload the docker image to a registry:
docker push example/sql-processor:4.1
- Set
For the docker image set the corresponding environment variableslenses.kubernetes.processor.image.name=example/sql-processor lenses.kubernetes.processor.image.tag=4.1
LENSES_KUBERNETES_PROCESSOR_IMAGE_NAME=example/sql-processor LENSES_KUBERNETES_PROCESSOR_IMAGE_TAG=4.1
SQL Processors in Kafka Connect
To add custom plugins (UDFs and Serializers) to the SQL Processor for Kafka Connect (connector), all the required jars should be added together with the connector jars.
Step by step
- Extract the SQL Processor connector under the
plugin.path
of Kafka Connect to all Connect worker nodes. E.g forplugin.path=/usr/share/java/kafka/plugins/
:mkdir -p /usr/share/java/kafka/plugins/sql-processor tar -xzf lenses-sql-connect.tar.gz \ -C /usr/share/java/kafka/plugins/sql-processor \ --wildcards */connector/* --strip-components=2
- Copy the custom plugins and rest of required jars to all Connect worker
nodes:
cp /path/to/plugins/jars/* /usr/share/java/kafka/plugins/sql-processor