docker pull lensesio/box
Lenses Box is a complete container solution for you to build applications on a localhost Apache Kafka docker.
Lenses Box contains all components of the Apache Kafka eco-system, CLI tools and synthetic data streams:
1. To start with Box get your free development license online.
2. Install Docker
Before you get started you will need to install Docker. This Kafka docker container will work even on a low memory 8GB RAM machine.
3. Run Docker
docker run --rm \ -p 3030:3030 \ --name=dev \ --net=host \ -e EULA="https://dl.lenses.stream/d/?id=CHECK_YOUR_EMAIL_FOR_KEY" \ lensesio/box
Note that the value of the --name parameter can be anything you want. Just remember to replace dev with your own value in the docker exec commands found in this page. Kafka Connect requires a few minutes to start up since it iterates and loads all the available connectors.
--name
dev
docker exec
You can also periodically upgrade to the latest versions with:
$ docker pull lensesio/box
To access the Kafka Docker web user interface, open your browser and navigate to http://localhost:3030
http://localhost:3030
Login with admin / admin.
admin
If interested to learn Kafka or to quick start and create topics, view consumer groups or monitor kafka visit the user guide.
To access the various Kafka command-line tools, such as the console producer and the console consumer, from a terminal in the Docker container, you should execute the following command:
docker exec -it dev bash root@fast-data-dev / $ kafka-topics --zookeeper localhost:2181 --list
Alternatively, you can directly execute a Kafka command such as kafka-topics like this:
kafka-topics
docker exec -it dev kafka-topics --zookeeper localhost:2181 --list
The broker in the Kafka docker has a broker id 101 and advertises the listener configuration endpoint to accept client connections.
101
If you run Docker on macOS or Windows, you may need to find the address of the VM running Docker and export it as the advertised listener address for the broker (On macOS it usually is 192.168.99.100). At the same time, you should give the lensesio/box image access to the VM’s network:
192.168.99.100
lensesio/box
docker run -e EULA="CHECK_YOUR_EMAIL_FOR_KEY" \ -e ADV_HOST="192.168.99.100" \ --net=host --name=dev \ lensesio/box
If you run on Linux you don’t have to set the ADV_HOST but you can do something cool with it. If you set it to be your machine’s IP address you will be able to access Kafka from any clients in your network.
ADV_HOST
If you decide to run box in the cloud, you (and all your team) will be able to access Kafka from your development machines. Remember to provide the public IP of your server as the kafka advertised host for your producers and consumers to be able to access it.
Kafka JMX metrics are enabled by default. Refer to ports. Once you expose the relevant port ie. -p 9581:9581 you can connect to JMX with
-p 9581:9581
jconsole localhost:9581
If you are using docker-machine or setting this up in a Cloud or DOCKER_HOST is a custom IP address such as 192.168.99.100, you will need to use the parameters --net=host -e ADV_HOST=192.168.99.100.
--net=host -e ADV_HOST=192.168.99.100
docker run --rm \ -p 3030:3030 \ --net=host \ -e ADV_HOST=192.168.99.100 \ -e EULA="https://dl.lenses.stream/d/?id=CHECK_YOUR_EMAIL_FOR_KEY" \ lensesio/box
To persist the Kafka data between multiple executions, provide a name for your Docker instance and do not set the container to be removed automatically (--rm flag). For example:
--rm
docker run \ -p 3030:3030 -e EULA="CHECK_YOUR_EMAIL_FOR_KEY" \ --name=dev lensesio/box
Once you want to free up resources, just press Control-C. Now you have two options: either remove the Docker:
Control-C
docker rm dev
Or use it at a later time and continue from where you left off:
docker start -a dev
The Docker container has been set up to create and produce data to a handful of Kafka topics. The producers (data generators) are enabled by default. In order to disable the examples from being executed set the environment variable -e SAMPLEDATA=0 in the docker run command.
-e SAMPLEDATA=0
docker run
The docker ships with a collection of source and sink Kafka Connectors. There is one Kafka Connect Worker with all the connectors pre-set in the classpath so they are ready to be used.
From the UI go to Connectors > New Connector
Follow the instructions of each connector to launch.
From the UI go to SQL Processors > New Processor
Here is few examples for SQL Processors for the generated data.
SET defaults.topic.autocreate = true; INSERT INTO position_reports_Accurate SELECT STREAM * FROM sea_vessel_position_reports WHERE Accuracy = true
SET defaults.topic.autocreate = true; INSERT INTO position_reports_latitude_filter SELECT STREAM Speed, Heading, Latitude, Longitude, Radio FROM sea_vessel_position_reports WHERE Latitude > 58
SET defaults.topic.autocreate = true; INSERT INTO position_reports_MMSI_large SELECT STREAM * FROM sea_vessel_position_reports WHERE MMSI > 100000
SET defaults.topic.autocreate = true; INSERT INTO backblaze_smart_result SELECT STREAM (smart_1_normalized + smart_3_normalized) AS sum1_3_normalized, serial_number FROM backblaze_smart WHERE _key.serial_number LIKE 'Z%'
SET defaults.topic.autocreate=true; SET auto.offset.reset='earliest'; SET commit.interval.ms='10000'; INSERT INTO cc_data_json STORE VALUE AS JSON SELECT STREAM * FROM cc_data; INSERT INTO cc_payments_json STORE VALUE AS JSON SELECT STREAM * FROM cc_payments; WITH tableCards AS ( SELECT TABLE number, customerFirstName, customerLastName, blocked FROM cc_data_json ); WITH joined AS ( SELECT STREAM p.merchantId, p.creditCardId, p.currency FROM cc_payments_json AS p JOIN tableCards AS c WHERE c.blocked = false ); INSERT INTO frauds_detection_5 STORE VALUE AS JSON SELECT STREAM count(*) as attempts FROM joined WINDOW BY TUMBLE 5s GROUP BY joined.merchantId;
Download your key locally and run the command:
LFILE=`cat license.json` docker run --rm -it -p 3030:3030 -e LICENSE="$LFILE" lensesio/box:latest
The container is running multiple services and is recommended to allocate 5GB of RAM to docker (although it can operate with even less than 4GB).
To reduce the memory footprint, it is possible to disable some connectors and shrink the Kafka Connect heap size by applying these options (choose connectors to keep) to the docker run command:
-e DISABLE=azure-documentdb,blockchain,bloomberg,cassandra,coap,druid,elastic,elastic5,ftp,hazelcast,hbase,influxdb,jms,kudu,mongodb,mqtt,redis,rethink,voltdb,yahoo,hdfs,jdbc,elasticsearch,s3,twitter -e CONNECT_HEAP=512m
Lenses Box is optimized for development and includes a single broker Kafka environment. However, you can connect Lenses against your existing setup. Lenses is compatible with all popular Kafka distributions and clouds.
Contact our team to help
To make your experience better, we have pre-configured a set of synthetic data generators to get streaming data out of the box. By default the Docker image will launch the data generators; however you can have them off by setting the environment variable -e SAMPLEDATA=0 in the docker run command.
Yes.
You will need a free key to operate which will need to be renewed after 6 months.
Yes. The docker image is open sourced. View on GitHub
On this page