- Working installation of docker and docker compose. A quick guide is available here.
-
Navigate to the docker directory:
cd path_tracing/docker
-
Deploy the docker-compose stack:
docker-compose up -d
The docker-compose file spins up a fresh Druid install. To configure Druid as a Kafka consumer and query source for Turnilo, a few Druid Datasources need to be configured. The JSON specifications for the Datasources are available in the path_tracing/docker/druid folder.
To add a Datasource from the Druid GUI, select "Load Data", "Edit Spec", then copy/paste one of the schemas. For basic functionalities, the datasources pt_probe_processed, pt_probe_global and pt_probe_hub need to be running. For additional IPFIX integration, also the datasources pt_ipfix_processed and pt_ipfix_joined need to be added.
The docker container will setup a nfacctd listener on 192.168.0.100:4739. The setup script already configures all vpp routers to export IPFIX information to this address.
To deploy the nfacct container:
-
Navigate to the docker/pmacct directory:
cd path_tracing/docker/pmacct
-
Deploy the docker-compose stack:
docker-compose up -d
-
If you modify the config file (path_tracing/docker/pmacct/config/nfacctd.conf), the docker container needs to be recreated:
docker-compose down docker-compose up -d