We are happy to advise you!
1­-855­-778­-7246    Contact

Implementing Splunk Data Stream Processor (DSP) (ISDSP)

 

Course Content

This hands-on class provides the fundamentals of deploying a Splunk DSP cluster and designing pipelines for core use cases. It covers installation, source and sink configurations, pipeline design and backup, and monitoring a DSP environment.

Who should attend

This 4-day course is designed for the experienced Splunk administrators who are new to Splunk DSP.

Prerequisites

Required:

Recommended:

Course Objectives

  • Introduction to Splunk Data Stream Processor
  • Deploying a DSP cluster
  • Prepping Sources and Sinks
  • Building Pipelines - Basics
  • Building Pipelines - Deep Dive
  • Working with 3rd party Data Feeds
  • Working with Metric Data
  • Monitoring DSP Environment

Outline: Implementing Splunk Data Stream Processor (DSP) (ISDSP)

Module 1 – Introduction to DSP

  • Review Splunk deployment options and challenges
  • Describe the purpose and value of Splunk DSP
  • Understand DSP concepts and terminologies

Module 2 – Deploying a DSP Cluster

  • List DSP core components and system requirements
  • List DSP core components and system requirements
  • Describe installation options and steps
  • Check DSP service status
  • Learn to navigate in DSP UI
  • Use scloud

Module 3 – Prepping Sources and Sinks

  • Ingest data with DSP REST API service
  • Configure DSP source connections for Splunk data
  • Configure DSP sink connections for Splunk indexers
  • Create Splunk-to Splunk pass-through pipelines

Module 4 – Building Pipelines - Basics

  • Describe the basic elements of a DSP pipeline
  • Create data pipelines with the DSP canvas and SPL2
  • List DSP pipeline commands
  • Use scalar functions to convert data types and schema
  • Filter and route data to multiple sinks

Module 5 – Building Pipelines - Deep Dive

  • Manipulate pipeline options:
  • Extract
  • Transform
  • Obfuscate
  • Aggregate and conditional trigger

Module 6 – Working with 3rd party Data Feeds

  • Read from and write data to pub-sub systems like Kafka
  • List sources supported with the collect service
  • Transform data from Kafka and normalize
  • Write to S3

Module 7 – Working with Metric Data

  • Onboard metric data into DSP
  • Transform metric data for Splunk indexers and SignalFx
  • Send metric data to Splunk indexers
  • Send metric data to Splunk SignalFx

Module 8 – Monitoring DSP Environment

  • Back up DSP pipelines
  • Monitor DSP environment
  • Describe steps to isolate DSP service issues
  • Scale DSP
  • Replace DSP master node
  • Upgrade DSP cluster
Online Training

Duration 4 days

Price
  • CAD 2,540
Classroom Training

Duration 4 days

Price
  • Canada: CAD 2,540
 
Schedule

Currently there are no training dates scheduled for this course.