Transitioning to Splunk Cloud Flashcards
What 5 things does Splunk Cloud Provide?
Hosted and supported by Splunk
Enterprise functionality on another’s machine
Reliability
Faster time to value
Cloud First Feature Releases
Can Splunk Cloud Accommodate both virtual and real infrastructure?
Yes
What two components can either be on prem or in the cloud with a cloud deployment?
-Universal Forwarder or Heavy Forwarder
-Intermediate UF/HF
What are the customer responsibilities for the cloud deployment?
-Forward the data
-Manage conifgs of sourcetype, index, contextual details
-Admin and coordinate changes: users, retention, configurations, needs associated with Splunk account team or PS
What are the two usage based license types a cloud customer can use?
Ingestion or Infrastructure
Describe ingest based license
-capabilities at set cost of ingest
-no additional costs to increase resources, or search activities
Describe infrastructure/workload based license
Splunk Virtual Core (SVC) units of data processing capacity used for a mix of ingest and search
-capabilities at a set infra size
-no ingest violations
-prioritizing index or search may impact performance
What are the 7 cloud benefits
Cloud Support and Ops Provides:
advice/troubleshooting support
Asset management and automated infra deploy
Automated processing and implementation
Regular maintenance and upgrade
Monitor/alert system health/security
IT Ops and security specialists
24/7 NOC
Does Cloud have license pooling or access through the CLI to hosted components?
No and there is SH GUI access only
Can Apps be installed without a vetting process in the Cloud?
No, apps should comply to vetting policy
What kind of secure forwarding does Cloud offer?
Secure SSL and TLS forwarding
What are the two Cloud Experiences offered?
Classic and Victoria
Victoria - Does not support/need Hybrid search, Inputs data manager, modular or scripted inputs. Uses Admin config Service API for HEC. Has the option to install premium apps
On Prem vs Cloud access differences
Cloud:
- no CLI
- vetted and approved apps permitted
- cant send TCP/UDP directly
- Scripted alerts only supported in approved apps
- License pooling not supported
- HEC enabled on port 443
- APi avail through API self service app or cloud support
- inbound TCP protocol only with SSL connection
Do Splunk Cloud Users have access to the CLI?
No
Can Direct TCP and syslog inputs be sent directly to Cloud?
Not in Cloud
How is the HEC enabled in the Cloud?
Via the ELB on port 443
What kind of network connection is supported in the cloud?
Inbound TCP protocol only with SSL secure connection
What are the authentication options for managed splunk cloud?
Splunk Native and SAML and LDAP
What are cloud apps installed via and deployed via?
Installed via search head and deployed via management app
When can Cloud apps be installed through self service?
When they are vetted, on splunkbase, or if the customer accepts the liability
With what release of cloud are most apps self service installations
Victoria
What are the parameters of TCP connections needed for splunk cloud
TCP connections need an authorized role, secure token, credentials or certificate validation
What is a hybrid search Head?
On prem SH initiated search to Cloud, can run searches to combine data from multiple locations, blended search on prem and/or cloud indexers
-not used for premium app SH
How does Splunk Version Compatibility work for hybrid searches?
On prem SH must have same major.minor version as cloud
What are the limitations to a hybrid search topology?
Can’t search multiple cloud environments and a Cloud SH can not search on prem environments or another cloud
Can Hybrid SH perform scheduled searches?
No
With which method of searching can the search span multiple Cloud and enterprise environments?
Federated Search
Which method of searching requires special syntax of generating commands?
Federated Search
Which type of search, hybrid or federated, supports workload management
Federated
Describe Federated Search version control and Architecture
Splunk 8.2.x and greater, and supports all search tier management architecture ( like clustering)
What three admin tasks occur at the source on-prem components?
Forwarding of events, input definition/parsing (on prem parsing/masking), problem isolation
Through what cloud component does the cloud admin manage knowledge objects
Via splunk cloud search head
With what issues would a customer work with Cloud Support?
Perf and avail issues
Cloud deployment issues
Config changes and maintenance
Install and manage apps
What three things does the Cloud Monitoring Console (CMC) Provide?
Monitoring and details of topology
Ingestion and Search activity of data
Orientation on overall health and performance
How does the CMC (cloud Monitoring Console) differ from the on prem monitoring console?
CMC is pre-configured except for forwarder and workload manager
What is the purpose of the Cloud Migration Assessment App for Splunk?
Deploy on Monitoring Console server or SH to perform pre-checks and guidance on migration
What does the Phased Cloud Migration consist of?
Planning, Config and Artifact Migration, data Migration, Data collector Migration, post implementation checks
What is the average time for a cloud migration?
4-8 weeks
What does the planning phase of cloud migration consist of?
Assessing on prem splunk with health checks, gathering configs, recording priorities
What does the configuration phase of cloud migration consist of?
Preparing Cloud with indexes and authentications, configure cloud and IP Based access controls
What does the Artefact Migration phase of the cloud migration consist of?
Migrating search artefacts, apps, and workflows (dashboards, apps, alerts, field extractions etc)
What does the data migration phase of cloud migration consist of?
-replicate on prem data input and source types, check CIM, initiate historical data migration
-deploy credential app to forwarders, point data sources to splunk cloud, check inputs for ingest path, timestamp/linebreak/extractions
How is access to the cloud enabled?
With authentication credentials via the user interface
User account must be authenticated by splunk or external identity provider
Authorized by assignment to a splunk role(s)
What are the three ways of establishing a user account?
Native Splunk, LDAP/Active Directory, or SAML
What two files maintain the splunk access controls?
Authentication.conf
Is the user who they say they are
Authorization.conf
What resources they can access, tasks they can perform, limits are placed on them
How should customers audit or remove users?
Raise a support ticket
What capability is needed for user manager roles, and is default for sc admins?
Change_authentication capability
What authentication method is not supported by Cloud?
DUO two factor authentication
What is a good way to troubleshoot authentication issues?
Create a unique Splunk Native admin account`
When can authentication replicate?
When set up on clustered search heads
When using a mix of native Splunk, LDAP, & SAML users, which will take precedence?`
Splunk Native
What user role is reserved for Cloud Ops?
Admin Role
What are two additional user roles that Cloud offers?
Sc_admin and apps
What actions are Splunk Cloud Admin allowed to do and why?
edit/delete Splunk Native Users
Change time zone, and default app for LDAP/SAML users
Due to limited access in the Cloud
How are customer Identity providers connected and managed
Connected to splunk via internet and managed through splunk web
Does Splunk Cloud use existing customer configured accounts?
Yes, enforces user account a pw policies, and has the ability to use local usernames and pw in splunk with the option to map IdP groups to splunk roles
What must customers do to authenticate users in Cloud using LDAP?
-maintain read only, internet accessible LDAP servers
-authenticate and authorize in splunk
When does Splunk cache user data from LDAP?
The first time a user logs in AND its reloaded for subsequent logins if an update has been made
How many Identity Providers (IdP) can a customer using SAML have?
Limitation is currently 1 IdP
What type of authentication uses digitally signed XML Certificates from an IdP?
SAML
T/F when mapping SAML Groups to roles, only one group can be mapped to one role
False, multiple groups can be mapped to one role
What are the roles that users can have in splunk Cloud?
sc_admin, power, user, apps, can _delete, tokens_auth
What Cloud user role has the highest number of capabilities?
sc_admin
Which user role can add custom user roles
sc_admin
Which user role can manage apps and has some admin capabilities
Apps user role
Define the can_delete user role
Not assigned to any user role or group by default
Can use |delete command to hide data
Define the token_auth user role capabilities
Enables users to configure token based authorization
Custom user role authorization is a combination of what 5 things?
Role inheritance, capabilities, index access, restrictions and resource usage limits
T/F You can adjust the capabilities of inherited roles
False, inherited capabilities or access cannot be disabled
When creating a new user role and assigning indexes, what does selecting the ‘default’ checkbox for an index imply?
The index will be automatically searched without a user specifying “index=<index_name>”</index_name>
What are role based restrictions used for when setting up a new user role?
Used to restrict the searches a role can use: can set a default time range, indexes fields to filter, field values, concatenation option, and a specific search string to filter results
What do ‘Resources’ adjust when setting up a new user role?
Resources manages the
-role and user search job limit
-role search time window limit
-disk space limit
Can you validate or check on user capabilities in the Cloud?
Yes, using REST API, there are searches you can run to get capability info
What is Workload Management?
A rule-based management to allocate compute resources (CPU and Memory) to search, index, and other user workloads
What is the benefit of workload management?
Improve performance, resource availability and productivity:
Separate data ingest from search workload,
Prioritize critical search workloads
Isolate resource heavy searches
What are workload pools?
Logical containers which resources (CPU/ mem) are assigned to as part of WLM
What are workload Rules?
User defined set of conditions allocate a search to a workload pool automatically or to reduce impact of expensive searches
EX of assigning pool by set criteria: role=security AND search_type=adhoc
What are workload management Admission Rules?
Filter searches automatically before execution based on user defined conditions like running searches in a certain time range and searches that use excessive resources
Why is “users unable to search” a commonly reported issue>
Unrestricted user access may tie up resources impacting access/functionality
Where might resource availability cause performance issues?
Data replication and search performance
Disk space/storage availability
What measures are taken for disaster recovery in the Cloud?
Site awareness - across 3 availability zones
Automatic Index replication of which all copies are searchable
Splunk Cloud Users should be aware of what two indexes?
“main” - the default index accepts events not assigned an index
“lastchanceindex” pre-defined in Cloud accepting events sent to a non-existent index
What two key file types are within an index?
rawdata files: raw uncompressed data
tsidx files: Time series index files pointing to raw data
Is it best practice to use index=main when searching?
No it is best practice to segregate data into separate indexes and specify your specific index when searching
What type of indexes are available?
Event indexes - unstructured data stored as separate events
Metric indexes - metric data uses less storage and system resources with increased search speed
Describe the process of new and updated indexes are deployed in Cloud.
Done through SH UI: changes transferred to Manager Node (MN) which creates a bundle push to files on indexers
How do buckets role from hot to warm to cold in the indexes?
Role by exceeding:
-number of buckets
-index size
-event age
Which index bucket is open for write?
Hot bucket
What 3 options are available as data moves to the frozen state?
-purge/delete default unless archive selected
-archive events unsearchable either in splunk or customer managed archive
-thawed, archive data restored
Describe Splunk Managed Archive data option
Known as Dynamic Data Active Archive: disabled by default and must be purchased in 500GB increments with set retention
Describe Customer Managed Archive data option
Known as Dynamic Data Self Storage where data is moved to customer managed AWS S3
What happens to data accessibility when it is archived?
With Active Archive it is easily restored to Splunk Cloud.
With Self Storage data is no longer accessible via Splunk Cloud. Must be re-ingested
What happens when DDAA Dynamic Data Active Archive is full?
Buckets are deleted
Which storage option is best for auditing and tracking historical data/compliance?
Dynamic Data Self Storage DDSS as it is a long term option needed for those activities
Can data be thawed back into splunk searchable indexes from dynamic data self storage DDSS?
No, it has to be thawed in customer environment and data needs to be re-ingested to be restored and searchable in splunk cloud
How long is restored dynamic data active archive (DDAA) searchable for?
Searchable with 24 hours of being reinstated and searchable for up to 30 days`
In what increments can DDAA be purchased?
500GB
How can you delete an archive?`
Logging a support ticket
T/F You create the splunk index prior to setting up the AWS S3 bucket name
False, bucket must be created prior to creating the self storage location in splunk
When using the CMC, what would you check indexing performance for?
-missing events
-queue fill issues
-delayed data
When using the CMC,what would you check Indexes and Storage dashboard for?
-Retention and Sizing
-indexes without events
What are several ways missing data can occur (an issue uncovered using the CMC Monitoring Indexes and Storage dashboard)
Missing data can occur if:
-ingest volume exceeds index size
-events have aged out or exceeded searchable age limit
-data is masked using the delete command
For what purpose do you check the Index Detail Dashboard of the CMC?
-ingestion issues
-isolation troubleshooting (through the host source and sourcetype views available)
How are TCP connections permitted in Splunk Cloud?
With an authorized role, secure token, credentials or certificate validation
When would you use a Test server?
-standalone dev enviro for test and POC
-replicate cloud indexes/configs to test data collection, parsing and indexed event quality
What SPlunk functions would be part of the standalone test server?
All functions in a single instance: input, parse, index, and search
What is the test server deployment strategy?
Test inputs/parsing/events in dev enviro then match prod to dev enviro including versions, apps/configs and index names
What type of data collection method monitors local hosts to gather data?
Universal forwarder
What are the key differences between a universal and heavy forwarder?
HF is a full splunk enterprise instance with license and UI, with ability to parse data prior to forwarding it, and can aggregate data from other forwarders to route elsewhere
When would you use a HF over a UF?
Limit HF to:
-mask/filter data before going to cloud
-manage modular inputs or HEC on prem
-host apps not allowed on cloud
How does a user get forwarder credential apps?
Customer stacks are commissioned with them preconfigured with forwarding and secure connection settings and installed on a forwarder that outputs data to cloud
What does a customer welcome pack include?
Stack URL and Forwarder Credential App
What does the forwarder credentials app contain?
limits.conf
Data transmission limit in limits.conf
outputs.conf
Forwarder output conifgs
SSL secure connection setting
Why would you check the forwarder credentials app limits.conf file`
A high throughput in limits.conf can overwhelm and indexing tier, block network traffic, or cause events to be dropped so check for:
Missing events, queue fill issues, delayed data
How does SSL Compression work in the forwarder credentials app?
In the outputs.conf file, it must be enabled on both sides. The compression is good for network bandwidth. It has a cpu penalty
Why wouldnt you want unlimited throughput for your forwarders into cloud?
May cause higher forwarder resource usage so maintain a limit to control file monitoring and network usage
T/F UFs can access the internet without going through a firewall
False
What is an intermediate UF?
A UF or collection of UFs, that relay data to splunk; centralized forwarding of data to cloud
Why use an intermediate UF over a UF or HF?
-Limits servers with direct access to internet
-reduces overhead of updating firewall rules for each server added or removed
Why use multiple forwarder ingestion vs HF or intermediate forwarder?
-checkpoints for lossless data collection
-efficient to minimize bandwidth
-built in load balancing, optional encryption, and data compression
-supports multiple inputs and local management
What is an intermediate HF?
Full enterprise instances as a tier to parse and forward data and manage data ingestion PRIOR to indexing in cloud
Why use a Intermediate HF over a UF or intermediate UF?
HF can parse and perform indexer tasks in the customer controlled area
-parse and anonymize with or without writing and indexing events
-data can be removed prior to forwarding to Cloud
What must be considered when using an intermediate HF in terms of troubleshooting, parsing and ingesting?
-troubleshooting could be challenging because hidden intermediate modifies/parses
-parsed data not parsed again in cloud index
-ingested data may differ from original data
T/F For both structured and unstructured data, data parsing can occur on UF and HF
False, only structure data parsing can occur on both. Unstructured data can not be parsed on a UF
How can splunk Cloud get data from TCP or UDP input?
Get these inputs on a forwarder because SC cant accept direct Network Inputs or UDP traffic
T/F Best practice is to put network inputs on an intermediate forwarder
False, Do not put on intermediate. Collect on separate dedicated forwarders
Splunk merges what kind of data until it finds a timestamp by default?
UDP data
How do you get syslog data into splunk cloud?
Send to syslog collector writing to a directory structure and monitor the directory using host_segment`
How does splunk cloud handle SNMP traps
Write traps to file and monitor input. Collect on prem to parse and filter before forwarding
When using the CMC Monitoring Forwarder Instance, what do you want to check for?
Connection issues: review version and compatibility
Missing/delayed events: forwarder must be connected and sending _internal data
What is the CMC Forwarders: Instance dashboard used for?
-Drill downs into performance of active/inactive forwarders over time
-Investigate perf and connectivity
What is the CMC Forwarders: Deployment dashboard used for?
-shows active/inactive forwarder connecting to splunk directly
-shows connectivity
When using the CMC Monitoring Forwarder Deployment, what do you want to check for?
-Connection and forwarding issues: check when last connected to indexers
-Missing forwarders: check forwarder status
-Checking IDM input rate
What is REST API?
An application programming interface conformed to REST architectural style used by vendors to expose data and internal management end-points
What is a software messenger delivering requests to providers and returning responses to requesting Client?
API
What is REST?
Representational State Transfer - defined constraints for HTTP(s) web services and provides direct interaction with web-based clients
What are APIs used for?
Expose data and management endpoints providing ability to manipulate outputs or filter results to manage ingestion
What are some benefits of using a REST API for ingestion?
Manipulate outputs or filter results to manage ingest inflight = reduce data volume and increase ingest speed.
-use less system resources
-minimal impact on performance
What API tasks are restricted in Splunk Cloud?
Modifying client server configs/components
Restarting deployment or executing debug
What are PUSH/PULL requests when using an API?
It is how a REST Client reaches endpoints
-PUSH: source attempts to deliver data from streaming source to endpoint
-PULL: streaming or file input source where source continually publishes to an endpoint
- channel opens, Splunk connects to input and ingests the content
What API requests does splunk support?
GET, POST, DELETE
Where can Cloud REST API ingestion Apps and Add-ons be installed?
On an IDM in classic Cloud, indexers and/or SH
Can non-Splunk Cloud compliant REST API ingestion TAs be used?
Yes, if installed on an on-prem HF
What type of ingestion reduces the overhead of maintaining machines and network infrastructure?
Using REST API
What happens to on-prem API inputs.conf files when deployed in cloud?
They are encrypted and use a key for security
Describe how the API Input TA can be installed
Installed as either:
invisible app with a list of data inputs on the configure data collection page
Visible app with UI for managing and configuring inputs
What method is used to collect diagnostic data from OS commands?
Scripted Inputs
What do Scripted inputs do?
Schedule a script execution and index the output
Why use a scripted input instead of another ingest method
It can gather transient data that cannot be collected with monitor or network inputs
Ex: API, message queue, web service, custom transaction
What kind of scripts are supported with scripted inputs
Shell .sh, Batch .bat, PowerShell .ps1m Python .py
Where can scripts be executed from>
SPLUNK_HOME/etc/apps/<app_name>/bin
SPLUNK_HOME/etc/system/bin
SPLUNK_HOME/bin/scripts</app_name>
In the inputs.conf of a scripted input, what does the the interval = <> mean?
Interval is the time period between script executions - default is 60 seconds
What is HEC?
HTTP Event Collector is a secure and scalable token based HTTP input.
Sends events to splunk without using forwarders
What method can you use to ingest info from web browsers, automation scripts or mobile apps?
HTTP Event Collector HEC
What method of ingestion can facilitate loggin from distributed, multimodal and or legacy environments
HEC
What are some considerations when using HEC in the cloud?
-HEC enabled by default
-Cant change config files because no direct access to indexers
-Doesnt support forwarding to output groups
T/F Using HEC will increase infrastructure Overhead
False, it will reduce infra overhead
How is encryption handled when using HEC?
All data is encrypted in transit using TLS 1.2+
What port must be used when using HEC?
Port 443 and customer cannot change
What is the default max content length for HEC in the cloud?
1MB
How do you enable HEC for Kinesis Firehose or make changes to HEC ?
Through filing a support ticket`
Can raw payloads be sent to HEC?
Yes, HEC allows any arbitrary payloads, not just JSON. But must use channels similar to ACK and events must be bound within a request
How are private apps installed on Cloud?
Uploaded and vetted via App manager, and vetted via appcert process
When do you contact support for Cloud app installs?
When app on splunkbase indicates request install or multiple apps need installing`
When should you get assisted installation on apps from splunkbase?
Apps for hidden components
Bulk installation or planned migration
Scheduling in specific maintenance windows
What is considered ‘unsafe practices’ when determining if an app is prohibited
Using elevated permissions
Running processes that manipulate OS, file, or security settings
What is categorized as prohibited behavior in a Cloud app?
Privilege escalations, precedence elevation, using local folder, reverse shells, splunk restart, OS manipulation
Cross site scripting dashboards
Config changed to core splunk ot underlying OS files
Manipulation of OS, Remote Shells, insecure comms and creds storage
Data exfiltration or export
What is App Inspect for a Dev Environment?
An automated vetting process that could require manual review. Offered two ways outside UI:
-CLI: uses “– coud” tag to validate
-API: uses “– self-service” tag to run package toolkit; can run antivirus checks
When do you use Splunk AppInspect API?
To validate an app for Cloud prior to install or is preparation for updated settings/configs
What do you use the cURL GET command for when using the AppInspect API?
Use cURL GET with a splunk username to the Splunk AppInspect API to obtain HTTP auth token
How do you submit an app to AppInspect using the API?
First use GET request to obtain HTTP token. Then use POST request to submit app to validation endpoint.
Produces a request_id for tracking purposes
How do you perform a status check of the AppInspect API?
Send a cURL GET request for either a status check or to retrieve a validation report. Both leverage the request_id produced when app was originally submitted
Do you need an updated version and build number to vet an upgraded/updated app?
Yes otherwise app could fail if build numbers are identical to previous checks
When will Splunk Cloud Classic need a restart when uploading apps
When app contains static assets, props and transforms
Where is a rolling restart required when installing cloud apps?
Apps or configs deployed to indexers require rolling restart
How is syslog data ingested in the cloud?
Logs are collected locally, then forwarded to Splunk Cloud
What are the 2 options to collect Syslog data into the CLoud?
Sent through an intermediate tier
- Reliable delivery via forwarder
- Requires on prem syslog server for parsing and filtering
Splunk Connect for Syslog (SC4S):
- Containerized Syslog-ng server with data source library
- Filters for ID, Parse and format
- Reduces config and management of syslog servers
- Repeatable concise and prescriptive soln
How can you get visibility where collection agents are prohibited?
Access streaming data, data off the wire
What OS is supported for collecting streaming data?
Windows, Mac, Linux
What ingest method uses rapid agentless deployment to collect real time data?
Streaming data / data off the wire
What are the 3 phases of stream data collection?
Data Collection points
Streaming data processing
Forwarding data
What is Splunk DSP, Data Stream Processor
DSP provides real time stream processing to collect, input connectors, process via DSP, and deliver data to splunk via output connectors
What is Splunk SPS, Stream Processor Service?
Cloud feature using real time stream processing to collect, process and deliver data to splunk
Flexible/scalable, using SVCs
What is IDM, Inputs Data Manager?
Single hosted data input component in Cloud Classic available for scripted and modular inputs
-IDM is not an app, it hosts input apps
How are IDM apps installed?
Via support ticket request or uploaded and added by engaging support or PS engineers
When is it best practice to use the IDM?
Use for Cloud Vendor Services data collection and install cloud based ingestion addons to the IDM
Can an IDM accept TCP/UDP inputs like syslog and inputs from HEC
No
What are some limitations of the IDM
Limited on scaling and ingest volume as well as concurrent searching (limit 10)
How do you get custom inputs on the IDM?
Create modular/scripted inputs and package configs as private app that will need to get vetted then uploaded by support/PS. Manage through IDM login
How do you get vendor inputs on the IDM?
find prebuilt apps/addons and have PS/support upload to IDM. Configure access controls and manage through IDM login
How do you parse and modify data before forwarding?
By using a Heavy Forwarder, where you can perform indexer like tasks in the customer controlled environment: parse/mask/remove data before indexing
What service is able to parse, modify, and filter data prior to writing events to disk?
Streaming processor Service: manage data ingestion prior to indexing in cloud
What main issues can impact user experience and information quality
Line breaking: lines in event exceed TRUNCATE setting
timestamp parsing: extraction unsuccessful
aggregation: exceeding number of lines per event set in MAX_EVENTS
What info is gathered at the input phase
Host, sourcetype, source, index
What actions occur at the parsing phase?
Line breaking, date/time extraction, event level processing, adjust meta fields
If you are changing extraction settings in sourcetype, what conf file do you need to update these changes to?
props.conf
What is an efficient way to break single line events when parsing?
Automatic line breaking is used but it is more efficient to set explicitly SHOULD_LINEMERGE = false
What is an efficient way to break multi line events when parsing ?
While splunk will attempt to find boundaries it is more efficient to set:
BREAK_ONLY_BEFORE_DATE = true (default)
BREAK_ONLY_BEFORE = <regex>
MAX_EVENTS = 256 (default)</regex>
What can be used to more efficiently extract date/timestamp in an event?
For the timestamp set:
TIME_PREFIX= <regex>
MAX_TIMESTAMP_LOOKAHEAD=<integer>
Specify time format and time zones</integer></regex>
How can poor time extraction lead to missing events?
Ingested but unavailable in the specified time range
Events rolled off as they are outside retention period
Events not ingested because ‘dates’ beyond allowed range
How could you end up with duplicate events if there is a timestamp extraction issue?
Splunk assigns a timestamp from a previous event if it cant find one
What kind of data prep should be done before mass ingestion?
Eval event breaking and date/timestamp settings, then use a test instance onprem and cloud, then redirect to prod
What is splunk data preview used for?
Creating new sourcetypes and adjusting config settings
How can you hide or delete sensitive or identifying data prior to forwarding to Cloud?
Use an on prem Heavy Forwarder to modify _raw data
Why should you avoid indexing ‘dirty’ data?
Minimizes delays, improves search accuracy, data quality, and ingestion time, issues rendering dashboards
What can users with the can_delete capability do?
Use the |delete command to hide data from searches, but it still consumes disk space
For what problem do you NOT contact cloud support?
Resizing, License changes, purchases
For what problems should you contact cloud support?`
Unable to resolve issue or perform problem isolation
Capacity or config changes
Unable to log into cloud
What is the difference between a Splunk Support engineer and a Customer Support Engineer?
Customer support may troubleshoot, submits support tickets, manages expectations/best practices
Splunk support provides solns to product issues and complex issues and troubleshoots technical problems
When troubleshooting what are the 3 likely areas that search can fail?
Search request, data retrieval, and manipulation
What should you consider when you have issues with a search due to User failures?
Check user capabilities, roles, group mappings, access/resource limits
At what stages can data ingestion be disrupted?
Collection, forwarding, intermediate stage or at the indexing tier
What steps do you take if data ingestion is disrupted at the collection and forwarding stages?
Check if splunk has access to the data and find if data forwarding is configured via inputs/outputs settings
What steps do you take if data ingestion is disrupted at the forwarding stage?
Check the output, limits.conf, restricted bandwidth
What steps do you take if data ingestion is disrupted at the intermediate forwarding stages?
Is it receiving any data, confirm receive and send ports, is it parsing or indexing and parsing data?
The cloud monitoring console or CMC is preconfigure so long as customers do what?
Enable forwarders and workload management
What is Splunk Diag?
Diagnostic Screenshot providing insight to onprem splunk instance with current component configs and customization
When should you run a splunk diag?
Before and after upgrades, creates a backup of configs/settings, faster restore and easier change audit, for splunk cloud support to aid in troubleshooting
How do you collect a diag?
Run SPLUNK_HOME/bin/splunk diag
What is Btool?
CLI troubleshooting tool used to audit configs to see what values are being used by splunk
What are the limitations of Btool?
Only shows merged on-disk configs (at the restart), not the settings Splunk is currently using