Exam2pass
0 items Sign In or Register
  • Home
  • IT Exams
  • Guarantee
  • FAQs
  • Reviews
  • Contact Us
  • Demo
Exam2pass > Splunk > Splunk Certifications > SPLK-1003 > SPLK-1003 Online Practice Questions and Answers

SPLK-1003 Online Practice Questions and Answers

Questions 4

Which network input option provides durable file-system buffering of data to mitigate data loss due to network outages and splunkd restarts?

A. diskQueueSize

B. durableQueueSize

C. persistentOueueSize

D. queueSize

Buy Now

Correct Answer: C

Reference:https://docs.splunk.com/Documentation/SplunkCloud/8.2.2111/Data/Usepersistentqueues

Questions 5

Which option on the Add Data menu is most useful for testing data ingestion without creating inputs.conf?

A. Upload option

B. Forward option

C. Monitor option

D. Download option

Buy Now

Correct Answer: A

Questions 6

In which scenario would a Splunk Administrator want to enable data integrity check when creating an index?

A. To ensure that hot buckets are still open for writes and have not been forced to roll to a cold state

B. To ensure that configuration files have not been tampered with for auditing and/or legal purposes

C. To ensure that user passwords have not been tampered with for auditing and/or legal purposes.

D. To ensure that data has not been tampered with for auditing and/or legal purposes

Buy Now

Correct Answer: D

Questions 7

A Universal Forwarder is collecting two separate sources of data (A,B). Source A is being routed through a Heavy Forwarder and then to an indexer. Source B is being routed directly to the indexer. Both sets of data require the masking of raw text strings before being written to disk. What does the administrator need to do to ensure that the masking takes place successfully?

A. Make sure that props . conf and transforms . conf are both present on the in-dexer and the search head.

B. For source A, make sure that props . conf is in place on the indexer; and for source B, make sure transforms . conf is present on the Heavy Forwarder.

C. Make sure that props . conf and transforms . conf are both present on the Universal Forwarder.

D. Place both props . conf and transforms . conf on the Heavy Forwarder for source A, and place both props . conf and transforms . conf on the indexer for source B.

Buy Now

Correct Answer: D

The correct answer is D. Place both props . conf and transforms . conf on the Heavy Forwarder for source A, and place both props . conf and transforms . conf on the indexer for source B. According to the Splunk documentation1, to mask sensitive data from raw events, you need to use the SEDCMD attribute in the props.conf file and the REGEX attribute in the transforms.conf file. The SEDCMD attribute applies a sed expression to the raw data before indexing, while the REGEX attribute defines a regular expression to match the data to be masked.You need to place these files on the Splunk instance that parses the data, which isusually the indexer or the heavy forwarder2. The universal forwarder does not parse the data, so it does not need these files. For source A, the data is routed through a heavy forwarder, which can parse the data before sending it to the indexer. Therefore, you need to place both props.conf and transforms.conf on the heavy forwarder for source A, so that the masking takes place before indexing. For source B, the data is routed directly to the indexer, which parses and indexes the data. Therefore, you need to place both props.conf and transforms.conf on the indexer for source B, so that the masking takes place before indexing. References:1:Redact data from events - Splunk Documentation2:Where do I configure my Splunk settings? - Splunk Documentation

Questions 8

Which setting allows the configuration of Splunk to allow events to span over more than one line?

A. SHOULD_LINEMERGE = true

B. BREAK_ONLY_BEFORE_DATE = true

C. BREAK_ONLY_BEFORE =

D. SHOULD_LINEMERGE = false

Buy Now

Correct Answer: A

The setting that allows the configuration of Splunk to allow events to span over more than one line is SHOULD_LINEMERGE. This setting determines whether consecutive lines from a single source should be concatenated into a single event. If SHOULD_LINEMERGE is set to true, Splunk will attempt to merge multiple lines into one event based on certain criteria, such as timestamps or regular expressions. Therefore, option A is the correct answer. References: Splunk Enterprise Certified Admin | Splunk, [Configure event line merging - Splunk Documentation]

Questions 9

The volume of data from collecting log files from 50 Linux servers and 200 Windows servers will require multiple indexers. Following best practices, which types of Splunk component instances are needed?

A. Indexers, search head, universal forwarders, license master

B. Indexers, search head, deployment server, universal forwarders

C. Indexers, search head, deployment server, license master, universal forwarder

D. Indexers, search head, deployment server, license master, universal forwarder, heavy forwarder

Buy Now

Correct Answer: C

Indexers, search head, deployment server, license master, universal forwarder. This is the combination of Splunk component instances that are needed to handle the volume of data from collecting log files from 50 Linux servers and 200 Windows servers, following the best practices. The roles and functions of these components are: Indexers: These are the Splunk instances that index the data and make it searchable. They also perform some data processing, such as timestamp extraction, line breaking, and field extraction. Multiple indexers can be clustered together to provide high availability, data replication, and load balancing. Search head: This is the Splunk instance that coordinates the search across the indexers and merges the results from them. It also provides the user interface for searching, reporting, and dashboarding. A search head can also be clustered with other search heads to provide high availability, scalability, and load balancing. Deployment server: This is the Splunk instance that manages the configuration and app deployment for the universal forwarders. It allows the administrator to centrally control the inputs.conf, outputs.conf, and other configuration files for the forwarders, as well as distribute apps and updates to them. License master: This is the Splunk instance that manages the licensing for the entire Splunk deployment. It tracks the license usage of all the Splunk instances and enforces the license limits and violations. It also allows the administrator to add, remove, or change licenses. Universal forwarder: These are the lightweight Splunk instances that collect data from various sources and forward it to the indexers or other forwarders. They do not index or parse the data, but only perform minimal processing, such as compression and encryption. They are installed on the Linux and Windows servers that generate the log files.

Questions 10

Which artifact is required in the request header when creating an HTTP event?

A. ackID

B. Token

C. Manifest

D. Host name

Buy Now

Correct Answer: B

Reference:https://docs.splunk.com/Documentation/Splunk/8.2.3/Data/FormateventsforHTT PEventCollector

When creating an HTTP event, the request header must include a token that identifies the HTTP Event Collector (HEC) endpoint. The token is a 32-character hexadecimal string that is generated when the HEC endpoint is created. The token is used to authenticate the request and route the event data to the correct index. Therefore, option B is the correct answer. References: Splunk Enterprise Certified Admin | Splunk, [About HTTP Event Collector - Splunk Documentation]

Questions 11

Which configuration files are used to transform raw data ingested by Splunk? (Choose all that apply.)

A. props.conf

B. inputs.conf

C. rawdata.conf

D. transforms.conf

Buy Now

Correct Answer: AD

https://docs.splunk.com/Documentation/Splunk/8.1.1/Knowledge/Configureadvancedextrac tionswithfieldtransforms use transformations with props.conf and transforms.conf to: Mask or delete raw data as it is being indexed verride sourcetype or host based upon event values Route events to specific indexes based on event content ?Prevent unwanted events from being indexed

Reference: https://docs.splunk.com/Documentation/Splunk/8.0.5/Data/Configuretimestamprecognition

Questions 12

What is the correct example to redact a plain-text password from raw events?

A. in props.conf: [identity] REGEX-redact_pw = s/password=([^,|/s]+)/ ####REACTED####/g

B. in props.conf: [identity] SEDCMD-redact_pw = s/password=([^,|/s]+)/ ####REACTED####/g

C. in transforms.conf: [identity] SEDCMD-redact_pw = s/password=([^,|/s]+)/ ####REACTED####/g

D. in transforms.conf: [identity] REGEX-redact_pw = s/password=([^,|/s]+)/ ####REACTED####/g

Buy Now

Correct Answer: B

The correct answer is B. in props.conf:

[identity]

SEDCMD-redact_pw = s/password=([^,|/s]+)/ ####REACTED####/g According to the Splunk documentation1, to redact sensitive data from raw events, you need to use the SEDCMD attribute in the props.conf file. The SEDCMD attribute

applies a sed expression to the raw data before indexing. The sed expression can use the s command to replace a pattern with a substitution string. For example, the following sed expression replaces any occurrence of password= followed

by any characters until a comma, whitespace, or slash with ####REACTED####:

s/password=([^,|/s]+)/ ####REACTED####/g

The g flag at the end means that the replacement is applied globally, not just to the first match.

Option A is incorrect because it uses the REGEX attribute instead of the SEDCMD attribute. The REGEX attribute is used to extract fields from events, not to modify them. Option C is incorrect because it uses the transforms.conf file instead

of the props.conf file. The transforms.conf file is used to define transformations that can be applied to fields or events, such as lookups, evaluations, or replacements. However, these transformations are applied after indexing, not before.

Option D is incorrect because it uses both the wrong attribute and the wrong file. There is no REGEX-redact_pw attribute in the transforms.conf file. References:1:Redact data from events - Splunk Documentation

Questions 13

A security team needs to ingest a static file for a specific incident. The log file has not been collected previously and future updates to the file must not be indexed.

Which command would meet these needs?

A. splunk add one shot / opt/ incident [data .log --index incident

B. splunk edit monitor /opt/incident/data.* --index incident

C. splunk add monitor /opt/incident/data.log --index incident

D. splunk edit oneshot [opt/ incident/data.* --index incident

Buy Now

Correct Answer: A

The correct answer is A. splunk add one shot / opt/ incident [data . log --index incident According to the Splunk documentation1, the splunk add one shot command adds a single file or directory to the Splunk index and then stops monitoring it.

This is useful for ingesting static files that do not change or update. The command takes the following syntax:

splunk add one shot -index

The file parameter specifies the path to the file or directory to be indexed. The index parameter specifies the name of the index where the data will be stored. If the index does not exist, Splunk will create it automatically. Option B is incorrect because the splunk edit monitor command modifies an existing monitor input, which is used for ingesting files or directories that change or update over time. This command does not create a new monitor input, nor does it stop monitoring after indexing. Option C is incorrect because the splunk add monitor command creates a new monitor input, which is also used for ingesting files or directories that change or update over time. This command does not stop monitoring after indexing. Option D is incorrect because the splunk edit oneshot command does not exist. There is no such command in the Splunk CLI. References:1:Monitor files and directories with inputs.conf - Splunk Documentation

Exam Code: SPLK-1003
Exam Name: Splunk Enterprise Certified Admin
Last Update: Jun 13, 2025
Questions: 182

PDF (Q&A)

$45.99
ADD TO CART

VCE

$49.99
ADD TO CART

PDF + VCE

$59.99
ADD TO CART

Exam2Pass----The Most Reliable Exam Preparation Assistance

There are tens of thousands of certification exam dumps provided on the internet. And how to choose the most reliable one among them is the first problem one certification candidate should face. Exam2Pass provide a shot cut to pass the exam and get the certification. If you need help on any questions or any Exam2Pass exam PDF and VCE simulators, customer support team is ready to help at any time when required.

Home | Guarantee & Policy |  Privacy & Policy |  Terms & Conditions |  How to buy |  FAQs |  About Us |  Contact Us |  Demo |  Reviews

2025 Copyright @ exam2pass.com All trademarks are the property of their respective vendors. We are not associated with any of them.