11.05.2016 Views

Apache Solr Reference Guide Covering Apache Solr 6.0

21SiXmO

21SiXmO

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

solr.hdfs.home hdfs://host:port/path/solr N/A A root location in HDFS for <strong>Solr</strong> to write<br />

collection data to. Rather than specifying<br />

an HDFS location for the data directory or<br />

update log directory, use this to specify<br />

one root location and have everything<br />

automatically created within this HDFS<br />

location.<br />

Block Cache Settings<br />

Parameter Default Description<br />

solr.hdfs.blockcache.enabled true Enable the blockcache<br />

solr.hdfs.blockcache.read.enabled true Enable the read cache<br />

solr.hdfs.blockcache.direct.memory.allocation true Enable direct memory allocation. If<br />

this is false, heap is used<br />

solr.hdfs.blockcache.slab.count 1 Number of memory slabs to allocate.<br />

Each slab is 128 MB in size.<br />

solr.hdfs.blockcache.global true Enable/Disable using one global<br />

cache for all <strong>Solr</strong>Cores. The settings<br />

used will be from the first<br />

HdfsDirectoryFactory created.<br />

NRTCachingDirectory Settings<br />

Parameter Default Description<br />

solr.hdfs.nrtcachingdirectory.enable true Enable the use of<br />

NRTCachingDirectory<br />

solr.hdfs.nrtcachingdirectory.maxmergesizemb 16 NRTCachingDirectory max segment<br />

size for merges<br />

solr.hdfs.nrtcachingdirectory.maxcachedmb 192 NRTCachingDirectory max cache size<br />

HDFS Client Configuration Settings<br />

solr.hdfs.confdir pass the location of HDFS client configuration files - needed for HDFS HA for example.<br />

Parameter Default Description<br />

solr.hdfs.confdir N/A Pass the location of HDFS client configuration files - needed for HDFS HA<br />

for example.<br />

Kerberos Authentication Settings<br />

Hadoop can be configured to use the Kerberos protocol to verify user identity when trying to access core<br />

services like HDFS. If your HDFS directories are protected using Kerberos, then you need to configure <strong>Solr</strong>'s<br />

HdfsDirectoryFactory to authenticate using Kerberos in order to read and write to HDFS. To enable Kerberos<br />

<strong>Apache</strong> <strong>Solr</strong> <strong>Reference</strong> <strong>Guide</strong> <strong>6.0</strong><br />

534

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!