public class CategoryAttributesStream extends TokenStream
CategoryAttribute. This stream should then be passed through several filters (see
CategoryTokenizer) until a token stream is produced that can be indexed by Lucene.
A CategoryAttributesStream object can be reused for producing more than one
stream. To do that, the user should cause the underlying
reset() to allow this stream to be used again.
|Modifier and Type||Field and Description|
|Constructor and Description|
|Modifier and Type||Method and Description|
Resets this stream to the beginning.
addAttribute, addAttributeImpl, captureState, clearAttributes, cloneAttributes, copyTo, equals, getAttribute, getAttributeClassesIterator, getAttributeFactory, getAttributeImplsIterator, hasAttribute, hasAttributes, hashCode, reflectAsString, reflectWith, restoreState, toString
protected CategoryAttribute categoryAttribute
public CategoryAttributesStream(Iterable<CategoryAttribute> iterable)
CategoryAttribute, from which categories are taken.
public final boolean incrementToken() throws IOException
IndexWriter) use this method to advance the stream to the next token. Implementing classes must implement this method and update the appropriate
AttributeImpls with the attributes of the next token.
The producer must make no assumptions about the attributes after the method
has been returned: the caller may arbitrarily change it. If the producer
needs to preserve the state for subsequent calls, it can use
AttributeSource.captureState() to create a copy of the current attribute state.
This method is called for every token of a document, so an efficient
implementation is crucial for good performance. To avoid calls to
references to all
AttributeImpls that this stream uses should be
retrieved during instantiation.
To ensure that filters and consumers know which attributes are available,
the attributes must be added during instantiation. Filters and consumers
are not required to check for availability of attributes in
public void reset()
TokenStream.reset()is not needed for the standard indexing process. However, if the tokens of a
TokenStreamare intended to be consumed more than once, it is necessary to implement
TokenStream.reset(). Note that if your TokenStream caches tokens and feeds them back again after a reset, it is imperative that you clone the tokens when you store them away (on the first pass) as well as when you return them (on future passes after