public abstract class TokenStream extends AttributeSource implements Closeable
TokenStream
enumerates the sequence of tokens, either from
Field
s of a Document
or from query text.
This is an abstract class; concrete subclasses are:
Tokenizer
, a TokenStream
whose input is a Reader; and
TokenFilter
, a TokenStream
whose input is another
TokenStream
.
TokenStream
API has been introduced with Lucene 2.9. This API
has moved from being Token
-based to Attribute
-based. While
Token
still exists in 2.9 as a convenience class, the preferred way
to store the information of a Token
is to use AttributeImpl
s.
TokenStream
now extends AttributeSource
, which provides
access to all of the token Attribute
s for the TokenStream
.
Note that only one instance per AttributeImpl
is created and reused
for every token. This approach reduces object creation and allows local
caching of references to the AttributeImpl
s. See
incrementToken()
for further details.
The workflow of the new TokenStream
API is as follows:
TokenStream
/TokenFilter
s which add/get
attributes to/from the AttributeSource
.
reset()
.
incrementToken()
until it returns false
consuming the attributes after each call.
end()
so that any end-of-stream operations
can be performed.
close()
to release any resource when finished
using the TokenStream
.
incrementToken()
.
You can find some example code for the new API in the analysis package level Javadoc.
Sometimes it is desirable to capture a current state of a TokenStream
,
e.g., for buffering purposes (see CachingTokenFilter
,
TeeSinkTokenFilter
). For this usecase
AttributeSource.captureState()
and AttributeSource.restoreState(org.apache.lucene.util.AttributeSource.State)
can be used.
The TokenStream
-API in Lucene is based on the decorator pattern.
Therefore all non-abstract subclasses must be final or have at least a final
implementation of incrementToken()
! This is checked when Java
assertions are enabled.
AttributeSource.AttributeFactory, AttributeSource.State
Modifier | Constructor and Description |
---|---|
protected |
TokenStream()
A TokenStream using the default attribute factory.
|
protected |
TokenStream(AttributeSource.AttributeFactory factory)
A TokenStream using the supplied AttributeFactory for creating new
Attribute instances. |
protected |
TokenStream(AttributeSource input)
A TokenStream that uses the same attributes as the supplied one.
|
Modifier and Type | Method and Description |
---|---|
void |
close()
Releases resources associated with this stream.
|
void |
end()
This method is called by the consumer after the last token has been
consumed, after
incrementToken() returned false
(using the new TokenStream API). |
abstract boolean |
incrementToken()
Consumers (i.e.,
IndexWriter ) use this method to advance the stream to
the next token. |
void |
reset()
Resets this stream to the beginning.
|
addAttribute, addAttributeImpl, captureState, clearAttributes, cloneAttributes, copyTo, equals, getAttribute, getAttributeClassesIterator, getAttributeFactory, getAttributeImplsIterator, hasAttribute, hasAttributes, hashCode, reflectAsString, reflectWith, restoreState, toString
protected TokenStream()
protected TokenStream(AttributeSource input)
protected TokenStream(AttributeSource.AttributeFactory factory)
Attribute
instances.public abstract boolean incrementToken() throws IOException
IndexWriter
) use this method to advance the stream to
the next token. Implementing classes must implement this method and update
the appropriate AttributeImpl
s with the attributes of the next
token.
The producer must make no assumptions about the attributes after the method
has been returned: the caller may arbitrarily change it. If the producer
needs to preserve the state for subsequent calls, it can use
AttributeSource.captureState()
to create a copy of the current attribute state.
This method is called for every token of a document, so an efficient
implementation is crucial for good performance. To avoid calls to
AttributeSource.addAttribute(Class)
and AttributeSource.getAttribute(Class)
,
references to all AttributeImpl
s that this stream uses should be
retrieved during instantiation.
To ensure that filters and consumers know which attributes are available,
the attributes must be added during instantiation. Filters and consumers
are not required to check for availability of attributes in
incrementToken()
.
IOException
public void end() throws IOException
incrementToken()
returned false
(using the new TokenStream
API). Streams implementing the old API
should upgrade to use this feature.
This method can be used to perform any end-of-stream operations, such as
setting the final offset of a stream. The final offset of a stream might
differ from the offset of the last token eg in case one or more whitespaces
followed after the last token, but a WhitespaceTokenizer
was used.IOException
public void reset() throws IOException
reset()
is not needed for
the standard indexing process. However, if the tokens of a
TokenStream
are intended to be consumed more than once, it is
necessary to implement reset()
. Note that if your TokenStream
caches tokens and feeds them back again after a reset, it is imperative
that you clone the tokens when you store them away (on the first pass) as
well as when you return them (on future passes after reset()
).IOException
public void close() throws IOException
close
in interface Closeable
IOException