Class CachingTokenFilter

All Implemented Interfaces:
Closeable, AutoCloseable, Unwrappable<TokenStream>

public final class CachingTokenFilter extends TokenFilter
This class can be used if the token attributes of a TokenStream are intended to be consumed more than once. It caches all token attribute states locally in a List when the first call to incrementToken() is called. Subsequent calls will used the cache.

Important: Like any proper TokenFilter, reset() propagates to the input, although only before incrementToken() is called the first time. Prior to Lucene 5, it was never propagated.

  • Constructor Details

    • CachingTokenFilter

      public CachingTokenFilter(TokenStream input)
      Create a new CachingTokenFilter around input. As with any normal TokenFilter, do not call reset on the input; this filter will do it normally.
  • Method Details

    • reset

      public void reset() throws IOException
      Propagates reset if incrementToken has not yet been called. Otherwise it rewinds the iterator to the beginning of the cached list.
      Overrides:
      reset in class TokenFilter
      Throws:
      IOException
    • incrementToken

      public final boolean incrementToken() throws IOException
      The first time called, it'll read and cache all tokens from the input.
      Specified by:
      incrementToken in class TokenStream
      Returns:
      false for end of stream; true otherwise
      Throws:
      IOException
    • end

      public final void end()
      Description copied from class: TokenFilter
      This method is called by the consumer after the last token has been consumed, after TokenStream.incrementToken() returned false (using the new TokenStream API). Streams implementing the old API should upgrade to use this feature.

      This method can be used to perform any end-of-stream operations, such as setting the final offset of a stream. The final offset of a stream might differ from the offset of the last token eg in case one or more whitespaces followed after the last token, but a WhitespaceTokenizer was used.

      Additionally any skipped positions (such as those removed by a stopfilter) can be applied to the position increment, or any adjustment of other attributes where the end-of-stream value may be important.

      If you override this method, always call super.end().

      NOTE: The default implementation chains the call to the input TokenStream, so be sure to call super.end() first when overriding this method.

      Overrides:
      end in class TokenFilter
    • isCached

      public boolean isCached()
      If the underlying token stream was consumed and cached.