org.apache.lucene.analysis
Class TeeTokenFilter

java.lang.Object
  extended by org.apache.lucene.util.AttributeSource
      extended by org.apache.lucene.analysis.TokenStream
          extended by org.apache.lucene.analysis.TokenFilter
              extended by org.apache.lucene.analysis.TeeTokenFilter

Deprecated. Use TeeSinkTokenFilter instead

public class TeeTokenFilter
extends TokenFilter

Works in conjunction with the SinkTokenizer to provide the ability to set aside tokens that have already been analyzed. This is useful in situations where multiple fields share many common analysis steps and then go their separate ways.

It is also useful for doing things like entity extraction or proper noun analysis as part of the analysis workflow and saving off those tokens for use in another field.

SinkTokenizer sink1 = new SinkTokenizer();
SinkTokenizer sink2 = new SinkTokenizer();

TokenStream source1 = new TeeTokenFilter(new TeeTokenFilter(new WhitespaceTokenizer(reader1), sink1), sink2);
TokenStream source2 = new TeeTokenFilter(new TeeTokenFilter(new WhitespaceTokenizer(reader2), sink1), sink2);

TokenStream final1 = new LowerCaseFilter(source1);
TokenStream final2 = source2;
TokenStream final3 = new EntityDetect(sink1);
TokenStream final4 = new URLDetect(sink2);

d.add(new Field("f1", final1));
d.add(new Field("f2", final2));
d.add(new Field("f3", final3));
d.add(new Field("f4", final4));
 
In this example, sink1 and sink2 will both get tokens from both reader1 and reader2 after whitespace tokenizer and now we can further wrap any of these in extra analysis, and more "sources" can be inserted if desired. It is important, that tees are consumed before sinks (in the above example, the field names must be less the sink's field names). Note, the EntityDetect and URLDetect TokenStreams are for the example and do not currently exist in Lucene

See LUCENE-1058.

WARNING: TeeTokenFilter and SinkTokenizer only work with the old TokenStream API. If you switch to the new API, you need to use TeeSinkTokenFilter instead, which offers the same functionality.

See Also:
SinkTokenizer

Nested Class Summary
 
Nested classes/interfaces inherited from class org.apache.lucene.util.AttributeSource
AttributeSource.AttributeFactory, AttributeSource.State
 
Field Summary
 
Fields inherited from class org.apache.lucene.analysis.TokenFilter
input
 
Constructor Summary
TeeTokenFilter(TokenStream input, SinkTokenizer sink)
          Deprecated.  
 
Method Summary
 Token next(Token reusableToken)
          Deprecated. Returns the next token in the stream, or null at EOS.
 
Methods inherited from class org.apache.lucene.analysis.TokenFilter
close, end, reset
 
Methods inherited from class org.apache.lucene.analysis.TokenStream
getOnlyUseNewAPI, incrementToken, next, setOnlyUseNewAPI
 
Methods inherited from class org.apache.lucene.util.AttributeSource
addAttribute, addAttributeImpl, captureState, clearAttributes, cloneAttributes, equals, getAttribute, getAttributeClassesIterator, getAttributeFactory, getAttributeImplsIterator, hasAttribute, hasAttributes, hashCode, restoreState, toString
 
Methods inherited from class java.lang.Object
clone, finalize, getClass, notify, notifyAll, wait, wait, wait
 

Constructor Detail

TeeTokenFilter

public TeeTokenFilter(TokenStream input,
                      SinkTokenizer sink)
Deprecated. 
Method Detail

next

public Token next(Token reusableToken)
           throws IOException
Deprecated. 
Description copied from class: TokenStream
Returns the next token in the stream, or null at EOS. When possible, the input Token should be used as the returned Token (this gives fastest tokenization performance), but this is not required and a new Token may be returned. Callers may re-use a single Token instance for successive calls to this method.

This implicitly defines a "contract" between consumers (callers of this method) and producers (implementations of this method that are the source for tokens):

  • A consumer must fully consume the previously returned Token before calling this method again.
  • A producer must call Token.clear() before setting the fields in it and returning it
Also, the producer must make no assumptions about a Token after it has been returned: the caller may arbitrarily change it. If the producer needs to hold onto the Token for subsequent calls, it must clone() it before storing it. Note that a TokenFilter is considered a consumer.

Overrides:
next in class TokenStream
Parameters:
reusableToken - a Token that may or may not be used to return; this parameter should never be null (the callee is not required to check for null before using it, but it is a good idea to assert that it is not null.)
Returns:
next Token in the stream or null if end-of-stream was hit
Throws:
IOException


Copyright © 2000-2010 Apache Software Foundation. All Rights Reserved.