org.apache.lucene.analysis
TeeSinkTokenFilter
insteadpublic class TeeTokenFilter extends TokenFilter
SinkTokenizer sink1 = new SinkTokenizer(); SinkTokenizer sink2 = new SinkTokenizer(); TokenStream source1 = new TeeTokenFilter(new TeeTokenFilter(new WhitespaceTokenizer(reader1), sink1), sink2); TokenStream source2 = new TeeTokenFilter(new TeeTokenFilter(new WhitespaceTokenizer(reader2), sink1), sink2); TokenStream final1 = new LowerCaseFilter(source1); TokenStream final2 = source2; TokenStream final3 = new EntityDetect(sink1); TokenStream final4 = new URLDetect(sink2); d.add(new Field("f1", final1)); d.add(new Field("f2", final2)); d.add(new Field("f3", final3)); d.add(new Field("f4", final4));In this example,
sink1
and sink2 will both get tokens from both
reader1
and reader2
after whitespace tokenizer
and now we can further wrap any of these in extra analysis, and more "sources" can be inserted if desired.
It is important, that tees are consumed before sinks (in the above example, the field names must be
less the sink's field names).
Note, the EntityDetect and URLDetect TokenStreams are for the example and do not currently exist in Lucene
See LUCENE-1058.
WARNING: TeeTokenFilter
and SinkTokenizer
only work with the old TokenStream API.
If you switch to the new API, you need to use TeeSinkTokenFilter
instead, which offers
the same functionality.
SinkTokenizer
AttributeSource.AttributeFactory, AttributeSource.State
input
Constructor and Description |
---|
TeeTokenFilter(TokenStream input,
SinkTokenizer sink)
Deprecated.
|
Modifier and Type | Method and Description |
---|---|
Token |
next(Token reusableToken)
Deprecated.
Returns the next token in the stream, or null at EOS.
|
close, end, reset
getOnlyUseNewAPI, incrementToken, next, setOnlyUseNewAPI
addAttribute, addAttributeImpl, captureState, clearAttributes, cloneAttributes, equals, getAttribute, getAttributeClassesIterator, getAttributeFactory, getAttributeImplsIterator, hasAttribute, hasAttributes, hashCode, restoreState, toString
public TeeTokenFilter(TokenStream input, SinkTokenizer sink)
public Token next(Token reusableToken) throws java.io.IOException
TokenStream
This implicitly defines a "contract" between consumers (callers of this method) and producers (implementations of this method that are the source for tokens):
Token
before calling this method again.Token.clear()
before setting the fields in
it and returning itToken
after it
has been returned: the caller may arbitrarily change it. If the producer
needs to hold onto the Token
for subsequent calls, it must clone()
it before storing it. Note that a TokenFilter
is considered a
consumer.next
in class TokenStream
reusableToken
- a Token
that may or may not be used to return;
this parameter should never be null (the callee is not required to
check for null before using it, but it is a good idea to assert that
it is not null.)Token
in the stream or null if end-of-stream was hitjava.io.IOException
Copyright © 2000-2014 Apache Software Foundation. All Rights Reserved.