org.apache.lucene.analysis.shingle
Class ShingleFilter

java.lang.Object
  extended by org.apache.lucene.analysis.TokenStream
      extended by org.apache.lucene.analysis.TokenFilter
          extended by org.apache.lucene.analysis.shingle.ShingleFilter

public class ShingleFilter
extends TokenFilter

A ShingleFilter constructs shingles (token n-grams) from a token stream. In other words, it creates combinations of tokens as a single token.

For example, the sentence "please divide this sentence into shingles" might be tokenized into shingles "please divide", "divide this", "this sentence", "sentence into", and "into shingles".

This filter handles position increments > 1 by inserting filler tokens (tokens with termtext "_"). It does not handle a position increment of 0.


Field Summary
static int DEFAULT_MAX_SHINGLE_SIZE
          default maximum shingle size is 2.
static char[] FILLER_TOKEN
          filler token for when positionIncrement is more than 1
static String TOKEN_SEPARATOR
          The string to use when joining adjacent tokens to form a shingle
 
Fields inherited from class org.apache.lucene.analysis.TokenFilter
input
 
Constructor Summary
ShingleFilter(TokenStream input)
          Construct a ShingleFilter with default shingle size.
ShingleFilter(TokenStream input, int maxShingleSize)
          Constructs a ShingleFilter with the specified single size from the TokenStream input
ShingleFilter(TokenStream input, String tokenType)
          Construct a ShingleFilter with the specified token type for shingle tokens.
 
Method Summary
 Token next(Token reusableToken)
          Returns the next token in the stream, or null at EOS.
 void setMaxShingleSize(int maxShingleSize)
          Set the max shingle size (default: 2)
 void setOutputUnigrams(boolean outputUnigrams)
          Shall the output stream contain the input tokens (unigrams) as well as shingles? (default: true.)
 void setTokenType(String tokenType)
          Set the type of the shingle tokens produced by this filter.
 
Methods inherited from class org.apache.lucene.analysis.TokenFilter
close, reset
 
Methods inherited from class org.apache.lucene.analysis.TokenStream
next
 
Methods inherited from class java.lang.Object
clone, equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait
 

Field Detail

FILLER_TOKEN

public static final char[] FILLER_TOKEN
filler token for when positionIncrement is more than 1


DEFAULT_MAX_SHINGLE_SIZE

public static final int DEFAULT_MAX_SHINGLE_SIZE
default maximum shingle size is 2.

See Also:
Constant Field Values

TOKEN_SEPARATOR

public static final String TOKEN_SEPARATOR
The string to use when joining adjacent tokens to form a shingle

See Also:
Constant Field Values
Constructor Detail

ShingleFilter

public ShingleFilter(TokenStream input,
                     int maxShingleSize)
Constructs a ShingleFilter with the specified single size from the TokenStream input

Parameters:
input - input stream
maxShingleSize - maximum shingle size produced by the filter.

ShingleFilter

public ShingleFilter(TokenStream input)
Construct a ShingleFilter with default shingle size.

Parameters:
input - input stream

ShingleFilter

public ShingleFilter(TokenStream input,
                     String tokenType)
Construct a ShingleFilter with the specified token type for shingle tokens.

Parameters:
input - input stream
tokenType - token type for shingle tokens
Method Detail

setTokenType

public void setTokenType(String tokenType)
Set the type of the shingle tokens produced by this filter. (default: "shingle")

Parameters:
tokenType - token tokenType

setOutputUnigrams

public void setOutputUnigrams(boolean outputUnigrams)
Shall the output stream contain the input tokens (unigrams) as well as shingles? (default: true.)

Parameters:
outputUnigrams - Whether or not the output stream shall contain the input tokens (unigrams)

setMaxShingleSize

public void setMaxShingleSize(int maxShingleSize)
Set the max shingle size (default: 2)

Parameters:
maxShingleSize - max size of output shingles

next

public Token next(Token reusableToken)
           throws IOException
Description copied from class: TokenStream
Returns the next token in the stream, or null at EOS. When possible, the input Token should be used as the returned Token (this gives fastest tokenization performance), but this is not required and a new Token may be returned. Callers may re-use a single Token instance for successive calls to this method.

This implicitly defines a "contract" between consumers (callers of this method) and producers (implementations of this method that are the source for tokens):

Also, the producer must make no assumptions about a Token after it has been returned: the caller may arbitrarily change it. If the producer needs to hold onto the token for subsequent calls, it must clone() it before storing it. Note that a TokenFilter is considered a consumer.

Overrides:
next in class TokenStream
Parameters:
reusableToken - a Token that may or may not be used to return; this parameter should never be null (the callee is not required to check for null before using it, but it is a good idea to assert that it is not null.)
Returns:
next token in the stream or null if end-of-stream was hit
Throws:
IOException


Copyright © 2000-2010 Apache Software Foundation. All Rights Reserved.