Exception Fact Sheet for "solr"

The goal of an Exception Fact Sheet is to reveal the design of exception handling in an application.

--Maxence, Martin

For feedback, please contact Martin

Table of contents

Basic Statistics

Number of Classes 1393
Number of Domain Exception Types (Thrown or Caught) 8
Number of Domain Checked Exception Types 2
Number of Domain Runtime Exception Types 5
Number of Domain Unknown Exception Types 1
nTh = Number of Throw 1155
nTh = Number of Throw in Catch 434
Number of Catch-Rethrow (may not be correct) 38
nC = Number of Catch 848
nCTh = Number of Catch with Throw 430
Number of Empty Catch (really Empty) 28
Number of Empty Catch (with comments) 31
Number of Empty Catch 59
nM = Number of Methods 7101
nbFunctionWithCatch = Number of Methods with Catch 580 / 7101
nbFunctionWithThrow = Number of Methods with Throw 737 / 7101
nbFunctionWithThrowS = Number of Methods with ThrowS 1460 / 7101
nbFunctionTransmitting = Number of Methods with "Throws" but NO catch, NO throw (only transmitting) 1187 / 7101
P1 = nCTh / nC 50.7% (0.507)
P2 = nMC / nM 8.2% (0.082)
P3 = nbFunctionWithThrow / nbFunction 10.4% (0.104)
P4 = nbFunctionTransmitting / nbFunction 16.7% (0.167)
P5 = nbThrowInCatch / nbThrow 37.6% (0.376)
R2 = nCatch / nThrow 0.734
A1 = Number of Caught Exception Types From External Libraries 56
A2 = Number of Reused Exception Types From External Libraries (thrown from application code) 23

W1 is a rough estimation of the richness of the exception model. It does not take into account the inheritance relationships between domain exceptions.

Proportion P1 measures the overall exception flow. According to our experience, it varies from 5% to 70%. Early-catch design generally yields a low P1, libraries that must warn clients about errors (e.g. databases) generally have a high P1.

Proportion P2 measures the dispersion of catch blocks in the application. According to our experience, it varies from 2% to 15%. A small P2 indicates a rather centralized management of errors.

R1 shows how many exceptions types from libraries (incl. JDK) are thrown from application code. For instance, IllegalArgumentException comes from the JDK but is used in many applications.

A1 measures the awareness of the application to library exceptions. A high value of A1 means either that the application is polluted with checked exceptions or that it is able to apply specific recovery depending on the library exception.

Exception Hierachy

Exception Map

Each exception that is used at least once in the project is a dot. A orange dot represents a domain exception that is defined in the application. A blue dot exception is defined in the JDK or in a library. The x-axis represents the number of times an exception is caught, the y-axis the number of times an exception is thrown.

Exceptions With State

State means fields. Number of exceptions with state: 2
SolrException
              package org.apache.solr.common;public class SolrException extends RuntimeException {

  /**
   * @since solr 1.2
   */
  public enum ErrorCode {
    BAD_REQUEST( 400 ),
    UNAUTHORIZED( 401 ),
    FORBIDDEN( 403 ),
    NOT_FOUND( 404 ),
    CONFLICT( 409 ),
    SERVER_ERROR( 500 ),
    SERVICE_UNAVAILABLE( 503 ),
    UNKNOWN(0);
    public final int code;
    
    private ErrorCode( int c )
    {
      code = c;
    }
    public static ErrorCode getErrorCode(int c){
      for (ErrorCode err : values()) {
        if(err.code == c) return err;
      }
      return UNKNOWN;
    }
  };

  public SolrException(ErrorCode code, String msg) {
    super(msg);
    this.code = code.code;
  }
  public SolrException(ErrorCode code, String msg, Throwable th) {
    super(msg, th);
    this.code = code.code;
  }

  public SolrException(ErrorCode code, Throwable th) {
    super(th);
    this.code = code.code;
  }
  
  int code=0;
  public int code() { return code; }


  public void log(Logger log) { log(log,this); }
  public static void log(Logger log, Throwable e) {
    if (e instanceof SolrException
        && ((SolrException) e).code() == ErrorCode.SERVICE_UNAVAILABLE.code) {
      return;
    }
    String stackTrace = toStr(e);
    String ignore = doIgnore(e, stackTrace);
    if (ignore != null) {
      log.info(ignore);
      return;
    }
    log.error(stackTrace);

  }

  public static void log(Logger log, String msg, Throwable e) {
    if (e instanceof SolrException
        && ((SolrException) e).code() == ErrorCode.SERVICE_UNAVAILABLE.code) {
      log(log, msg);
    }
    String stackTrace = msg + ':' + toStr(e);
    String ignore = doIgnore(e, stackTrace);
    if (ignore != null) {
      log.info(ignore);
      return;
    }
    log.error(stackTrace);
  }
  
  public static void log(Logger log, String msg) {
    String stackTrace = msg;
    String ignore = doIgnore(null, stackTrace);
    if (ignore != null) {
      log.info(ignore);
      return;
    }
    log.error(stackTrace);
  }

  // public String toString() { return toStr(this); }  // oops, inf loop
  @Override
  public String toString() { return super.toString(); }

  public static String toStr(Throwable e) {   
    CharArrayWriter cw = new CharArrayWriter();
    PrintWriter pw = new PrintWriter(cw);
    e.printStackTrace(pw);
    pw.flush();
    return cw.toString();

/** This doesn't work for some reason!!!!!
    StringWriter sw = new StringWriter();
    PrintWriter pw = new PrintWriter(sw);
    e.printStackTrace(pw);
    pw.flush();
    System.out.println("The STRING:" + sw.toString());
    return sw.toString();
**/
  }


  /** For test code - do not log exceptions that match any of the regular expressions in ignorePatterns */
  public static Set<String> ignorePatterns;

  /** Returns null if this exception does not match any ignore patterns, or a message string to use if it does. */
  public static String doIgnore(Throwable t, String m) {
    if (ignorePatterns == null || m == null) return null;
    if (t != null && t instanceof AssertionError) return null;

    for (String regex : ignorePatterns) {
      Pattern pattern = Pattern.compile(regex);
      Matcher matcher = pattern.matcher(m);
      
      if (matcher.find()) return "Ignoring exception matching " + regex;
    }

    return null;
  }
  
  public static Throwable getRootCause(Throwable t) {
    while (true) {
      Throwable cause = t.getCause();
      if (cause!=null) {
        t = cause;
      } else {
        break;
      }
    }
    return t;
  }

}
            
DataImportHandlerException
              package org.apache.solr.handler.dataimport;public class DataImportHandlerException extends RuntimeException {
  private int errCode;

  public boolean debugged = false;

  public static final int SEVERE = 500, WARN = 400, SKIP = 300, SKIP_ROW =301;

  public DataImportHandlerException(int err) {
    super();
    errCode = err;
  }

  public DataImportHandlerException(int err, String message) {
    super(message + (SolrWriter.getDocCount() == null ? "" : MSG + SolrWriter.getDocCount()));
    errCode = err;
  }

  public DataImportHandlerException(int err, String message, Throwable cause) {
    super(message + (SolrWriter.getDocCount() == null ? "" : MSG + SolrWriter.getDocCount()), cause);
    errCode = err;
  }

  public DataImportHandlerException(int err, Throwable cause) {
    super(cause);
    errCode = err;
  }

  public int getErrCode() {
    return errCode;
  }

  public static void wrapAndThrow(int err, Exception e) {
    if (e instanceof DataImportHandlerException) {
      throw (DataImportHandlerException) e;
    } else {
      throw new DataImportHandlerException(err, e);
    }
  }

  public static void wrapAndThrow(int err, Exception e, String msg) {
    if (e instanceof DataImportHandlerException) {
      throw (DataImportHandlerException) e;
    } else {
      throw new DataImportHandlerException(err, msg, e);
    }
  }


  public static final String MSG = " Processing Document # ";
}
            

Thrown Exceptions Summary

A (Domain) exception is defined in the application. A (Lib) exception is defined in the JDK or in a library. An exception can be thrown, thrown from within a catch, or declared in the signature of a method (usually for checked exceptions). Hovering over a number triggers showing code snippets from the application code.

Type Exception Thrown Thrown
from Catch
Declared
- Unknown 85
              
//in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
throw e;

              
//in solrj/src/java/org/apache/solr/common/cloud/ZkCmdExecutor.java
throw exception;

              
//in solrj/src/java/org/apache/solr/common/util/XMLErrorLogger.java
throw e;

              
//in solrj/src/java/org/apache/solr/common/util/XMLErrorLogger.java
throw e;

              
//in solrj/src/java/org/apache/solr/common/util/XMLErrorLogger.java
throw e;

              
//in solrj/src/java/org/apache/solr/common/util/XMLErrorLogger.java
throw e;

              
//in solrj/src/java/org/apache/solr/client/solrj/request/QueryRequest.java
throw e;

              
//in solrj/src/java/org/apache/solr/client/solrj/request/QueryRequest.java
throw s;

              
//in solrj/src/java/org/apache/solr/client/solrj/impl/HttpSolrServer.java
throw r;

              
//in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
throw e;

              
//in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
throw e;

              
//in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
throw e;

              
//in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
throw e;

              
//in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
throw e;

              
//in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
throw e;

              
//in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
throw e;

              
//in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
throw e;

              
//in solrj/src/java/org/apache/noggit/JSONParser.java
throw err("Unbalanced container");

              
//in solrj/src/java/org/apache/noggit/JSONParser.java
throw err(null);

              
//in solrj/src/java/org/apache/noggit/JSONParser.java
throw err("Expected " + new String(arr));

              
//in solrj/src/java/org/apache/noggit/JSONParser.java
throw err("missing exponent number");

              
//in solrj/src/java/org/apache/noggit/JSONParser.java
throw err("invalid hex digit");

              
//in solrj/src/java/org/apache/noggit/JSONParser.java
throw err("Invalid character escape in string");

              
//in solrj/src/java/org/apache/noggit/JSONParser.java
throw err("Leading zeros not allowed");

              
//in solrj/src/java/org/apache/noggit/JSONParser.java
throw err("expected digit after '-'");

              
//in solrj/src/java/org/apache/noggit/JSONParser.java
throw err("Premature EOF");

              
//in solrj/src/java/org/apache/noggit/JSONParser.java
throw err(null);

              
//in solrj/src/java/org/apache/noggit/JSONParser.java
throw err("Expected string");

              
//in solrj/src/java/org/apache/noggit/JSONParser.java
throw err("Expected key,value separator ':'");

              
//in solrj/src/java/org/apache/noggit/JSONParser.java
throw err("Expected ',' or '}'");

              
//in solrj/src/java/org/apache/noggit/JSONParser.java
throw err("Expected string");

              
//in solrj/src/java/org/apache/noggit/JSONParser.java
throw err("Expected ',' or ']'");

              
//in solrj/src/java/org/apache/noggit/JSONParser.java
throw err("type mismatch");

              
//in solrj/src/java/org/apache/noggit/JSONParser.java
throw err("type mismatch");

              
//in solrj/src/java/org/apache/noggit/JSONParser.java
throw err("Unexpected " + ev);

              
//in solrj/src/java/org/apache/noggit/JSONParser.java
throw err("Unexpected " + ev);

              
//in contrib/clustering/src/java/org/apache/solr/handler/clustering/carrot2/LuceneCarrot2TokenizerFactory.java
throw ExceptionUtils.wrapAsRuntimeException(e);

              
//in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DebugLogger.java
throw de;

              
//in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DebugLogger.java
throw de;

              
//in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DebugLogger.java
throw de;

              
//in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DebugLogger.java
throw de;

              
//in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/ScriptTransformer.java
throw e;

              
//in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/SqlEntityProcessor.java
throw e;

              
//in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DataImportHandlerException.java
throw (DataImportHandlerException) e;

              
//in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DataImportHandlerException.java
throw (DataImportHandlerException) e;

              
//in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DocBuilder.java
throw e;

              
//in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DocBuilder.java
throw e;

              
//in core/src/java/org/apache/solr/handler/SnapPuller.java
throw e;

              
//in core/src/java/org/apache/solr/handler/SnapPuller.java
throw fsyncExceptionCopy;

              
//in core/src/java/org/apache/solr/handler/SnapPuller.java
throw e;

              
//in core/src/java/org/apache/solr/handler/component/SearchHandler.java
throw (SolrException)srsp.getException();

              
//in core/src/java/org/apache/solr/client/solrj/embedded/EmbeddedSolrServer.java
throw rsp.getException();

              
//in core/src/java/org/apache/solr/client/solrj/embedded/EmbeddedSolrServer.java
throw iox;

              
//in core/src/java/org/apache/solr/client/solrj/embedded/EmbeddedSolrServer.java
throw sx;

              
//in core/src/java/org/apache/solr/response/XSLTResponseWriter.java
throw ioe;

              
//in core/src/java/org/apache/solr/request/PerSegmentSingleValuedFaceting.java
throw (RuntimeException)cause;

              
//in core/src/java/org/apache/solr/servlet/DirectSolrConnection.java
throw rsp.getException();

              
//in core/src/java/org/apache/solr/schema/IndexSchema.java
throw e;

              
//in core/src/java/org/apache/solr/search/ExtendedDismaxQParserPlugin.java
throw unknownField;

              
//in core/src/java/org/apache/solr/search/QueryParsing.java
throw ioe;

              
//in core/src/java/org/apache/solr/cloud/ShardLeaderWatcher.java
throw ke;

              
//in core/src/java/org/apache/solr/cloud/ZkController.java
throw e;

              
//in core/src/java/org/apache/solr/cloud/ZkController.java
throw e;

              
//in core/src/java/org/apache/solr/cloud/ZkController.java
throw e;

              
//in core/src/java/org/apache/solr/cloud/LeaderElector.java
throw e;

              
//in core/src/java/org/apache/solr/cloud/LeaderElector.java
throw e;

              
//in core/src/java/org/apache/solr/update/UpdateLog.java
throw rsp.getException();

              
//in core/src/java/org/apache/solr/update/DocumentBuilder.java
throw ex;

              
//in core/src/java/org/apache/solr/core/Config.java
throw e;

              
//in core/src/java/org/apache/solr/core/Config.java
throw e;

              
//in core/src/java/org/apache/solr/core/Config.java
throw e;

              
//in core/src/java/org/apache/solr/core/Config.java
throw(e);

              
//in core/src/java/org/apache/solr/core/SolrCore.java
throw e;

              
//in core/src/java/org/apache/solr/core/SolrCore.java
throw e;

              
//in core/src/java/org/apache/solr/core/SolrCore.java
throw (SolrException)e;

              
//in core/src/java/org/apache/solr/core/SolrCore.java
throw e;

              
//in core/src/java/org/apache/solr/core/SolrXMLSerializer.java
throw xforward;

              
//in core/src/java/org/apache/solr/util/SystemIdResolver.java
throw (IOException) (new IOException(re.getMessage()).initCause(re));

              
//in core/src/java/org/apache/solr/util/xslt/TransformerProvider.java
throw ioe;

              
//in core/src/java/org/apache/solr/util/xslt/TransformerProvider.java
throw ioe;

              
//in core/src/java/org/apache/solr/util/FileUtils.java
throw exc;

              
//in core/src/java/org/apache/solr/util/plugin/AbstractPluginLoader.java
throw e;

              
//in core/src/java/org/apache/solr/util/plugin/AbstractPluginLoader.java
throw e;

              
//in core/src/java/org/apache/solr/util/plugin/AbstractPluginLoader.java
throw e;

              
//in core/src/java/org/apache/solr/util/plugin/AbstractPluginLoader.java
throw e;

            
- -
- Builder 25
              
// in solrj/src/java/org/apache/noggit/JSONParser.java
throw err("Unbalanced container");

              
// in solrj/src/java/org/apache/noggit/JSONParser.java
throw err(null);

              
// in solrj/src/java/org/apache/noggit/JSONParser.java
throw err("Expected " + new String(arr));

              
// in solrj/src/java/org/apache/noggit/JSONParser.java
throw err("missing exponent number");

              
// in solrj/src/java/org/apache/noggit/JSONParser.java
throw err("invalid hex digit");

              
// in solrj/src/java/org/apache/noggit/JSONParser.java
throw err("Invalid character escape in string");

              
// in solrj/src/java/org/apache/noggit/JSONParser.java
throw err("Leading zeros not allowed");

              
// in solrj/src/java/org/apache/noggit/JSONParser.java
throw err("expected digit after '-'");

              
// in solrj/src/java/org/apache/noggit/JSONParser.java
throw err("Premature EOF");

              
// in solrj/src/java/org/apache/noggit/JSONParser.java
throw err(null);

              
// in solrj/src/java/org/apache/noggit/JSONParser.java
throw err("Expected string");

              
// in solrj/src/java/org/apache/noggit/JSONParser.java
throw err("Expected key,value separator ':'");

              
// in solrj/src/java/org/apache/noggit/JSONParser.java
throw err("Expected ',' or '}'");

              
// in solrj/src/java/org/apache/noggit/JSONParser.java
throw err("Expected string");

              
// in solrj/src/java/org/apache/noggit/JSONParser.java
throw err("Expected ',' or ']'");

              
// in solrj/src/java/org/apache/noggit/JSONParser.java
throw err("type mismatch");

              
// in solrj/src/java/org/apache/noggit/JSONParser.java
throw err("type mismatch");

              
// in solrj/src/java/org/apache/noggit/JSONParser.java
throw err("Unexpected " + ev);

              
// in solrj/src/java/org/apache/noggit/JSONParser.java
throw err("Unexpected " + ev);

              
// in contrib/clustering/src/java/org/apache/solr/handler/clustering/carrot2/LuceneCarrot2TokenizerFactory.java
throw ExceptionUtils.wrapAsRuntimeException(e);

              
// in core/src/java/org/apache/solr/handler/component/SearchHandler.java
throw (SolrException)srsp.getException();

              
// in core/src/java/org/apache/solr/client/solrj/embedded/EmbeddedSolrServer.java
throw rsp.getException();

              
// in core/src/java/org/apache/solr/servlet/DirectSolrConnection.java
throw rsp.getException();

              
// in core/src/java/org/apache/solr/update/UpdateLog.java
throw rsp.getException();

              
// in core/src/java/org/apache/solr/util/SystemIdResolver.java
throw (IOException) (new IOException(re.getMessage()).initCause(re));

            
- -
- Variable 61
              
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
throw e;

              
// in solrj/src/java/org/apache/solr/common/cloud/ZkCmdExecutor.java
throw exception;

              
// in solrj/src/java/org/apache/solr/common/util/XMLErrorLogger.java
throw e;

              
// in solrj/src/java/org/apache/solr/common/util/XMLErrorLogger.java
throw e;

              
// in solrj/src/java/org/apache/solr/common/util/XMLErrorLogger.java
throw e;

              
// in solrj/src/java/org/apache/solr/common/util/XMLErrorLogger.java
throw e;

              
// in solrj/src/java/org/apache/solr/client/solrj/request/QueryRequest.java
throw e;

              
// in solrj/src/java/org/apache/solr/client/solrj/request/QueryRequest.java
throw s;

              
// in solrj/src/java/org/apache/solr/client/solrj/impl/HttpSolrServer.java
throw r;

              
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
throw e;

              
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
throw e;

              
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
throw e;

              
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
throw e;

              
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
throw e;

              
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
throw e;

              
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
throw e;

              
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
throw e;

              
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DebugLogger.java
throw de;

              
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DebugLogger.java
throw de;

              
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DebugLogger.java
throw de;

              
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DebugLogger.java
throw de;

              
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/ScriptTransformer.java
throw e;

              
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/SqlEntityProcessor.java
throw e;

              
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DataImportHandlerException.java
throw (DataImportHandlerException) e;

              
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DataImportHandlerException.java
throw (DataImportHandlerException) e;

              
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DocBuilder.java
throw e;

              
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DocBuilder.java
throw e;

              
// in core/src/java/org/apache/solr/handler/SnapPuller.java
throw e;

              
// in core/src/java/org/apache/solr/handler/SnapPuller.java
throw fsyncExceptionCopy;

              
// in core/src/java/org/apache/solr/handler/SnapPuller.java
throw e;

              
// in core/src/java/org/apache/solr/handler/component/SearchHandler.java
throw (SolrException)srsp.getException();

              
// in core/src/java/org/apache/solr/client/solrj/embedded/EmbeddedSolrServer.java
throw iox;

              
// in core/src/java/org/apache/solr/client/solrj/embedded/EmbeddedSolrServer.java
throw sx;

              
// in core/src/java/org/apache/solr/response/XSLTResponseWriter.java
throw ioe;

              
// in core/src/java/org/apache/solr/request/PerSegmentSingleValuedFaceting.java
throw (RuntimeException)cause;

              
// in core/src/java/org/apache/solr/schema/IndexSchema.java
throw e;

              
// in core/src/java/org/apache/solr/search/ExtendedDismaxQParserPlugin.java
throw unknownField;

              
// in core/src/java/org/apache/solr/search/QueryParsing.java
throw ioe;

              
// in core/src/java/org/apache/solr/cloud/ShardLeaderWatcher.java
throw ke;

              
// in core/src/java/org/apache/solr/cloud/ZkController.java
throw e;

              
// in core/src/java/org/apache/solr/cloud/ZkController.java
throw e;

              
// in core/src/java/org/apache/solr/cloud/ZkController.java
throw e;

              
// in core/src/java/org/apache/solr/cloud/LeaderElector.java
throw e;

              
// in core/src/java/org/apache/solr/cloud/LeaderElector.java
throw e;

              
// in core/src/java/org/apache/solr/update/DocumentBuilder.java
throw ex;

              
// in core/src/java/org/apache/solr/core/Config.java
throw e;

              
// in core/src/java/org/apache/solr/core/Config.java
throw e;

              
// in core/src/java/org/apache/solr/core/Config.java
throw e;

              
// in core/src/java/org/apache/solr/core/Config.java
throw(e);

              
// in core/src/java/org/apache/solr/core/SolrCore.java
throw e;

              
// in core/src/java/org/apache/solr/core/SolrCore.java
throw e;

              
// in core/src/java/org/apache/solr/core/SolrCore.java
throw (SolrException)e;

              
// in core/src/java/org/apache/solr/core/SolrCore.java
throw e;

              
// in core/src/java/org/apache/solr/core/SolrXMLSerializer.java
throw xforward;

              
// in core/src/java/org/apache/solr/util/xslt/TransformerProvider.java
throw ioe;

              
// in core/src/java/org/apache/solr/util/xslt/TransformerProvider.java
throw ioe;

              
// in core/src/java/org/apache/solr/util/FileUtils.java
throw exc;

              
// in core/src/java/org/apache/solr/util/plugin/AbstractPluginLoader.java
throw e;

              
// in core/src/java/org/apache/solr/util/plugin/AbstractPluginLoader.java
throw e;

              
// in core/src/java/org/apache/solr/util/plugin/AbstractPluginLoader.java
throw e;

              
// in core/src/java/org/apache/solr/util/plugin/AbstractPluginLoader.java
throw e;

            
- -
(Domain) SolrException 504
              
// in solrj/src/java/org/apache/solr/common/cloud/CloudState.java
private RangeInfo addRangeInfo(String collection) { List<Range> ranges; RangeInfo rangeInfo; rangeInfo = new RangeInfo(); Map<String,Slice> slices = getSlices(collection); if (slices == null) { throw new SolrException(ErrorCode.BAD_REQUEST, "Can not find collection " + collection + " in " + this); } Set<String> shards = slices.keySet(); ArrayList<String> shardList = new ArrayList<String>(shards.size()); shardList.addAll(shards); Collections.sort(shardList); ranges = hp.partitionRange(shards.size()); rangeInfo.ranges = ranges; rangeInfo.shardList = shardList; rangeInfos.put(collection, rangeInfo); return rangeInfo; }
// in solrj/src/java/org/apache/solr/common/util/StrUtils.java
public static boolean parseBool(String s) { if( s != null ) { if( s.startsWith("true") || s.startsWith("on") || s.startsWith("yes") ) { return true; } if( s.startsWith("false") || s.startsWith("off") || s.equals("no") ) { return false; } } throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "invalid boolean value: "+s ); }
// in solrj/src/java/org/apache/solr/common/params/RequiredSolrParams.java
Override public String get(String param) { String val = params.get(param); if( val == null ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "Missing required parameter: "+param ); } return val; }
// in solrj/src/java/org/apache/solr/common/params/RequiredSolrParams.java
Override public String getFieldParam(final String field, final String param) { final String fpname = fpname(field,param); String val = params.get(fpname); if (null == val) { // don't call this.get, we want a specified exception message val = params.get(param); if (null == val) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "Missing required parameter: "+fpname+ " (or default: "+param+")" ); } } return val; }
// in solrj/src/java/org/apache/solr/common/params/RequiredSolrParams.java
Override public String[] getFieldParams(final String field, final String param) { final String fpname = fpname(field,param); String[] val = params.getParams(fpname); if (null == val) { // don't call this.getParams, we want a specified exception message val = params.getParams(param); if (null == val) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "Missing required parameter: "+fpname+ " (or default: "+param+")" ); } } return val; }
// in solrj/src/java/org/apache/solr/common/params/RequiredSolrParams.java
Override public String[] getParams(String param) { String[] vals = params.getParams(param); if( vals == null || vals.length == 0 ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "Missing required parameter: "+param ); } return vals; }
// in solrj/src/java/org/apache/solr/common/params/FacetParams.java
public static FacetRangeOther get(String label) { try { return valueOf(label.toUpperCase()); } catch (IllegalArgumentException e) { throw new SolrException (SolrException.ErrorCode.BAD_REQUEST, label+" is not a valid type of 'other' range facet information",e); } }
// in solrj/src/java/org/apache/solr/common/params/FacetParams.java
public static FacetDateOther get(String label) { try { return valueOf(label.toUpperCase()); } catch (IllegalArgumentException e) { throw new SolrException (SolrException.ErrorCode.BAD_REQUEST, label+" is not a valid type of 'other' range facet information",e); } }
// in solrj/src/java/org/apache/solr/common/params/FacetParams.java
public static FacetRangeInclude get(String label) { try { return valueOf(label.toUpperCase(Locale.ENGLISH)); } catch (IllegalArgumentException e) { throw new SolrException (SolrException.ErrorCode.BAD_REQUEST, label+" is not a valid type of for range 'include' information",e); } }
// in solrj/src/java/org/apache/solr/common/params/SolrParams.java
public Integer getInt(String param) { String val = get(param); try { return val==null ? null : Integer.valueOf(val); } catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, ex.getMessage(), ex ); } }
// in solrj/src/java/org/apache/solr/common/params/SolrParams.java
public int getInt(String param, int def) { String val = get(param); try { return val==null ? def : Integer.parseInt(val); } catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, ex.getMessage(), ex ); } }
// in solrj/src/java/org/apache/solr/common/params/SolrParams.java
public Integer getFieldInt(String field, String param) { String val = getFieldParam(field, param); try { return val==null ? null : Integer.valueOf(val); } catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, ex.getMessage(), ex ); } }
// in solrj/src/java/org/apache/solr/common/params/SolrParams.java
public int getFieldInt(String field, String param, int def) { String val = getFieldParam(field, param); try { return val==null ? def : Integer.parseInt(val); } catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, ex.getMessage(), ex ); } }
// in solrj/src/java/org/apache/solr/common/params/SolrParams.java
public Float getFloat(String param) { String val = get(param); try { return val==null ? null : Float.valueOf(val); } catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, ex.getMessage(), ex ); } }
// in solrj/src/java/org/apache/solr/common/params/SolrParams.java
public float getFloat(String param, float def) { String val = get(param); try { return val==null ? def : Float.parseFloat(val); } catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, ex.getMessage(), ex ); } }
// in solrj/src/java/org/apache/solr/common/params/SolrParams.java
public Double getDouble(String param) { String val = get(param); try { return val==null ? null : Double.valueOf(val); } catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, ex.getMessage(), ex ); } }
// in solrj/src/java/org/apache/solr/common/params/SolrParams.java
public double getDouble(String param, double def) { String val = get(param); try { return val==null ? def : Double.parseDouble(val); } catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, ex.getMessage(), ex ); } }
// in solrj/src/java/org/apache/solr/common/params/SolrParams.java
public Float getFieldFloat(String field, String param) { String val = getFieldParam(field, param); try { return val==null ? null : Float.valueOf(val); } catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, ex.getMessage(), ex ); } }
// in solrj/src/java/org/apache/solr/common/params/SolrParams.java
public float getFieldFloat(String field, String param, float def) { String val = getFieldParam(field, param); try { return val==null ? def : Float.parseFloat(val); } catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, ex.getMessage(), ex ); } }
// in solrj/src/java/org/apache/solr/common/params/SolrParams.java
public Double getFieldDouble(String field, String param) { String val = getFieldParam(field, param); try { return val==null ? null : Double.valueOf(val); } catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, ex.getMessage(), ex ); } }
// in solrj/src/java/org/apache/solr/common/params/SolrParams.java
public double getFieldDouble(String field, String param, double def) { String val = getFieldParam(field, param); try { return val==null ? def : Double.parseDouble(val); } catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, ex.getMessage(), ex ); } }
// in solrj/src/java/org/apache/solr/client/solrj/impl/BinaryResponseParser.java
Override public NamedList<Object> processResponse(InputStream body, String encoding) { try { return (NamedList<Object>) new JavaBinCodec().unmarshal(body); } catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "parsing error", e); } }
// in solrj/src/java/org/apache/solr/client/solrj/impl/StreamingBinaryResponseParser.java
Override public NamedList<Object> processResponse(InputStream body, String encoding) { try { JavaBinCodec codec = new JavaBinCodec() { @Override public SolrDocument readSolrDocument(FastInputStream dis) throws IOException { SolrDocument doc = super.readSolrDocument(dis); callback.streamSolrDocument( doc ); return null; } @Override public SolrDocumentList readSolrDocumentList(FastInputStream dis) throws IOException { SolrDocumentList solrDocs = new SolrDocumentList(); List list = (List) readVal(dis); solrDocs.setNumFound((Long) list.get(0)); solrDocs.setStart((Long) list.get(1)); solrDocs.setMaxScore((Float) list.get(2)); callback.streamDocListInfo( solrDocs.getNumFound(), solrDocs.getStart(), solrDocs.getMaxScore() ); // Read the Array tagByte = dis.readByte(); if( (tagByte >>> 5) != (ARR >>> 5) ) { throw new RuntimeException( "doclist must have an array" ); } int sz = readSize(dis); for (int i = 0; i < sz; i++) { // must be a SolrDocument readVal( dis ); } return solrDocs; } }; return (NamedList<Object>) codec.unmarshal(body); } catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "parsing error", e); } }
// in solrj/src/java/org/apache/solr/client/solrj/impl/HttpSolrServer.java
public NamedList<Object> request(final SolrRequest request, final ResponseParser processor) throws SolrServerException, IOException { HttpRequestBase method = null; InputStream is = null; SolrParams params = request.getParams(); Collection<ContentStream> streams = requestWriter.getContentStreams(request); String path = requestWriter.getPath(request); if (path == null || !path.startsWith("/")) { path = DEFAULT_PATH; } ResponseParser parser = request.getResponseParser(); if (parser == null) { parser = this.parser; } // The parser 'wt=' and 'version=' params are used instead of the original // params ModifiableSolrParams wparams = new ModifiableSolrParams(params); wparams.set(CommonParams.WT, parser.getWriterType()); wparams.set(CommonParams.VERSION, parser.getVersion()); if (invariantParams != null) { wparams.add(invariantParams); } params = wparams; int tries = maxRetries + 1; try { while( tries-- > 0 ) { // Note: since we aren't do intermittent time keeping // ourselves, the potential non-timeout latency could be as // much as tries-times (plus scheduling effects) the given // timeAllowed. try { if( SolrRequest.METHOD.GET == request.getMethod() ) { if( streams != null ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "GET can't send streams!" ); } method = new HttpGet( baseUrl + path + ClientUtils.toQueryString( params, false ) ); } else if( SolrRequest.METHOD.POST == request.getMethod() ) { String url = baseUrl + path; boolean isMultipart = ( streams != null && streams.size() > 1 ); LinkedList<NameValuePair> postParams = new LinkedList<NameValuePair>(); if (streams == null || isMultipart) { HttpPost post = new HttpPost(url); post.setHeader("Content-Charset", "UTF-8"); if (!this.useMultiPartPost && !isMultipart) { post.addHeader("Content-Type", "application/x-www-form-urlencoded; charset=UTF-8"); } List<FormBodyPart> parts = new LinkedList<FormBodyPart>(); Iterator<String> iter = params.getParameterNamesIterator(); while (iter.hasNext()) { String p = iter.next(); String[] vals = params.getParams(p); if (vals != null) { for (String v : vals) { if (this.useMultiPartPost || isMultipart) { parts.add(new FormBodyPart(p, new StringBody(v, Charset.forName("UTF-8")))); } else { postParams.add(new BasicNameValuePair(p, v)); } } } } if (isMultipart) { for (ContentStream content : streams) { String contentType = content.getContentType(); if(contentType==null) { contentType = "application/octet-stream"; // default } parts.add(new FormBodyPart(content.getName(), new InputStreamBody( content.getStream(), contentType, content.getName()))); } } if (parts.size() > 0) { MultipartEntity entity = new MultipartEntity(HttpMultipartMode.STRICT); for(FormBodyPart p: parts) { entity.addPart(p); } post.setEntity(entity); } else { //not using multipart post.setEntity(new UrlEncodedFormEntity(postParams, "UTF-8")); } method = post; } // It is has one stream, it is the post body, put the params in the URL else { String pstr = ClientUtils.toQueryString(params, false); HttpPost post = new HttpPost(url + pstr); // Single stream as body // Using a loop just to get the first one final ContentStream[] contentStream = new ContentStream[1]; for (ContentStream content : streams) { contentStream[0] = content; break; } if (contentStream[0] instanceof RequestWriter.LazyContentStream) { post.setEntity(new InputStreamEntity(contentStream[0].getStream(), -1) { @Override public Header getContentType() { return new BasicHeader("Content-Type", contentStream[0].getContentType()); } @Override public boolean isRepeatable() { return false; } }); } else { post.setEntity(new InputStreamEntity(contentStream[0].getStream(), -1) { @Override public Header getContentType() { return new BasicHeader("Content-Type", contentStream[0].getContentType()); } @Override public boolean isRepeatable() { return false; } }); } method = post; } } else { throw new SolrServerException("Unsupported method: "+request.getMethod() ); } } catch( NoHttpResponseException r ) { method = null; if(is != null) { is.close(); } // If out of tries then just rethrow (as normal error). if (tries < 1) { throw r; } } }
// in solrj/src/java/org/apache/solr/client/solrj/impl/XMLResponseParser.java
Override public NamedList<Object> processResponse(Reader in) { XMLStreamReader parser = null; try { parser = factory.createXMLStreamReader(in); } catch (XMLStreamException e) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "parsing error", e); } return processResponse(parser); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/XMLResponseParser.java
Override public NamedList<Object> processResponse(InputStream in, String encoding) { XMLStreamReader parser = null; try { parser = factory.createXMLStreamReader(in, encoding); } catch (XMLStreamException e) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "parsing error", e); } return processResponse(parser); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/XMLResponseParser.java
private NamedList<Object> processResponse(XMLStreamReader parser) { try { NamedList<Object> response = null; for (int event = parser.next(); event != XMLStreamConstants.END_DOCUMENT; event = parser.next()) { switch (event) { case XMLStreamConstants.START_ELEMENT: if( response != null ) { throw new Exception( "already read the response!" ); } // only top-level element is "response String name = parser.getLocalName(); if( name.equals( "response" ) || name.equals( "result" ) ) { response = readNamedList( parser ); } else if( name.equals( "solr" ) ) { return new SimpleOrderedMap<Object>(); } else { throw new Exception( "really needs to be response or result. " + "not:"+parser.getLocalName() ); } break; } } return response; } catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "parsing error", ex ); } finally { try { parser.close(); } catch( Exception ex ){} } }
// in contrib/extraction/src/java/org/apache/solr/handler/extraction/ExtractingDocumentLoader.java
Override public void load(SolrQueryRequest req, SolrQueryResponse rsp, ContentStream stream, UpdateRequestProcessor processor) throws Exception { Parser parser = null; String streamType = req.getParams().get(ExtractingParams.STREAM_TYPE, null); if (streamType != null) { //Cache? Parsers are lightweight to construct and thread-safe, so I'm told MediaType mt = MediaType.parse(streamType.trim().toLowerCase(Locale.ENGLISH)); parser = new DefaultParser(config.getMediaTypeRegistry()).getParsers().get(mt); } else { parser = autoDetectParser; } if (parser != null) { Metadata metadata = new Metadata(); // If you specify the resource name (the filename, roughly) with this parameter, // then Tika can make use of it in guessing the appropriate MIME type: String resourceName = req.getParams().get(ExtractingParams.RESOURCE_NAME, null); if (resourceName != null) { metadata.add(TikaMetadataKeys.RESOURCE_NAME_KEY, resourceName); } // Provide stream's content type as hint for auto detection if(stream.getContentType() != null) { metadata.add(HttpHeaders.CONTENT_TYPE, stream.getContentType()); } InputStream inputStream = null; try { inputStream = stream.getStream(); metadata.add(ExtractingMetadataConstants.STREAM_NAME, stream.getName()); metadata.add(ExtractingMetadataConstants.STREAM_SOURCE_INFO, stream.getSourceInfo()); metadata.add(ExtractingMetadataConstants.STREAM_SIZE, String.valueOf(stream.getSize())); metadata.add(ExtractingMetadataConstants.STREAM_CONTENT_TYPE, stream.getContentType()); // HtmlParser and TXTParser regard Metadata.CONTENT_ENCODING in metadata String charset = ContentStreamBase.getCharsetFromContentType(stream.getContentType()); if(charset != null){ metadata.add(HttpHeaders.CONTENT_ENCODING, charset); } String xpathExpr = params.get(ExtractingParams.XPATH_EXPRESSION); boolean extractOnly = params.getBool(ExtractingParams.EXTRACT_ONLY, false); SolrContentHandler handler = factory.createSolrContentHandler(metadata, params, schema); ContentHandler parsingHandler = handler; StringWriter writer = null; BaseMarkupSerializer serializer = null; if (extractOnly == true) { String extractFormat = params.get(ExtractingParams.EXTRACT_FORMAT, "xml"); writer = new StringWriter(); if (extractFormat.equals(TEXT_FORMAT)) { serializer = new TextSerializer(); serializer.setOutputCharStream(writer); serializer.setOutputFormat(new OutputFormat("Text", "UTF-8", true)); } else { serializer = new XMLSerializer(writer, new OutputFormat("XML", "UTF-8", true)); } if (xpathExpr != null) { Matcher matcher = PARSER.parse(xpathExpr); serializer.startDocument();//The MatchingContentHandler does not invoke startDocument. See http://tika.markmail.org/message/kknu3hw7argwiqin parsingHandler = new MatchingContentHandler(serializer, matcher); } else { parsingHandler = serializer; } } else if (xpathExpr != null) { Matcher matcher = PARSER.parse(xpathExpr); parsingHandler = new MatchingContentHandler(handler, matcher); } //else leave it as is try{ //potentially use a wrapper handler for parsing, but we still need the SolrContentHandler for getting the document. ParseContext context = new ParseContext();//TODO: should we design a way to pass in parse context? parser.parse(inputStream, parsingHandler, metadata, context); } catch (TikaException e) { if(ignoreTikaException) log.warn(new StringBuilder("skip extracting text due to ").append(e.getLocalizedMessage()) .append(". metadata=").append(metadata.toString()).toString()); else throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); } if (extractOnly == false) { addDoc(handler); } else { //serializer is not null, so we need to call endDoc on it if using xpath if (xpathExpr != null){ serializer.endDocument(); } rsp.add(stream.getName(), writer.toString()); writer.close(); String[] names = metadata.names(); NamedList metadataNL = new NamedList(); for (int i = 0; i < names.length; i++) { String[] vals = metadata.getValues(names[i]); metadataNL.add(names[i], vals); } rsp.add(stream.getName() + "_metadata", metadataNL); } } catch (SAXException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); } finally { IOUtils.closeQuietly(inputStream); } } else { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Stream type of " + streamType + " didn't match any known parsers. Please supply the " + ExtractingParams.STREAM_TYPE + " parameter."); } }
// in contrib/extraction/src/java/org/apache/solr/handler/extraction/ExtractingRequestHandler.java
public void inform(SolrCore core) { if (initArgs != null) { //if relative,then relative to config dir, otherwise, absolute path String tikaConfigLoc = (String) initArgs.get(CONFIG_LOCATION); if (tikaConfigLoc != null) { File configFile = new File(tikaConfigLoc); if (configFile.isAbsolute() == false) { configFile = new File(core.getResourceLoader().getConfigDir(), configFile.getPath()); } try { config = new TikaConfig(configFile); } catch (Exception e) { throw new SolrException(ErrorCode.SERVER_ERROR, e); } } NamedList configDateFormats = (NamedList) initArgs.get(DATE_FORMATS); if (configDateFormats != null && configDateFormats.size() > 0) { dateFormats = new HashSet<String>(); Iterator<Map.Entry> it = configDateFormats.iterator(); while (it.hasNext()) { String format = (String) it.next().getValue(); log.info("Adding Date Format: " + format); dateFormats.add(format); } } } if (config == null) { try { config = getDefaultConfig(core.getResourceLoader().getClassLoader()); } catch (MimeTypeException e) { throw new SolrException(ErrorCode.SERVER_ERROR, e); } catch (IOException e) { throw new SolrException(ErrorCode.SERVER_ERROR, e); } } factory = createFactory(); }
// in contrib/uima/src/java/org/apache/solr/uima/processor/UIMAUpdateRequestProcessor.java
Override public void processAdd(AddUpdateCommand cmd) throws IOException { String text = null; try { /* get Solr document */ SolrInputDocument solrInputDocument = cmd.getSolrInputDocument(); /* get the fields to analyze */ String[] texts = getTextsToAnalyze(solrInputDocument); for (int i = 0; i < texts.length; i++) { text = texts[i]; if (text != null && text.length()>0) { /* process the text value */ JCas jcas = processText(text); UIMAToSolrMapper uimaToSolrMapper = new UIMAToSolrMapper(solrInputDocument, jcas); /* get field mapping from config */ Map<String, Map<String, MapField>> typesAndFeaturesFieldsMap = solrUIMAConfiguration .getTypesFeaturesFieldsMapping(); /* map type features on fields */ for (String typeFQN : typesAndFeaturesFieldsMap.keySet()) { uimaToSolrMapper.map(typeFQN, typesAndFeaturesFieldsMap.get(typeFQN)); } } } } catch (Exception e) { String logField = solrUIMAConfiguration.getLogField(); if(logField == null){ SchemaField uniqueKeyField = solrCore.getSchema().getUniqueKeyField(); if(uniqueKeyField != null){ logField = uniqueKeyField.getName(); } } String optionalFieldInfo = logField == null ? "." : new StringBuilder(". ").append(logField).append("=") .append((String)cmd.getSolrInputDocument().getField(logField).getValue()) .append(", ").toString(); int len = Math.min(text.length(), 100); if (solrUIMAConfiguration.isIgnoreErrors()) { log.warn(new StringBuilder("skip the text processing due to ") .append(e.getLocalizedMessage()).append(optionalFieldInfo) .append(" text=\"").append(text.substring(0, len)).append("...\"").toString()); } else { throw new SolrException(ErrorCode.SERVER_ERROR, new StringBuilder("processing error: ") .append(e.getLocalizedMessage()).append(optionalFieldInfo) .append(" text=\"").append(text.substring(0, len)).append("...\"").toString(), e); } } super.processAdd(cmd); }
// in contrib/clustering/src/java/org/apache/solr/handler/clustering/carrot2/CarrotClusteringEngine.java
Override public Object cluster(Query query, SolrDocumentList solrDocList, Map<SolrDocument, Integer> docIds, SolrQueryRequest sreq) { try { // Prepare attributes for Carrot2 clustering call Map<String, Object> attributes = new HashMap<String, Object>(); List<Document> documents = getDocuments(solrDocList, docIds, query, sreq); attributes.put(AttributeNames.DOCUMENTS, documents); attributes.put(AttributeNames.QUERY, query.toString()); // Pass the fields on which clustering runs to the // SolrStopwordsCarrot2LexicalDataFactory attributes.put("solrFieldNames", getFieldsForClustering(sreq)); // Pass extra overriding attributes from the request, if any extractCarrotAttributes(sreq.getParams(), attributes); // Perform clustering and convert to named list // Carrot2 uses current thread's context class loader to get // certain classes (e.g. custom tokenizer/stemmer) at runtime. // To make sure classes from contrib JARs are available, // we swap the context class loader for the time of clustering. Thread ct = Thread.currentThread(); ClassLoader prev = ct.getContextClassLoader(); try { ct.setContextClassLoader(core.getResourceLoader().getClassLoader()); return clustersToNamedList(controller.process(attributes, clusteringAlgorithmClass).getClusters(), sreq.getParams()); } finally { ct.setContextClassLoader(prev); } } catch (Exception e) { log.error("Carrot2 clustering failed", e); throw new SolrException(ErrorCode.SERVER_ERROR, "Carrot2 clustering failed", e); } }
// in contrib/clustering/src/java/org/apache/solr/handler/clustering/carrot2/CarrotClusteringEngine.java
public String init(NamedList config, final SolrCore core) { this.core = core; String result = super.init(config, core); final SolrParams initParams = SolrParams.toSolrParams(config); // Initialize Carrot2 controller. Pass initialization attributes, if any. HashMap<String, Object> initAttributes = new HashMap<String, Object>(); extractCarrotAttributes(initParams, initAttributes); // Customize the stemmer and tokenizer factories. The implementations we provide here // are included in the code base of Solr, so that it's possible to refactor // the Lucene APIs the factories rely on if needed. // Additionally, we set a custom lexical resource factory for Carrot2 that // will use both Carrot2 default stop words as well as stop words from // the StopFilter defined on the field. final AttributeBuilder attributeBuilder = BasicPreprocessingPipelineDescriptor.attributeBuilder(initAttributes); attributeBuilder.lexicalDataFactory(SolrStopwordsCarrot2LexicalDataFactory.class); if (!initAttributes.containsKey(BasicPreprocessingPipelineDescriptor.Keys.TOKENIZER_FACTORY)) { attributeBuilder.tokenizerFactory(LuceneCarrot2TokenizerFactory.class); } if (!initAttributes.containsKey(BasicPreprocessingPipelineDescriptor.Keys.STEMMER_FACTORY)) { attributeBuilder.stemmerFactory(LuceneCarrot2StemmerFactory.class); } // Pass the schema to SolrStopwordsCarrot2LexicalDataFactory. initAttributes.put("solrIndexSchema", core.getSchema()); // Customize Carrot2's resource lookup to first look for resources // using Solr's resource loader. If that fails, try loading from the classpath. DefaultLexicalDataFactoryDescriptor.attributeBuilder(initAttributes).resourceLookup( new ResourceLookup( // Solr-specific resource loading. new SolrResourceLocator(core, initParams), // Using the class loader directly because this time we want to omit the prefix new ClassLoaderLocator(core.getResourceLoader().getClassLoader()))); // Carrot2 uses current thread's context class loader to get // certain classes (e.g. custom tokenizer/stemmer) at initialization time. // To make sure classes from contrib JARs are available, // we swap the context class loader for the time of clustering. Thread ct = Thread.currentThread(); ClassLoader prev = ct.getContextClassLoader(); try { ct.setContextClassLoader(core.getResourceLoader().getClassLoader()); this.controller.init(initAttributes); } finally { ct.setContextClassLoader(prev); } SchemaField uniqueField = core.getSchema().getUniqueKeyField(); if (uniqueField == null) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, CarrotClusteringEngine.class.getSimpleName() + " requires the schema to have a uniqueKeyField"); } this.idFieldName = uniqueField.getName(); // Make sure the requested Carrot2 clustering algorithm class is available String carrotAlgorithmClassName = initParams.get(CarrotParams.ALGORITHM); this.clusteringAlgorithmClass = core.getResourceLoader().findClass(carrotAlgorithmClassName, IClusteringAlgorithm.class); return result; }
// in contrib/clustering/src/java/org/apache/solr/handler/clustering/carrot2/CarrotClusteringEngine.java
private Set<String> getFieldsForClustering(SolrQueryRequest sreq) { SolrParams solrParams = sreq.getParams(); String titleFieldSpec = solrParams.get(CarrotParams.TITLE_FIELD_NAME, "title"); String snippetFieldSpec = solrParams.get(CarrotParams.SNIPPET_FIELD_NAME, titleFieldSpec); if (StringUtils.isBlank(snippetFieldSpec)) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, CarrotParams.SNIPPET_FIELD_NAME + " must not be blank."); } final Set<String> fields = Sets.newHashSet(); fields.addAll(Arrays.asList(titleFieldSpec.split("[, ]"))); fields.addAll(Arrays.asList(snippetFieldSpec.split("[, ]"))); return fields; }
// in contrib/langid/src/java/org/apache/solr/update/processor/LanguageIdentifierUpdateProcessor.java
private void initParams(SolrParams params) { if (params != null) { // Document-centric langId params setEnabled(params.getBool(LANGUAGE_ID, true)); if(params.get(FIELDS_PARAM, "").length() > 0) { inputFields = params.get(FIELDS_PARAM, "").split(","); } langField = params.get(LANG_FIELD, DOCID_LANGFIELD_DEFAULT); langsField = params.get(LANGS_FIELD, DOCID_LANGSFIELD_DEFAULT); docIdField = params.get(DOCID_PARAM, DOCID_FIELD_DEFAULT); fallbackValue = params.get(FALLBACK); if(params.get(FALLBACK_FIELDS, "").length() > 0) { fallbackFields = params.get(FALLBACK_FIELDS).split(","); } overwrite = params.getBool(OVERWRITE, false); langWhitelist = new HashSet<String>(); threshold = params.getDouble(THRESHOLD, DOCID_THRESHOLD_DEFAULT); if(params.get(LANG_WHITELIST, "").length() > 0) { for(String lang : params.get(LANG_WHITELIST, "").split(",")) { langWhitelist.add(lang); } } // Mapping params (field centric) enableMapping = params.getBool(MAP_ENABLE, false); if(params.get(MAP_FL, "").length() > 0) { mapFields = params.get(MAP_FL, "").split(","); } else { mapFields = inputFields; } mapKeepOrig = params.getBool(MAP_KEEP_ORIG, false); mapOverwrite = params.getBool(MAP_OVERWRITE, false); mapIndividual = params.getBool(MAP_INDIVIDUAL, false); // Process individual fields String[] mapIndividualFields = {}; if(params.get(MAP_INDIVIDUAL_FL, "").length() > 0) { mapIndividualFields = params.get(MAP_INDIVIDUAL_FL, "").split(","); } else { mapIndividualFields = mapFields; } mapIndividualFieldsSet = new HashSet<String>(Arrays.asList(mapIndividualFields)); // Compile a union of the lists of fields to map allMapFieldsSet = new HashSet<String>(Arrays.asList(mapFields)); if(Arrays.equals(mapFields, mapIndividualFields)) { allMapFieldsSet.addAll(mapIndividualFieldsSet); } // Language Code mapping lcMap = new HashMap<String,String>(); if(params.get(MAP_LCMAP) != null) { for(String mapping : params.get(MAP_LCMAP).split("[, ]")) { String[] keyVal = mapping.split(":"); if(keyVal.length == 2) { lcMap.put(keyVal[0], keyVal[1]); } else { log.error("Unsupported format for langid.map.lcmap: "+mapping+". Skipping this mapping."); } } } enforceSchema = params.getBool(ENFORCE_SCHEMA, true); mapPattern = Pattern.compile(params.get(MAP_PATTERN, MAP_PATTERN_DEFAULT)); mapReplaceStr = params.get(MAP_REPLACE, MAP_REPLACE_DEFAULT); } log.debug("LangId configured"); if (inputFields.length == 0) { throw new SolrException(ErrorCode.BAD_REQUEST, "Missing or faulty configuration of LanguageIdentifierUpdateProcessor. Input fields must be specified as a comma separated list"); } }
// in contrib/langid/src/java/org/apache/solr/update/processor/LanguageIdentifierUpdateProcessor.java
protected SolrInputDocument process(SolrInputDocument doc) { String docLang = null; HashSet<String> docLangs = new HashSet<String>(); String fallbackLang = getFallbackLang(doc, fallbackFields, fallbackValue); if(langField == null || !doc.containsKey(langField) || (doc.containsKey(langField) && overwrite)) { String allText = concatFields(doc, inputFields); List<DetectedLanguage> languagelist = detectLanguage(allText); docLang = resolveLanguage(languagelist, fallbackLang); docLangs.add(docLang); log.debug("Detected main document language from fields "+inputFields+": "+docLang); if(doc.containsKey(langField) && overwrite) { log.debug("Overwritten old value "+doc.getFieldValue(langField)); } if(langField != null && langField.length() != 0) { doc.setField(langField, docLang); } } else { // langField is set, we sanity check it against whitelist and fallback docLang = resolveLanguage((String) doc.getFieldValue(langField), fallbackLang); docLangs.add(docLang); log.debug("Field "+langField+" already contained value "+docLang+", not overwriting."); } if(enableMapping) { for (String fieldName : allMapFieldsSet) { if(doc.containsKey(fieldName)) { String fieldLang; if(mapIndividual && mapIndividualFieldsSet.contains(fieldName)) { String text = (String) doc.getFieldValue(fieldName); List<DetectedLanguage> languagelist = detectLanguage(text); fieldLang = resolveLanguage(languagelist, docLang); docLangs.add(fieldLang); log.debug("Mapping field "+fieldName+" using individually detected language "+fieldLang); } else { fieldLang = docLang; log.debug("Mapping field "+fieldName+" using document global language "+fieldLang); } String mappedOutputField = getMappedField(fieldName, fieldLang); if(enforceSchema && schema.getFieldOrNull(fieldName) == null) { log.warn("Unsuccessful field name mapping to {}, field does not exist, skipping mapping.", mappedOutputField, fieldName); mappedOutputField = fieldName; } if (mappedOutputField != null) { log.debug("Mapping field {} to {}", doc.getFieldValue(docIdField), fieldLang); SolrInputField inField = doc.getField(fieldName); doc.setField(mappedOutputField, inField.getValue(), inField.getBoost()); if(!mapKeepOrig) { log.debug("Removing old field {}", fieldName); doc.removeField(fieldName); } } else { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Invalid output field mapping for " + fieldName + " field and language: " + fieldLang); } } else { log.warn("Document {} does not contain input field {}. Skipping this field.", doc.getFieldValue(docIdField), fieldName); } } } // Set the languages field to an array of all detected languages if(langsField != null && langsField.length() != 0) { doc.setField(langsField, docLangs.toArray()); } return doc; }
// in contrib/analysis-extras/src/java/org/apache/solr/analysis/ICUNormalizer2FilterFactory.java
Override public void init(Map<String,String> args) { super.init(args); String name = args.get("name"); if (name == null) name = "nfkc_cf"; String mode = args.get("mode"); if (mode == null) mode = "compose"; if (mode.equals("compose")) normalizer = Normalizer2.getInstance(null, name, Normalizer2.Mode.COMPOSE); else if (mode.equals("decompose")) normalizer = Normalizer2.getInstance(null, name, Normalizer2.Mode.DECOMPOSE); else throw new SolrException(ErrorCode.SERVER_ERROR, "Invalid mode: " + mode); String filter = args.get("filter"); if (filter != null) { UnicodeSet set = new UnicodeSet(filter); if (!set.isEmpty()) { set.freeze(); normalizer = new FilteredNormalizer2(normalizer, set); } } }
// in contrib/analysis-extras/src/java/org/apache/solr/analysis/StempelPolishStemFilterFactory.java
public void inform(ResourceLoader loader) { try { stemmer = StempelStemmer.load(loader.openResource(STEMTABLE)); } catch (IOException e) { throw new SolrException(ErrorCode.SERVER_ERROR, "Could not load stem table: " + STEMTABLE); } }
// in contrib/analysis-extras/src/java/org/apache/solr/analysis/ICUTransformFilterFactory.java
Override public void init(Map<String,String> args) { super.init(args); String id = args.get("id"); if (id == null) { throw new SolrException(ErrorCode.SERVER_ERROR, "id is required."); } int dir; String direction = args.get("direction"); if (direction == null || direction.equalsIgnoreCase("forward")) dir = Transliterator.FORWARD; else if (direction.equalsIgnoreCase("reverse")) dir = Transliterator.REVERSE; else throw new SolrException(ErrorCode.SERVER_ERROR, "invalid direction: " + direction); transliterator = Transliterator.getInstance(id, dir); }
// in contrib/analysis-extras/src/java/org/apache/solr/schema/ICUCollationField.java
private void setup(ResourceLoader loader, Map<String,String> args) { String custom = args.remove("custom"); String localeID = args.remove("locale"); String strength = args.remove("strength"); String decomposition = args.remove("decomposition"); String alternate = args.remove("alternate"); String caseLevel = args.remove("caseLevel"); String caseFirst = args.remove("caseFirst"); String numeric = args.remove("numeric"); String variableTop = args.remove("variableTop"); if (custom == null && localeID == null) throw new SolrException(ErrorCode.SERVER_ERROR, "Either custom or locale is required."); if (custom != null && localeID != null) throw new SolrException(ErrorCode.SERVER_ERROR, "Cannot specify both locale and custom. " + "To tailor rules for a built-in language, see the javadocs for RuleBasedCollator. " + "Then save the entire customized ruleset to a file, and use with the custom parameter"); final Collator collator; if (localeID != null) { // create from a system collator, based on Locale. collator = createFromLocale(localeID); } else { // create from a custom ruleset collator = createFromRules(custom, loader); } // set the strength flag, otherwise it will be the default. if (strength != null) { if (strength.equalsIgnoreCase("primary")) collator.setStrength(Collator.PRIMARY); else if (strength.equalsIgnoreCase("secondary")) collator.setStrength(Collator.SECONDARY); else if (strength.equalsIgnoreCase("tertiary")) collator.setStrength(Collator.TERTIARY); else if (strength.equalsIgnoreCase("quaternary")) collator.setStrength(Collator.QUATERNARY); else if (strength.equalsIgnoreCase("identical")) collator.setStrength(Collator.IDENTICAL); else throw new SolrException(ErrorCode.SERVER_ERROR, "Invalid strength: " + strength); } // set the decomposition flag, otherwise it will be the default. if (decomposition != null) { if (decomposition.equalsIgnoreCase("no")) collator.setDecomposition(Collator.NO_DECOMPOSITION); else if (decomposition.equalsIgnoreCase("canonical")) collator.setDecomposition(Collator.CANONICAL_DECOMPOSITION); else throw new SolrException(ErrorCode.SERVER_ERROR, "Invalid decomposition: " + decomposition); } // expert options: concrete subclasses are always a RuleBasedCollator RuleBasedCollator rbc = (RuleBasedCollator) collator; if (alternate != null) { if (alternate.equalsIgnoreCase("shifted")) { rbc.setAlternateHandlingShifted(true); } else if (alternate.equalsIgnoreCase("non-ignorable")) { rbc.setAlternateHandlingShifted(false); } else { throw new SolrException(ErrorCode.SERVER_ERROR, "Invalid alternate: " + alternate); } } if (caseLevel != null) { rbc.setCaseLevel(Boolean.parseBoolean(caseLevel)); } if (caseFirst != null) { if (caseFirst.equalsIgnoreCase("lower")) { rbc.setLowerCaseFirst(true); } else if (caseFirst.equalsIgnoreCase("upper")) { rbc.setUpperCaseFirst(true); } else { throw new SolrException(ErrorCode.SERVER_ERROR, "Invalid caseFirst: " + caseFirst); } } if (numeric != null) { rbc.setNumericCollation(Boolean.parseBoolean(numeric)); } if (variableTop != null) { rbc.setVariableTop(variableTop); } // we use 4.0 because it ensures we just encode the pure byte[] keys. analyzer = new ICUCollationKeyAnalyzer(Version.LUCENE_40, collator); }
// in core/src/java/org/apache/solr/handler/DocumentAnalysisRequestHandler.java
SolrInputDocument readDocument(XMLStreamReader reader, IndexSchema schema) throws XMLStreamException { SolrInputDocument doc = new SolrInputDocument(); String uniqueKeyField = schema.getUniqueKeyField().getName(); StringBuilder text = new StringBuilder(); String fieldName = null; boolean hasId = false; while (true) { int event = reader.next(); switch (event) { // Add everything to the text case XMLStreamConstants.SPACE: case XMLStreamConstants.CDATA: case XMLStreamConstants.CHARACTERS: text.append(reader.getText()); break; case XMLStreamConstants.END_ELEMENT: if ("doc".equals(reader.getLocalName())) { if (!hasId) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "All documents must contain a unique key value: '" + doc.toString() + "'"); } return doc; } else if ("field".equals(reader.getLocalName())) { doc.addField(fieldName, text.toString(), DEFAULT_BOOST); if (uniqueKeyField.equals(fieldName)) { hasId = true; } } break; case XMLStreamConstants.START_ELEMENT: text.setLength(0); String localName = reader.getLocalName(); if (!"field".equals(localName)) { log.warn("unexpected XML tag doc/" + localName); throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "unexpected XML tag doc/" + localName); } for (int i = 0; i < reader.getAttributeCount(); i++) { String attrName = reader.getAttributeLocalName(i); if ("name".equals(attrName)) { fieldName = reader.getAttributeValue(i); } } break; } } }
// in core/src/java/org/apache/solr/handler/DocumentAnalysisRequestHandler.java
private ContentStream extractSingleContentStream(SolrQueryRequest req) { Iterable<ContentStream> streams = req.getContentStreams(); String exceptionMsg = "DocumentAnalysisRequestHandler expects a single content stream with documents to analyze"; if (streams == null) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, exceptionMsg); } Iterator<ContentStream> iter = streams.iterator(); if (!iter.hasNext()) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, exceptionMsg); } ContentStream stream = iter.next(); if (iter.hasNext()) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, exceptionMsg); } return stream; }
// in core/src/java/org/apache/solr/handler/UpdateRequestHandler.java
Override public void load(SolrQueryRequest req, SolrQueryResponse rsp, ContentStream stream, UpdateRequestProcessor processor) throws Exception { String type = req.getParams().get(UpdateParams.ASSUME_CONTENT_TYPE); if(type == null) { type = stream.getContentType(); } if( type == null ) { // Normal requests will not get here. throw new SolrException(ErrorCode.BAD_REQUEST, "Missing ContentType"); } int idx = type.indexOf(';'); if(idx>0) { type = type.substring(0,idx); } ContentStreamLoader loader = loaders.get(type); if(loader==null) { throw new SolrException(ErrorCode.BAD_REQUEST, "Unsupported ContentType: " +type+ " Not in: "+loaders.keySet()); } if(loader.getDefaultWT()!=null) { setDefaultWT(req,loader); } loader.load(req, rsp, stream, processor); }
// in core/src/java/org/apache/solr/handler/AnalysisRequestHandlerBase.java
protected NamedList<? extends Object> analyzeValue(String value, AnalysisContext context) { Analyzer analyzer = context.getAnalyzer(); if (!TokenizerChain.class.isInstance(analyzer)) { TokenStream tokenStream = null; try { tokenStream = analyzer.tokenStream(context.getFieldName(), new StringReader(value)); } catch (IOException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); } NamedList<List<NamedList>> namedList = new NamedList<List<NamedList>>(); namedList.add(tokenStream.getClass().getName(), convertTokensToNamedLists(analyzeTokenStream(tokenStream), context)); return namedList; } TokenizerChain tokenizerChain = (TokenizerChain) analyzer; CharFilterFactory[] cfiltfacs = tokenizerChain.getCharFilterFactories(); TokenizerFactory tfac = tokenizerChain.getTokenizerFactory(); TokenFilterFactory[] filtfacs = tokenizerChain.getTokenFilterFactories(); NamedList<Object> namedList = new NamedList<Object>(); if( cfiltfacs != null ){ String source = value; for(CharFilterFactory cfiltfac : cfiltfacs ){ CharStream reader = CharReader.get(new StringReader(source)); reader = cfiltfac.create(reader); source = writeCharStream(namedList, reader); } } TokenStream tokenStream = tfac.create(tokenizerChain.initReader(new StringReader(value))); List<AttributeSource> tokens = analyzeTokenStream(tokenStream); namedList.add(tokenStream.getClass().getName(), convertTokensToNamedLists(tokens, context)); ListBasedTokenStream listBasedTokenStream = new ListBasedTokenStream(tokens); for (TokenFilterFactory tokenFilterFactory : filtfacs) { for (final AttributeSource tok : tokens) { tok.getAttribute(TokenTrackingAttribute.class).freezeStage(); } tokenStream = tokenFilterFactory.create(listBasedTokenStream); tokens = analyzeTokenStream(tokenStream); namedList.add(tokenStream.getClass().getName(), convertTokensToNamedLists(tokens, context)); listBasedTokenStream = new ListBasedTokenStream(tokens); } return namedList; }
// in core/src/java/org/apache/solr/handler/AnalysisRequestHandlerBase.java
private String writeCharStream(NamedList<Object> out, CharStream input ){ final int BUFFER_SIZE = 1024; char[] buf = new char[BUFFER_SIZE]; int len = 0; StringBuilder sb = new StringBuilder(); do { try { len = input.read( buf, 0, BUFFER_SIZE ); } catch (IOException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); } if( len > 0 ) sb.append(buf, 0, len); } while( len == BUFFER_SIZE ); out.add( input.getClass().getName(), sb.toString()); return sb.toString(); }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
private NamedList<?> getNamedListResponse(HttpPost method) throws IOException { InputStream input = null; NamedList<?> result = null; try { HttpResponse response = myHttpClient.execute(method); int status = response.getStatusLine().getStatusCode(); if (status != HttpStatus.SC_OK) { throw new SolrException(SolrException.ErrorCode.SERVICE_UNAVAILABLE, "Request failed for the url " + method); } input = response.getEntity().getContent(); result = (NamedList<?>)new JavaBinCodec().unmarshal(input); } finally { try { if (input != null) { input.close(); } } catch (Exception e) { } } return result; }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
boolean fetchLatestIndex(SolrCore core, boolean force) throws IOException, InterruptedException { successfulInstall = false; replicationStartTime = System.currentTimeMillis(); try { //get the current 'replicateable' index version in the master NamedList response = null; try { response = getLatestVersion(); } catch (Exception e) { LOG.error("Master at: " + masterUrl + " is not available. Index fetch failed. Exception: " + e.getMessage()); return false; } long latestVersion = (Long) response.get(CMD_INDEX_VERSION); long latestGeneration = (Long) response.get(GENERATION); IndexCommit commit; RefCounted<SolrIndexSearcher> searcherRefCounted = null; try { searcherRefCounted = core.getNewestSearcher(false); if (searcherRefCounted == null) { SolrException.log(LOG, "No open searcher found - fetch aborted"); return false; } commit = searcherRefCounted.get().getIndexReader().getIndexCommit(); } finally { if (searcherRefCounted != null) searcherRefCounted.decref(); } if (latestVersion == 0L) { if (force && commit.getGeneration() != 0) { // since we won't get the files for an empty index, // we just clear ours and commit core.getUpdateHandler().getSolrCoreState().getIndexWriter(core).deleteAll(); SolrQueryRequest req = new LocalSolrQueryRequest(core, new ModifiableSolrParams()); core.getUpdateHandler().commit(new CommitUpdateCommand(req, false)); } //there is nothing to be replicated successfulInstall = true; return true; } if (!force && IndexDeletionPolicyWrapper.getCommitTimestamp(commit) == latestVersion) { //master and slave are already in sync just return LOG.info("Slave in sync with master."); successfulInstall = true; return true; } LOG.info("Master's generation: " + latestGeneration); LOG.info("Slave's generation: " + commit.getGeneration()); LOG.info("Starting replication process"); // get the list of files first fetchFileList(latestGeneration); // this can happen if the commit point is deleted before we fetch the file list. if(filesToDownload.isEmpty()) return false; LOG.info("Number of files in latest index in master: " + filesToDownload.size()); // Create the sync service fsyncService = Executors.newSingleThreadExecutor(); // use a synchronized list because the list is read by other threads (to show details) filesDownloaded = Collections.synchronizedList(new ArrayList<Map<String, Object>>()); // if the generateion of master is older than that of the slave , it means they are not compatible to be copied // then a new index direcory to be created and all the files need to be copied boolean isFullCopyNeeded = IndexDeletionPolicyWrapper.getCommitTimestamp(commit) >= latestVersion || force; File tmpIndexDir = createTempindexDir(core); if (isIndexStale()) isFullCopyNeeded = true; successfulInstall = false; boolean deleteTmpIdxDir = true; File indexDir = null ; try { indexDir = new File(core.getIndexDir()); downloadIndexFiles(isFullCopyNeeded, tmpIndexDir, latestGeneration); LOG.info("Total time taken for download : " + ((System.currentTimeMillis() - replicationStartTime) / 1000) + " secs"); Collection<Map<String, Object>> modifiedConfFiles = getModifiedConfFiles(confFilesToDownload); if (!modifiedConfFiles.isEmpty()) { downloadConfFiles(confFilesToDownload, latestGeneration); if (isFullCopyNeeded) { successfulInstall = modifyIndexProps(tmpIndexDir.getName()); deleteTmpIdxDir = false; } else { successfulInstall = copyIndexFiles(tmpIndexDir, indexDir); } if (successfulInstall) { LOG.info("Configuration files are modified, core will be reloaded"); logReplicationTimeAndConfFiles(modifiedConfFiles, successfulInstall);//write to a file time of replication and conf files. reloadCore(); } } else { terminateAndWaitFsyncService(); if (isFullCopyNeeded) { successfulInstall = modifyIndexProps(tmpIndexDir.getName()); deleteTmpIdxDir = false; } else { successfulInstall = copyIndexFiles(tmpIndexDir, indexDir); } if (successfulInstall) { logReplicationTimeAndConfFiles(modifiedConfFiles, successfulInstall); doCommit(); } } replicationStartTime = 0; return successfulInstall; } catch (ReplicationHandlerException e) { LOG.error("User aborted Replication"); return false; } catch (SolrException e) { throw e; } catch (InterruptedException e) { throw new InterruptedException("Index fetch interrupted"); } catch (Exception e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Index fetch failed : ", e); } finally { if (deleteTmpIdxDir) delTree(tmpIndexDir); else delTree(indexDir); } } finally { if (!successfulInstall) { logReplicationTimeAndConfFiles(null, successfulInstall); } filesToDownload = filesDownloaded = confFilesDownloaded = confFilesToDownload = null; replicationStartTime = 0; fileFetcher = null; if (fsyncService != null && !fsyncService.isShutdown()) fsyncService.shutdownNow(); fsyncService = null; stop = false; fsyncException = null; } }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
private void downloadConfFiles(List<Map<String, Object>> confFilesToDownload, long latestGeneration) throws Exception { LOG.info("Starting download of configuration files from master: " + confFilesToDownload); confFilesDownloaded = Collections.synchronizedList(new ArrayList<Map<String, Object>>()); File tmpconfDir = new File(solrCore.getResourceLoader().getConfigDir(), "conf." + getDateAsStr(new Date())); try { boolean status = tmpconfDir.mkdirs(); if (!status) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Failed to create temporary config folder: " + tmpconfDir.getName()); } for (Map<String, Object> file : confFilesToDownload) { String saveAs = (String) (file.get(ALIAS) == null ? file.get(NAME) : file.get(ALIAS)); fileFetcher = new FileFetcher(tmpconfDir, file, saveAs, true, latestGeneration); currentFile = file; fileFetcher.fetchFile(); confFilesDownloaded.add(new HashMap<String, Object>(file)); } // this is called before copying the files to the original conf dir // so that if there is an exception avoid corrupting the original files. terminateAndWaitFsyncService(); copyTmpConfFiles2Conf(tmpconfDir); } finally { delTree(tmpconfDir); } }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
private void copyTmpConfFiles2Conf(File tmpconfDir) throws IOException { File confDir = new File(solrCore.getResourceLoader().getConfigDir()); for (File file : tmpconfDir.listFiles()) { File oldFile = new File(confDir, file.getName()); if (oldFile.exists()) { File backupFile = new File(confDir, oldFile.getName() + "." + getDateAsStr(new Date(oldFile.lastModified()))); boolean status = oldFile.renameTo(backupFile); if (!status) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Unable to rename: " + oldFile + " to: " + backupFile); } } boolean status = file.renameTo(oldFile); if (status) { } else { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Unable to rename: " + file + " to: " + oldFile); } } }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
private boolean modifyIndexProps(String tmpIdxDirName) { LOG.info("New index installed. Updating index properties..."); File idxprops = new File(solrCore.getDataDir() + "index.properties"); Properties p = new Properties(); if (idxprops.exists()) { InputStream is = null; try { is = new FileInputStream(idxprops); p.load(is); } catch (Exception e) { LOG.error("Unable to load index.properties"); } finally { IOUtils.closeQuietly(is); } } p.put("index", tmpIdxDirName); FileOutputStream os = null; try { os = new FileOutputStream(idxprops); p.store(os, "index properties"); } catch (Exception e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Unable to write index.properties", e); } finally { IOUtils.closeQuietly(os); } return true; }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
private int fetchPackets(FastInputStream fis) throws Exception { byte[] intbytes = new byte[4]; byte[] longbytes = new byte[8]; try { while (true) { if (stop) { stop = false; aborted = true; throw new ReplicationHandlerException("User aborted replication"); } long checkSumServer = -1; fis.readFully(intbytes); //read the size of the packet int packetSize = readInt(intbytes); if (packetSize <= 0) { LOG.warn("No content recieved for file: " + currentFile); return NO_CONTENT; } if (buf.length < packetSize) buf = new byte[packetSize]; if (checksum != null) { //read the checksum fis.readFully(longbytes); checkSumServer = readLong(longbytes); } //then read the packet of bytes fis.readFully(buf, 0, packetSize); //compare the checksum as sent from the master if (includeChecksum) { checksum.reset(); checksum.update(buf, 0, packetSize); long checkSumClient = checksum.getValue(); if (checkSumClient != checkSumServer) { LOG.error("Checksum not matched between client and server for: " + currentFile); //if checksum is wrong it is a problem return for retry return 1; } } //if everything is fine, write down the packet to the file fileChannel.write(ByteBuffer.wrap(buf, 0, packetSize)); bytesDownloaded += packetSize; if (bytesDownloaded >= size) return 0; //errorcount is always set to zero after a successful packet errorCount = 0; } } catch (ReplicationHandlerException e) { throw e; } catch (Exception e) { LOG.warn("Error in fetching packets ", e); //for any failure , increment the error count errorCount++; //if it fails for the same pacaket for MAX_RETRIES fail and come out if (errorCount > MAX_RETRIES) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Fetch failed for file:" + fileName, e); } return ERR; } }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
private void cleanup() { try { //close the FileOutputStream (which also closes the Channel) fileOutputStream.close(); } catch (Exception e) {/* noop */ LOG.error("Error closing the file stream: "+ this.saveAs ,e); } if (bytesDownloaded != size) { //if the download is not complete then //delete the file being downloaded try { file.delete(); } catch (Exception e) { LOG.error("Error deleting file in cleanup" + e.getMessage()); } //if the failure is due to a user abort it is returned nomally else an exception is thrown if (!aborted) throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Unable to download " + fileName + " completely. Downloaded " + bytesDownloaded + "!=" + size); } }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
static Integer readInterval(String interval) { if (interval == null) return null; int result = 0; if (interval != null) { Matcher m = INTERVAL_PATTERN.matcher(interval.trim()); if (m.find()) { String hr = m.group(1); String min = m.group(2); String sec = m.group(3); result = 0; try { if (sec != null && sec.length() > 0) result += Integer.parseInt(sec); if (min != null && min.length() > 0) result += (60 * Integer.parseInt(min)); if (hr != null && hr.length() > 0) result += (60 * 60 * Integer.parseInt(hr)); result *= 1000; } catch (NumberFormatException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, INTERVAL_ERR_MSG); } } else { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, INTERVAL_ERR_MSG); } } return result; }
// in core/src/java/org/apache/solr/handler/loader/XMLLoader.java
Override public void load(SolrQueryRequest req, SolrQueryResponse rsp, ContentStream stream, UpdateRequestProcessor processor) throws Exception { final String charset = ContentStreamBase.getCharsetFromContentType(stream.getContentType()); InputStream is = null; XMLStreamReader parser = null; String tr = req.getParams().get(CommonParams.TR,null); if(tr!=null) { Transformer t = getTransformer(tr,req); final DOMResult result = new DOMResult(); // first step: read XML and build DOM using Transformer (this is no overhead, as XSL always produces // an internal result DOM tree, we just access it directly as input for StAX): try { is = stream.getStream(); final InputSource isrc = new InputSource(is); isrc.setEncoding(charset); final SAXSource source = new SAXSource(isrc); t.transform(source, result); } catch(TransformerException te) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, te.getMessage(), te); } finally { IOUtils.closeQuietly(is); } // second step feed the intermediate DOM tree into StAX parser: try { parser = inputFactory.createXMLStreamReader(new DOMSource(result.getNode())); this.processUpdate(req, processor, parser); } catch (XMLStreamException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e.getMessage(), e); } finally { if (parser != null) parser.close(); } } // Normal XML Loader else { try { is = stream.getStream(); if (UpdateRequestHandler.log.isTraceEnabled()) { final byte[] body = IOUtils.toByteArray(is); // TODO: The charset may be wrong, as the real charset is later // determined by the XML parser, the content-type is only used as a hint! UpdateRequestHandler.log.trace("body", new String(body, (charset == null) ? ContentStreamBase.DEFAULT_CHARSET : charset)); IOUtils.closeQuietly(is); is = new ByteArrayInputStream(body); } parser = (charset == null) ? inputFactory.createXMLStreamReader(is) : inputFactory.createXMLStreamReader(is, charset); this.processUpdate(req, processor, parser); } catch (XMLStreamException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e.getMessage(), e); } finally { if (parser != null) parser.close(); IOUtils.closeQuietly(is); } } }
// in core/src/java/org/apache/solr/handler/loader/XMLLoader.java
void processUpdate(SolrQueryRequest req, UpdateRequestProcessor processor, XMLStreamReader parser) throws XMLStreamException, IOException, FactoryConfigurationError, InstantiationException, IllegalAccessException, TransformerConfigurationException { AddUpdateCommand addCmd = null; SolrParams params = req.getParams(); while (true) { int event = parser.next(); switch (event) { case XMLStreamConstants.END_DOCUMENT: parser.close(); return; case XMLStreamConstants.START_ELEMENT: String currTag = parser.getLocalName(); if (currTag.equals(UpdateRequestHandler.ADD)) { log.trace("SolrCore.update(add)"); addCmd = new AddUpdateCommand(req); // First look for commitWithin parameter on the request, will be overwritten for individual <add>'s addCmd.commitWithin = params.getInt(UpdateParams.COMMIT_WITHIN, -1); addCmd.overwrite = params.getBool(UpdateParams.OVERWRITE, true); for (int i = 0; i < parser.getAttributeCount(); i++) { String attrName = parser.getAttributeLocalName(i); String attrVal = parser.getAttributeValue(i); if (UpdateRequestHandler.OVERWRITE.equals(attrName)) { addCmd.overwrite = StrUtils.parseBoolean(attrVal); } else if (UpdateRequestHandler.COMMIT_WITHIN.equals(attrName)) { addCmd.commitWithin = Integer.parseInt(attrVal); } else { log.warn("Unknown attribute id in add:" + attrName); } } } else if ("doc".equals(currTag)) { if(addCmd != null) { log.trace("adding doc..."); addCmd.clear(); addCmd.solrDoc = readDoc(parser); processor.processAdd(addCmd); } else { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Unexpected <doc> tag without an <add> tag surrounding it."); } } else if (UpdateRequestHandler.COMMIT.equals(currTag) || UpdateRequestHandler.OPTIMIZE.equals(currTag)) { log.trace("parsing " + currTag); CommitUpdateCommand cmd = new CommitUpdateCommand(req, UpdateRequestHandler.OPTIMIZE.equals(currTag)); ModifiableSolrParams mp = new ModifiableSolrParams(); for (int i = 0; i < parser.getAttributeCount(); i++) { String attrName = parser.getAttributeLocalName(i); String attrVal = parser.getAttributeValue(i); mp.set(attrName, attrVal); } RequestHandlerUtils.validateCommitParams(mp); SolrParams p = SolrParams.wrapDefaults(mp, req.getParams()); // default to the normal request params for commit options RequestHandlerUtils.updateCommit(cmd, p); processor.processCommit(cmd); } // end commit else if (UpdateRequestHandler.ROLLBACK.equals(currTag)) { log.trace("parsing " + currTag); RollbackUpdateCommand cmd = new RollbackUpdateCommand(req); processor.processRollback(cmd); } // end rollback else if (UpdateRequestHandler.DELETE.equals(currTag)) { log.trace("parsing delete"); processDelete(req, processor, parser); } // end delete break; } } }
// in core/src/java/org/apache/solr/handler/loader/XMLLoader.java
void processDelete(SolrQueryRequest req, UpdateRequestProcessor processor, XMLStreamReader parser) throws XMLStreamException, IOException { // Parse the command DeleteUpdateCommand deleteCmd = new DeleteUpdateCommand(req); // First look for commitWithin parameter on the request, will be overwritten for individual <delete>'s SolrParams params = req.getParams(); deleteCmd.commitWithin = params.getInt(UpdateParams.COMMIT_WITHIN, -1); for (int i = 0; i < parser.getAttributeCount(); i++) { String attrName = parser.getAttributeLocalName(i); String attrVal = parser.getAttributeValue(i); if ("fromPending".equals(attrName)) { // deprecated } else if ("fromCommitted".equals(attrName)) { // deprecated } else if (UpdateRequestHandler.COMMIT_WITHIN.equals(attrName)) { deleteCmd.commitWithin = Integer.parseInt(attrVal); } else { log.warn("unexpected attribute delete/@" + attrName); } } StringBuilder text = new StringBuilder(); while (true) { int event = parser.next(); switch (event) { case XMLStreamConstants.START_ELEMENT: String mode = parser.getLocalName(); if (!("id".equals(mode) || "query".equals(mode))) { log.warn("unexpected XML tag /delete/" + mode); throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "unexpected XML tag /delete/" + mode); } text.setLength(0); if ("id".equals(mode)) { for (int i = 0; i < parser.getAttributeCount(); i++) { String attrName = parser.getAttributeLocalName(i); String attrVal = parser.getAttributeValue(i); if (UpdateRequestHandler.VERSION.equals(attrName)) { deleteCmd.setVersion(Long.parseLong(attrVal)); } } } break; case XMLStreamConstants.END_ELEMENT: String currTag = parser.getLocalName(); if ("id".equals(currTag)) { deleteCmd.setId(text.toString()); } else if ("query".equals(currTag)) { deleteCmd.setQuery(text.toString()); } else if ("delete".equals(currTag)) { return; } else { log.warn("unexpected XML tag /delete/" + currTag); throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "unexpected XML tag /delete/" + currTag); } processor.processDelete(deleteCmd); deleteCmd.clear(); break; // Add everything to the text case XMLStreamConstants.SPACE: case XMLStreamConstants.CDATA: case XMLStreamConstants.CHARACTERS: text.append(parser.getText()); break; } } }
// in core/src/java/org/apache/solr/handler/loader/XMLLoader.java
public SolrInputDocument readDoc(XMLStreamReader parser) throws XMLStreamException { SolrInputDocument doc = new SolrInputDocument(); String attrName = ""; for (int i = 0; i < parser.getAttributeCount(); i++) { attrName = parser.getAttributeLocalName(i); if ("boost".equals(attrName)) { doc.setDocumentBoost(Float.parseFloat(parser.getAttributeValue(i))); } else { log.warn("Unknown attribute doc/@" + attrName); } } StringBuilder text = new StringBuilder(); String name = null; float boost = 1.0f; boolean isNull = false; String update = null; while (true) { int event = parser.next(); switch (event) { // Add everything to the text case XMLStreamConstants.SPACE: case XMLStreamConstants.CDATA: case XMLStreamConstants.CHARACTERS: text.append(parser.getText()); break; case XMLStreamConstants.END_ELEMENT: if ("doc".equals(parser.getLocalName())) { return doc; } else if ("field".equals(parser.getLocalName())) { Object v = isNull ? null : text.toString(); if (update != null) { Map<String,Object> extendedValue = new HashMap<String,Object>(1); extendedValue.put(update, v); v = extendedValue; } doc.addField(name, v, boost); boost = 1.0f; } break; case XMLStreamConstants.START_ELEMENT: text.setLength(0); String localName = parser.getLocalName(); if (!"field".equals(localName)) { log.warn("unexpected XML tag doc/" + localName); throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "unexpected XML tag doc/" + localName); } boost = 1.0f; update = null; String attrVal = ""; for (int i = 0; i < parser.getAttributeCount(); i++) { attrName = parser.getAttributeLocalName(i); attrVal = parser.getAttributeValue(i); if ("name".equals(attrName)) { name = attrVal; } else if ("boost".equals(attrName)) { boost = Float.parseFloat(attrVal); } else if ("null".equals(attrName)) { isNull = StrUtils.parseBoolean(attrVal); } else if ("update".equals(attrName)) { update = attrVal; } else { log.warn("Unknown attribute doc/field/@" + attrName); } } break; } } }
// in core/src/java/org/apache/solr/handler/loader/JavabinLoader.java
Override public void update(SolrInputDocument document, UpdateRequest updateRequest) { if (document == null) { // Perhaps commit from the parameters try { RequestHandlerUtils.handleCommit(req, processor, updateRequest.getParams(), false); RequestHandlerUtils.handleRollback(req, processor, updateRequest.getParams(), false); } catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "ERROR handling commit/rollback"); } return; } if (addCmd == null) { addCmd = getAddCommand(req, updateRequest.getParams()); } addCmd.solrDoc = document; try { processor.processAdd(addCmd); addCmd.clear(); } catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "ERROR adding document " + document); } }
// in core/src/java/org/apache/solr/handler/loader/JsonLoader.java
DeleteUpdateCommand parseDelete() throws IOException { assertNextEvent( JSONParser.OBJECT_START ); DeleteUpdateCommand cmd = new DeleteUpdateCommand(req); cmd.commitWithin = commitWithin; while( true ) { int ev = parser.nextEvent(); if( ev == JSONParser.STRING ) { String key = parser.getString(); if( parser.wasKey() ) { if( "id".equals( key ) ) { cmd.setId(parser.getString()); } else if( "query".equals(key) ) { cmd.setQuery(parser.getString()); } else if( "commitWithin".equals(key) ) { cmd.commitWithin = Integer.parseInt(parser.getString()); } else { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Unknown key: "+key+" ["+parser.getPosition()+"]" ); } } else { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "invalid string: " + key +" at ["+parser.getPosition()+"]" ); } } else if( ev == JSONParser.OBJECT_END ) { if( cmd.getId() == null && cmd.getQuery() == null ) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Missing id or query for delete ["+parser.getPosition()+"]" ); } return cmd; } else { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Got: "+JSONParser.getEventString( ev ) +" at ["+parser.getPosition()+"]" ); } } }
// in core/src/java/org/apache/solr/handler/loader/JsonLoader.java
AddUpdateCommand parseAdd() throws IOException { AddUpdateCommand cmd = new AddUpdateCommand(req); cmd.commitWithin = commitWithin; cmd.overwrite = overwrite; float boost = 1.0f; while( true ) { int ev = parser.nextEvent(); if( ev == JSONParser.STRING ) { if( parser.wasKey() ) { String key = parser.getString(); if( "doc".equals( key ) ) { if( cmd.solrDoc != null ) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "multiple docs in same add command" ); } ev = assertNextEvent( JSONParser.OBJECT_START ); cmd.solrDoc = parseDoc( ev ); } else if( UpdateRequestHandler.OVERWRITE.equals( key ) ) { cmd.overwrite = parser.getBoolean(); // reads next boolean } else if( UpdateRequestHandler.COMMIT_WITHIN.equals( key ) ) { cmd.commitWithin = (int)parser.getLong(); } else if( "boost".equals( key ) ) { boost = Float.parseFloat( parser.getNumberChars().toString() ); } else { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Unknown key: "+key+" ["+parser.getPosition()+"]" ); } } else { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Should be a key " +" at ["+parser.getPosition()+"]" ); } } else if( ev == JSONParser.OBJECT_END ) { if( cmd.solrDoc == null ) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST,"missing solr document. "+parser.getPosition() ); } cmd.solrDoc.setDocumentBoost( boost ); return cmd; } else { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Got: "+JSONParser.getEventString( ev ) +" at ["+parser.getPosition()+"]" ); } } }
// in core/src/java/org/apache/solr/handler/loader/JsonLoader.java
void assertEvent(int ev, int expected) { if( ev != expected ) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Expected: "+JSONParser.getEventString( expected ) +" but got "+JSONParser.getEventString( ev ) +" at ["+parser.getPosition()+"]" ); } }
// in core/src/java/org/apache/solr/handler/loader/JsonLoader.java
private void parseExtendedFieldValue(SolrInputField sif, int ev) throws IOException { assert ev == JSONParser.OBJECT_START; float boost = 1.0f; Object normalFieldValue = null; Map<String, Object> extendedInfo = null; for (;;) { ev = parser.nextEvent(); switch (ev) { case JSONParser.STRING: String label = parser.getString(); if ("boost".equals(label)) { ev = parser.nextEvent(); if( ev != JSONParser.NUMBER && ev != JSONParser.LONG && ev != JSONParser.BIGNUMBER ) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "boost should have number! "+JSONParser.getEventString(ev) ); } boost = (float)parser.getDouble(); } else if ("value".equals(label)) { normalFieldValue = parseNormalFieldValue(parser.nextEvent()); } else { // If we encounter other unknown map keys, then use a map if (extendedInfo == null) { extendedInfo = new HashMap<String, Object>(2); } // for now, the only extended info will be field values // we could either store this as an Object or a SolrInputField Object val = parseNormalFieldValue(parser.nextEvent()); extendedInfo.put(label, val); } break; case JSONParser.OBJECT_END: if (extendedInfo != null) { if (normalFieldValue != null) { extendedInfo.put("value",normalFieldValue); } sif.setValue(extendedInfo, boost); } else { sif.setValue(normalFieldValue, boost); } return; default: throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Error parsing JSON extended field value. Unexpected "+JSONParser.getEventString(ev) ); } } }
// in core/src/java/org/apache/solr/handler/loader/JsonLoader.java
private Object parseSingleFieldValue(int ev) throws IOException { switch (ev) { case JSONParser.STRING: return parser.getString(); case JSONParser.LONG: case JSONParser.NUMBER: case JSONParser.BIGNUMBER: return parser.getNumberChars().toString(); case JSONParser.BOOLEAN: return Boolean.toString(parser.getBoolean()); // for legacy reasons, single values s are expected to be strings case JSONParser.NULL: parser.getNull(); return null; case JSONParser.ARRAY_START: return parseArrayFieldValue(ev); default: throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Error parsing JSON field value. Unexpected "+JSONParser.getEventString(ev) ); } }
// in core/src/java/org/apache/solr/handler/loader/CSVLoaderBase.java
Override void add(SolrInputDocument doc, int line, int column, String val) { CSVParser parser = new CSVParser(new StringReader(val), strategy); try { String[] vals = parser.getLine(); if (vals!=null) { for (String v: vals) base.add(doc,line,column,v); } else { base.add(doc,line,column,val); } } catch (IOException e) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST,e); } }
// in core/src/java/org/apache/solr/handler/loader/CSVLoaderBase.java
void prepareFields() { // Possible future optimization: for really rapid incremental indexing // from a POST, one could cache all of this setup info based on the params. // The link from FieldAdder to this would need to be severed for that to happen. fields = new SchemaField[fieldnames.length]; adders = new CSVLoaderBase.FieldAdder[fieldnames.length]; String skipStr = params.get(SKIP); List<String> skipFields = skipStr==null ? null : StrUtils.splitSmart(skipStr,','); CSVLoaderBase.FieldAdder adder = new CSVLoaderBase.FieldAdder(); CSVLoaderBase.FieldAdder adderKeepEmpty = new CSVLoaderBase.FieldAdderEmpty(); for (int i=0; i<fields.length; i++) { String fname = fieldnames[i]; // to skip a field, leave the entries in fields and addrs null if (fname.length()==0 || (skipFields!=null && skipFields.contains(fname))) continue; fields[i] = schema.getField(fname); boolean keepEmpty = params.getFieldBool(fname,EMPTY,false); adders[i] = keepEmpty ? adderKeepEmpty : adder; // Order that operations are applied: split -> trim -> map -> add // so create in reverse order. // Creation of FieldAdders could be optimized and shared among fields String[] fmap = params.getFieldParams(fname,MAP); if (fmap!=null) { for (String mapRule : fmap) { String[] mapArgs = colonSplit.split(mapRule,-1); if (mapArgs.length!=2) throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "Map rules must be of the form 'from:to' ,got '"+mapRule+"'"); adders[i] = new CSVLoaderBase.FieldMapperSingle(mapArgs[0], mapArgs[1], adders[i]); } } if (params.getFieldBool(fname,TRIM,false)) { adders[i] = new CSVLoaderBase.FieldTrimmer(adders[i]); } if (params.getFieldBool(fname,SPLIT,false)) { String sepStr = params.getFieldParam(fname,SEPARATOR); char fsep = sepStr==null || sepStr.length()==0 ? ',' : sepStr.charAt(0); String encStr = params.getFieldParam(fname,ENCAPSULATOR); char fenc = encStr==null || encStr.length()==0 ? (char)-2 : encStr.charAt(0); String escStr = params.getFieldParam(fname,ESCAPE); char fesc = escStr==null || escStr.length()==0 ? CSVStrategy.ESCAPE_DISABLED : escStr.charAt(0); CSVStrategy fstrat = new CSVStrategy(fsep,fenc,CSVStrategy.COMMENTS_DISABLED,fesc, false, false, false, false); adders[i] = new CSVLoaderBase.FieldSplitter(fstrat, adders[i]); } } // look for any literal fields - literal.foo=xyzzy Iterator<String> paramNames = params.getParameterNamesIterator(); while (paramNames.hasNext()) { String pname = paramNames.next(); if (!pname.startsWith(LITERALS_PREFIX)) continue; String name = pname.substring(LITERALS_PREFIX.length()); SchemaField sf = schema.getFieldOrNull(name); if(sf == null) throw new SolrException( SolrException.ErrorCode.BAD_REQUEST,"Invalid field name for literal:'"+ name +"'"); literals.put(sf, params.get(pname)); } }
// in core/src/java/org/apache/solr/handler/loader/CSVLoaderBase.java
private void input_err(String msg, String[] line, int lineno) { StringBuilder sb = new StringBuilder(); sb.append(errHeader).append(", line=").append(lineno).append(",").append(msg).append("\n\tvalues={"); for (String val: line) { sb.append("'").append(val).append("',"); } sb.append('}'); throw new SolrException( SolrException.ErrorCode.BAD_REQUEST,sb.toString()); }
// in core/src/java/org/apache/solr/handler/loader/CSVLoaderBase.java
private void input_err(String msg, String[] lines, int lineNo, Throwable e) { StringBuilder sb = new StringBuilder(); sb.append(errHeader).append(", line=").append(lineNo).append(",").append(msg).append("\n\tvalues={"); if (lines != null) { for (String val : lines) { sb.append("'").append(val).append("',"); } } else { sb.append("NO LINES AVAILABLE"); } sb.append('}'); throw new SolrException( SolrException.ErrorCode.BAD_REQUEST,sb.toString(), e); }
// in core/src/java/org/apache/solr/handler/loader/CSVLoaderBase.java
Override public void load(SolrQueryRequest req, SolrQueryResponse rsp, ContentStream stream, UpdateRequestProcessor processor) throws IOException { errHeader = "CSVLoader: input=" + stream.getSourceInfo(); Reader reader = null; try { reader = stream.getReader(); if (skipLines>0) { if (!(reader instanceof BufferedReader)) { reader = new BufferedReader(reader); } BufferedReader r = (BufferedReader)reader; for (int i=0; i<skipLines; i++) { r.readLine(); } } CSVParser parser = new CSVParser(reader, strategy); // parse the fieldnames from the header of the file if (fieldnames==null) { fieldnames = parser.getLine(); if (fieldnames==null) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST,"Expected fieldnames in CSV input"); } prepareFields(); } // read the rest of the CSV file for(;;) { int line = parser.getLineNumber(); // for error reporting in MT mode String[] vals = null; try { vals = parser.getLine(); } catch (IOException e) { //Catch the exception and rethrow it with more line information input_err("can't read line: " + line, null, line, e); } if (vals==null) break; if (vals.length != fields.length) { input_err("expected "+fields.length+" values but got "+vals.length, vals, line); } addDoc(line,vals); } } finally{ if (reader != null) { IOUtils.closeQuietly(reader); } } }
// in core/src/java/org/apache/solr/handler/ReplicationHandler.java
private void doSnapShoot(SolrParams params, SolrQueryResponse rsp, SolrQueryRequest req) { try { int numberToKeep = params.getInt(NUMBER_BACKUPS_TO_KEEP_REQUEST_PARAM, 0); if (numberToKeep > 0 && numberBackupsToKeep > 0) { throw new SolrException(ErrorCode.BAD_REQUEST, "Cannot use " + NUMBER_BACKUPS_TO_KEEP_REQUEST_PARAM + " if " + NUMBER_BACKUPS_TO_KEEP_INIT_PARAM + " was specified in the configuration."); } numberToKeep = Math.max(numberToKeep, numberBackupsToKeep); if (numberToKeep < 1) { numberToKeep = Integer.MAX_VALUE; } IndexDeletionPolicyWrapper delPolicy = core.getDeletionPolicy(); IndexCommit indexCommit = delPolicy.getLatestCommit(); if (indexCommit == null) { indexCommit = req.getSearcher().getIndexReader().getIndexCommit(); } // small race here before the commit point is saved new SnapShooter(core, params.get("location")).createSnapAsync( indexCommit, numberToKeep, this); } catch (Exception e) { LOG.warn("Exception during creating a snapshot", e); rsp.add("exception", e); } }
// in core/src/java/org/apache/solr/handler/RequestHandlerUtils.java
public static void validateCommitParams(SolrParams params) { Iterator<String> i = params.getParameterNamesIterator(); while (i.hasNext()) { String key = i.next(); if (!commitParams.contains(key)) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Unknown commit parameter '" + key + "'"); } } }
// in core/src/java/org/apache/solr/handler/MoreLikeThisHandler.java
Override public void handleRequestBody(SolrQueryRequest req, SolrQueryResponse rsp) throws Exception { SolrParams params = req.getParams(); // Set field flags ReturnFields returnFields = new ReturnFields( req ); rsp.setReturnFields( returnFields ); int flags = 0; if (returnFields.wantsScore()) { flags |= SolrIndexSearcher.GET_SCORES; } String defType = params.get(QueryParsing.DEFTYPE, QParserPlugin.DEFAULT_QTYPE); String q = params.get( CommonParams.Q ); Query query = null; SortSpec sortSpec = null; List<Query> filters = null; try { if (q != null) { QParser parser = QParser.getParser(q, defType, req); query = parser.getQuery(); sortSpec = parser.getSort(true); } String[] fqs = req.getParams().getParams(CommonParams.FQ); if (fqs!=null && fqs.length!=0) { filters = new ArrayList<Query>(); for (String fq : fqs) { if (fq != null && fq.trim().length()!=0) { QParser fqp = QParser.getParser(fq, null, req); filters.add(fqp.getQuery()); } } } } catch (ParseException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); } SolrIndexSearcher searcher = req.getSearcher(); MoreLikeThisHelper mlt = new MoreLikeThisHelper( params, searcher ); // Hold on to the interesting terms if relevant TermStyle termStyle = TermStyle.get( params.get( MoreLikeThisParams.INTERESTING_TERMS ) ); List<InterestingTerm> interesting = (termStyle == TermStyle.NONE ) ? null : new ArrayList<InterestingTerm>( mlt.mlt.getMaxQueryTerms() ); DocListAndSet mltDocs = null; // Parse Required Params // This will either have a single Reader or valid query Reader reader = null; try { if (q == null || q.trim().length() < 1) { Iterable<ContentStream> streams = req.getContentStreams(); if (streams != null) { Iterator<ContentStream> iter = streams.iterator(); if (iter.hasNext()) { reader = iter.next().getReader(); } if (iter.hasNext()) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "MoreLikeThis does not support multiple ContentStreams"); } } } int start = params.getInt(CommonParams.START, 0); int rows = params.getInt(CommonParams.ROWS, 10); // Find documents MoreLikeThis - either with a reader or a query // -------------------------------------------------------------------------------- if (reader != null) { mltDocs = mlt.getMoreLikeThis(reader, start, rows, filters, interesting, flags); } else if (q != null) { // Matching options boolean includeMatch = params.getBool(MoreLikeThisParams.MATCH_INCLUDE, true); int matchOffset = params.getInt(MoreLikeThisParams.MATCH_OFFSET, 0); // Find the base match DocList match = searcher.getDocList(query, null, null, matchOffset, 1, flags); // only get the first one... if (includeMatch) { rsp.add("match", match); } // This is an iterator, but we only handle the first match DocIterator iterator = match.iterator(); if (iterator.hasNext()) { // do a MoreLikeThis query for each document in results int id = iterator.nextDoc(); mltDocs = mlt.getMoreLikeThis(id, start, rows, filters, interesting, flags); } } else { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "MoreLikeThis requires either a query (?q=) or text to find similar documents."); } } finally { if (reader != null) { reader.close(); } } if( mltDocs == null ) { mltDocs = new DocListAndSet(); // avoid NPE } rsp.add( "response", mltDocs.docList ); if( interesting != null ) { if( termStyle == TermStyle.DETAILS ) { NamedList<Float> it = new NamedList<Float>(); for( InterestingTerm t : interesting ) { it.add( t.term.toString(), t.boost ); } rsp.add( "interestingTerms", it ); } else { List<String> it = new ArrayList<String>( interesting.size() ); for( InterestingTerm t : interesting ) { it.add( t.term.text()); } rsp.add( "interestingTerms", it ); } } // maybe facet the results if (params.getBool(FacetParams.FACET,false)) { if( mltDocs.docSet == null ) { rsp.add( "facet_counts", null ); } else { SimpleFacets f = new SimpleFacets(req, mltDocs.docSet, params ); rsp.add( "facet_counts", f.getFacetCounts() ); } } boolean dbg = req.getParams().getBool(CommonParams.DEBUG_QUERY, false); boolean dbgQuery = false, dbgResults = false; if (dbg == false){//if it's true, we are doing everything anyway. String[] dbgParams = req.getParams().getParams(CommonParams.DEBUG); if (dbgParams != null) { for (int i = 0; i < dbgParams.length; i++) { if (dbgParams[i].equals(CommonParams.QUERY)){ dbgQuery = true; } else if (dbgParams[i].equals(CommonParams.RESULTS)){ dbgResults = true; } } } } else { dbgQuery = true; dbgResults = true; } // Copied from StandardRequestHandler... perhaps it should be added to doStandardDebug? if (dbg == true) { try { NamedList<Object> dbgInfo = SolrPluginUtils.doStandardDebug(req, q, mlt.getRawMLTQuery(), mltDocs.docList, dbgQuery, dbgResults); if (null != dbgInfo) { if (null != filters) { dbgInfo.add("filter_queries",req.getParams().getParams(CommonParams.FQ)); List<String> fqs = new ArrayList<String>(filters.size()); for (Query fq : filters) { fqs.add(QueryParsing.toString(fq, req.getSchema())); } dbgInfo.add("parsed_filter_queries",fqs); } rsp.add("debug", dbgInfo); } } catch (Exception e) { SolrException.log(SolrCore.log, "Exception during debug", e); rsp.add("exception_during_debug", SolrException.toStr(e)); } } }
// in core/src/java/org/apache/solr/handler/component/TermsComponent.java
Override public void prepare(ResponseBuilder rb) throws IOException { SolrParams params = rb.req.getParams(); if (params.getBool(TermsParams.TERMS, false)) { rb.doTerms = true; } // TODO: temporary... this should go in a different component. String shards = params.get(ShardParams.SHARDS); if (shards != null) { rb.isDistrib = true; if (params.get(ShardParams.SHARDS_QT) == null) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "No shards.qt parameter specified"); } List<String> lst = StrUtils.splitSmart(shards, ",", true); rb.shards = lst.toArray(new String[lst.size()]); } }
// in core/src/java/org/apache/solr/handler/component/TermsComponent.java
int resolveRegexpFlags(SolrParams params) { String[] flagParams = params.getParams(TermsParams.TERMS_REGEXP_FLAG); if (flagParams == null) { return 0; } int flags = 0; for (String flagParam : flagParams) { try { flags |= TermsParams.TermsRegexpFlag.valueOf(flagParam.toUpperCase(Locale.ENGLISH)).getValue(); } catch (IllegalArgumentException iae) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Unknown terms regex flag '" + flagParam + "'"); } } return flags; }
// in core/src/java/org/apache/solr/handler/component/StatsValuesFactory.java
public static StatsValues createStatsValues(SchemaField sf) { FieldType fieldType = sf.getType(); if (DoubleField.class.isInstance(fieldType) || IntField.class.isInstance(fieldType) || LongField.class.isInstance(fieldType) || ShortField.class.isInstance(fieldType) || FloatField.class.isInstance(fieldType) || ByteField.class.isInstance(fieldType) || TrieField.class.isInstance(fieldType) || SortableDoubleField.class.isInstance(fieldType) || SortableIntField.class.isInstance(fieldType) || SortableLongField.class.isInstance(fieldType) || SortableFloatField.class.isInstance(fieldType)) { return new NumericStatsValues(sf); } else if (DateField.class.isInstance(fieldType)) { return new DateStatsValues(sf); } else if (StrField.class.isInstance(fieldType)) { return new StringStatsValues(sf); } else { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Field type " + fieldType + " is not currently supported"); } }
// in core/src/java/org/apache/solr/handler/component/HttpShardHandler.java
public ShardResponse call() throws Exception { ShardResponse srsp = new ShardResponse(); srsp.setShardRequest(sreq); srsp.setShard(shard); SimpleSolrResponse ssr = new SimpleSolrResponse(); srsp.setSolrResponse(ssr); long startTime = System.currentTimeMillis(); try { params.remove(CommonParams.WT); // use default (currently javabin) params.remove(CommonParams.VERSION); // SolrRequest req = new QueryRequest(SolrRequest.METHOD.POST, "/select"); // use generic request to avoid extra processing of queries QueryRequest req = new QueryRequest(params); req.setMethod(SolrRequest.METHOD.POST); // no need to set the response parser as binary is the default // req.setResponseParser(new BinaryResponseParser()); // if there are no shards available for a slice, urls.size()==0 if (urls.size()==0) { // TODO: what's the right error code here? We should use the same thing when // all of the servers for a shard are down. throw new SolrException(SolrException.ErrorCode.SERVICE_UNAVAILABLE, "no servers hosting shard: " + shard); } if (urls.size() <= 1) { String url = urls.get(0); srsp.setShardAddress(url); SolrServer server = new HttpSolrServer(url, httpClient); ssr.nl = server.request(req); } else { LBHttpSolrServer.Rsp rsp = httpShardHandlerFactory.loadbalancer.request(new LBHttpSolrServer.Req(req, urls)); ssr.nl = rsp.getResponse(); srsp.setShardAddress(rsp.getServer()); } } catch( ConnectException cex ) { srsp.setException(cex); //???? } catch (Throwable th) { srsp.setException(th); if (th instanceof SolrException) { srsp.setResponseCode(((SolrException)th).code()); } else { srsp.setResponseCode(-1); } } ssr.elapsedTime = System.currentTimeMillis() - startTime; return srsp; }
// in core/src/java/org/apache/solr/handler/component/HttpShardHandler.java
private ShardResponse take(boolean bailOnError) { while (pending.size() > 0) { try { Future<ShardResponse> future = completionService.take(); pending.remove(future); ShardResponse rsp = future.get(); if (bailOnError && rsp.getException() != null) return rsp; // if exception, return immediately // add response to the response list... we do this after the take() and // not after the completion of "call" so we know when the last response // for a request was received. Otherwise we might return the same // request more than once. rsp.getShardRequest().responses.add(rsp); if (rsp.getShardRequest().responses.size() == rsp.getShardRequest().actualShards.length) { return rsp; } } catch (InterruptedException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); } catch (ExecutionException e) { // should be impossible... the problem with catching the exception // at this level is we don't know what ShardRequest it applied to throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Impossible Exception",e); } } return null; }
// in core/src/java/org/apache/solr/handler/component/HttpShardHandler.java
public void checkDistributed(ResponseBuilder rb) { SolrQueryRequest req = rb.req; SolrParams params = req.getParams(); rb.isDistrib = params.getBool("distrib", req.getCore().getCoreDescriptor() .getCoreContainer().isZooKeeperAware()); String shards = params.get(ShardParams.SHARDS); // for back compat, a shards param with URLs like localhost:8983/solr will mean that this // search is distributed. boolean hasShardURL = shards != null && shards.indexOf('/') > 0; rb.isDistrib = hasShardURL | rb.isDistrib; if (rb.isDistrib) { // since the cost of grabbing cloud state is still up in the air, we grab it only // if we need it. CloudState cloudState = null; Map<String,Slice> slices = null; CoreDescriptor coreDescriptor = req.getCore().getCoreDescriptor(); CloudDescriptor cloudDescriptor = coreDescriptor.getCloudDescriptor(); ZkController zkController = coreDescriptor.getCoreContainer().getZkController(); if (shards != null) { List<String> lst = StrUtils.splitSmart(shards, ",", true); rb.shards = lst.toArray(new String[lst.size()]); rb.slices = new String[rb.shards.length]; if (zkController != null) { // figure out which shards are slices for (int i=0; i<rb.shards.length; i++) { if (rb.shards[i].indexOf('/') < 0) { // this is a logical shard rb.slices[i] = rb.shards[i]; rb.shards[i] = null; } } } } else if (zkController != null) { // we weren't provided with a list of slices to query, so find the list that will cover the complete index cloudState = zkController.getCloudState(); // This can be more efficient... we only record the name, even though we // have the shard info we need in the next step of mapping slice->shards // Stores the comma-separated list of specified collections. // Eg: "collection1,collection2,collection3" String collections = params.get("collection"); if (collections != null) { // If there were one or more collections specified in the query, split // each parameter and store as a seperate member of a List. List<String> collectionList = StrUtils.splitSmart(collections, ",", true); // First create an empty HashMap to add the slice info to. slices = new HashMap<String,Slice>(); // In turn, retrieve the slices that cover each collection from the // cloud state and add them to the Map 'slices'. for (int i = 0; i < collectionList.size(); i++) { String collection = collectionList.get(i); ClientUtils.appendMap(collection, slices, cloudState.getSlices(collection)); } } else { // If no collections were specified, default to the collection for // this core. slices = cloudState.getSlices(cloudDescriptor.getCollectionName()); if (slices == null) { throw new SolrException(ErrorCode.BAD_REQUEST, "Could not find collection:" + cloudDescriptor.getCollectionName()); } } // Store the logical slices in the ResponseBuilder and create a new // String array to hold the physical shards (which will be mapped // later). rb.slices = slices.keySet().toArray(new String[slices.size()]); rb.shards = new String[rb.slices.length]; /*** rb.slices = new String[slices.size()]; for (int i=0; i<rb.slices.length; i++) { rb.slices[i] = slices.get(i).getName(); } ***/ } // // Map slices to shards // if (zkController != null) { for (int i=0; i<rb.shards.length; i++) { if (rb.shards[i] == null) { if (cloudState == null) { cloudState = zkController.getCloudState(); slices = cloudState.getSlices(cloudDescriptor.getCollectionName()); } String sliceName = rb.slices[i]; Slice slice = slices.get(sliceName); if (slice==null) { // Treat this the same as "all servers down" for a slice, and let things continue // if partial results are acceptable rb.shards[i] = ""; continue; // throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "no such shard: " + sliceName); } Map<String, ZkNodeProps> sliceShards = slice.getShards(); // For now, recreate the | delimited list of equivalent servers Set<String> liveNodes = cloudState.getLiveNodes(); StringBuilder sliceShardsStr = new StringBuilder(); boolean first = true; for (ZkNodeProps nodeProps : sliceShards.values()) { ZkCoreNodeProps coreNodeProps = new ZkCoreNodeProps(nodeProps); if (!liveNodes.contains(coreNodeProps.getNodeName()) || !coreNodeProps.getState().equals( ZkStateReader.ACTIVE)) continue; if (first) { first = false; } else { sliceShardsStr.append('|'); } String url = coreNodeProps.getCoreUrl(); if (url.startsWith("http://")) url = url.substring(7); sliceShardsStr.append(url); } rb.shards[i] = sliceShardsStr.toString(); } } } } String shards_rows = params.get(ShardParams.SHARDS_ROWS); if(shards_rows != null) { rb.shards_rows = Integer.parseInt(shards_rows); } String shards_start = params.get(ShardParams.SHARDS_START); if(shards_start != null) { rb.shards_start = Integer.parseInt(shards_start); } }
// in core/src/java/org/apache/solr/handler/component/QueryElevationComponent.java
public void inform(SolrCore core) { String a = initArgs.get(FIELD_TYPE); if (a != null) { FieldType ft = core.getSchema().getFieldTypes().get(a); if (ft == null) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Unknown FieldType: '" + a + "' used in QueryElevationComponent"); } analyzer = ft.getQueryAnalyzer(); } SchemaField sf = core.getSchema().getUniqueKeyField(); if( sf == null) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "QueryElevationComponent requires the schema to have a uniqueKeyField." ); } idSchemaFT = sf.getType(); idField = sf.getName(); //register the EditorialMarkerFactory String excludeName = initArgs.get(QueryElevationParams.EXCLUDE_MARKER_FIELD_NAME, "excluded"); if (excludeName == null || excludeName.equals("") == true){ excludeName = "excluded"; } ExcludedMarkerFactory excludedMarkerFactory = new ExcludedMarkerFactory(); core.addTransformerFactory(excludeName, excludedMarkerFactory); ElevatedMarkerFactory elevatedMarkerFactory = new ElevatedMarkerFactory(); String markerName = initArgs.get(QueryElevationParams.EDITORIAL_MARKER_FIELD_NAME, "elevated"); if (markerName == null || markerName.equals("") == true) { markerName = "elevated"; } core.addTransformerFactory(markerName, elevatedMarkerFactory); forceElevation = initArgs.getBool(QueryElevationParams.FORCE_ELEVATION, forceElevation); try { synchronized (elevationCache) { elevationCache.clear(); String f = initArgs.get(CONFIG_FILE); if (f == null) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "QueryElevationComponent must specify argument: '" + CONFIG_FILE + "' -- path to elevate.xml"); } boolean exists = false; // check if using ZooKeeper ZkController zkController = core.getCoreDescriptor().getCoreContainer().getZkController(); if (zkController != null) { // TODO : shouldn't have to keep reading the config name when it has been read before exists = zkController.configFileExists(zkController.readConfigName(core.getCoreDescriptor().getCloudDescriptor().getCollectionName()), f); } else { File fC = new File(core.getResourceLoader().getConfigDir(), f); File fD = new File(core.getDataDir(), f); if (fC.exists() == fD.exists()) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "QueryElevationComponent missing config file: '" + f + "\n" + "either: " + fC.getAbsolutePath() + " or " + fD.getAbsolutePath() + " must exist, but not both."); } if (fC.exists()) { exists = true; log.info("Loading QueryElevation from: " + fC.getAbsolutePath()); Config cfg = new Config(core.getResourceLoader(), f); elevationCache.put(null, loadElevationMap(cfg)); } } //in other words, we think this is in the data dir, not the conf dir if (!exists) { // preload the first data RefCounted<SolrIndexSearcher> searchHolder = null; try { searchHolder = core.getNewestSearcher(false); IndexReader reader = searchHolder.get().getIndexReader(); getElevationMap(reader, core); } finally { if (searchHolder != null) searchHolder.decref(); } } } } catch (Exception ex) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Error initializing QueryElevationComponent.", ex); } }
// in core/src/java/org/apache/solr/handler/component/QueryElevationComponent.java
Map<String, ElevationObj> getElevationMap(IndexReader reader, SolrCore core) throws Exception { synchronized (elevationCache) { Map<String, ElevationObj> map = elevationCache.get(null); if (map != null) return map; map = elevationCache.get(reader); if (map == null) { String f = initArgs.get(CONFIG_FILE); if (f == null) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "QueryElevationComponent must specify argument: " + CONFIG_FILE); } log.info("Loading QueryElevation from data dir: " + f); Config cfg; ZkController zkController = core.getCoreDescriptor().getCoreContainer().getZkController(); if (zkController != null) { cfg = new Config(core.getResourceLoader(), f, null, null); } else { InputStream is = VersionedFile.getLatestFile(core.getDataDir(), f); cfg = new Config(core.getResourceLoader(), f, new InputSource(is), null); } map = loadElevationMap(cfg); elevationCache.put(reader, map); } return map; } }
// in core/src/java/org/apache/solr/handler/component/QueryElevationComponent.java
private Map<String, ElevationObj> loadElevationMap(Config cfg) throws IOException { XPath xpath = XPathFactory.newInstance().newXPath(); Map<String, ElevationObj> map = new HashMap<String, ElevationObj>(); NodeList nodes = (NodeList) cfg.evaluate("elevate/query", XPathConstants.NODESET); for (int i = 0; i < nodes.getLength(); i++) { Node node = nodes.item(i); String qstr = DOMUtil.getAttr(node, "text", "missing query 'text'"); NodeList children = null; try { children = (NodeList) xpath.evaluate("doc", node, XPathConstants.NODESET); } catch (XPathExpressionException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "query requires '<doc .../>' child"); } ArrayList<String> include = new ArrayList<String>(); ArrayList<String> exclude = new ArrayList<String>(); for (int j = 0; j < children.getLength(); j++) { Node child = children.item(j); String id = DOMUtil.getAttr(child, "id", "missing 'id'"); String e = DOMUtil.getAttr(child, EXCLUDE, null); if (e != null) { if (Boolean.valueOf(e)) { exclude.add(id); continue; } } include.add(id); } ElevationObj elev = new ElevationObj(qstr, include, exclude); if (map.containsKey(elev.analyzed)) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Boosting query defined twice for query: '" + elev.text + "' (" + elev.analyzed + "')"); } map.put(elev.analyzed, elev); } return map; }
// in core/src/java/org/apache/solr/handler/component/QueryElevationComponent.java
Override public void prepare(ResponseBuilder rb) throws IOException { SolrQueryRequest req = rb.req; SolrParams params = req.getParams(); // A runtime param can skip if (!params.getBool(QueryElevationParams.ENABLE, true)) { return; } boolean exclusive = params.getBool(QueryElevationParams.EXCLUSIVE, false); // A runtime parameter can alter the config value for forceElevation boolean force = params.getBool(QueryElevationParams.FORCE_ELEVATION, forceElevation); boolean markExcludes = params.getBool(QueryElevationParams.MARK_EXCLUDES, false); Query query = rb.getQuery(); String qstr = rb.getQueryString(); if (query == null || qstr == null) { return; } qstr = getAnalyzedQuery(qstr); IndexReader reader = req.getSearcher().getIndexReader(); ElevationObj booster = null; try { booster = getElevationMap(reader, req.getCore()).get(qstr); } catch (Exception ex) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Error loading elevation", ex); } if (booster != null) { rb.req.getContext().put(BOOSTED, booster.ids); // Change the query to insert forced documents if (exclusive == true) { //we only want these results rb.setQuery(booster.include); } else { BooleanQuery newq = new BooleanQuery(true); newq.add(query, BooleanClause.Occur.SHOULD); newq.add(booster.include, BooleanClause.Occur.SHOULD); if (booster.exclude != null) { if (markExcludes == false) { for (TermQuery tq : booster.exclude) { newq.add(new BooleanClause(tq, BooleanClause.Occur.MUST_NOT)); } } else { //we are only going to mark items as excluded, not actually exclude them. This works //with the EditorialMarkerFactory rb.req.getContext().put(EXCLUDED, booster.excludeIds); for (TermQuery tq : booster.exclude) { newq.add(new BooleanClause(tq, BooleanClause.Occur.SHOULD)); } } } rb.setQuery(newq); } ElevationComparatorSource comparator = new ElevationComparatorSource(booster); // if the sort is 'score desc' use a custom sorting method to // insert documents in their proper place SortSpec sortSpec = rb.getSortSpec(); if (sortSpec.getSort() == null) { sortSpec.setSort(new Sort(new SortField[]{ new SortField("_elevate_", comparator, true), new SortField(null, SortField.Type.SCORE, false) })); } else { // Check if the sort is based on score boolean modify = false; SortField[] current = sortSpec.getSort().getSort(); ArrayList<SortField> sorts = new ArrayList<SortField>(current.length + 1); // Perhaps force it to always sort by score if (force && current[0].getType() != SortField.Type.SCORE) { sorts.add(new SortField("_elevate_", comparator, true)); modify = true; } for (SortField sf : current) { if (sf.getType() == SortField.Type.SCORE) { sorts.add(new SortField("_elevate_", comparator, !sf.getReverse())); modify = true; } sorts.add(sf); } if (modify) { sortSpec.setSort(new Sort(sorts.toArray(new SortField[sorts.size()]))); } } } // Add debugging information if (rb.isDebug()) { List<String> match = null; if (booster != null) { // Extract the elevated terms into a list match = new ArrayList<String>(booster.priority.size()); for (Object o : booster.include.clauses()) { TermQuery tq = (TermQuery) ((BooleanClause) o).getQuery(); match.add(tq.getTerm().text()); } } SimpleOrderedMap<Object> dbg = new SimpleOrderedMap<Object>(); dbg.add("q", qstr); dbg.add("match", match); if (rb.isDebugQuery()) { rb.addDebugInfo("queryBoosting", dbg); } } }
// in core/src/java/org/apache/solr/handler/component/SearchHandler.java
Override public void handleRequestBody(SolrQueryRequest req, SolrQueryResponse rsp) throws Exception, ParseException, InstantiationException, IllegalAccessException { // int sleep = req.getParams().getInt("sleep",0); // if (sleep > 0) {log.error("SLEEPING for " + sleep); Thread.sleep(sleep);} ResponseBuilder rb = new ResponseBuilder(req, rsp, components); if (rb.requestInfo != null) { rb.requestInfo.setResponseBuilder(rb); } boolean dbg = req.getParams().getBool(CommonParams.DEBUG_QUERY, false); rb.setDebug(dbg); if (dbg == false){//if it's true, we are doing everything anyway. SolrPluginUtils.getDebugInterests(req.getParams().getParams(CommonParams.DEBUG), rb); } final RTimer timer = rb.isDebug() ? new RTimer() : null; ShardHandler shardHandler1 = shardHandlerFactory.getShardHandler(); shardHandler1.checkDistributed(rb); if (timer == null) { // non-debugging prepare phase for( SearchComponent c : components ) { c.prepare(rb); } } else { // debugging prepare phase RTimer subt = timer.sub( "prepare" ); for( SearchComponent c : components ) { rb.setTimer( subt.sub( c.getName() ) ); c.prepare(rb); rb.getTimer().stop(); } subt.stop(); } if (!rb.isDistrib) { // a normal non-distributed request // The semantics of debugging vs not debugging are different enough that // it makes sense to have two control loops if(!rb.isDebug()) { // Process for( SearchComponent c : components ) { c.process(rb); } } else { // Process RTimer subt = timer.sub( "process" ); for( SearchComponent c : components ) { rb.setTimer( subt.sub( c.getName() ) ); c.process(rb); rb.getTimer().stop(); } subt.stop(); timer.stop(); // add the timing info if (rb.isDebugTimings()) { rb.addDebugInfo("timing", timer.asNamedList() ); } } } else { // a distributed request if (rb.outgoing == null) { rb.outgoing = new LinkedList<ShardRequest>(); } rb.finished = new ArrayList<ShardRequest>(); int nextStage = 0; do { rb.stage = nextStage; nextStage = ResponseBuilder.STAGE_DONE; // call all components for( SearchComponent c : components ) { // the next stage is the minimum of what all components report nextStage = Math.min(nextStage, c.distributedProcess(rb)); } // check the outgoing queue and send requests while (rb.outgoing.size() > 0) { // submit all current request tasks at once while (rb.outgoing.size() > 0) { ShardRequest sreq = rb.outgoing.remove(0); sreq.actualShards = sreq.shards; if (sreq.actualShards==ShardRequest.ALL_SHARDS) { sreq.actualShards = rb.shards; } sreq.responses = new ArrayList<ShardResponse>(); // TODO: map from shard to address[] for (String shard : sreq.actualShards) { ModifiableSolrParams params = new ModifiableSolrParams(sreq.params); params.remove(ShardParams.SHARDS); // not a top-level request params.set("distrib", "false"); // not a top-level request params.remove("indent"); params.remove(CommonParams.HEADER_ECHO_PARAMS); params.set(ShardParams.IS_SHARD, true); // a sub (shard) request params.set(ShardParams.SHARD_URL, shard); // so the shard knows what was asked if (rb.requestInfo != null) { // we could try and detect when this is needed, but it could be tricky params.set("NOW", Long.toString(rb.requestInfo.getNOW().getTime())); } String shardQt = params.get(ShardParams.SHARDS_QT); if (shardQt == null) { params.remove(CommonParams.QT); } else { params.set(CommonParams.QT, shardQt); } shardHandler1.submit(sreq, shard, params); } } // now wait for replies, but if anyone puts more requests on // the outgoing queue, send them out immediately (by exiting // this loop) boolean tolerant = rb.req.getParams().getBool(ShardParams.SHARDS_TOLERANT, false); while (rb.outgoing.size() == 0) { ShardResponse srsp = tolerant ? shardHandler1.takeCompletedIncludingErrors(): shardHandler1.takeCompletedOrError(); if (srsp == null) break; // no more requests to wait for // Was there an exception? if (srsp.getException() != null) { // If things are not tolerant, abort everything and rethrow if(!tolerant) { shardHandler1.cancelAll(); if (srsp.getException() instanceof SolrException) { throw (SolrException)srsp.getException(); } else { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, srsp.getException()); } } } rb.finished.add(srsp.getShardRequest()); // let the components see the responses to the request for(SearchComponent c : components) { c.handleResponses(rb, srsp.getShardRequest()); } } } for(SearchComponent c : components) { c.finishStage(rb); } // we are done when the next stage is MAX_VALUE } while (nextStage != Integer.MAX_VALUE); } }
// in core/src/java/org/apache/solr/handler/component/PivotFacetHelper.java
public SimpleOrderedMap<List<NamedList<Object>>> process(ResponseBuilder rb, SolrParams params, String[] pivots) throws IOException { if (!rb.doFacets || pivots == null) return null; int minMatch = params.getInt( FacetParams.FACET_PIVOT_MINCOUNT, 1 ); SimpleOrderedMap<List<NamedList<Object>>> pivotResponse = new SimpleOrderedMap<List<NamedList<Object>>>(); for (String pivot : pivots) { String[] fields = pivot.split(","); // only support two levels for now if( fields.length < 2 ) { throw new SolrException( ErrorCode.BAD_REQUEST, "Pivot Facet needs at least two fields: "+pivot ); } DocSet docs = rb.getResults().docSet; String field = fields[0]; String subField = fields[1]; Deque<String> fnames = new LinkedList<String>(); for( int i=fields.length-1; i>1; i-- ) { fnames.push( fields[i] ); } SimpleFacets sf = getFacetImplementation(rb.req, rb.getResults().docSet, rb.req.getParams()); NamedList<Integer> superFacets = sf.getTermCounts(field); pivotResponse.add(pivot, doPivots(superFacets, field, subField, fnames, rb, docs, minMatch)); } return pivotResponse; }
// in core/src/java/org/apache/solr/handler/component/QueryComponent.java
Override public void prepare(ResponseBuilder rb) throws IOException { SolrQueryRequest req = rb.req; SolrParams params = req.getParams(); if (!params.getBool(COMPONENT_NAME, true)) { return; } SolrQueryResponse rsp = rb.rsp; // Set field flags ReturnFields returnFields = new ReturnFields( req ); rsp.setReturnFields( returnFields ); int flags = 0; if (returnFields.wantsScore()) { flags |= SolrIndexSearcher.GET_SCORES; } rb.setFieldFlags( flags ); String defType = params.get(QueryParsing.DEFTYPE,QParserPlugin.DEFAULT_QTYPE); // get it from the response builder to give a different component a chance // to set it. String queryString = rb.getQueryString(); if (queryString == null) { // this is the normal way it's set. queryString = params.get( CommonParams.Q ); rb.setQueryString(queryString); } try { QParser parser = QParser.getParser(rb.getQueryString(), defType, req); Query q = parser.getQuery(); if (q == null) { // normalize a null query to a query that matches nothing q = new BooleanQuery(); } rb.setQuery( q ); rb.setSortSpec( parser.getSort(true) ); rb.setQparser(parser); rb.setScoreDoc(parser.getPaging()); String[] fqs = req.getParams().getParams(CommonParams.FQ); if (fqs!=null && fqs.length!=0) { List<Query> filters = rb.getFilters(); if (filters==null) { filters = new ArrayList<Query>(fqs.length); } for (String fq : fqs) { if (fq != null && fq.trim().length()!=0) { QParser fqp = QParser.getParser(fq, null, req); filters.add(fqp.getQuery()); } } // only set the filters if they are not empty otherwise // fq=&someotherParam= will trigger all docs filter for every request // if filter cache is disabled if (!filters.isEmpty()) { rb.setFilters( filters ); } } } catch (ParseException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); } boolean grouping = params.getBool(GroupParams.GROUP, false); if (!grouping) { return; } SolrIndexSearcher.QueryCommand cmd = rb.getQueryCommand(); SolrIndexSearcher searcher = rb.req.getSearcher(); GroupingSpecification groupingSpec = new GroupingSpecification(); rb.setGroupingSpec(groupingSpec); //TODO: move weighting of sort Sort groupSort = searcher.weightSort(cmd.getSort()); if (groupSort == null) { groupSort = Sort.RELEVANCE; } // groupSort defaults to sort String groupSortStr = params.get(GroupParams.GROUP_SORT); //TODO: move weighting of sort Sort sortWithinGroup = groupSortStr == null ? groupSort : searcher.weightSort(QueryParsing.parseSort(groupSortStr, req)); if (sortWithinGroup == null) { sortWithinGroup = Sort.RELEVANCE; } groupingSpec.setSortWithinGroup(sortWithinGroup); groupingSpec.setGroupSort(groupSort); String formatStr = params.get(GroupParams.GROUP_FORMAT, Grouping.Format.grouped.name()); Grouping.Format responseFormat; try { responseFormat = Grouping.Format.valueOf(formatStr); } catch (IllegalArgumentException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, String.format("Illegal %s parameter", GroupParams.GROUP_FORMAT)); } groupingSpec.setResponseFormat(responseFormat); groupingSpec.setFields(params.getParams(GroupParams.GROUP_FIELD)); groupingSpec.setQueries(params.getParams(GroupParams.GROUP_QUERY)); groupingSpec.setFunctions(params.getParams(GroupParams.GROUP_FUNC)); groupingSpec.setGroupOffset(params.getInt(GroupParams.GROUP_OFFSET, 0)); groupingSpec.setGroupLimit(params.getInt(GroupParams.GROUP_LIMIT, 1)); groupingSpec.setOffset(rb.getSortSpec().getOffset()); groupingSpec.setLimit(rb.getSortSpec().getCount()); groupingSpec.setIncludeGroupCount(params.getBool(GroupParams.GROUP_TOTAL_COUNT, false)); groupingSpec.setMain(params.getBool(GroupParams.GROUP_MAIN, false)); groupingSpec.setNeedScore((cmd.getFlags() & SolrIndexSearcher.GET_SCORES) != 0); groupingSpec.setTruncateGroups(params.getBool(GroupParams.GROUP_TRUNCATE, false)); }
// in core/src/java/org/apache/solr/handler/component/QueryComponent.java
Override public void process(ResponseBuilder rb) throws IOException { SolrQueryRequest req = rb.req; SolrQueryResponse rsp = rb.rsp; SolrParams params = req.getParams(); if (!params.getBool(COMPONENT_NAME, true)) { return; } SolrIndexSearcher searcher = req.getSearcher(); if (rb.getQueryCommand().getOffset() < 0) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "'start' parameter cannot be negative"); } // -1 as flag if not set. long timeAllowed = (long)params.getInt( CommonParams.TIME_ALLOWED, -1 ); // Optional: This could also be implemented by the top-level searcher sending // a filter that lists the ids... that would be transparent to // the request handler, but would be more expensive (and would preserve score // too if desired). String ids = params.get(ShardParams.IDS); if (ids != null) { SchemaField idField = req.getSchema().getUniqueKeyField(); List<String> idArr = StrUtils.splitSmart(ids, ",", true); int[] luceneIds = new int[idArr.size()]; int docs = 0; for (int i=0; i<idArr.size(); i++) { int id = req.getSearcher().getFirstMatch( new Term(idField.getName(), idField.getType().toInternal(idArr.get(i)))); if (id >= 0) luceneIds[docs++] = id; } DocListAndSet res = new DocListAndSet(); res.docList = new DocSlice(0, docs, luceneIds, null, docs, 0); if (rb.isNeedDocSet()) { // TODO: create a cache for this! List<Query> queries = new ArrayList<Query>(); queries.add(rb.getQuery()); List<Query> filters = rb.getFilters(); if (filters != null) queries.addAll(filters); res.docSet = searcher.getDocSet(queries); } rb.setResults(res); ResultContext ctx = new ResultContext(); ctx.docs = rb.getResults().docList; ctx.query = null; // anything? rsp.add("response", ctx); return; } SolrIndexSearcher.QueryCommand cmd = rb.getQueryCommand(); cmd.setTimeAllowed(timeAllowed); SolrIndexSearcher.QueryResult result = new SolrIndexSearcher.QueryResult(); // // grouping / field collapsing // GroupingSpecification groupingSpec = rb.getGroupingSpec(); if (groupingSpec != null) { try { boolean needScores = (cmd.getFlags() & SolrIndexSearcher.GET_SCORES) != 0; if (params.getBool(GroupParams.GROUP_DISTRIBUTED_FIRST, false)) { CommandHandler.Builder topsGroupsActionBuilder = new CommandHandler.Builder() .setQueryCommand(cmd) .setNeedDocSet(false) // Order matters here .setIncludeHitCount(true) .setSearcher(searcher); for (String field : groupingSpec.getFields()) { topsGroupsActionBuilder.addCommandField(new SearchGroupsFieldCommand.Builder() .setField(searcher.getSchema().getField(field)) .setGroupSort(groupingSpec.getGroupSort()) .setTopNGroups(cmd.getOffset() + cmd.getLen()) .setIncludeGroupCount(groupingSpec.isIncludeGroupCount()) .build() ); } CommandHandler commandHandler = topsGroupsActionBuilder.build(); commandHandler.execute(); SearchGroupsResultTransformer serializer = new SearchGroupsResultTransformer(searcher); rsp.add("firstPhase", commandHandler.processResult(result, serializer)); rsp.add("totalHitCount", commandHandler.getTotalHitCount()); rb.setResult(result); return; } else if (params.getBool(GroupParams.GROUP_DISTRIBUTED_SECOND, false)) { CommandHandler.Builder secondPhaseBuilder = new CommandHandler.Builder() .setQueryCommand(cmd) .setTruncateGroups(groupingSpec.isTruncateGroups() && groupingSpec.getFields().length > 0) .setSearcher(searcher); for (String field : groupingSpec.getFields()) { String[] topGroupsParam = params.getParams(GroupParams.GROUP_DISTRIBUTED_TOPGROUPS_PREFIX + field); if (topGroupsParam == null) { topGroupsParam = new String[0]; } List<SearchGroup<BytesRef>> topGroups = new ArrayList<SearchGroup<BytesRef>>(topGroupsParam.length); for (String topGroup : topGroupsParam) { SearchGroup<BytesRef> searchGroup = new SearchGroup<BytesRef>(); if (!topGroup.equals(TopGroupsShardRequestFactory.GROUP_NULL_VALUE)) { searchGroup.groupValue = new BytesRef(searcher.getSchema().getField(field).getType().readableToIndexed(topGroup)); } topGroups.add(searchGroup); } secondPhaseBuilder.addCommandField( new TopGroupsFieldCommand.Builder() .setField(searcher.getSchema().getField(field)) .setGroupSort(groupingSpec.getGroupSort()) .setSortWithinGroup(groupingSpec.getSortWithinGroup()) .setFirstPhaseGroups(topGroups) .setMaxDocPerGroup(groupingSpec.getGroupOffset() + groupingSpec.getGroupLimit()) .setNeedScores(needScores) .setNeedMaxScore(needScores) .build() ); } for (String query : groupingSpec.getQueries()) { secondPhaseBuilder.addCommandField(new QueryCommand.Builder() .setDocsToCollect(groupingSpec.getOffset() + groupingSpec.getLimit()) .setSort(groupingSpec.getGroupSort()) .setQuery(query, rb.req) .setDocSet(searcher) .build() ); } CommandHandler commandHandler = secondPhaseBuilder.build(); commandHandler.execute(); TopGroupsResultTransformer serializer = new TopGroupsResultTransformer(rb); rsp.add("secondPhase", commandHandler.processResult(result, serializer)); rb.setResult(result); return; } int maxDocsPercentageToCache = params.getInt(GroupParams.GROUP_CACHE_PERCENTAGE, 0); boolean cacheSecondPassSearch = maxDocsPercentageToCache >= 1 && maxDocsPercentageToCache <= 100; Grouping.TotalCount defaultTotalCount = groupingSpec.isIncludeGroupCount() ? Grouping.TotalCount.grouped : Grouping.TotalCount.ungrouped; int limitDefault = cmd.getLen(); // this is normally from "rows" Grouping grouping = new Grouping(searcher, result, cmd, cacheSecondPassSearch, maxDocsPercentageToCache, groupingSpec.isMain()); grouping.setSort(groupingSpec.getGroupSort()) .setGroupSort(groupingSpec.getSortWithinGroup()) .setDefaultFormat(groupingSpec.getResponseFormat()) .setLimitDefault(limitDefault) .setDefaultTotalCount(defaultTotalCount) .setDocsPerGroupDefault(groupingSpec.getGroupLimit()) .setGroupOffsetDefault(groupingSpec.getGroupOffset()) .setGetGroupedDocSet(groupingSpec.isTruncateGroups()); if (groupingSpec.getFields() != null) { for (String field : groupingSpec.getFields()) { grouping.addFieldCommand(field, rb.req); } } if (groupingSpec.getFunctions() != null) { for (String groupByStr : groupingSpec.getFunctions()) { grouping.addFunctionCommand(groupByStr, rb.req); } } if (groupingSpec.getQueries() != null) { for (String groupByStr : groupingSpec.getQueries()) { grouping.addQueryCommand(groupByStr, rb.req); } } if (rb.doHighlights || rb.isDebug() || params.getBool(MoreLikeThisParams.MLT, false)) { // we need a single list of the returned docs cmd.setFlags(SolrIndexSearcher.GET_DOCLIST); } grouping.execute(); if (grouping.isSignalCacheWarning()) { rsp.add( "cacheWarning", String.format("Cache limit of %d percent relative to maxdoc has exceeded. Please increase cache size or disable caching.", maxDocsPercentageToCache) ); } rb.setResult(result); if (grouping.mainResult != null) { ResultContext ctx = new ResultContext(); ctx.docs = grouping.mainResult; ctx.query = null; // TODO? add the query? rsp.add("response", ctx); rsp.getToLog().add("hits", grouping.mainResult.matches()); } else if (!grouping.getCommands().isEmpty()) { // Can never be empty since grouping.execute() checks for this. rsp.add("grouped", result.groupedResults); rsp.getToLog().add("hits", grouping.getCommands().get(0).getMatches()); } return; } catch (ParseException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); } } // normal search result searcher.search(result,cmd); rb.setResult( result ); ResultContext ctx = new ResultContext(); ctx.docs = rb.getResults().docList; ctx.query = rb.getQuery(); rsp.add("response", ctx); rsp.getToLog().add("hits", rb.getResults().docList.matches()); doFieldSortValues(rb, searcher); doPrefetch(rb); }
// in core/src/java/org/apache/solr/handler/component/HighlightComponent.java
Override public void prepare(ResponseBuilder rb) throws IOException { SolrParams params = rb.req.getParams(); rb.doHighlights = highlighter.isHighlightingEnabled(params); if(rb.doHighlights){ String hlq = params.get(HighlightParams.Q); if(hlq != null){ try { QParser parser = QParser.getParser(hlq, null, rb.req); rb.setHighlightQuery(parser.getHighlightQuery()); } catch (ParseException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); } } } }
// in core/src/java/org/apache/solr/handler/component/HighlightComponent.java
Override public void process(ResponseBuilder rb) throws IOException { if (rb.doHighlights) { SolrQueryRequest req = rb.req; SolrParams params = req.getParams(); String[] defaultHighlightFields; //TODO: get from builder by default? if (rb.getQparser() != null) { defaultHighlightFields = rb.getQparser().getDefaultHighlightFields(); } else { defaultHighlightFields = params.getParams(CommonParams.DF); } Query highlightQuery = rb.getHighlightQuery(); if(highlightQuery==null) { if (rb.getQparser() != null) { try { highlightQuery = rb.getQparser().getHighlightQuery(); rb.setHighlightQuery( highlightQuery ); } catch (Exception e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); } } else { highlightQuery = rb.getQuery(); rb.setHighlightQuery( highlightQuery ); } } if(highlightQuery != null) { boolean rewrite = !(Boolean.valueOf(params.get(HighlightParams.USE_PHRASE_HIGHLIGHTER, "true")) && Boolean.valueOf(params.get(HighlightParams.HIGHLIGHT_MULTI_TERM, "true"))); highlightQuery = rewrite ? highlightQuery.rewrite(req.getSearcher().getIndexReader()) : highlightQuery; } // No highlighting if there is no query -- consider q.alt="*:* if( highlightQuery != null ) { NamedList sumData = highlighter.doHighlighting( rb.getResults().docList, highlightQuery, req, defaultHighlightFields ); if(sumData != null) { // TODO ???? add this directly to the response? rb.rsp.add("highlighting", sumData); } } } }
// in core/src/java/org/apache/solr/handler/component/StatsComponent.java
public NamedList<?> getFieldCacheStats(String fieldName, String[] facet ) { SchemaField sf = searcher.getSchema().getField(fieldName); FieldCache.DocTermsIndex si; try { si = FieldCache.DEFAULT.getTermsIndex(searcher.getAtomicReader(), fieldName); } catch (IOException e) { throw new RuntimeException( "failed to open field cache for: "+fieldName, e ); } StatsValues allstats = StatsValuesFactory.createStatsValues(sf); final int nTerms = si.numOrd(); if ( nTerms <= 0 || docs.size() <= 0 ) return allstats.getStatsValues(); // don't worry about faceting if no documents match... List<FieldFacetStats> facetStats = new ArrayList<FieldFacetStats>(); FieldCache.DocTermsIndex facetTermsIndex; for( String facetField : facet ) { SchemaField fsf = searcher.getSchema().getField(facetField); FieldType facetFieldType = fsf.getType(); if (facetFieldType.isTokenized() || facetFieldType.isMultiValued()) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Stats can only facet on single-valued fields, not: " + facetField + "[" + facetFieldType + "]"); } try { facetTermsIndex = FieldCache.DEFAULT.getTermsIndex(searcher.getAtomicReader(), facetField); } catch (IOException e) { throw new RuntimeException( "failed to open field cache for: " + facetField, e ); } facetStats.add(new FieldFacetStats(facetField, facetTermsIndex, sf, fsf, nTerms)); } final BytesRef tempBR = new BytesRef(); DocIterator iter = docs.iterator(); while (iter.hasNext()) { int docID = iter.nextDoc(); BytesRef raw = si.lookup(si.getOrd(docID), tempBR); if( raw.length > 0 ) { allstats.accumulate(raw); } else { allstats.missing(); } // now update the facets for (FieldFacetStats f : facetStats) { f.facet(docID, raw); } } for (FieldFacetStats f : facetStats) { allstats.addFacet(f.name, f.facetStatsValues); } return allstats.getStatsValues(); }
// in core/src/java/org/apache/solr/handler/component/RealTimeGetComponent.java
Override public void process(ResponseBuilder rb) throws IOException { SolrQueryRequest req = rb.req; SolrQueryResponse rsp = rb.rsp; SolrParams params = req.getParams(); if (!params.getBool(COMPONENT_NAME, true)) { return; } String val = params.get("getVersions"); if (val != null) { processGetVersions(rb); return; } val = params.get("getUpdates"); if (val != null) { processGetUpdates(rb); return; } String id[] = params.getParams("id"); String ids[] = params.getParams("ids"); if (id == null && ids == null) { return; } String[] allIds = id==null ? new String[0] : id; if (ids != null) { List<String> lst = new ArrayList<String>(); for (String s : allIds) { lst.add(s); } for (String idList : ids) { lst.addAll( StrUtils.splitSmart(idList, ",", true) ); } allIds = lst.toArray(new String[lst.size()]); } SchemaField idField = req.getSchema().getUniqueKeyField(); FieldType fieldType = idField.getType(); SolrDocumentList docList = new SolrDocumentList(); UpdateLog ulog = req.getCore().getUpdateHandler().getUpdateLog(); RefCounted<SolrIndexSearcher> searcherHolder = null; DocTransformer transformer = rsp.getReturnFields().getTransformer(); if (transformer != null) { TransformContext context = new TransformContext(); context.req = req; transformer.setContext(context); } try { SolrIndexSearcher searcher = null; BytesRef idBytes = new BytesRef(); for (String idStr : allIds) { fieldType.readableToIndexed(idStr, idBytes); if (ulog != null) { Object o = ulog.lookup(idBytes); if (o != null) { // should currently be a List<Oper,Ver,Doc/Id> List entry = (List)o; assert entry.size() >= 3; int oper = (Integer)entry.get(0) & UpdateLog.OPERATION_MASK; switch (oper) { case UpdateLog.ADD: SolrDocument doc = toSolrDoc((SolrInputDocument)entry.get(entry.size()-1), req.getSchema()); if(transformer!=null) { transformer.transform(doc, -1); // unknown docID } docList.add(doc); break; case UpdateLog.DELETE: break; default: throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Unknown Operation! " + oper); } continue; } } // didn't find it in the update log, so it should be in the newest searcher opened if (searcher == null) { searcherHolder = req.getCore().getRealtimeSearcher(); searcher = searcherHolder.get(); } // SolrCore.verbose("RealTimeGet using searcher ", searcher); int docid = searcher.getFirstMatch(new Term(idField.getName(), idBytes)); if (docid < 0) continue; Document luceneDocument = searcher.doc(docid); SolrDocument doc = toSolrDoc(luceneDocument, req.getSchema()); if( transformer != null ) { transformer.transform(doc, docid); } docList.add(doc); } } finally { if (searcherHolder != null) { searcherHolder.decref(); } } // if the client specified a single id=foo, then use "doc":{ // otherwise use a standard doclist if (ids == null && allIds.length <= 1) { // if the doc was not found, then use a value of null. rsp.add("doc", docList.size() > 0 ? docList.get(0) : null); } else { docList.setNumFound(docList.size()); rsp.add("response", docList); } }
// in core/src/java/org/apache/solr/handler/component/RealTimeGetComponent.java
public static SolrInputDocument getInputDocument(SolrCore core, BytesRef idBytes) throws IOException { SolrInputDocument sid = null; RefCounted<SolrIndexSearcher> searcherHolder = null; try { SolrIndexSearcher searcher = null; UpdateLog ulog = core.getUpdateHandler().getUpdateLog(); if (ulog != null) { Object o = ulog.lookup(idBytes); if (o != null) { // should currently be a List<Oper,Ver,Doc/Id> List entry = (List)o; assert entry.size() >= 3; int oper = (Integer)entry.get(0) & UpdateLog.OPERATION_MASK; switch (oper) { case UpdateLog.ADD: sid = (SolrInputDocument)entry.get(entry.size()-1); break; case UpdateLog.DELETE: return null; default: throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Unknown Operation! " + oper); } } } if (sid == null) { // didn't find it in the update log, so it should be in the newest searcher opened if (searcher == null) { searcherHolder = core.getRealtimeSearcher(); searcher = searcherHolder.get(); } // SolrCore.verbose("RealTimeGet using searcher ", searcher); SchemaField idField = core.getSchema().getUniqueKeyField(); int docid = searcher.getFirstMatch(new Term(idField.getName(), idBytes)); if (docid < 0) return null; Document luceneDocument = searcher.doc(docid); sid = toSolrInputDocument(luceneDocument, core.getSchema()); } } finally { if (searcherHolder != null) { searcherHolder.decref(); } } return sid; }
// in core/src/java/org/apache/solr/handler/component/RealTimeGetComponent.java
private String[] sliceToShards(ResponseBuilder rb, String collection, String slice) { String lookup = collection + '_' + slice; // seems either form may be filled in rb.slices? // We use this since the shard handler already filled in the slice to shards mapping. // A better approach would be to avoid filling out every slice each time, or to cache // the mappings. for (int i=0; i<rb.slices.length; i++) { log.info("LOOKUP_SLICE:" + rb.slices[i] + "=" + rb.shards[i]); if (lookup.equals(rb.slices[i]) || slice.equals(rb.slices[i])) { return new String[]{rb.shards[i]}; } } throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Can't find shard '" + lookup + "'"); }
// in core/src/java/org/apache/solr/handler/component/HttpShardHandlerFactory.java
public void init(PluginInfo info) { NamedList args = info.initArgs; this.soTimeout = getParameter(args, HttpClientUtil.PROP_SO_TIMEOUT, soTimeout); this.scheme = getParameter(args, INIT_URL_SCHEME, "http://"); this.scheme = (this.scheme.endsWith("://")) ? this.scheme : this.scheme + "://"; this.connectionTimeout = getParameter(args, HttpClientUtil.PROP_CONNECTION_TIMEOUT, connectionTimeout); this.maxConnectionsPerHost = getParameter(args, HttpClientUtil.PROP_MAX_CONNECTIONS_PER_HOST, maxConnectionsPerHost); this.corePoolSize = getParameter(args, INIT_CORE_POOL_SIZE, corePoolSize); this.maximumPoolSize = getParameter(args, INIT_MAX_POOL_SIZE, maximumPoolSize); this.keepAliveTime = getParameter(args, MAX_THREAD_IDLE_TIME, keepAliveTime); this.queueSize = getParameter(args, INIT_SIZE_OF_QUEUE, queueSize); this.accessPolicy = getParameter(args, INIT_FAIRNESS_POLICY, accessPolicy); BlockingQueue<Runnable> blockingQueue = (this.queueSize == -1) ? new SynchronousQueue<Runnable>(this.accessPolicy) : new ArrayBlockingQueue<Runnable>(this.queueSize, this.accessPolicy); this.commExecutor = new ThreadPoolExecutor( this.corePoolSize, this.maximumPoolSize, this.keepAliveTime, TimeUnit.SECONDS, blockingQueue, new DefaultSolrThreadFactory("httpShardExecutor") ); ModifiableSolrParams clientParams = new ModifiableSolrParams(); clientParams.set(HttpClientUtil.PROP_MAX_CONNECTIONS_PER_HOST, maxConnectionsPerHost); clientParams.set(HttpClientUtil.PROP_MAX_CONNECTIONS, 10000); clientParams.set(HttpClientUtil.PROP_SO_TIMEOUT, soTimeout); clientParams.set(HttpClientUtil.PROP_CONNECTION_TIMEOUT, connectionTimeout); clientParams.set(HttpClientUtil.PROP_USE_RETRY, false); this.defaultClient = HttpClientUtil.createClient(clientParams); try { loadbalancer = new LBHttpSolrServer(defaultClient); } catch (MalformedURLException e) { // should be impossible since we're not passing any URLs here throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); } }
// in core/src/java/org/apache/solr/handler/component/TermVectorComponent.java
Override public void process(ResponseBuilder rb) throws IOException { SolrParams params = rb.req.getParams(); if (!params.getBool(COMPONENT_NAME, false)) { return; } NamedList<Object> termVectors = new NamedList<Object>(); rb.rsp.add(TERM_VECTORS, termVectors); FieldOptions allFields = new FieldOptions(); //figure out what options we have, and try to get the appropriate vector allFields.termFreq = params.getBool(TermVectorParams.TF, false); allFields.positions = params.getBool(TermVectorParams.POSITIONS, false); allFields.offsets = params.getBool(TermVectorParams.OFFSETS, false); allFields.docFreq = params.getBool(TermVectorParams.DF, false); allFields.tfIdf = params.getBool(TermVectorParams.TF_IDF, false); //boolean cacheIdf = params.getBool(TermVectorParams.IDF, false); //short cut to all values. if (params.getBool(TermVectorParams.ALL, false)) { allFields.termFreq = true; allFields.positions = true; allFields.offsets = true; allFields.docFreq = true; allFields.tfIdf = true; } String fldLst = params.get(TermVectorParams.FIELDS); if (fldLst == null) { fldLst = params.get(CommonParams.FL); } //use this to validate our fields IndexSchema schema = rb.req.getSchema(); //Build up our per field mapping Map<String, FieldOptions> fieldOptions = new HashMap<String, FieldOptions>(); NamedList<List<String>> warnings = new NamedList<List<String>>(); List<String> noTV = new ArrayList<String>(); List<String> noPos = new ArrayList<String>(); List<String> noOff = new ArrayList<String>(); //we have specific fields to retrieve if (fldLst != null) { String [] fields = SolrPluginUtils.split(fldLst); for (String field : fields) { SchemaField sf = schema.getFieldOrNull(field); if (sf != null) { if (sf.storeTermVector()) { FieldOptions option = fieldOptions.get(field); if (option == null) { option = new FieldOptions(); option.fieldName = field; fieldOptions.put(field, option); } //get the per field mappings option.termFreq = params.getFieldBool(field, TermVectorParams.TF, allFields.termFreq); option.docFreq = params.getFieldBool(field, TermVectorParams.DF, allFields.docFreq); option.tfIdf = params.getFieldBool(field, TermVectorParams.TF_IDF, allFields.tfIdf); //Validate these are even an option option.positions = params.getFieldBool(field, TermVectorParams.POSITIONS, allFields.positions); if (option.positions && !sf.storeTermPositions()){ noPos.add(field); } option.offsets = params.getFieldBool(field, TermVectorParams.OFFSETS, allFields.offsets); if (option.offsets && !sf.storeTermOffsets()){ noOff.add(field); } } else {//field doesn't have term vectors noTV.add(field); } } else { //field doesn't exist throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "undefined field: " + field); } } } //else, deal with all fields boolean hasWarnings = false; if (!noTV.isEmpty()) { warnings.add("noTermVectors", noTV); hasWarnings = true; } if (!noPos.isEmpty()) { warnings.add("noPositions", noPos); hasWarnings = true; } if (!noOff.isEmpty()) { warnings.add("noOffsets", noOff); hasWarnings = true; } if (hasWarnings) { termVectors.add("warnings", warnings); } DocListAndSet listAndSet = rb.getResults(); List<Integer> docIds = getInts(params.getParams(TermVectorParams.DOC_IDS)); Iterator<Integer> iter; if (docIds != null && !docIds.isEmpty()) { iter = docIds.iterator(); } else { DocList list = listAndSet.docList; iter = list.iterator(); } SolrIndexSearcher searcher = rb.req.getSearcher(); IndexReader reader = searcher.getIndexReader(); //the TVMapper is a TermVectorMapper which can be used to optimize loading of Term Vectors SchemaField keyField = schema.getUniqueKeyField(); String uniqFieldName = null; if (keyField != null) { uniqFieldName = keyField.getName(); } //Only load the id field to get the uniqueKey of that //field final String finalUniqFieldName = uniqFieldName; final List<String> uniqValues = new ArrayList<String>(); // TODO: is this required to be single-valued? if so, we should STOP // once we find it... final StoredFieldVisitor getUniqValue = new StoredFieldVisitor() { @Override public void stringField(FieldInfo fieldInfo, String value) throws IOException { uniqValues.add(value); } @Override public void intField(FieldInfo fieldInfo, int value) throws IOException { uniqValues.add(Integer.toString(value)); } @Override public void longField(FieldInfo fieldInfo, long value) throws IOException { uniqValues.add(Long.toString(value)); } @Override public Status needsField(FieldInfo fieldInfo) throws IOException { return (fieldInfo.name.equals(finalUniqFieldName)) ? Status.YES : Status.NO; } }; TermsEnum termsEnum = null; while (iter.hasNext()) { Integer docId = iter.next(); NamedList<Object> docNL = new NamedList<Object>(); termVectors.add("doc-" + docId, docNL); if (keyField != null) { reader.document(docId, getUniqValue); String uniqVal = null; if (uniqValues.size() != 0) { uniqVal = uniqValues.get(0); uniqValues.clear(); docNL.add("uniqueKey", uniqVal); termVectors.add("uniqueKeyFieldName", uniqFieldName); } } if (!fieldOptions.isEmpty()) { for (Map.Entry<String, FieldOptions> entry : fieldOptions.entrySet()) { final String field = entry.getKey(); final Terms vector = reader.getTermVector(docId, field); if (vector != null) { termsEnum = vector.iterator(termsEnum); mapOneVector(docNL, entry.getValue(), reader, docId, vector.iterator(termsEnum), field); } } } else { // extract all fields final Fields vectors = reader.getTermVectors(docId); final FieldsEnum fieldsEnum = vectors.iterator(); String field; while((field = fieldsEnum.next()) != null) { Terms terms = fieldsEnum.terms(); if (terms != null) { termsEnum = terms.iterator(termsEnum); mapOneVector(docNL, allFields, reader, docId, termsEnum, field); } } } } }
// in core/src/java/org/apache/solr/handler/component/TermVectorComponent.java
private List<Integer> getInts(String[] vals) { List<Integer> result = null; if (vals != null && vals.length > 0) { result = new ArrayList<Integer>(vals.length); for (int i = 0; i < vals.length; i++) { try { result.add(new Integer(vals[i])); } catch (NumberFormatException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e.getMessage(), e); } } } return result; }
// in core/src/java/org/apache/solr/handler/ContentStreamHandlerBase.java
Override public void handleRequestBody(SolrQueryRequest req, SolrQueryResponse rsp) throws Exception { SolrParams params = req.getParams(); UpdateRequestProcessorChain processorChain = req.getCore().getUpdateProcessingChain(params.get(UpdateParams.UPDATE_CHAIN)); UpdateRequestProcessor processor = processorChain.createProcessor(req, rsp); try { ContentStreamLoader documentLoader = newLoader(req, processor); Iterable<ContentStream> streams = req.getContentStreams(); if (streams == null) { if (!RequestHandlerUtils.handleCommit(req, processor, params, false) && !RequestHandlerUtils.handleRollback(req, processor, params, false)) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "missing content stream"); } } else { for (ContentStream stream : streams) { documentLoader.load(req, rsp, stream, processor); } // Perhaps commit from the parameters RequestHandlerUtils.handleCommit(req, processor, params, false); RequestHandlerUtils.handleRollback(req, processor, params, false); } } finally { // finish the request processor.finish(); } }
// in core/src/java/org/apache/solr/handler/admin/ShowFileRequestHandler.java
private void showFromZooKeeper(SolrQueryRequest req, SolrQueryResponse rsp, CoreContainer coreContainer) throws KeeperException, InterruptedException, UnsupportedEncodingException { String adminFile = null; SolrCore core = req.getCore(); SolrZkClient zkClient = coreContainer.getZkController().getZkClient(); final ZkSolrResourceLoader loader = (ZkSolrResourceLoader) core .getResourceLoader(); String confPath = loader.getCollectionZkPath(); String fname = req.getParams().get("file", null); if (fname == null) { adminFile = confPath; } else { fname = fname.replace('\\', '/'); // normalize slashes if (hiddenFiles.contains(fname.toUpperCase(Locale.ENGLISH))) { throw new SolrException(ErrorCode.FORBIDDEN, "Can not access: " + fname); } if (fname.indexOf("..") >= 0) { throw new SolrException(ErrorCode.FORBIDDEN, "Invalid path: " + fname); } adminFile = confPath + "/" + fname; } // Make sure the file exists, is readable and is not a hidden file if (!zkClient.exists(adminFile, true)) { throw new SolrException(ErrorCode.BAD_REQUEST, "Can not find: " + adminFile); } // Show a directory listing List<String> children = zkClient.getChildren(adminFile, null, true); if (children.size() > 0) { NamedList<SimpleOrderedMap<Object>> files = new SimpleOrderedMap<SimpleOrderedMap<Object>>(); for (String f : children) { if (hiddenFiles.contains(f.toUpperCase(Locale.ENGLISH))) { continue; // don't show 'hidden' files } if (f.startsWith(".")) { continue; // skip hidden system files... } SimpleOrderedMap<Object> fileInfo = new SimpleOrderedMap<Object>(); files.add(f, fileInfo); List<String> fchildren = zkClient.getChildren(adminFile, null, true); if (fchildren.size() > 0) { fileInfo.add("directory", true); } else { // TODO? content type fileInfo.add("size", f.length()); } // TODO: ? // fileInfo.add( "modified", new Date( f.lastModified() ) ); } rsp.add("files", files); } else { // Include the file contents // The file logic depends on RawResponseWriter, so force its use. ModifiableSolrParams params = new ModifiableSolrParams(req.getParams()); params.set(CommonParams.WT, "raw"); req.setParams(params); ContentStreamBase content = new ContentStreamBase.StringStream( new String(zkClient.getData(adminFile, null, null, true), "UTF-8")); content.setContentType(req.getParams().get(USE_CONTENT_TYPE)); rsp.add(RawResponseWriter.CONTENT, content); } rsp.setHttpCaching(false); }
// in core/src/java/org/apache/solr/handler/admin/ShowFileRequestHandler.java
private void showFromFileSystem(SolrQueryRequest req, SolrQueryResponse rsp) throws IOException { File adminFile = null; final SolrResourceLoader loader = req.getCore().getResourceLoader(); File configdir = new File( loader.getConfigDir() ); if (!configdir.exists()) { // TODO: maybe we should just open it this way to start with? try { configdir = new File( loader.getClassLoader().getResource(loader.getConfigDir()).toURI() ); } catch (URISyntaxException e) { throw new SolrException( ErrorCode.FORBIDDEN, "Can not access configuration directory!"); } } String fname = req.getParams().get("file", null); if( fname == null ) { adminFile = configdir; } else { fname = fname.replace( '\\', '/' ); // normalize slashes if( hiddenFiles.contains( fname.toUpperCase(Locale.ENGLISH) ) ) { throw new SolrException( ErrorCode.FORBIDDEN, "Can not access: "+fname ); } if( fname.indexOf( ".." ) >= 0 ) { throw new SolrException( ErrorCode.FORBIDDEN, "Invalid path: "+fname ); } adminFile = new File( configdir, fname ); } // Make sure the file exists, is readable and is not a hidden file if( !adminFile.exists() ) { throw new SolrException( ErrorCode.BAD_REQUEST, "Can not find: "+adminFile.getName() + " ["+adminFile.getAbsolutePath()+"]" ); } if( !adminFile.canRead() || adminFile.isHidden() ) { throw new SolrException( ErrorCode.BAD_REQUEST, "Can not show: "+adminFile.getName() + " ["+adminFile.getAbsolutePath()+"]" ); } // Show a directory listing if( adminFile.isDirectory() ) { int basePath = configdir.getAbsolutePath().length() + 1; NamedList<SimpleOrderedMap<Object>> files = new SimpleOrderedMap<SimpleOrderedMap<Object>>(); for( File f : adminFile.listFiles() ) { String path = f.getAbsolutePath().substring( basePath ); path = path.replace( '\\', '/' ); // normalize slashes if( hiddenFiles.contains( path.toUpperCase(Locale.ENGLISH) ) ) { continue; // don't show 'hidden' files } if( f.isHidden() || f.getName().startsWith( "." ) ) { continue; // skip hidden system files... } SimpleOrderedMap<Object> fileInfo = new SimpleOrderedMap<Object>(); files.add( path, fileInfo ); if( f.isDirectory() ) { fileInfo.add( "directory", true ); } else { // TODO? content type fileInfo.add( "size", f.length() ); } fileInfo.add( "modified", new Date( f.lastModified() ) ); } rsp.add( "files", files ); } else { // Include the file contents //The file logic depends on RawResponseWriter, so force its use. ModifiableSolrParams params = new ModifiableSolrParams( req.getParams() ); params.set( CommonParams.WT, "raw" ); req.setParams(params); ContentStreamBase content = new ContentStreamBase.FileStream( adminFile ); content.setContentType( req.getParams().get( USE_CONTENT_TYPE ) ); rsp.add(RawResponseWriter.CONTENT, content); } rsp.setHttpCaching(false); }
// in core/src/java/org/apache/solr/handler/admin/LukeRequestHandler.java
public static ShowStyle get(String v) { if(v==null) return null; if("schema".equalsIgnoreCase(v)) return SCHEMA; if("index".equalsIgnoreCase(v)) return INDEX; if("doc".equalsIgnoreCase(v)) return DOC; if("all".equalsIgnoreCase(v)) return ALL; throw new SolrException(ErrorCode.BAD_REQUEST, "Unknown Show Style: "+v); }
// in core/src/java/org/apache/solr/handler/admin/LukeRequestHandler.java
Override public void handleRequestBody(SolrQueryRequest req, SolrQueryResponse rsp) throws Exception { IndexSchema schema = req.getSchema(); SolrIndexSearcher searcher = req.getSearcher(); DirectoryReader reader = searcher.getIndexReader(); SolrParams params = req.getParams(); ShowStyle style = ShowStyle.get(params.get("show")); // If no doc is given, show all fields and top terms rsp.add("index", getIndexInfo(reader)); if(ShowStyle.INDEX==style) { return; // that's all we need } Integer docId = params.getInt( DOC_ID ); if( docId == null && params.get( ID ) != null ) { // Look for something with a given solr ID SchemaField uniqueKey = schema.getUniqueKeyField(); String v = uniqueKey.getType().toInternal( params.get(ID) ); Term t = new Term( uniqueKey.getName(), v ); docId = searcher.getFirstMatch( t ); if( docId < 0 ) { throw new SolrException( SolrException.ErrorCode.NOT_FOUND, "Can't find document: "+params.get( ID ) ); } } // Read the document from the index if( docId != null ) { if( style != null && style != ShowStyle.DOC ) { throw new SolrException(ErrorCode.BAD_REQUEST, "missing doc param for doc style"); } Document doc = null; try { doc = reader.document( docId ); } catch( Exception ex ) {} if( doc == null ) { throw new SolrException( SolrException.ErrorCode.NOT_FOUND, "Can't find document: "+docId ); } SimpleOrderedMap<Object> info = getDocumentFieldsInfo( doc, docId, reader, schema ); SimpleOrderedMap<Object> docinfo = new SimpleOrderedMap<Object>(); docinfo.add( "docId", docId ); docinfo.add( "lucene", info ); docinfo.add( "solr", doc ); rsp.add( "doc", docinfo ); } else if ( ShowStyle.SCHEMA == style ) { rsp.add( "schema", getSchemaInfo( req.getSchema() ) ); } else { rsp.add( "fields", getIndexedFieldsInfo(req) ) ; } // Add some generally helpful information NamedList<Object> info = new SimpleOrderedMap<Object>(); info.add( "key", getFieldFlagsKey() ); info.add( "NOTE", "Document Frequency (df) is not updated when a document is marked for deletion. df values include deleted documents." ); rsp.add( "info", info ); rsp.setHttpCaching(false); }
// in core/src/java/org/apache/solr/handler/admin/AdminHandlers.java
public void inform(SolrCore core) { String path = null; for( Map.Entry<String, SolrRequestHandler> entry : core.getRequestHandlers().entrySet() ) { if( entry.getValue() == this ) { path = entry.getKey(); break; } } if( path == null ) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "The AdminHandler is not registered with the current core." ); } if( !path.startsWith( "/" ) ) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "The AdminHandler needs to be registered to a path. Typically this is '/admin'" ); } // Remove the parent handler core.registerRequestHandler(path, null); if( !path.endsWith( "/" ) ) { path += "/"; } StandardHandler[] list = new StandardHandler[] { new StandardHandler( "luke", new LukeRequestHandler() ), new StandardHandler( "system", new SystemInfoHandler() ), new StandardHandler( "mbeans", new SolrInfoMBeanHandler() ), new StandardHandler( "plugins", new PluginInfoHandler() ), new StandardHandler( "threads", new ThreadDumpHandler() ), new StandardHandler( "properties", new PropertiesRequestHandler() ), new StandardHandler( "logging", new LoggingHandler() ), new StandardHandler( "file", new ShowFileRequestHandler() ) }; for( StandardHandler handler : list ) { if( core.getRequestHandler( path+handler.name ) == null ) { handler.handler.init( initArgs ); core.registerRequestHandler( path+handler.name, handler.handler ); if( handler.handler instanceof SolrCoreAware ) { ((SolrCoreAware)handler.handler).inform(core); } } } }
// in core/src/java/org/apache/solr/handler/admin/AdminHandlers.java
public void handleRequest(SolrQueryRequest req, SolrQueryResponse rsp) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "The AdminHandler should never be called directly" ); }
// in core/src/java/org/apache/solr/handler/admin/SolrInfoMBeanHandler.java
Override public void handleRequestBody(SolrQueryRequest req, SolrQueryResponse rsp) throws Exception { NamedList<NamedList<NamedList<Object>>> cats = getMBeanInfo(req); if(req.getParams().getBool("diff", false)) { ContentStream body = null; try { body = req.getContentStreams().iterator().next(); } catch(Exception ex) { throw new SolrException(ErrorCode.BAD_REQUEST, "missing content-stream for diff"); } String content = IOUtils.toString(body.getReader()); NamedList<NamedList<NamedList<Object>>> ref = fromXML(content); // Normalize the output SolrQueryResponse wrap = new SolrQueryResponse(); wrap.add("solr-mbeans", cats); cats = (NamedList<NamedList<NamedList<Object>>>) BinaryResponseWriter.getParsedResponse(req, wrap).get("solr-mbeans"); // Get rid of irrelevant things ref = normalize(ref); cats = normalize(cats); // Only the changes boolean showAll = req.getParams().getBool("all", false); rsp.add("solr-mbeans", getDiff(ref,cats, showAll)); } else { rsp.add("solr-mbeans", cats); } rsp.setHttpCaching(false); // never cache, no matter what init config looks like }
// in core/src/java/org/apache/solr/handler/admin/SolrInfoMBeanHandler.java
static NamedList<NamedList<NamedList<Object>>> fromXML(String content) { int idx = content.indexOf("<response>"); if(idx<0) { throw new SolrException(ErrorCode.BAD_REQUEST, "Body does not appear to be an XML response"); } try { XMLResponseParser parser = new XMLResponseParser(); return (NamedList<NamedList<NamedList<Object>>>) parser.processResponse(new StringReader(content.substring(idx))).get("solr-mbeans"); } catch(Exception ex) { throw new SolrException(ErrorCode.BAD_REQUEST, "Unable to read original XML", ex); } }
// in core/src/java/org/apache/solr/handler/admin/CoreAdminHandler.java
Override final public void init(NamedList args) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "CoreAdminHandler should not be configured in solrconf.xml\n" + "it is a special Handler configured directly by the RequestDispatcher"); }
// in core/src/java/org/apache/solr/handler/admin/CoreAdminHandler.java
Override public void handleRequestBody(SolrQueryRequest req, SolrQueryResponse rsp) throws Exception { // Make sure the cores is enabled CoreContainer cores = getCoreContainer(); if (cores == null) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Core container instance missing"); } boolean doPersist = false; // Pick the action SolrParams params = req.getParams(); CoreAdminAction action = CoreAdminAction.STATUS; String a = params.get(CoreAdminParams.ACTION); if (a != null) { action = CoreAdminAction.get(a); if (action == null) { doPersist = this.handleCustomAction(req, rsp); } } if (action != null) { switch (action) { case CREATE: { doPersist = this.handleCreateAction(req, rsp); break; } case RENAME: { doPersist = this.handleRenameAction(req, rsp); break; } case UNLOAD: { doPersist = this.handleUnloadAction(req, rsp); break; } case STATUS: { doPersist = this.handleStatusAction(req, rsp); break; } case PERSIST: { doPersist = this.handlePersistAction(req, rsp); break; } case RELOAD: { doPersist = this.handleReloadAction(req, rsp); break; } case SWAP: { doPersist = this.handleSwapAction(req, rsp); break; } case MERGEINDEXES: { doPersist = this.handleMergeAction(req, rsp); break; } case PREPRECOVERY: { this.handleWaitForStateAction(req, rsp); break; } case REQUESTRECOVERY: { this.handleRequestRecoveryAction(req, rsp); break; } case DISTRIBURL: { this.handleDistribUrlAction(req, rsp); break; } default: { doPersist = this.handleCustomAction(req, rsp); break; } case LOAD: break; } } // Should we persist the changes? if (doPersist) { cores.persist(); rsp.add("saved", cores.getConfigFile().getAbsolutePath()); } rsp.setHttpCaching(false); }
// in core/src/java/org/apache/solr/handler/admin/CoreAdminHandler.java
protected boolean handleMergeAction(SolrQueryRequest req, SolrQueryResponse rsp) throws IOException { SolrParams params = req.getParams(); String cname = params.required().get(CoreAdminParams.CORE); SolrCore core = coreContainer.getCore(cname); SolrQueryRequest wrappedReq = null; SolrCore[] sourceCores = null; RefCounted<SolrIndexSearcher>[] searchers = null; // stores readers created from indexDir param values DirectoryReader[] readersToBeClosed = null; Directory[] dirsToBeReleased = null; if (core != null) { try { String[] dirNames = params.getParams(CoreAdminParams.INDEX_DIR); if (dirNames == null || dirNames.length == 0) { String[] sources = params.getParams("srcCore"); if (sources == null || sources.length == 0) throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "At least one indexDir or srcCore must be specified"); sourceCores = new SolrCore[sources.length]; for (int i = 0; i < sources.length; i++) { String source = sources[i]; SolrCore srcCore = coreContainer.getCore(source); if (srcCore == null) throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Core: " + source + " does not exist"); sourceCores[i] = srcCore; } } else { readersToBeClosed = new DirectoryReader[dirNames.length]; dirsToBeReleased = new Directory[dirNames.length]; DirectoryFactory dirFactory = core.getDirectoryFactory(); for (int i = 0; i < dirNames.length; i++) { Directory dir = dirFactory.get(dirNames[i], core.getSolrConfig().indexConfig.lockType); dirsToBeReleased[i] = dir; // TODO: why doesn't this use the IR factory? what is going on here? readersToBeClosed[i] = DirectoryReader.open(dir); } } DirectoryReader[] readers = null; if (readersToBeClosed != null) { readers = readersToBeClosed; } else { readers = new DirectoryReader[sourceCores.length]; searchers = new RefCounted[sourceCores.length]; for (int i = 0; i < sourceCores.length; i++) { SolrCore solrCore = sourceCores[i]; // record the searchers so that we can decref searchers[i] = solrCore.getSearcher(); readers[i] = searchers[i].get().getIndexReader(); } } UpdateRequestProcessorChain processorChain = core.getUpdateProcessingChain(params.get(UpdateParams.UPDATE_CHAIN)); wrappedReq = new LocalSolrQueryRequest(core, req.getParams()); UpdateRequestProcessor processor = processorChain.createProcessor(wrappedReq, rsp); processor.processMergeIndexes(new MergeIndexesCommand(readers, req)); } finally { if (searchers != null) { for (RefCounted<SolrIndexSearcher> searcher : searchers) { if (searcher != null) searcher.decref(); } } if (sourceCores != null) { for (SolrCore solrCore : sourceCores) { if (solrCore != null) solrCore.close(); } } if (readersToBeClosed != null) IOUtils.closeWhileHandlingException(readersToBeClosed); if (dirsToBeReleased != null) { for (Directory dir : dirsToBeReleased) { DirectoryFactory dirFactory = core.getDirectoryFactory(); dirFactory.release(dir); } } if (wrappedReq != null) wrappedReq.close(); core.close(); } } return coreContainer.isPersistent(); }
// in core/src/java/org/apache/solr/handler/admin/CoreAdminHandler.java
protected boolean handleCustomAction(SolrQueryRequest req, SolrQueryResponse rsp) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Unsupported operation: " + req.getParams().get(CoreAdminParams.ACTION)); }
// in core/src/java/org/apache/solr/handler/admin/CoreAdminHandler.java
protected boolean handleCreateAction(SolrQueryRequest req, SolrQueryResponse rsp) throws SolrException { try { SolrParams params = req.getParams(); String name = params.get(CoreAdminParams.NAME); //for now, do not allow creating new core with same name when in cloud mode //XXX perhaps it should just be unregistered from cloud before readding it?, //XXX perhaps we should also check that cores are of same type before adding new core to collection? if (coreContainer.getZkController() != null) { if (coreContainer.getCore(name) != null) { log.info("Re-creating a core with existing name is not allowed in cloud mode"); throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Core with name '" + name + "' already exists."); } } String instanceDir = params.get(CoreAdminParams.INSTANCE_DIR); if (instanceDir == null) { // instanceDir = coreContainer.getSolrHome() + "/" + name; instanceDir = name; // bare name is already relative to solr home } CoreDescriptor dcore = new CoreDescriptor(coreContainer, name, instanceDir); // fillup optional parameters String opts = params.get(CoreAdminParams.CONFIG); if (opts != null) dcore.setConfigName(opts); opts = params.get(CoreAdminParams.SCHEMA); if (opts != null) dcore.setSchemaName(opts); opts = params.get(CoreAdminParams.DATA_DIR); if (opts != null) dcore.setDataDir(opts); CloudDescriptor cd = dcore.getCloudDescriptor(); if (cd != null) { cd.setParams(req.getParams()); opts = params.get(CoreAdminParams.COLLECTION); if (opts != null) cd.setCollectionName(opts); opts = params.get(CoreAdminParams.SHARD); if (opts != null) cd.setShardId(opts); opts = params.get(CoreAdminParams.ROLES); if (opts != null) cd.setRoles(opts); Integer numShards = params.getInt(ZkStateReader.NUM_SHARDS_PROP); if (numShards != null) cd.setNumShards(numShards); } // Process all property.name=value parameters and set them as name=value core properties Properties coreProperties = new Properties(); Iterator<String> parameterNamesIterator = params.getParameterNamesIterator(); while (parameterNamesIterator.hasNext()) { String parameterName = parameterNamesIterator.next(); if(parameterName.startsWith(CoreAdminParams.PROPERTY_PREFIX)) { String parameterValue = params.get(parameterName); String propertyName = parameterName.substring(CoreAdminParams.PROPERTY_PREFIX.length()); // skip prefix coreProperties.put(propertyName, parameterValue); } } dcore.setCoreProperties(coreProperties); SolrCore core = coreContainer.create(dcore); coreContainer.register(name, core, false); rsp.add("core", core.getName()); return coreContainer.isPersistent(); } catch (Exception ex) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Error executing default implementation of CREATE", ex); } }
// in core/src/java/org/apache/solr/handler/admin/CoreAdminHandler.java
protected boolean handleUnloadAction(SolrQueryRequest req, SolrQueryResponse rsp) throws SolrException { SolrParams params = req.getParams(); String cname = params.get(CoreAdminParams.CORE); SolrCore core = coreContainer.remove(cname); if(core == null){ throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "No such core exists '" + cname + "'"); } else { if (coreContainer.getZkController() != null) { log.info("Unregistering core " + cname + " from cloudstate."); try { coreContainer.getZkController().unregister(cname, core.getCoreDescriptor().getCloudDescriptor()); } catch (InterruptedException e) { Thread.currentThread().interrupt(); throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Could not unregister core " + cname + " from cloudstate: " + e.getMessage(), e); } catch (KeeperException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Could not unregister core " + cname + " from cloudstate: " + e.getMessage(), e); } } } if (params.getBool(CoreAdminParams.DELETE_INDEX, false)) { core.addCloseHook(new CloseHook() { @Override public void preClose(SolrCore core) {} @Override public void postClose(SolrCore core) { File dataDir = new File(core.getIndexDir()); File[] files = dataDir.listFiles(); if (files != null) { for (File file : files) { if (!file.delete()) { log.error(file.getAbsolutePath() + " could not be deleted on core unload"); } } if (!dataDir.delete()) log.error(dataDir.getAbsolutePath() + " could not be deleted on core unload"); } else { log.error(dataDir.getAbsolutePath() + " could not be deleted on core unload"); } } }); }
// in core/src/java/org/apache/solr/handler/admin/CoreAdminHandler.java
protected boolean handleStatusAction(SolrQueryRequest req, SolrQueryResponse rsp) throws SolrException { SolrParams params = req.getParams(); String cname = params.get(CoreAdminParams.CORE); boolean doPersist = false; NamedList<Object> status = new SimpleOrderedMap<Object>(); try { if (cname == null) { rsp.add("defaultCoreName", coreContainer.getDefaultCoreName()); for (String name : coreContainer.getCoreNames()) { status.add(name, getCoreStatus(coreContainer, name)); } } else { status.add(cname, getCoreStatus(coreContainer, cname)); } rsp.add("status", status); doPersist = false; // no state change return doPersist; } catch (Exception ex) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Error handling 'status' action ", ex); } }
// in core/src/java/org/apache/solr/handler/admin/CoreAdminHandler.java
protected boolean handlePersistAction(SolrQueryRequest req, SolrQueryResponse rsp) throws SolrException { SolrParams params = req.getParams(); boolean doPersist = false; String fileName = params.get(CoreAdminParams.FILE); if (fileName != null) { File file = new File(coreContainer.getConfigFile().getParentFile(), fileName); coreContainer.persistFile(file); rsp.add("saved", file.getAbsolutePath()); doPersist = false; } else if (!coreContainer.isPersistent()) { throw new SolrException(SolrException.ErrorCode.FORBIDDEN, "Persistence is not enabled"); } else doPersist = true; return doPersist; }
// in core/src/java/org/apache/solr/handler/admin/CoreAdminHandler.java
protected boolean handleReloadAction(SolrQueryRequest req, SolrQueryResponse rsp) { SolrParams params = req.getParams(); String cname = params.get(CoreAdminParams.CORE); try { coreContainer.reload(cname); return false; // no change on reload } catch (Exception ex) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Error handling 'reload' action", ex); } }
// in core/src/java/org/apache/solr/handler/admin/CoreAdminHandler.java
protected void handleWaitForStateAction(SolrQueryRequest req, SolrQueryResponse rsp) throws IOException, InterruptedException { final SolrParams params = req.getParams(); String cname = params.get(CoreAdminParams.CORE); if (cname == null) { cname = ""; } String nodeName = params.get("nodeName"); String coreNodeName = params.get("coreNodeName"); String waitForState = params.get("state"); Boolean checkLive = params.getBool("checkLive"); int pauseFor = params.getInt("pauseFor", 0); String state = null; boolean live = false; int retry = 0; while (true) { SolrCore core = null; try { core = coreContainer.getCore(cname); if (core == null && retry == 30) { throw new SolrException(ErrorCode.BAD_REQUEST, "core not found:" + cname); } if (core != null) { // wait until we are sure the recovering node is ready // to accept updates CloudDescriptor cloudDescriptor = core.getCoreDescriptor() .getCloudDescriptor(); CloudState cloudState = coreContainer.getZkController() .getCloudState(); String collection = cloudDescriptor.getCollectionName(); Slice slice = cloudState.getSlice(collection, cloudDescriptor.getShardId()); if (slice != null) { ZkNodeProps nodeProps = slice.getShards().get(coreNodeName); if (nodeProps != null) { state = nodeProps.get(ZkStateReader.STATE_PROP); live = cloudState.liveNodesContain(nodeName); if (nodeProps != null && state.equals(waitForState)) { if (checkLive == null) { break; } else if (checkLive && live) { break; } else if (!checkLive && !live) { break; } } } } } if (retry++ == 30) { throw new SolrException(ErrorCode.BAD_REQUEST, "I was asked to wait on state " + waitForState + " for " + nodeName + " but I still do not see the request state. I see state: " + state + " live:" + live); } } finally { if (core != null) { core.close(); } } Thread.sleep(1000); } // small safety net for any updates that started with state that // kept it from sending the update to be buffered - // pause for a while to let any outstanding updates finish // System.out.println("I saw state:" + state + " sleep for " + pauseFor + // " live:" + live); Thread.sleep(pauseFor); // solrcloud_debug // try {; // LocalSolrQueryRequest r = new LocalSolrQueryRequest(core, new // ModifiableSolrParams()); // CommitUpdateCommand commitCmd = new CommitUpdateCommand(r, false); // commitCmd.softCommit = true; // core.getUpdateHandler().commit(commitCmd); // RefCounted<SolrIndexSearcher> searchHolder = // core.getNewestSearcher(false); // SolrIndexSearcher searcher = searchHolder.get(); // try { // System.out.println(core.getCoreDescriptor().getCoreContainer().getZkController().getNodeName() // + " to replicate " // + searcher.search(new MatchAllDocsQuery(), 1).totalHits + " gen:" + // core.getDeletionPolicy().getLatestCommit().getGeneration() + " data:" + // core.getDataDir()); // } finally { // searchHolder.decref(); // } // } catch (Exception e) { // // } }
// in core/src/java/org/apache/solr/handler/admin/LoggingHandler.java
Override public void handleRequestBody(SolrQueryRequest req, SolrQueryResponse rsp) throws Exception { // Don't do anything if the framework is unknown if(watcher==null) { rsp.add("error", "Logging Not Initalized"); return; } rsp.add("watcher", watcher.getName()); SolrParams params = req.getParams(); if(params.get("threshold")!=null) { watcher.setThreshold(params.get("threshold")); } // Write something at each level if(params.get("test")!=null) { log.trace("trace message"); log.debug( "debug message"); log.info("info (with exception)", new RuntimeException("test") ); log.warn("warn (with exception)", new RuntimeException("test") ); log.error("error (with exception)", new RuntimeException("test") ); } String[] set = params.getParams("set"); if (set != null) { for (String pair : set) { String[] split = pair.split(":"); if (split.length != 2) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "Invalid format, expected level:value, got " + pair); } String category = split[0]; String level = split[1]; watcher.setLogLevel(category, level); } } String since = req.getParams().get("since"); if(since != null) { long time = -1; try { time = Long.parseLong(since); } catch(Exception ex) { throw new SolrException(ErrorCode.BAD_REQUEST, "invalid timestamp: "+since); } AtomicBoolean found = new AtomicBoolean(false); SolrDocumentList docs = watcher.getHistory(time, found); if(docs==null) { rsp.add("error", "History not enabled"); return; } else { SimpleOrderedMap<Object> info = new SimpleOrderedMap<Object>(); if(time>0) { info.add("since", time); info.add("found", found); } else { info.add("levels", watcher.getAllLevels()); // show for the first request } info.add("last", watcher.getLastEvent()); info.add("buffer", watcher.getHistorySize()); info.add("threshold", watcher.getThreshold()); rsp.add("info", info); rsp.add("history", docs); } } else { rsp.add("levels", watcher.getAllLevels()); List<LoggerInfo> loggers = new ArrayList<LoggerInfo>(watcher.getAllLoggers()); Collections.sort(loggers); List<SimpleOrderedMap<?>> info = new ArrayList<SimpleOrderedMap<?>>(); for(LoggerInfo wrap:loggers) { info.add(wrap.getInfo()); } rsp.add("loggers", info); } rsp.setHttpCaching(false); }
// in core/src/java/org/apache/solr/handler/PingRequestHandler.java
Override public void handleRequestBody(SolrQueryRequest req, SolrQueryResponse rsp) throws Exception { SolrParams params = req.getParams(); // in this case, we want to default distrib to false so // we only ping the single node Boolean distrib = params.getBool("distrib"); if (distrib == null) { ModifiableSolrParams mparams = new ModifiableSolrParams(params); mparams.set("distrib", false); req.setParams(mparams); } String actionParam = params.get("action"); ACTIONS action = null; if (actionParam == null){ action = ACTIONS.PING; } else { try { action = ACTIONS.valueOf(actionParam.toUpperCase(Locale.ENGLISH)); } catch (IllegalArgumentException iae){ throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Unknown action: " + actionParam); } } switch(action){ case PING: if( isPingDisabled() ) { throw new SolrException(SolrException.ErrorCode.SERVICE_UNAVAILABLE, "Service disabled"); } handlePing(req, rsp); break; case ENABLE: handleEnable(true); break; case DISABLE: handleEnable(false); break; case STATUS: if( healthcheck == null ){ SolrException e = new SolrException (SolrException.ErrorCode.SERVICE_UNAVAILABLE, "healthcheck not configured"); rsp.setException(e); } else { rsp.add( "status", isPingDisabled() ? "disabled" : "enabled" ); } } }
// in core/src/java/org/apache/solr/handler/PingRequestHandler.java
protected void handlePing(SolrQueryRequest req, SolrQueryResponse rsp) throws Exception { SolrParams params = req.getParams(); SolrCore core = req.getCore(); // Get the RequestHandler String qt = params.get( CommonParams.QT );//optional; you get the default otherwise SolrRequestHandler handler = core.getRequestHandler( qt ); if( handler == null ) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Unknown RequestHandler (qt): "+qt ); } if( handler instanceof PingRequestHandler ) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Cannot execute the PingRequestHandler recursively" ); } // Execute the ping query and catch any possible exception Throwable ex = null; try { SolrQueryResponse pingrsp = new SolrQueryResponse(); core.execute(handler, req, pingrsp ); ex = pingrsp.getException(); } catch( Throwable th ) { ex = th; } // Send an error or an 'OK' message (response code will be 200) if( ex != null ) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Ping query caused exception: "+ex.getMessage(), ex ); } rsp.add( "status", "OK" ); }
// in core/src/java/org/apache/solr/handler/PingRequestHandler.java
protected void handleEnable(boolean enable) throws SolrException { if (healthcheck == null) { throw new SolrException(SolrException.ErrorCode.SERVICE_UNAVAILABLE, "No healthcheck file defined."); } if ( enable ) { try { // write out when the file was created FileUtils.write(healthcheck, DateField.formatExternal(new Date()), "UTF-8"); } catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Unable to write healthcheck flag file", e); } } else { if (healthcheck.exists() && !healthcheck.delete()){ throw new SolrException(SolrException.ErrorCode.NOT_FOUND, "Did not successfully delete healthcheck file: " +healthcheck.getAbsolutePath()); } } }
// in core/src/java/org/apache/solr/analysis/TrieTokenizerFactory.java
Override public void reset(Reader input) throws IOException { try { super.reset(input); input = super.input; char[] buf = new char[32]; int len = input.read(buf); this.startOfs = correctOffset(0); this.endOfs = correctOffset(len); String v = new String(buf, 0, len); try { switch (type) { case INTEGER: ts.setIntValue(Integer.parseInt(v)); break; case FLOAT: ts.setFloatValue(Float.parseFloat(v)); break; case LONG: ts.setLongValue(Long.parseLong(v)); break; case DOUBLE: ts.setDoubleValue(Double.parseDouble(v)); break; case DATE: ts.setLongValue(dateField.parseMath(null, v).getTime()); break; default: throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Unknown type for trie field"); } } catch (NumberFormatException nfe) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Invalid Number: " + v); } } catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Unable to create TrieIndexTokenizer", e); } }
// in core/src/java/org/apache/solr/client/solrj/embedded/EmbeddedSolrServer.java
Override public NamedList<Object> request(SolrRequest request) throws SolrServerException, IOException { String path = request.getPath(); if( path == null || !path.startsWith( "/" ) ) { path = "/select"; } // Check for cores action SolrCore core = coreContainer.getCore( coreName ); if( core == null ) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "No such core: " + coreName ); } SolrParams params = request.getParams(); if( params == null ) { params = new ModifiableSolrParams(); } // Extract the handler from the path or params SolrRequestHandler handler = core.getRequestHandler( path ); if( handler == null ) { if( "/select".equals( path ) || "/select/".equalsIgnoreCase( path) ) { String qt = params.get( CommonParams.QT ); handler = core.getRequestHandler( qt ); if( handler == null ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "unknown handler: "+qt); } } // Perhaps the path is to manage the cores if( handler == null && coreContainer != null && path.equals( coreContainer.getAdminPath() ) ) { handler = coreContainer.getMultiCoreHandler(); } } if( handler == null ) { core.close(); throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "unknown handler: "+path ); } SolrQueryRequest req = null; try { req = _parser.buildRequestFrom( core, params, request.getContentStreams() ); req.getContext().put( "path", path ); SolrQueryResponse rsp = new SolrQueryResponse(); SolrRequestInfo.setRequestInfo(new SolrRequestInfo(req, rsp)); core.execute( handler, req, rsp ); if( rsp.getException() != null ) { if(rsp.getException() instanceof SolrException) { throw rsp.getException(); } throw new SolrServerException( rsp.getException() ); } // Check if this should stream results if( request.getStreamingResponseCallback() != null ) { try { final StreamingResponseCallback callback = request.getStreamingResponseCallback(); BinaryResponseWriter.Resolver resolver = new BinaryResponseWriter.Resolver( req, rsp.getReturnFields()) { @Override public void writeResults(ResultContext ctx, JavaBinCodec codec) throws IOException { // write an empty list... SolrDocumentList docs = new SolrDocumentList(); docs.setNumFound( ctx.docs.matches() ); docs.setStart( ctx.docs.offset() ); docs.setMaxScore( ctx.docs.maxScore() ); codec.writeSolrDocumentList( docs ); // This will transform writeResultsBody( ctx, codec ); } }; ByteArrayOutputStream out = new ByteArrayOutputStream(); new JavaBinCodec(resolver) { @Override public void writeSolrDocument(SolrDocument doc) throws IOException { callback.streamSolrDocument( doc ); //super.writeSolrDocument( doc, fields ); } @Override public void writeSolrDocumentList(SolrDocumentList docs) throws IOException { if( docs.size() > 0 ) { SolrDocumentList tmp = new SolrDocumentList(); tmp.setMaxScore( docs.getMaxScore() ); tmp.setNumFound( docs.getNumFound() ); tmp.setStart( docs.getStart() ); docs = tmp; } callback.streamDocListInfo( docs.getNumFound(), docs.getStart(), docs.getMaxScore() ); super.writeSolrDocumentList(docs); } }.marshal(rsp.getValues(), out); InputStream in = new ByteArrayInputStream(out.toByteArray()); return (NamedList<Object>) new JavaBinCodec(resolver).unmarshal(in); } catch (Exception ex) { throw new RuntimeException(ex); } } // Now write it out NamedList<Object> normalized = getParsedResponse(req, rsp); return normalized; } catch( IOException iox ) { throw iox; } catch( SolrException sx ) { throw sx; } catch( Exception ex ) { throw new SolrServerException( ex ); } finally { if (req != null) req.close(); core.close(); SolrRequestInfo.clearRequestInfo(); } }
// in core/src/java/org/apache/solr/response/CSVResponseWriter.java
public void writeResponse() throws IOException { SolrParams params = req.getParams(); strategy = new CSVStrategy(',', '"', CSVStrategy.COMMENTS_DISABLED, CSVStrategy.ESCAPE_DISABLED, false, false, false, true); CSVStrategy strat = strategy; String sep = params.get(CSV_SEPARATOR); if (sep!=null) { if (sep.length()!=1) throw new SolrException( SolrException.ErrorCode.BAD_REQUEST,"Invalid separator:'"+sep+"'"); strat.setDelimiter(sep.charAt(0)); } String nl = params.get(CSV_NEWLINE); if (nl!=null) { if (nl.length()==0) throw new SolrException( SolrException.ErrorCode.BAD_REQUEST,"Invalid newline:'"+nl+"'"); strat.setPrinterNewline(nl); } String encapsulator = params.get(CSV_ENCAPSULATOR); String escape = params.get(CSV_ESCAPE); if (encapsulator!=null) { if (encapsulator.length()!=1) throw new SolrException( SolrException.ErrorCode.BAD_REQUEST,"Invalid encapsulator:'"+encapsulator+"'"); strat.setEncapsulator(encapsulator.charAt(0)); } if (escape!=null) { if (escape.length()!=1) throw new SolrException( SolrException.ErrorCode.BAD_REQUEST,"Invalid escape:'"+escape+"'"); strat.setEscape(escape.charAt(0)); if (encapsulator == null) { strat.setEncapsulator( CSVStrategy.ENCAPSULATOR_DISABLED); } } if (strat.getEscape() == '\\') { // If the escape is the standard backslash, then also enable // unicode escapes (it's harmless since 'u' would not otherwise // be escaped. strat.setUnicodeEscapeInterpretation(true); } printer = new CSVPrinter(writer, strategy); CSVStrategy mvStrategy = new CSVStrategy(strategy.getDelimiter(), CSVStrategy.ENCAPSULATOR_DISABLED, CSVStrategy.COMMENTS_DISABLED, '\\', false, false, false, false); strat = mvStrategy; sep = params.get(MV_SEPARATOR); if (sep!=null) { if (sep.length()!=1) throw new SolrException( SolrException.ErrorCode.BAD_REQUEST,"Invalid mv separator:'"+sep+"'"); strat.setDelimiter(sep.charAt(0)); } encapsulator = params.get(MV_ENCAPSULATOR); escape = params.get(MV_ESCAPE); if (encapsulator!=null) { if (encapsulator.length()!=1) throw new SolrException( SolrException.ErrorCode.BAD_REQUEST,"Invalid mv encapsulator:'"+encapsulator+"'"); strat.setEncapsulator(encapsulator.charAt(0)); if (escape == null) { strat.setEscape(CSVStrategy.ESCAPE_DISABLED); } } escape = params.get(MV_ESCAPE); if (escape!=null) { if (escape.length()!=1) throw new SolrException( SolrException.ErrorCode.BAD_REQUEST,"Invalid mv escape:'"+escape+"'"); strat.setEscape(escape.charAt(0)); // encapsulator will already be disabled if it wasn't specified } Collection<String> fields = returnFields.getLuceneFieldNames(); Object responseObj = rsp.getValues().get("response"); boolean returnOnlyStored = false; if (fields==null) { if (responseObj instanceof SolrDocumentList) { // get the list of fields from the SolrDocumentList fields = new LinkedHashSet<String>(); for (SolrDocument sdoc: (SolrDocumentList)responseObj) { fields.addAll(sdoc.getFieldNames()); } } else { // get the list of fields from the index fields = req.getSearcher().getFieldNames(); } if (returnFields.wantsScore()) { fields.add("score"); } else { fields.remove("score"); } returnOnlyStored = true; } CSVSharedBufPrinter csvPrinterMV = new CSVSharedBufPrinter(mvWriter, mvStrategy); for (String field : fields) { if (!returnFields.wantsField(field)) { continue; } if (field.equals("score")) { CSVField csvField = new CSVField(); csvField.name = "score"; csvFields.put("score", csvField); continue; } SchemaField sf = schema.getFieldOrNull(field); if (sf == null) { FieldType ft = new StrField(); sf = new SchemaField(field, ft); } // Return only stored fields, unless an explicit field list is specified if (returnOnlyStored && sf != null && !sf.stored()) { continue; } // check for per-field overrides sep = params.get("f." + field + '.' + CSV_SEPARATOR); encapsulator = params.get("f." + field + '.' + CSV_ENCAPSULATOR); escape = params.get("f." + field + '.' + CSV_ESCAPE); CSVSharedBufPrinter csvPrinter = csvPrinterMV; if (sep != null || encapsulator != null || escape != null) { // create a new strategy + printer if there were any per-field overrides strat = (CSVStrategy)mvStrategy.clone(); if (sep!=null) { if (sep.length()!=1) throw new SolrException( SolrException.ErrorCode.BAD_REQUEST,"Invalid mv separator:'"+sep+"'"); strat.setDelimiter(sep.charAt(0)); } if (encapsulator!=null) { if (encapsulator.length()!=1) throw new SolrException( SolrException.ErrorCode.BAD_REQUEST,"Invalid mv encapsulator:'"+encapsulator+"'"); strat.setEncapsulator(encapsulator.charAt(0)); if (escape == null) { strat.setEscape(CSVStrategy.ESCAPE_DISABLED); } } if (escape!=null) { if (escape.length()!=1) throw new SolrException( SolrException.ErrorCode.BAD_REQUEST,"Invalid mv escape:'"+escape+"'"); strat.setEscape(escape.charAt(0)); if (encapsulator == null) { strat.setEncapsulator(CSVStrategy.ENCAPSULATOR_DISABLED); } } csvPrinter = new CSVSharedBufPrinter(mvWriter, strat); } CSVField csvField = new CSVField(); csvField.name = field; csvField.sf = sf; csvField.mvPrinter = csvPrinter; csvFields.put(field, csvField); } NullValue = params.get(CSV_NULL, ""); if (params.getBool(CSV_HEADER, true)) { for (CSVField csvField : csvFields.values()) { printer.print(csvField.name); } printer.println(); } if (responseObj instanceof ResultContext ) { writeDocuments(null, (ResultContext)responseObj, returnFields ); } else if (responseObj instanceof DocList) { ResultContext ctx = new ResultContext(); ctx.docs = (DocList)responseObj; writeDocuments(null, ctx, returnFields ); } else if (responseObj instanceof SolrDocumentList) { writeSolrDocumentList(null, (SolrDocumentList)responseObj, returnFields ); } }
// in core/src/java/org/apache/solr/response/transform/ValueAugmenterFactory.java
public static Object getObjectFrom( String val, String type ) { if( type != null ) { try { if( "int".equals( type ) ) return Integer.valueOf( val ); if( "double".equals( type ) ) return Double.valueOf( val ); if( "float".equals( type ) ) return Float.valueOf( val ); if( "date".equals( type ) ) return DateUtil.parseDate(val); } catch( Exception ex ) { throw new SolrException( ErrorCode.BAD_REQUEST, "Unable to parse "+type+"="+val, ex ); } } return val; }
// in core/src/java/org/apache/solr/response/transform/ValueAugmenterFactory.java
Override public DocTransformer create(String field, SolrParams params, SolrQueryRequest req) { Object val = value; if( val == null ) { String v = params.get("v"); if( v == null ) { val = defaultValue; } else { val = getObjectFrom(v, params.get("t")); } if( val == null ) { throw new SolrException( ErrorCode.BAD_REQUEST, "ValueAugmenter is missing a value -- should be defined in solrconfig or inline" ); } } return new ValueAugmenter( field, val ); }
// in core/src/java/org/apache/solr/response/transform/ValueSourceAugmenter.java
Override public void setContext( TransformContext context ) { try { IndexReader reader = qparser.getReq().getSearcher().getIndexReader(); readerContexts = reader.getTopReaderContext().leaves(); docValuesArr = new FunctionValues[readerContexts.length]; searcher = qparser.getReq().getSearcher(); fcontext = ValueSource.newContext(searcher); this.valueSource.createWeight(fcontext, searcher); } catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); } }
// in core/src/java/org/apache/solr/response/transform/ValueSourceAugmenter.java
Override public void transform(SolrDocument doc, int docid) { // This is only good for random-access functions try { // TODO: calculate this stuff just once across diff functions int idx = ReaderUtil.subIndex(docid, readerContexts); AtomicReaderContext rcontext = readerContexts[idx]; FunctionValues values = docValuesArr[idx]; if (values == null) { docValuesArr[idx] = values = valueSource.getValues(fcontext, rcontext); } int localId = docid - rcontext.docBase; Object val = values.objectVal(localId); if (val != null) { doc.setField( name, val ); } } catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "exception at docid " + docid + " for valuesource " + valueSource, e); } }
// in core/src/java/org/apache/solr/response/transform/ExplainAugmenterFactory.java
public static Style getStyle( String str ) { try { return Style.valueOf( str ); } catch( Exception ex ) { throw new SolrException( ErrorCode.BAD_REQUEST, "Unknown Explain Style: "+str ); } }
// in core/src/java/org/apache/solr/request/PerSegmentSingleValuedFaceting.java
NamedList<Integer> getFacetCounts(Executor executor) throws IOException { CompletionService<SegFacet> completionService = new ExecutorCompletionService<SegFacet>(executor); // reuse the translation logic to go from top level set to per-segment set baseSet = docs.getTopFilter(); final AtomicReaderContext[] leaves = searcher.getTopReaderContext().leaves(); // The list of pending tasks that aren't immediately submitted // TODO: Is there a completion service, or a delegating executor that can // limit the number of concurrent tasks submitted to a bigger executor? LinkedList<Callable<SegFacet>> pending = new LinkedList<Callable<SegFacet>>(); int threads = nThreads <= 0 ? Integer.MAX_VALUE : nThreads; for (int i=0; i<leaves.length; i++) { final SegFacet segFacet = new SegFacet(leaves[i]); Callable<SegFacet> task = new Callable<SegFacet>() { public SegFacet call() throws Exception { segFacet.countTerms(); return segFacet; } }; // TODO: if limiting threads, submit by largest segment first? if (--threads >= 0) { completionService.submit(task); } else { pending.add(task); } } // now merge the per-segment results PriorityQueue<SegFacet> queue = new PriorityQueue<SegFacet>(leaves.length) { @Override protected boolean lessThan(SegFacet a, SegFacet b) { return a.tempBR.compareTo(b.tempBR) < 0; } }; boolean hasMissingCount=false; int missingCount=0; for (int i=0; i<leaves.length; i++) { SegFacet seg = null; try { Future<SegFacet> future = completionService.take(); seg = future.get(); if (!pending.isEmpty()) { completionService.submit(pending.removeFirst()); } } catch (InterruptedException e) { Thread.currentThread().interrupt(); throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); } catch (ExecutionException e) { Throwable cause = e.getCause(); if (cause instanceof RuntimeException) { throw (RuntimeException)cause; } else { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Error in per-segment faceting on field: " + fieldName, cause); } } if (seg.startTermIndex < seg.endTermIndex) { if (seg.startTermIndex==0) { hasMissingCount=true; missingCount += seg.counts[0]; seg.pos = 1; } else { seg.pos = seg.startTermIndex; } if (seg.pos < seg.endTermIndex) { seg.tenum = seg.si.getTermsEnum(); seg.tenum.seekExact(seg.pos); seg.tempBR = seg.tenum.term(); queue.add(seg); } } } FacetCollector collector; if (sort.equals(FacetParams.FACET_SORT_COUNT) || sort.equals(FacetParams.FACET_SORT_COUNT_LEGACY)) { collector = new CountSortedFacetCollector(offset, limit, mincount); } else { collector = new IndexSortedFacetCollector(offset, limit, mincount); } BytesRef val = new BytesRef(); while (queue.size() > 0) { SegFacet seg = queue.top(); // make a shallow copy val.bytes = seg.tempBR.bytes; val.offset = seg.tempBR.offset; val.length = seg.tempBR.length; int count = 0; do { count += seg.counts[seg.pos - seg.startTermIndex]; // TODO: OPTIMIZATION... // if mincount>0 then seg.pos++ can skip ahead to the next non-zero entry. seg.pos++; if (seg.pos >= seg.endTermIndex) { queue.pop(); seg = queue.top(); } else { seg.tempBR = seg.tenum.next(); seg = queue.updateTop(); } } while (seg != null && val.compareTo(seg.tempBR) == 0); boolean stop = collector.collect(val, count); if (stop) break; } NamedList<Integer> res = collector.getFacetCounts(); // convert labels to readable form FieldType ft = searcher.getSchema().getFieldType(fieldName); int sz = res.size(); for (int i=0; i<sz; i++) { res.setName(i, ft.indexedToReadable(res.getName(i))); } if (missing) { if (!hasMissingCount) { missingCount = SimpleFacets.getFieldMissingCount(searcher,docs,fieldName); } res.add(null, missingCount); } return res; }
// in core/src/java/org/apache/solr/request/SimpleFacets.java
public NamedList<Object> getFacetCounts() { // if someone called this method, benefit of the doubt: assume true if (!params.getBool(FacetParams.FACET,true)) return null; facetResponse = new SimpleOrderedMap<Object>(); try { facetResponse.add("facet_queries", getFacetQueryCounts()); facetResponse.add("facet_fields", getFacetFieldCounts()); facetResponse.add("facet_dates", getFacetDateCounts()); facetResponse.add("facet_ranges", getFacetRangeCounts()); } catch (IOException e) { throw new SolrException(ErrorCode.SERVER_ERROR, e); } catch (ParseException e) { throw new SolrException(ErrorCode.BAD_REQUEST, e); } return facetResponse; }
// in core/src/java/org/apache/solr/request/SimpleFacets.java
public NamedList<Integer> getGroupedCounts(SolrIndexSearcher searcher, DocSet base, String field, boolean multiToken, int offset, int limit, int mincount, boolean missing, String sort, String prefix) throws IOException { GroupingSpecification groupingSpecification = rb.getGroupingSpec(); String groupField = groupingSpecification != null ? groupingSpecification.getFields()[0] : null; if (groupField == null) { throw new SolrException ( SolrException.ErrorCode.BAD_REQUEST, "Specify the group.field as parameter or local parameter" ); } BytesRef prefixBR = prefix != null ? new BytesRef(prefix) : null; TermGroupFacetCollector collector = TermGroupFacetCollector.createTermGroupFacetCollector(groupField, field, multiToken, prefixBR, 128); searcher.search(new MatchAllDocsQuery(), base.getTopFilter(), collector); boolean orderByCount = sort.equals(FacetParams.FACET_SORT_COUNT) || sort.equals(FacetParams.FACET_SORT_COUNT_LEGACY); TermGroupFacetCollector.GroupedFacetResult result = collector.mergeSegmentResults(offset + limit, mincount, orderByCount); CharsRef charsRef = new CharsRef(); FieldType facetFieldType = searcher.getSchema().getFieldType(field); NamedList<Integer> facetCounts = new NamedList<Integer>(); List<TermGroupFacetCollector.FacetEntry> scopedEntries = result.getFacetEntries(offset, limit); for (TermGroupFacetCollector.FacetEntry facetEntry : scopedEntries) { facetFieldType.indexedToReadable(facetEntry.getValue(), charsRef); facetCounts.add(charsRef.toString(), facetEntry.getCount()); } if (missing) { facetCounts.add(null, result.getTotalMissingCount()); } return facetCounts; }
// in core/src/java/org/apache/solr/request/SimpleFacets.java
Deprecated public void getFacetDateCounts(String dateFacet, NamedList<Object> resOuter) throws IOException, ParseException { final IndexSchema schema = searcher.getSchema(); parseParams(FacetParams.FACET_DATE, dateFacet); String f = facetValue; final NamedList<Object> resInner = new SimpleOrderedMap<Object>(); resOuter.add(key, resInner); final SchemaField sf = schema.getField(f); if (! (sf.getType() instanceof DateField)) { throw new SolrException (SolrException.ErrorCode.BAD_REQUEST, "Can not date facet on a field which is not a DateField: " + f); } final DateField ft = (DateField) sf.getType(); final String startS = required.getFieldParam(f,FacetParams.FACET_DATE_START); final Date start; try { start = ft.parseMath(null, startS); } catch (SolrException e) { throw new SolrException (SolrException.ErrorCode.BAD_REQUEST, "date facet 'start' is not a valid Date string: " + startS, e); } final String endS = required.getFieldParam(f,FacetParams.FACET_DATE_END); Date end; // not final, hardend may change this try { end = ft.parseMath(null, endS); } catch (SolrException e) { throw new SolrException (SolrException.ErrorCode.BAD_REQUEST, "date facet 'end' is not a valid Date string: " + endS, e); } if (end.before(start)) { throw new SolrException (SolrException.ErrorCode.BAD_REQUEST, "date facet 'end' comes before 'start': "+endS+" < "+startS); } final String gap = required.getFieldParam(f,FacetParams.FACET_DATE_GAP); final DateMathParser dmp = new DateMathParser(); final int minCount = params.getFieldInt(f,FacetParams.FACET_MINCOUNT, 0); String[] iStrs = params.getFieldParams(f,FacetParams.FACET_DATE_INCLUDE); // Legacy support for default of [lower,upper,edge] for date faceting // this is not handled by FacetRangeInclude.parseParam because // range faceting has differnet defaults final EnumSet<FacetRangeInclude> include = (null == iStrs || 0 == iStrs.length ) ? EnumSet.of(FacetRangeInclude.LOWER, FacetRangeInclude.UPPER, FacetRangeInclude.EDGE) : FacetRangeInclude.parseParam(iStrs); try { Date low = start; while (low.before(end)) { dmp.setNow(low); String label = ft.toExternal(low); Date high = dmp.parseMath(gap); if (end.before(high)) { if (params.getFieldBool(f,FacetParams.FACET_DATE_HARD_END,false)) { high = end; } else { end = high; } } if (high.before(low)) { throw new SolrException (SolrException.ErrorCode.BAD_REQUEST, "date facet infinite loop (is gap negative?)"); } final boolean includeLower = (include.contains(FacetRangeInclude.LOWER) || (include.contains(FacetRangeInclude.EDGE) && low.equals(start))); final boolean includeUpper = (include.contains(FacetRangeInclude.UPPER) || (include.contains(FacetRangeInclude.EDGE) && high.equals(end))); final int count = rangeCount(sf,low,high,includeLower,includeUpper); if (count >= minCount) { resInner.add(label, count); } low = high; } } catch (java.text.ParseException e) { throw new SolrException (SolrException.ErrorCode.BAD_REQUEST, "date facet 'gap' is not a valid Date Math string: " + gap, e); } // explicitly return the gap and end so all the counts // (including before/after/between) are meaningful - even if mincount // has removed the neighboring ranges resInner.add("gap", gap); resInner.add("start", start); resInner.add("end", end); final String[] othersP = params.getFieldParams(f,FacetParams.FACET_DATE_OTHER); if (null != othersP && 0 < othersP.length ) { final Set<FacetRangeOther> others = EnumSet.noneOf(FacetRangeOther.class); for (final String o : othersP) { others.add(FacetRangeOther.get(o)); } // no matter what other values are listed, we don't do // anything if "none" is specified. if (! others.contains(FacetRangeOther.NONE) ) { boolean all = others.contains(FacetRangeOther.ALL); if (all || others.contains(FacetRangeOther.BEFORE)) { // include upper bound if "outer" or if first gap doesn't already include it resInner.add(FacetRangeOther.BEFORE.toString(), rangeCount(sf,null,start, false, (include.contains(FacetRangeInclude.OUTER) || (! (include.contains(FacetRangeInclude.LOWER) || include.contains(FacetRangeInclude.EDGE)))))); } if (all || others.contains(FacetRangeOther.AFTER)) { // include lower bound if "outer" or if last gap doesn't already include it resInner.add(FacetRangeOther.AFTER.toString(), rangeCount(sf,end,null, (include.contains(FacetRangeInclude.OUTER) || (! (include.contains(FacetRangeInclude.UPPER) || include.contains(FacetRangeInclude.EDGE)))), false)); } if (all || others.contains(FacetRangeOther.BETWEEN)) { resInner.add(FacetRangeOther.BETWEEN.toString(), rangeCount(sf,start,end, (include.contains(FacetRangeInclude.LOWER) || include.contains(FacetRangeInclude.EDGE)), (include.contains(FacetRangeInclude.UPPER) || include.contains(FacetRangeInclude.EDGE)))); } } } }
// in core/src/java/org/apache/solr/request/SimpleFacets.java
void getFacetRangeCounts(String facetRange, NamedList<Object> resOuter) throws IOException, ParseException { final IndexSchema schema = searcher.getSchema(); parseParams(FacetParams.FACET_RANGE, facetRange); String f = facetValue; final SchemaField sf = schema.getField(f); final FieldType ft = sf.getType(); RangeEndpointCalculator<?> calc = null; if (ft instanceof TrieField) { final TrieField trie = (TrieField)ft; switch (trie.getType()) { case FLOAT: calc = new FloatRangeEndpointCalculator(sf); break; case DOUBLE: calc = new DoubleRangeEndpointCalculator(sf); break; case INTEGER: calc = new IntegerRangeEndpointCalculator(sf); break; case LONG: calc = new LongRangeEndpointCalculator(sf); break; default: throw new SolrException (SolrException.ErrorCode.BAD_REQUEST, "Unable to range facet on tried field of unexpected type:" + f); } } else if (ft instanceof DateField) { calc = new DateRangeEndpointCalculator(sf, null); } else if (ft instanceof SortableIntField) { calc = new IntegerRangeEndpointCalculator(sf); } else if (ft instanceof SortableLongField) { calc = new LongRangeEndpointCalculator(sf); } else if (ft instanceof SortableFloatField) { calc = new FloatRangeEndpointCalculator(sf); } else if (ft instanceof SortableDoubleField) { calc = new DoubleRangeEndpointCalculator(sf); } else { throw new SolrException (SolrException.ErrorCode.BAD_REQUEST, "Unable to range facet on field:" + sf); } resOuter.add(key, getFacetRangeCounts(sf, calc)); }
// in core/src/java/org/apache/solr/request/SimpleFacets.java
private <T extends Comparable<T>> NamedList getFacetRangeCounts (final SchemaField sf, final RangeEndpointCalculator<T> calc) throws IOException { final String f = sf.getName(); final NamedList<Object> res = new SimpleOrderedMap<Object>(); final NamedList<Integer> counts = new NamedList<Integer>(); res.add("counts", counts); final T start = calc.getValue(required.getFieldParam(f,FacetParams.FACET_RANGE_START)); // not final, hardend may change this T end = calc.getValue(required.getFieldParam(f,FacetParams.FACET_RANGE_END)); if (end.compareTo(start) < 0) { throw new SolrException (SolrException.ErrorCode.BAD_REQUEST, "range facet 'end' comes before 'start': "+end+" < "+start); } final String gap = required.getFieldParam(f, FacetParams.FACET_RANGE_GAP); // explicitly return the gap. compute this early so we are more // likely to catch parse errors before attempting math res.add("gap", calc.getGap(gap)); final int minCount = params.getFieldInt(f,FacetParams.FACET_MINCOUNT, 0); final EnumSet<FacetRangeInclude> include = FacetRangeInclude.parseParam (params.getFieldParams(f,FacetParams.FACET_RANGE_INCLUDE)); T low = start; while (low.compareTo(end) < 0) { T high = calc.addGap(low, gap); if (end.compareTo(high) < 0) { if (params.getFieldBool(f,FacetParams.FACET_RANGE_HARD_END,false)) { high = end; } else { end = high; } } if (high.compareTo(low) < 0) { throw new SolrException (SolrException.ErrorCode.BAD_REQUEST, "range facet infinite loop (is gap negative? did the math overflow?)"); } final boolean includeLower = (include.contains(FacetRangeInclude.LOWER) || (include.contains(FacetRangeInclude.EDGE) && 0 == low.compareTo(start))); final boolean includeUpper = (include.contains(FacetRangeInclude.UPPER) || (include.contains(FacetRangeInclude.EDGE) && 0 == high.compareTo(end))); final String lowS = calc.formatValue(low); final String highS = calc.formatValue(high); final int count = rangeCount(sf, lowS, highS, includeLower,includeUpper); if (count >= minCount) { counts.add(lowS, count); } low = high; } // explicitly return the start and end so all the counts // (including before/after/between) are meaningful - even if mincount // has removed the neighboring ranges res.add("start", start); res.add("end", end); final String[] othersP = params.getFieldParams(f,FacetParams.FACET_RANGE_OTHER); if (null != othersP && 0 < othersP.length ) { Set<FacetRangeOther> others = EnumSet.noneOf(FacetRangeOther.class); for (final String o : othersP) { others.add(FacetRangeOther.get(o)); } // no matter what other values are listed, we don't do // anything if "none" is specified. if (! others.contains(FacetRangeOther.NONE) ) { boolean all = others.contains(FacetRangeOther.ALL); final String startS = calc.formatValue(start); final String endS = calc.formatValue(end); if (all || others.contains(FacetRangeOther.BEFORE)) { // include upper bound if "outer" or if first gap doesn't already include it res.add(FacetRangeOther.BEFORE.toString(), rangeCount(sf,null,startS, false, (include.contains(FacetRangeInclude.OUTER) || (! (include.contains(FacetRangeInclude.LOWER) || include.contains(FacetRangeInclude.EDGE)))))); } if (all || others.contains(FacetRangeOther.AFTER)) { // include lower bound if "outer" or if last gap doesn't already include it res.add(FacetRangeOther.AFTER.toString(), rangeCount(sf,endS,null, (include.contains(FacetRangeInclude.OUTER) || (! (include.contains(FacetRangeInclude.UPPER) || include.contains(FacetRangeInclude.EDGE)))), false)); } if (all || others.contains(FacetRangeOther.BETWEEN)) { res.add(FacetRangeOther.BETWEEN.toString(), rangeCount(sf,startS,endS, (include.contains(FacetRangeInclude.LOWER) || include.contains(FacetRangeInclude.EDGE)), (include.contains(FacetRangeInclude.UPPER) || include.contains(FacetRangeInclude.EDGE)))); } } } return res; }
// in core/src/java/org/apache/solr/request/SimpleFacets.java
public final T getValue(final String rawval) { try { return parseVal(rawval); } catch (Exception e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Can't parse value "+rawval+" for field: " + field.getName(), e); } }
// in core/src/java/org/apache/solr/request/SimpleFacets.java
public final Object getGap(final String gap) { try { return parseGap(gap); } catch (Exception e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Can't parse gap "+gap+" for field: " + field.getName(), e); } }
// in core/src/java/org/apache/solr/request/SimpleFacets.java
public final T addGap(T value, String gap) { try { return parseAndAddGap(value, gap); } catch (Exception e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Can't add gap "+gap+" to value " + value + " for field: " + field.getName(), e); } }
// in core/src/java/org/apache/solr/request/SolrRequestInfo.java
public TimeZone getClientTimeZone() { if (tz == null) { String tzStr = req.getParams().get(CommonParams.TZ); if (tzStr != null) { tz = TimeZoneUtils.getTimeZone(tzStr); if (null == tz) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Solr JVM does not support TZ: " + tzStr); } } } return tz; }
// in core/src/java/org/apache/solr/servlet/SolrDispatchFilter.java
public void doFilter(ServletRequest request, ServletResponse response, FilterChain chain) throws IOException, ServletException { if( abortErrorMessage != null ) { ((HttpServletResponse)response).sendError( 500, abortErrorMessage ); return; } if (this.cores == null) { ((HttpServletResponse)response).sendError( 403, "Server is shutting down" ); return; } CoreContainer cores = this.cores; SolrCore core = null; SolrQueryRequest solrReq = null; if( request instanceof HttpServletRequest) { HttpServletRequest req = (HttpServletRequest)request; HttpServletResponse resp = (HttpServletResponse)response; SolrRequestHandler handler = null; String corename = ""; try { // put the core container in request attribute req.setAttribute("org.apache.solr.CoreContainer", cores); String path = req.getServletPath(); if( req.getPathInfo() != null ) { // this lets you handle /update/commit when /update is a servlet path += req.getPathInfo(); } if( pathPrefix != null && path.startsWith( pathPrefix ) ) { path = path.substring( pathPrefix.length() ); } // check for management path String alternate = cores.getManagementPath(); if (alternate != null && path.startsWith(alternate)) { path = path.substring(0, alternate.length()); } // unused feature ? int idx = path.indexOf( ':' ); if( idx > 0 ) { // save the portion after the ':' for a 'handler' path parameter path = path.substring( 0, idx ); } // Check for the core admin page if( path.equals( cores.getAdminPath() ) ) { handler = cores.getMultiCoreHandler(); solrReq = adminRequestParser.parse(null,path, req); handleAdminRequest(req, response, handler, solrReq); return; } else { //otherwise, we should find a core from the path idx = path.indexOf( "/", 1 ); if( idx > 1 ) { // try to get the corename as a request parameter first corename = path.substring( 1, idx ); core = cores.getCore(corename); if (core != null) { path = path.substring( idx ); } } if (core == null) { if (!cores.isZooKeeperAware() ) { core = cores.getCore(""); } } } if (core == null && cores.isZooKeeperAware()) { // we couldn't find the core - lets make sure a collection was not specified instead core = getCoreByCollection(cores, corename, path); if (core != null) { // we found a core, update the path path = path.substring( idx ); } else { // try the default core core = cores.getCore(""); } // TODO: if we couldn't find it locally, look on other nodes } // With a valid core... if( core != null ) { final SolrConfig config = core.getSolrConfig(); // get or create/cache the parser for the core SolrRequestParsers parser = null; parser = parsers.get(config); if( parser == null ) { parser = new SolrRequestParsers(config); parsers.put(config, parser ); } // Determine the handler from the url path if not set // (we might already have selected the cores handler) if( handler == null && path.length() > 1 ) { // don't match "" or "/" as valid path handler = core.getRequestHandler( path ); // no handler yet but allowed to handle select; let's check if( handler == null && parser.isHandleSelect() ) { if( "/select".equals( path ) || "/select/".equals( path ) ) { solrReq = parser.parse( core, path, req ); String qt = solrReq.getParams().get( CommonParams.QT ); handler = core.getRequestHandler( qt ); if( handler == null ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "unknown handler: "+qt); } if( qt != null && qt.startsWith("/") && (handler instanceof ContentStreamHandlerBase)) { //For security reasons it's a bad idea to allow a leading '/', ex: /select?qt=/update see SOLR-3161 //There was no restriction from Solr 1.4 thru 3.5 and it's not supported for update handlers. throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "Invalid query type. Do not use /select to access: "+qt); } } } } // With a valid handler and a valid core... if( handler != null ) { // if not a /select, create the request if( solrReq == null ) { solrReq = parser.parse( core, path, req ); } final Method reqMethod = Method.getMethod(req.getMethod()); HttpCacheHeaderUtil.setCacheControlHeader(config, resp, reqMethod); // unless we have been explicitly told not to, do cache validation // if we fail cache validation, execute the query if (config.getHttpCachingConfig().isNever304() || !HttpCacheHeaderUtil.doCacheHeaderValidation(solrReq, req, reqMethod, resp)) { SolrQueryResponse solrRsp = new SolrQueryResponse(); /* even for HEAD requests, we need to execute the handler to * ensure we don't get an error (and to make sure the correct * QueryResponseWriter is selected and we get the correct * Content-Type) */ SolrRequestInfo.setRequestInfo(new SolrRequestInfo(solrReq, solrRsp)); this.execute( req, handler, solrReq, solrRsp ); HttpCacheHeaderUtil.checkHttpCachingVeto(solrRsp, resp, reqMethod); // add info to http headers //TODO: See SOLR-232 and SOLR-267. /*try { NamedList solrRspHeader = solrRsp.getResponseHeader(); for (int i=0; i<solrRspHeader.size(); i++) { ((javax.servlet.http.HttpServletResponse) response).addHeader(("Solr-" + solrRspHeader.getName(i)), String.valueOf(solrRspHeader.getVal(i))); } } catch (ClassCastException cce) { log.log(Level.WARNING, "exception adding response header log information", cce); }*/ QueryResponseWriter responseWriter = core.getQueryResponseWriter(solrReq); writeResponse(solrRsp, response, responseWriter, solrReq, reqMethod); } return; // we are done with a valid handler } } log.debug("no handler or core retrieved for " + path + ", follow through..."); } catch (Throwable ex) { sendError( core, solrReq, request, (HttpServletResponse)response, ex ); return; } finally { if( solrReq != null ) { solrReq.close(); } if (core != null) { core.close(); } SolrRequestInfo.clearRequestInfo(); } } // Otherwise let the webapp handle the request chain.doFilter(request, response); }
// in core/src/java/org/apache/solr/servlet/SolrRequestParsers.java
public SolrQueryRequest buildRequestFrom( SolrCore core, SolrParams params, Collection<ContentStream> streams ) throws Exception { // The content type will be applied to all streaming content String contentType = params.get( CommonParams.STREAM_CONTENTTYPE ); // Handle anything with a remoteURL String[] strs = params.getParams( CommonParams.STREAM_URL ); if( strs != null ) { if( !enableRemoteStreams ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "Remote Streaming is disabled." ); } for( final String url : strs ) { ContentStreamBase stream = new ContentStreamBase.URLStream( new URL(url) ); if( contentType != null ) { stream.setContentType( contentType ); } streams.add( stream ); } } // Handle streaming files strs = params.getParams( CommonParams.STREAM_FILE ); if( strs != null ) { if( !enableRemoteStreams ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "Remote Streaming is disabled." ); } for( final String file : strs ) { ContentStreamBase stream = new ContentStreamBase.FileStream( new File(file) ); if( contentType != null ) { stream.setContentType( contentType ); } streams.add( stream ); } } // Check for streams in the request parameters strs = params.getParams( CommonParams.STREAM_BODY ); if( strs != null ) { for( final String body : strs ) { ContentStreamBase stream = new ContentStreamBase.StringStream( body ); if( contentType != null ) { stream.setContentType( contentType ); } streams.add( stream ); } } SolrQueryRequestBase q = new SolrQueryRequestBase( core, params ) { }; if( streams != null && streams.size() > 0 ) { q.setContentStreams( streams ); } return q; }
// in core/src/java/org/apache/solr/servlet/SolrRequestParsers.java
public static MultiMapSolrParams parseQueryString(String queryString) { Map<String,String[]> map = new HashMap<String, String[]>(); if( queryString != null && queryString.length() > 0 ) { try { for( String kv : queryString.split( "&" ) ) { int idx = kv.indexOf( '=' ); if( idx > 0 ) { String name = URLDecoder.decode( kv.substring( 0, idx ), "UTF-8"); String value = URLDecoder.decode( kv.substring( idx+1 ), "UTF-8"); MultiMapSolrParams.addParam( name, value, map ); } else { String name = URLDecoder.decode( kv, "UTF-8" ); MultiMapSolrParams.addParam( name, "", map ); } } } catch( UnsupportedEncodingException uex ) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, uex ); } } return new MultiMapSolrParams( map ); }
// in core/src/java/org/apache/solr/servlet/SolrRequestParsers.java
public SolrParams parseParamsAndFillStreams( final HttpServletRequest req, ArrayList<ContentStream> streams ) throws Exception { if( !ServletFileUpload.isMultipartContent(req) ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "Not multipart content! "+req.getContentType() ); } MultiMapSolrParams params = SolrRequestParsers.parseQueryString( req.getQueryString() ); // Create a factory for disk-based file items DiskFileItemFactory factory = new DiskFileItemFactory(); // Set factory constraints // TODO - configure factory.setSizeThreshold(yourMaxMemorySize); // TODO - configure factory.setRepository(yourTempDirectory); // Create a new file upload handler ServletFileUpload upload = new ServletFileUpload(factory); upload.setSizeMax( uploadLimitKB*1024 ); // Parse the request List items = upload.parseRequest(req); Iterator iter = items.iterator(); while (iter.hasNext()) { FileItem item = (FileItem) iter.next(); // If its a form field, put it in our parameter map if (item.isFormField()) { MultiMapSolrParams.addParam( item.getFieldName(), item.getString(), params.getMap() ); } // Add the stream else { streams.add( new FileItemContentStream( item ) ); } } return params; }
// in core/src/java/org/apache/solr/servlet/SolrRequestParsers.java
public SolrParams parseParamsAndFillStreams( final HttpServletRequest req, ArrayList<ContentStream> streams ) throws Exception { String method = req.getMethod().toUpperCase(Locale.ENGLISH); if( "GET".equals( method ) || "HEAD".equals( method )) { return new ServletSolrParams(req); } if( "POST".equals( method ) ) { String contentType = req.getContentType(); if( contentType != null ) { int idx = contentType.indexOf( ';' ); if( idx > 0 ) { // remove the charset definition "; charset=utf-8" contentType = contentType.substring( 0, idx ); } if( "application/x-www-form-urlencoded".equals( contentType.toLowerCase(Locale.ENGLISH) ) ) { return new ServletSolrParams(req); // just get the params from parameterMap } if( ServletFileUpload.isMultipartContent(req) ) { return multipart.parseParamsAndFillStreams(req, streams); } } return raw.parseParamsAndFillStreams(req, streams); } throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "Unsupported method: "+method ); }
// in core/src/java/org/apache/solr/servlet/cache/HttpCacheHeaderUtil.java
public static long calcLastModified(final SolrQueryRequest solrReq) { final SolrCore core = solrReq.getCore(); final SolrIndexSearcher searcher = solrReq.getSearcher(); final LastModFrom lastModFrom = core.getSolrConfig().getHttpCachingConfig().getLastModFrom(); long lastMod; try { // assume default, change if needed (getOpenTime() should be fast) lastMod = LastModFrom.DIRLASTMOD == lastModFrom ? IndexDeletionPolicyWrapper.getCommitTimestamp(searcher.getIndexReader().getIndexCommit()) : searcher.getOpenTime(); } catch (IOException e) { // we're pretty freaking screwed if this happens throw new SolrException(ErrorCode.SERVER_ERROR, e); } // Get the time where the searcher has been opened // We get rid of the milliseconds because the HTTP header has only // second granularity return lastMod - (lastMod % 1000L); }
// in core/src/java/org/apache/solr/servlet/DirectSolrConnection.java
public String request(String path, SolrParams params, String body) throws Exception { // Extract the handler from the path or params SolrRequestHandler handler = core.getRequestHandler( path ); if( handler == null ) { if( "/select".equals( path ) || "/select/".equalsIgnoreCase( path) ) { if (params == null) params = new MapSolrParams( new HashMap<String, String>() ); String qt = params.get( CommonParams.QT ); handler = core.getRequestHandler( qt ); if( handler == null ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "unknown handler: "+qt); } } } if( handler == null ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "unknown handler: "+path ); } return request(handler, params, body); }
// in core/src/java/org/apache/solr/schema/OpenExchangeRatesOrgProvider.java
public double getExchangeRate(String sourceCurrencyCode, String targetCurrencyCode) { if (rates == null) { throw new SolrException(SolrException.ErrorCode.SERVICE_UNAVAILABLE, "Rates not initialized."); } if (sourceCurrencyCode == null || targetCurrencyCode == null) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Cannot get exchange rate; currency was null."); } if (rates.getTimestamp() + refreshInterval*60*1000 > System.currentTimeMillis()) { log.debug("Refresh interval has expired. Refreshing exchange rates."); reload(); } Double source = (Double) rates.getRates().get(sourceCurrencyCode); Double target = (Double) rates.getRates().get(targetCurrencyCode); if (source == null || target == null) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "No available conversion rate from " + sourceCurrencyCode + " to " + targetCurrencyCode + ". " + "Available rates are "+listAvailableCurrencies()); } return target / source; }
// in core/src/java/org/apache/solr/schema/OpenExchangeRatesOrgProvider.java
Override public Set<String> listAvailableCurrencies() { if (rates == null) throw new SolrException(ErrorCode.SERVER_ERROR, "Rates not initialized"); return rates.getRates().keySet(); }
// in core/src/java/org/apache/solr/schema/OpenExchangeRatesOrgProvider.java
Override public boolean reload() throws SolrException { InputStream ratesJsonStream = null; try { log.info("Reloading exchange rates from "+ratesFileLocation); try { ratesJsonStream = (new URL(ratesFileLocation)).openStream(); } catch (Exception e) { ratesJsonStream = resourceLoader.openResource(ratesFileLocation); } rates = new OpenExchangeRates(ratesJsonStream); return true; } catch (Exception e) { throw new SolrException(ErrorCode.SERVER_ERROR, "Error reloading exchange rates", e); } finally { if (ratesJsonStream != null) try { ratesJsonStream.close(); } catch (IOException e) { throw new SolrException(ErrorCode.SERVER_ERROR, "Error closing stream", e); } } }
// in core/src/java/org/apache/solr/schema/OpenExchangeRatesOrgProvider.java
Override public void init(Map<String,String> params) throws SolrException { try { ratesFileLocation = getParam(params.get(PARAM_RATES_FILE_LOCATION), DEFAULT_RATES_FILE_LOCATION); refreshInterval = Integer.parseInt(getParam(params.get(PARAM_REFRESH_INTERVAL), DEFAULT_REFRESH_INTERVAL)); // Force a refresh interval of minimum one hour, since the API does not offer better resolution if (refreshInterval < 60) { refreshInterval = 60; log.warn("Specified refreshInterval was too small. Setting to 60 minutes which is the update rate of openexchangerates.org"); } log.info("Initialized with rates="+ratesFileLocation+", refreshInterval="+refreshInterval+"."); } catch (Exception e) { throw new SolrException(ErrorCode.BAD_REQUEST, "Error initializing", e); } finally { // Removing config params custom to us params.remove(PARAM_RATES_FILE_LOCATION); params.remove(PARAM_REFRESH_INTERVAL); } }
// in core/src/java/org/apache/solr/schema/IndexSchema.java
private void readSchema(InputSource is) { log.info("Reading Solr Schema"); try { // pass the config resource loader to avoid building an empty one for no reason: // in the current case though, the stream is valid so we wont load the resource by name Config schemaConf = new Config(loader, "schema", is, "/schema/"); Document document = schemaConf.getDocument(); final XPath xpath = schemaConf.getXPath(); final List<SchemaAware> schemaAware = new ArrayList<SchemaAware>(); Node nd = (Node) xpath.evaluate("/schema/@name", document, XPathConstants.NODE); if (nd==null) { log.warn("schema has no name!"); } else { name = nd.getNodeValue(); log.info("Schema name=" + name); } version = schemaConf.getFloat("/schema/@version", 1.0f); // load the Field Types final FieldTypePluginLoader typeLoader = new FieldTypePluginLoader(this, fieldTypes, schemaAware); String expression = "/schema/types/fieldtype | /schema/types/fieldType"; NodeList nodes = (NodeList) xpath.evaluate(expression, document, XPathConstants.NODESET); typeLoader.load( loader, nodes ); // load the Fields // Hang on to the fields that say if they are required -- this lets us set a reasonable default for the unique key Map<String,Boolean> explicitRequiredProp = new HashMap<String, Boolean>(); ArrayList<DynamicField> dFields = new ArrayList<DynamicField>(); expression = "/schema/fields/field | /schema/fields/dynamicField"; nodes = (NodeList) xpath.evaluate(expression, document, XPathConstants.NODESET); for (int i=0; i<nodes.getLength(); i++) { Node node = nodes.item(i); NamedNodeMap attrs = node.getAttributes(); String name = DOMUtil.getAttr(attrs,"name","field definition"); log.trace("reading field def "+name); String type = DOMUtil.getAttr(attrs,"type","field " + name); FieldType ft = fieldTypes.get(type); if (ft==null) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST,"Unknown fieldtype '" + type + "' specified on field " + name); } Map<String,String> args = DOMUtil.toMapExcept(attrs, "name", "type"); if( args.get( "required" ) != null ) { explicitRequiredProp.put( name, Boolean.valueOf( args.get( "required" ) ) ); } SchemaField f = SchemaField.create(name,ft,args); if (node.getNodeName().equals("field")) { SchemaField old = fields.put(f.getName(),f); if( old != null ) { String msg = "[schema.xml] Duplicate field definition for '" + f.getName() + "' [[["+old.toString()+"]]] and [[["+f.toString()+"]]]"; throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, msg ); } log.debug("field defined: " + f); if( f.getDefaultValue() != null ) { log.debug(name+" contains default value: " + f.getDefaultValue()); fieldsWithDefaultValue.add( f ); } if (f.isRequired()) { log.debug(name+" is required in this schema"); requiredFields.add(f); } } else if (node.getNodeName().equals("dynamicField")) { // make sure nothing else has the same path addDynamicField(dFields, f); } else { // we should never get here throw new RuntimeException("Unknown field type"); } } //fields with default values are by definition required //add them to required fields, and we only have to loop once // in DocumentBuilder.getDoc() requiredFields.addAll(getFieldsWithDefaultValue()); // OK, now sort the dynamic fields largest to smallest size so we don't get // any false matches. We want to act like a compiler tool and try and match // the largest string possible. Collections.sort(dFields); log.trace("Dynamic Field Ordering:" + dFields); // stuff it in a normal array for faster access dynamicFields = dFields.toArray(new DynamicField[dFields.size()]); Node node = (Node) xpath.evaluate("/schema/similarity", document, XPathConstants.NODE); SimilarityFactory simFactory = readSimilarity(loader, node); if (simFactory == null) { simFactory = new DefaultSimilarityFactory(); } if (simFactory instanceof SchemaAware) { ((SchemaAware)simFactory).inform(this); } similarity = simFactory.getSimilarity(); node = (Node) xpath.evaluate("/schema/defaultSearchField/text()", document, XPathConstants.NODE); if (node==null) { log.warn("no default search field specified in schema."); } else { defaultSearchFieldName=node.getNodeValue().trim(); // throw exception if specified, but not found or not indexed if (defaultSearchFieldName!=null) { SchemaField defaultSearchField = getFields().get(defaultSearchFieldName); if ((defaultSearchField == null) || !defaultSearchField.indexed()) { String msg = "default search field '" + defaultSearchFieldName + "' not defined or not indexed" ; throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, msg ); } } log.info("default search field is "+defaultSearchFieldName); } node = (Node) xpath.evaluate("/schema/solrQueryParser/@defaultOperator", document, XPathConstants.NODE); if (node==null) { log.debug("using default query parser operator (OR)"); } else { queryParserDefaultOperator=node.getNodeValue().trim(); log.info("query parser default operator is "+queryParserDefaultOperator); } node = (Node) xpath.evaluate("/schema/uniqueKey/text()", document, XPathConstants.NODE); if (node==null) { log.warn("no uniqueKey specified in schema."); } else { uniqueKeyField=getIndexedField(node.getNodeValue().trim()); if (!uniqueKeyField.stored()) { log.error("uniqueKey is not stored - distributed search will not work"); } if (uniqueKeyField.multiValued()) { log.error("uniqueKey should not be multivalued"); } uniqueKeyFieldName=uniqueKeyField.getName(); uniqueKeyFieldType=uniqueKeyField.getType(); log.info("unique key field: "+uniqueKeyFieldName); // Unless the uniqueKeyField is marked 'required=false' then make sure it exists if( Boolean.FALSE != explicitRequiredProp.get( uniqueKeyFieldName ) ) { uniqueKeyField.required = true; requiredFields.add(uniqueKeyField); } } /////////////// parse out copyField commands /////////////// // Map<String,ArrayList<SchemaField>> cfields = new HashMap<String,ArrayList<SchemaField>>(); // expression = "/schema/copyField"; dynamicCopyFields = new DynamicCopy[] {}; expression = "//copyField"; nodes = (NodeList) xpath.evaluate(expression, document, XPathConstants.NODESET); for (int i=0; i<nodes.getLength(); i++) { node = nodes.item(i); NamedNodeMap attrs = node.getAttributes(); String source = DOMUtil.getAttr(attrs,"source","copyField definition"); String dest = DOMUtil.getAttr(attrs,"dest", "copyField definition"); String maxChars = DOMUtil.getAttr(attrs, "maxChars"); int maxCharsInt = CopyField.UNLIMITED; if (maxChars != null) { try { maxCharsInt = Integer.parseInt(maxChars); } catch (NumberFormatException e) { log.warn("Couldn't parse maxChars attribute for copyField from " + source + " to " + dest + " as integer. The whole field will be copied."); } } registerCopyField(source, dest, maxCharsInt); } for (Map.Entry<SchemaField, Integer> entry : copyFieldTargetCounts.entrySet()) { if (entry.getValue() > 1 && !entry.getKey().multiValued()) { log.warn("Field " + entry.getKey().name + " is not multivalued "+ "and destination for multiple copyFields ("+ entry.getValue()+")"); } } //Run the callbacks on SchemaAware now that everything else is done for (SchemaAware aware : schemaAware) { aware.inform(this); } } catch (SolrException e) { throw e; } catch(Exception e) { // unexpected exception... throw new SolrException( SolrException.ErrorCode.SERVER_ERROR,"Schema Parsing Failed: " + e.getMessage(), e); } // create the field analyzers refreshAnalyzers(); }
// in core/src/java/org/apache/solr/schema/IndexSchema.java
private void addDynamicField(List<DynamicField> dFields, SchemaField f) { boolean dup = isDuplicateDynField(dFields, f); if( !dup ) { addDynamicFieldNoDupCheck(dFields, f); } else { String msg = "[schema.xml] Duplicate DynamicField definition for '" + f.getName() + "'"; throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, msg); } }
// in core/src/java/org/apache/solr/schema/IndexSchema.java
public void registerCopyField( String source, String dest, int maxChars ) { boolean sourceIsPattern = isWildCard(source); boolean destIsPattern = isWildCard(dest); log.debug("copyField source='"+source+"' dest='"+dest+"' maxChars='"+maxChars); SchemaField d = getFieldOrNull(dest); if(d == null){ throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "copyField destination :'"+dest+"' does not exist" ); } if(sourceIsPattern) { if( destIsPattern ) { DynamicField df = null; for( DynamicField dd : dynamicFields ) { if( dd.regex.equals( dest ) ) { df = dd; break; } } if( df == null ) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "copyField dynamic destination must match a dynamicField." ); } registerDynamicCopyField(new DynamicDestCopy(source, df, maxChars )); } else { registerDynamicCopyField(new DynamicCopy(source, d, maxChars)); } } else if( destIsPattern ) { String msg = "copyField only supports a dynamic destination if the source is also dynamic" ; throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, msg ); } else { // retrieve the field to force an exception if it doesn't exist SchemaField f = getField(source); List<CopyField> copyFieldList = copyFieldsMap.get(source); if (copyFieldList == null) { copyFieldList = new ArrayList<CopyField>(); copyFieldsMap.put(source, copyFieldList); } copyFieldList.add(new CopyField(f, d, maxChars)); copyFieldTargetCounts.put(d, (copyFieldTargetCounts.containsKey(d) ? copyFieldTargetCounts.get(d) + 1 : 1)); } }
// in core/src/java/org/apache/solr/schema/IndexSchema.java
public SchemaField getField(String fieldName) { SchemaField f = getFieldOrNull(fieldName); if (f != null) return f; // Hmmm, default field could also be implemented with a dynamic field of "*". // It would have to be special-cased and only used if nothing else matched. /*** REMOVED -YCS if (defaultFieldType != null) return new SchemaField(fieldName,defaultFieldType); ***/ throw new SolrException( SolrException.ErrorCode.BAD_REQUEST,"undefined field: \""+fieldName+"\""); }
// in core/src/java/org/apache/solr/schema/IndexSchema.java
public FieldType getDynamicFieldType(String fieldName) { for (DynamicField df : dynamicFields) { if (df.matches(fieldName)) return df.prototype.getType(); } throw new SolrException( SolrException.ErrorCode.BAD_REQUEST,"undefined field "+fieldName); }
// in core/src/java/org/apache/solr/schema/DateField.java
public Date parseMath(Date now, String val) { String math = null; final DateMathParser p = new DateMathParser(); if (null != now) p.setNow(now); if (val.startsWith(NOW)) { math = val.substring(NOW.length()); } else { final int zz = val.indexOf(Z); if (0 < zz) { math = val.substring(zz+1); try { // p.setNow(toObject(val.substring(0,zz))); p.setNow(parseDate(val.substring(0,zz+1))); } catch (ParseException e) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "Invalid Date in Date Math String:'" +val+'\'',e); } } else { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "Invalid Date String:'" +val+'\''); } } if (null == math || math.equals("")) { return p.getNow(); } try { return p.parseMath(math); } catch (ParseException e) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "Invalid Date Math String:'" +val+'\'',e); } }
// in core/src/java/org/apache/solr/schema/DateField.java
public Date parseMathLenient(Date now, String val, SolrQueryRequest req) { String math = null; final DateMathParser p = new DateMathParser(); if (null != now) p.setNow(now); if (val.startsWith(NOW)) { math = val.substring(NOW.length()); } else { final int zz = val.indexOf(Z); if (0 < zz) { math = val.substring(zz+1); try { // p.setNow(toObject(val.substring(0,zz))); p.setNow(parseDateLenient(val.substring(0,zz+1), req)); } catch (ParseException e) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "Invalid Date in Date Math String:'" +val+'\'',e); } } else { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "Invalid Date String:'" +val+'\''); } } if (null == math || math.equals("")) { return p.getNow(); } try { return p.parseMath(math); } catch (ParseException e) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "Invalid Date Math String:'" +val+'\'',e); } }
// in core/src/java/org/apache/solr/schema/CurrencyField.java
Override protected void init(IndexSchema schema, Map<String, String> args) { super.init(schema, args); this.schema = schema; this.exchangeRateProviderClass = args.get(PARAM_RATE_PROVIDER_CLASS); this.defaultCurrency = args.get(PARAM_DEFAULT_CURRENCY); if (this.defaultCurrency == null) { this.defaultCurrency = DEFAULT_DEFAULT_CURRENCY; } if (this.exchangeRateProviderClass == null) { this.exchangeRateProviderClass = DEFAULT_RATE_PROVIDER_CLASS; } if (java.util.Currency.getInstance(this.defaultCurrency) == null) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Invalid currency code " + this.defaultCurrency); } String precisionStepString = args.get(PARAM_PRECISION_STEP); if (precisionStepString == null) { precisionStepString = DEFAULT_PRECISION_STEP; } // Initialize field type for amount fieldTypeAmountRaw = new TrieLongField(); fieldTypeAmountRaw.setTypeName("amount_raw_type_tlong"); Map<String,String> map = new HashMap<String,String>(1); map.put("precisionStep", precisionStepString); fieldTypeAmountRaw.init(schema, map); // Initialize field type for currency string fieldTypeCurrency = new StrField(); fieldTypeCurrency.setTypeName("currency_type_string"); fieldTypeCurrency.init(schema, new HashMap<String,String>()); args.remove(PARAM_RATE_PROVIDER_CLASS); args.remove(PARAM_DEFAULT_CURRENCY); args.remove(PARAM_PRECISION_STEP); try { Class<? extends ExchangeRateProvider> c = schema.getResourceLoader().findClass(exchangeRateProviderClass, ExchangeRateProvider.class); provider = c.newInstance(); provider.init(args); } catch (Exception e) { throw new SolrException(ErrorCode.BAD_REQUEST, "Error instansiating exhange rate provider "+exchangeRateProviderClass+". Please check your FieldType configuration", e); } }
// in core/src/java/org/apache/solr/schema/CurrencyField.java
Override public Query getRangeQuery(QParser parser, SchemaField field, String part1, String part2, final boolean minInclusive, final boolean maxInclusive) { final CurrencyValue p1 = CurrencyValue.parse(part1, defaultCurrency); final CurrencyValue p2 = CurrencyValue.parse(part2, defaultCurrency); if (!p1.getCurrencyCode().equals(p2.getCurrencyCode())) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Cannot parse range query " + part1 + " to " + part2 + ": range queries only supported when upper and lower bound have same currency."); } return getRangeQuery(parser, field, p1, p2, minInclusive, maxInclusive); }
// in core/src/java/org/apache/solr/schema/CurrencyField.java
Override public SortField getSortField(SchemaField field, boolean reverse) { try { // Convert all values to default currency for sorting. return (new CurrencyValueSource(field, defaultCurrency, null)).getSortField(reverse); } catch (IOException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); } }
// in core/src/java/org/apache/solr/schema/CurrencyField.java
public double getExchangeRate(String sourceCurrencyCode, String targetCurrencyCode) { if (sourceCurrencyCode == null || targetCurrencyCode == null) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Cannot get exchange rate; currency was null."); } if (sourceCurrencyCode.equals(targetCurrencyCode)) { return 1.0; } Double directRate = lookupRate(sourceCurrencyCode, targetCurrencyCode); if (directRate != null) { return directRate; } Double symmetricRate = lookupRate(targetCurrencyCode, sourceCurrencyCode); if (symmetricRate != null) { return 1.0 / symmetricRate; } throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "No available conversion rate between " + sourceCurrencyCode + " to " + targetCurrencyCode); }
// in core/src/java/org/apache/solr/schema/CurrencyField.java
Override public boolean reload() throws SolrException { InputStream is = null; Map<String, Map<String, Double>> tmpRates = new HashMap<String, Map<String, Double>>(); try { log.info("Reloading exchange rates from file "+this.currencyConfigFile); is = loader.openResource(currencyConfigFile); javax.xml.parsers.DocumentBuilderFactory dbf = DocumentBuilderFactory.newInstance(); try { dbf.setXIncludeAware(true); dbf.setNamespaceAware(true); } catch (UnsupportedOperationException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "XML parser doesn't support XInclude option", e); } try { Document doc = dbf.newDocumentBuilder().parse(is); XPathFactory xpathFactory = XPathFactory.newInstance(); XPath xpath = xpathFactory.newXPath(); // Parse exchange rates. NodeList nodes = (NodeList) xpath.evaluate("/currencyConfig/rates/rate", doc, XPathConstants.NODESET); for (int i = 0; i < nodes.getLength(); i++) { Node rateNode = nodes.item(i); NamedNodeMap attributes = rateNode.getAttributes(); Node from = attributes.getNamedItem("from"); Node to = attributes.getNamedItem("to"); Node rate = attributes.getNamedItem("rate"); if (from == null || to == null || rate == null) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Exchange rate missing attributes (required: from, to, rate) " + rateNode); } String fromCurrency = from.getNodeValue(); String toCurrency = to.getNodeValue(); Double exchangeRate; if (java.util.Currency.getInstance(fromCurrency) == null || java.util.Currency.getInstance(toCurrency) == null) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Could not find from currency specified in exchange rate: " + rateNode); } try { exchangeRate = Double.parseDouble(rate.getNodeValue()); } catch (NumberFormatException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Could not parse exchange rate: " + rateNode, e); } addRate(tmpRates, fromCurrency, toCurrency, exchangeRate); } } catch (SAXException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Error parsing currency config.", e); } catch (IOException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Error parsing currency config.", e); } catch (ParserConfigurationException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Error parsing currency config.", e); } catch (XPathExpressionException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Error parsing currency config.", e); } } catch (IOException e) { throw new SolrException(ErrorCode.BAD_REQUEST, "Error while opening Currency configuration file "+currencyConfigFile, e); } finally { try { if (is != null) { is.close(); } } catch (IOException e) { e.printStackTrace(); } } // Atomically swap in the new rates map, if it loaded successfully this.rates = tmpRates; return true; }
// in core/src/java/org/apache/solr/schema/CurrencyField.java
Override public void init(Map<String,String> params) throws SolrException { this.currencyConfigFile = params.get(PARAM_CURRENCY_CONFIG); if(currencyConfigFile == null) { throw new SolrException(ErrorCode.NOT_FOUND, "Missing required configuration "+PARAM_CURRENCY_CONFIG); } // Removing config params custom to us params.remove(PARAM_CURRENCY_CONFIG); }
// in core/src/java/org/apache/solr/schema/CurrencyField.java
Override public void inform(ResourceLoader loader) throws SolrException { if(loader == null) { throw new SolrException(ErrorCode.BAD_REQUEST, "Needs ResourceLoader in order to load config file"); } this.loader = loader; reload(); }
// in core/src/java/org/apache/solr/schema/CurrencyField.java
public static CurrencyValue parse(String externalVal, String defaultCurrency) { String amount = externalVal; String code = defaultCurrency; if (externalVal.contains(",")) { String[] amountAndCode = externalVal.split(","); amount = amountAndCode[0]; code = amountAndCode[1]; } Currency currency = java.util.Currency.getInstance(code); if (currency == null) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Invalid currency code " + code); } try { double value = Double.parseDouble(amount); long currencyValue = Math.round(value * Math.pow(10.0, currency.getDefaultFractionDigits())); return new CurrencyValue(currencyValue, code); } catch (NumberFormatException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); } }
// in core/src/java/org/apache/solr/schema/TrieField.java
Override protected void init(IndexSchema schema, Map<String, String> args) { super.init(schema, args); String p = args.remove("precisionStep"); if (p != null) { precisionStepArg = Integer.parseInt(p); } // normalize the precisionStep precisionStep = precisionStepArg; if (precisionStep<=0 || precisionStep>=64) precisionStep=Integer.MAX_VALUE; String t = args.remove("type"); if (t != null) { try { type = TrieTypes.valueOf(t.toUpperCase(Locale.ENGLISH)); } catch (IllegalArgumentException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Invalid type specified in schema.xml for field: " + args.get("name"), e); } } CharFilterFactory[] filterFactories = new CharFilterFactory[0]; TokenFilterFactory[] tokenFilterFactories = new TokenFilterFactory[0]; analyzer = new TokenizerChain(filterFactories, new TrieTokenizerFactory(type, precisionStep), tokenFilterFactories); // for query time we only need one token, so we use the biggest possible precisionStep: queryAnalyzer = new TokenizerChain(filterFactories, new TrieTokenizerFactory(type, Integer.MAX_VALUE), tokenFilterFactories); }
// in core/src/java/org/apache/solr/schema/TrieField.java
Override public Object toObject(IndexableField f) { final Number val = f.numericValue(); if (val != null) { return (type == TrieTypes.DATE) ? new Date(val.longValue()) : val; } else { // the following code is "deprecated" and only to support pre-3.2 indexes using the old BinaryField encoding: final BytesRef bytes = f.binaryValue(); if (bytes==null) return badFieldString(f); switch (type) { case INTEGER: return toInt(bytes.bytes, bytes.offset); case FLOAT: return Float.intBitsToFloat(toInt(bytes.bytes, bytes.offset)); case LONG: return toLong(bytes.bytes, bytes.offset); case DOUBLE: return Double.longBitsToDouble(toLong(bytes.bytes, bytes.offset)); case DATE: return new Date(toLong(bytes.bytes, bytes.offset)); default: throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Unknown type for trie field: " + f.name()); } } }
// in core/src/java/org/apache/solr/schema/TrieField.java
Override public SortField getSortField(SchemaField field, boolean top) { field.checkSortability(); Object missingValue = null; boolean sortMissingLast = field.sortMissingLast(); boolean sortMissingFirst = field.sortMissingFirst(); switch (type) { case INTEGER: if( sortMissingLast ) { missingValue = top ? Integer.MIN_VALUE : Integer.MAX_VALUE; } else if( sortMissingFirst ) { missingValue = top ? Integer.MAX_VALUE : Integer.MIN_VALUE; } return new SortField( field.getName(), FieldCache.NUMERIC_UTILS_INT_PARSER, top).setMissingValue(missingValue); case FLOAT: if( sortMissingLast ) { missingValue = top ? Float.NEGATIVE_INFINITY : Float.POSITIVE_INFINITY; } else if( sortMissingFirst ) { missingValue = top ? Float.POSITIVE_INFINITY : Float.NEGATIVE_INFINITY; } return new SortField( field.getName(), FieldCache.NUMERIC_UTILS_FLOAT_PARSER, top).setMissingValue(missingValue); case DATE: // fallthrough case LONG: if( sortMissingLast ) { missingValue = top ? Long.MIN_VALUE : Long.MAX_VALUE; } else if( sortMissingFirst ) { missingValue = top ? Long.MAX_VALUE : Long.MIN_VALUE; } return new SortField( field.getName(), FieldCache.NUMERIC_UTILS_LONG_PARSER, top).setMissingValue(missingValue); case DOUBLE: if( sortMissingLast ) { missingValue = top ? Double.NEGATIVE_INFINITY : Double.POSITIVE_INFINITY; } else if( sortMissingFirst ) { missingValue = top ? Double.POSITIVE_INFINITY : Double.NEGATIVE_INFINITY; } return new SortField( field.getName(), FieldCache.NUMERIC_UTILS_DOUBLE_PARSER, top).setMissingValue(missingValue); default: throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Unknown type for trie field: " + field.name); } }
// in core/src/java/org/apache/solr/schema/TrieField.java
Override public ValueSource getValueSource(SchemaField field, QParser qparser) { field.checkFieldCacheSource(qparser); switch (type) { case INTEGER: return new IntFieldSource( field.getName(), FieldCache.NUMERIC_UTILS_INT_PARSER ); case FLOAT: return new FloatFieldSource( field.getName(), FieldCache.NUMERIC_UTILS_FLOAT_PARSER ); case DATE: return new TrieDateFieldSource( field.getName(), FieldCache.NUMERIC_UTILS_LONG_PARSER ); case LONG: return new LongFieldSource( field.getName(), FieldCache.NUMERIC_UTILS_LONG_PARSER ); case DOUBLE: return new DoubleFieldSource( field.getName(), FieldCache.NUMERIC_UTILS_DOUBLE_PARSER ); default: throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Unknown type for trie field: " + field.name); } }
// in core/src/java/org/apache/solr/schema/TrieField.java
Override public Query getRangeQuery(QParser parser, SchemaField field, String min, String max, boolean minInclusive, boolean maxInclusive) { int ps = precisionStep; Query query = null; switch (type) { case INTEGER: query = NumericRangeQuery.newIntRange(field.getName(), ps, min == null ? null : Integer.parseInt(min), max == null ? null : Integer.parseInt(max), minInclusive, maxInclusive); break; case FLOAT: query = NumericRangeQuery.newFloatRange(field.getName(), ps, min == null ? null : Float.parseFloat(min), max == null ? null : Float.parseFloat(max), minInclusive, maxInclusive); break; case LONG: query = NumericRangeQuery.newLongRange(field.getName(), ps, min == null ? null : Long.parseLong(min), max == null ? null : Long.parseLong(max), minInclusive, maxInclusive); break; case DOUBLE: query = NumericRangeQuery.newDoubleRange(field.getName(), ps, min == null ? null : Double.parseDouble(min), max == null ? null : Double.parseDouble(max), minInclusive, maxInclusive); break; case DATE: query = NumericRangeQuery.newLongRange(field.getName(), ps, min == null ? null : dateField.parseMath(null, min).getTime(), max == null ? null : dateField.parseMath(null, max).getTime(), minInclusive, maxInclusive); break; default: throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Unknown type for trie field"); } return query; }
// in core/src/java/org/apache/solr/schema/TrieField.java
Override public void readableToIndexed(CharSequence val, BytesRef result) { String s = val.toString(); switch (type) { case INTEGER: NumericUtils.intToPrefixCoded(Integer.parseInt(s), 0, result); break; case FLOAT: NumericUtils.intToPrefixCoded(NumericUtils.floatToSortableInt(Float.parseFloat(s)), 0, result); break; case LONG: NumericUtils.longToPrefixCoded(Long.parseLong(s), 0, result); break; case DOUBLE: NumericUtils.longToPrefixCoded(NumericUtils.doubleToSortableLong(Double.parseDouble(s)), 0, result); break; case DATE: NumericUtils.longToPrefixCoded(dateField.parseMath(null, s).getTime(), 0, result); break; default: throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Unknown type for trie field: " + type); } }
// in core/src/java/org/apache/solr/schema/TrieField.java
Override public String indexedToReadable(String _indexedForm) { final BytesRef indexedForm = new BytesRef(_indexedForm); switch (type) { case INTEGER: return Integer.toString( NumericUtils.prefixCodedToInt(indexedForm) ); case FLOAT: return Float.toString( NumericUtils.sortableIntToFloat(NumericUtils.prefixCodedToInt(indexedForm)) ); case LONG: return Long.toString( NumericUtils.prefixCodedToLong(indexedForm) ); case DOUBLE: return Double.toString( NumericUtils.sortableLongToDouble(NumericUtils.prefixCodedToLong(indexedForm)) ); case DATE: return dateField.toExternal( new Date(NumericUtils.prefixCodedToLong(indexedForm)) ); default: throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Unknown type for trie field: " + type); } }
// in core/src/java/org/apache/solr/schema/TrieField.java
Override public CharsRef indexedToReadable(BytesRef indexedForm, CharsRef charsRef) { final String value; switch (type) { case INTEGER: value = Integer.toString( NumericUtils.prefixCodedToInt(indexedForm) ); break; case FLOAT: value = Float.toString( NumericUtils.sortableIntToFloat(NumericUtils.prefixCodedToInt(indexedForm)) ); break; case LONG: value = Long.toString( NumericUtils.prefixCodedToLong(indexedForm) ); break; case DOUBLE: value = Double.toString( NumericUtils.sortableLongToDouble(NumericUtils.prefixCodedToLong(indexedForm)) ); break; case DATE: value = dateField.toExternal( new Date(NumericUtils.prefixCodedToLong(indexedForm)) ); break; default: throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Unknown type for trie field: " + type); } charsRef.grow(value.length()); charsRef.length = value.length(); value.getChars(0, charsRef.length, charsRef.chars, 0); return charsRef; }
// in core/src/java/org/apache/solr/schema/TrieField.java
Override public Object toObject(SchemaField sf, BytesRef term) { switch (type) { case INTEGER: return NumericUtils.prefixCodedToInt(term); case FLOAT: return NumericUtils.sortableIntToFloat(NumericUtils.prefixCodedToInt(term)); case LONG: return NumericUtils.prefixCodedToLong(term); case DOUBLE: return NumericUtils.sortableLongToDouble(NumericUtils.prefixCodedToLong(term)); case DATE: return new Date(NumericUtils.prefixCodedToLong(term)); default: throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Unknown type for trie field: " + type); } }
// in core/src/java/org/apache/solr/schema/TrieField.java
Override public String storedToIndexed(IndexableField f) { final BytesRef bytes = new BytesRef(NumericUtils.BUF_SIZE_LONG); final Number val = f.numericValue(); if (val != null) { switch (type) { case INTEGER: NumericUtils.intToPrefixCoded(val.intValue(), 0, bytes); break; case FLOAT: NumericUtils.intToPrefixCoded(NumericUtils.floatToSortableInt(val.floatValue()), 0, bytes); break; case LONG: //fallthrough! case DATE: NumericUtils.longToPrefixCoded(val.longValue(), 0, bytes); break; case DOUBLE: NumericUtils.longToPrefixCoded(NumericUtils.doubleToSortableLong(val.doubleValue()), 0, bytes); break; default: throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Unknown type for trie field: " + f.name()); } } else { // the following code is "deprecated" and only to support pre-3.2 indexes using the old BinaryField encoding: final BytesRef bytesRef = f.binaryValue(); if (bytesRef==null) throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Invalid field contents: "+f.name()); switch (type) { case INTEGER: NumericUtils.intToPrefixCoded(toInt(bytesRef.bytes, bytesRef.offset), 0, bytes); break; case FLOAT: { // WARNING: Code Duplication! Keep in sync with o.a.l.util.NumericUtils! // copied from NumericUtils to not convert to/from float two times // code in next 2 lines is identical to: int v = NumericUtils.floatToSortableInt(Float.intBitsToFloat(toInt(arr))); int v = toInt(bytesRef.bytes, bytesRef.offset); if (v<0) v ^= 0x7fffffff; NumericUtils.intToPrefixCoded(v, 0, bytes); break; } case LONG: //fallthrough! case DATE: NumericUtils.longToPrefixCoded(toLong(bytesRef.bytes, bytesRef.offset), 0, bytes); break; case DOUBLE: { // WARNING: Code Duplication! Keep in sync with o.a.l.util.NumericUtils! // copied from NumericUtils to not convert to/from double two times // code in next 2 lines is identical to: long v = NumericUtils.doubleToSortableLong(Double.longBitsToDouble(toLong(arr))); long v = toLong(bytesRef.bytes, bytesRef.offset); if (v<0) v ^= 0x7fffffffffffffffL; NumericUtils.longToPrefixCoded(v, 0, bytes); break; } default: throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Unknown type for trie field: " + f.name()); } } return bytes.utf8ToString(); }
// in core/src/java/org/apache/solr/schema/TrieField.java
Override public IndexableField createField(SchemaField field, Object value, float boost) { boolean indexed = field.indexed(); boolean stored = field.stored(); if (!indexed && !stored) { if (log.isTraceEnabled()) log.trace("Ignoring unindexed/unstored field: " + field); return null; } FieldType ft = new FieldType(); ft.setStored(stored); ft.setTokenized(true); ft.setIndexed(indexed); ft.setOmitNorms(field.omitNorms()); ft.setIndexOptions(getIndexOptions(field, value.toString())); switch (type) { case INTEGER: ft.setNumericType(NumericType.INT); break; case FLOAT: ft.setNumericType(NumericType.FLOAT); break; case LONG: ft.setNumericType(NumericType.LONG); break; case DOUBLE: ft.setNumericType(NumericType.DOUBLE); break; case DATE: ft.setNumericType(NumericType.LONG); break; default: throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Unknown type for trie field: " + type); } ft.setNumericPrecisionStep(precisionStep); final org.apache.lucene.document.Field f; switch (type) { case INTEGER: int i = (value instanceof Number) ? ((Number)value).intValue() : Integer.parseInt(value.toString()); f = new org.apache.lucene.document.IntField(field.getName(), i, ft); break; case FLOAT: float fl = (value instanceof Number) ? ((Number)value).floatValue() : Float.parseFloat(value.toString()); f = new org.apache.lucene.document.FloatField(field.getName(), fl, ft); break; case LONG: long l = (value instanceof Number) ? ((Number)value).longValue() : Long.parseLong(value.toString()); f = new org.apache.lucene.document.LongField(field.getName(), l, ft); break; case DOUBLE: double d = (value instanceof Number) ? ((Number)value).doubleValue() : Double.parseDouble(value.toString()); f = new org.apache.lucene.document.DoubleField(field.getName(), d, ft); break; case DATE: Date date = (value instanceof Date) ? ((Date)value) : dateField.parseMath(null, value.toString()); f = new org.apache.lucene.document.LongField(field.getName(), date.getTime(), ft); break; default: throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Unknown type for trie field: " + type); } f.setBoost(boost); return f; }
// in core/src/java/org/apache/solr/schema/TrieField.java
public static String getMainValuePrefix(org.apache.solr.schema.FieldType ft) { if (ft instanceof TrieDateField) ft = ((TrieDateField) ft).wrappedField; if (ft instanceof TrieField) { final TrieField trie = (TrieField)ft; if (trie.precisionStep == Integer.MAX_VALUE) return null; switch (trie.type) { case INTEGER: case FLOAT: return INT_PREFIX; case LONG: case DOUBLE: case DATE: return LONG_PREFIX; default: throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Unknown type for trie field: " + trie.type); } } return null; }
// in core/src/java/org/apache/solr/schema/LatLonType.java
Override public IndexableField[] createFields(SchemaField field, Object value, float boost) { String externalVal = value.toString(); //we could have tileDiff + 3 fields (two for the lat/lon, one for storage) IndexableField[] f = new IndexableField[(field.indexed() ? 2 : 0) + (field.stored() ? 1 : 0)]; if (field.indexed()) { int i = 0; double[] latLon; try { latLon = ParseUtils.parseLatitudeLongitude(null, externalVal); } catch (InvalidShapeException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); } //latitude SchemaField lat = subField(field, i); f[i] = lat.createField(String.valueOf(latLon[LAT]), lat.omitNorms() ? 1F : boost); i++; //longitude SchemaField lon = subField(field, i); f[i] = lon.createField(String.valueOf(latLon[LON]), lon.omitNorms() ? 1F : boost); } if (field.stored()) { FieldType customType = new FieldType(); customType.setStored(true); f[f.length - 1] = createField(field.getName(), externalVal, customType, boost); } return f; }
// in core/src/java/org/apache/solr/schema/LatLonType.java
Override public Query getRangeQuery(QParser parser, SchemaField field, String part1, String part2, boolean minInclusive, boolean maxInclusive) { int dimension = 2; String[] p1; String[] p2; try { p1 = ParseUtils.parsePoint(null, part1, dimension); p2 = ParseUtils.parsePoint(null, part2, dimension); } catch (InvalidShapeException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); } BooleanQuery result = new BooleanQuery(true); for (int i = 0; i < dimension; i++) { SchemaField subSF = subField(field, i); // points must currently be ordered... should we support specifying any two opposite corner points? result.add(subSF.getType().getRangeQuery(parser, subSF, p1[i], p2[i], minInclusive, maxInclusive), BooleanClause.Occur.MUST); } return result; }
// in core/src/java/org/apache/solr/schema/LatLonType.java
Override public Query getFieldQuery(QParser parser, SchemaField field, String externalVal) { int dimension = 2; String[] p1 = new String[0]; try { p1 = ParseUtils.parsePoint(null, externalVal, dimension); } catch (InvalidShapeException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); } BooleanQuery bq = new BooleanQuery(true); for (int i = 0; i < dimension; i++) { SchemaField sf = subField(field, i); Query tq = sf.getType().getFieldQuery(parser, sf, p1[i]); bq.add(tq, BooleanClause.Occur.MUST); } return bq; }
// in core/src/java/org/apache/solr/schema/LatLonType.java
Override public Query createSpatialQuery(QParser parser, SpatialOptions options) { double[] point = null; try { point = ParseUtils.parseLatitudeLongitude(options.pointStr); } catch (InvalidShapeException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); } // lat & lon in degrees double latCenter = point[LAT]; double lonCenter = point[LON]; DistanceCalculator distCalc = new GeodesicSphereDistCalc.Haversine(options.units.earthRadius()); SpatialContext ctx = new SimpleSpatialContext(options.units,distCalc,null); Rectangle bbox = DistanceUtils.calcBoxByDistFromPtDEG(latCenter, lonCenter, options.distance, ctx); double latMin = bbox.getMinY(); double latMax = bbox.getMaxY(); double lonMin, lonMax, lon2Min, lon2Max; if (bbox.getCrossesDateLine()) { lonMin = -180; lonMax = bbox.getMaxX(); lon2Min = bbox.getMinX(); lon2Max = 180; } else { lonMin = bbox.getMinX(); lonMax = bbox.getMaxX(); lon2Min = -180; lon2Max = 180; } // Now that we've figured out the ranges, build them! SchemaField latField = subField(options.field, LAT); SchemaField lonField = subField(options.field, LON); SpatialDistanceQuery spatial = new SpatialDistanceQuery(); if (options.bbox) { BooleanQuery result = new BooleanQuery(); Query latRange = latField.getType().getRangeQuery(parser, latField, String.valueOf(latMin), String.valueOf(latMax), true, true); result.add(latRange, BooleanClause.Occur.MUST); if (lonMin != -180 || lonMax != 180) { Query lonRange = lonField.getType().getRangeQuery(parser, lonField, String.valueOf(lonMin), String.valueOf(lonMax), true, true); if (lon2Min != -180 || lon2Max != 180) { // another valid longitude range BooleanQuery bothLons = new BooleanQuery(); bothLons.add(lonRange, BooleanClause.Occur.SHOULD); lonRange = lonField.getType().getRangeQuery(parser, lonField, String.valueOf(lon2Min), String.valueOf(lon2Max), true, true); bothLons.add(lonRange, BooleanClause.Occur.SHOULD); lonRange = bothLons; } result.add(lonRange, BooleanClause.Occur.MUST); } spatial.bboxQuery = result; } spatial.origField = options.field.getName(); spatial.latSource = latField.getType().getValueSource(latField, parser); spatial.lonSource = lonField.getType().getValueSource(lonField, parser); spatial.latMin = latMin; spatial.latMax = latMax; spatial.lonMin = lonMin; spatial.lonMax = lonMax; spatial.lon2Min = lon2Min; spatial.lon2Max = lon2Max; spatial.lon2 = lon2Min != -180 || lon2Max != 180; spatial.latCenter = latCenter; spatial.lonCenter = lonCenter; spatial.dist = options.distance; spatial.planetRadius = options.radius; spatial.calcDist = !options.bbox; return spatial; }
// in core/src/java/org/apache/solr/schema/LatLonType.java
Override public SortField getSortField(SchemaField field, boolean top) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Sorting not supported on LatLonType " + field.getName()); }
// in core/src/java/org/apache/solr/schema/LatLonType.java
Override public DelegatingCollector getFilterCollector(IndexSearcher searcher) { try { return new SpatialCollector(new SpatialWeight(searcher)); } catch (IOException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); } }
// in core/src/java/org/apache/solr/schema/FieldTypePluginLoader.java
public void add(Object current) { if (!(current instanceof MultiTermAwareComponent)) return; AbstractAnalysisFactory newComponent = ((MultiTermAwareComponent)current).getMultiTermComponent(); if (newComponent instanceof TokenFilterFactory) { if (filters == null) { filters = new ArrayList<TokenFilterFactory>(2); } filters.add((TokenFilterFactory)newComponent); } else if (newComponent instanceof TokenizerFactory) { tokenizer = (TokenizerFactory)newComponent; } else if (newComponent instanceof CharFilterFactory) { if (charFilters == null) { charFilters = new ArrayList<CharFilterFactory>(1); } charFilters.add( (CharFilterFactory)newComponent); } else { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Unknown analysis component from MultiTermAwareComponent: " + newComponent); } }
// in core/src/java/org/apache/solr/schema/FieldTypePluginLoader.java
private Analyzer readAnalyzer(Node node) throws XPathExpressionException { final SolrResourceLoader loader = schema.getResourceLoader(); // parent node used to be passed in as "fieldtype" // if (!fieldtype.hasChildNodes()) return null; // Node node = DOMUtil.getChild(fieldtype,"analyzer"); if (node == null) return null; NamedNodeMap attrs = node.getAttributes(); String analyzerName = DOMUtil.getAttr(attrs,"class"); if (analyzerName != null) { try { // No need to be core-aware as Analyzers are not in the core-aware list final Class<? extends Analyzer> clazz = loader.findClass(analyzerName, Analyzer.class); try { // first try to use a ctor with version parameter // (needed for many new Analyzers that have no default one anymore) Constructor<? extends Analyzer> cnstr = clazz.getConstructor(Version.class); final String matchVersionStr = DOMUtil.getAttr(attrs, LUCENE_MATCH_VERSION_PARAM); final Version luceneMatchVersion = (matchVersionStr == null) ? schema.getDefaultLuceneMatchVersion() : Config.parseLuceneVersionString(matchVersionStr); if (luceneMatchVersion == null) { throw new SolrException ( SolrException.ErrorCode.SERVER_ERROR, "Configuration Error: Analyzer '" + clazz.getName() + "' needs a 'luceneMatchVersion' parameter"); } return cnstr.newInstance(luceneMatchVersion); } catch (NoSuchMethodException nsme) { // otherwise use default ctor return clazz.newInstance(); } } catch (Exception e) { log.error("Cannot load analyzer: "+analyzerName, e); throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "Cannot load analyzer: "+analyzerName, e ); } } // Load the CharFilters final ArrayList<CharFilterFactory> charFilters = new ArrayList<CharFilterFactory>(); AbstractPluginLoader<CharFilterFactory> charFilterLoader = new AbstractPluginLoader<CharFilterFactory> ("[schema.xml] analyzer/charFilter", CharFilterFactory.class, false, false) { @Override protected void init(CharFilterFactory plugin, Node node) throws Exception { if( plugin != null ) { final Map<String,String> params = DOMUtil.toMapExcept(node.getAttributes(),"class"); String configuredVersion = params.remove(LUCENE_MATCH_VERSION_PARAM); plugin.setLuceneMatchVersion(parseConfiguredVersion(configuredVersion, plugin.getClass().getSimpleName())); plugin.init( params ); charFilters.add( plugin ); } } @Override protected CharFilterFactory register(String name, CharFilterFactory plugin) { return null; // used for map registration } }; charFilterLoader.load( loader, (NodeList)xpath.evaluate("./charFilter", node, XPathConstants.NODESET) ); // Load the Tokenizer // Although an analyzer only allows a single Tokenizer, we load a list to make sure // the configuration is ok final ArrayList<TokenizerFactory> tokenizers = new ArrayList<TokenizerFactory>(1); AbstractPluginLoader<TokenizerFactory> tokenizerLoader = new AbstractPluginLoader<TokenizerFactory> ("[schema.xml] analyzer/tokenizer", TokenizerFactory.class, false, false) { @Override protected void init(TokenizerFactory plugin, Node node) throws Exception { if( !tokenizers.isEmpty() ) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "The schema defines multiple tokenizers for: "+node ); } final Map<String,String> params = DOMUtil.toMapExcept(node.getAttributes(),"class"); String configuredVersion = params.remove(LUCENE_MATCH_VERSION_PARAM); plugin.setLuceneMatchVersion(parseConfiguredVersion(configuredVersion, plugin.getClass().getSimpleName())); plugin.init( params ); tokenizers.add( plugin ); } @Override protected TokenizerFactory register(String name, TokenizerFactory plugin) { return null; // used for map registration } }; tokenizerLoader.load( loader, (NodeList)xpath.evaluate("./tokenizer", node, XPathConstants.NODESET) ); // Make sure something was loaded if( tokenizers.isEmpty() ) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR,"analyzer without class or tokenizer & filter list"); } // Load the Filters final ArrayList<TokenFilterFactory> filters = new ArrayList<TokenFilterFactory>(); AbstractPluginLoader<TokenFilterFactory> filterLoader = new AbstractPluginLoader<TokenFilterFactory>("[schema.xml] analyzer/filter", TokenFilterFactory.class, false, false) { @Override protected void init(TokenFilterFactory plugin, Node node) throws Exception { if( plugin != null ) { final Map<String,String> params = DOMUtil.toMapExcept(node.getAttributes(),"class"); String configuredVersion = params.remove(LUCENE_MATCH_VERSION_PARAM); plugin.setLuceneMatchVersion(parseConfiguredVersion(configuredVersion, plugin.getClass().getSimpleName())); plugin.init( params ); filters.add( plugin ); } } @Override protected TokenFilterFactory register(String name, TokenFilterFactory plugin) throws Exception { return null; // used for map registration } }; filterLoader.load( loader, (NodeList)xpath.evaluate("./filter", node, XPathConstants.NODESET) ); return new TokenizerChain(charFilters.toArray(new CharFilterFactory[charFilters.size()]), tokenizers.get(0), filters.toArray(new TokenFilterFactory[filters.size()])); }
// in core/src/java/org/apache/solr/schema/FieldTypePluginLoader.java
Override protected void init(TokenizerFactory plugin, Node node) throws Exception { if( !tokenizers.isEmpty() ) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "The schema defines multiple tokenizers for: "+node ); } final Map<String,String> params = DOMUtil.toMapExcept(node.getAttributes(),"class"); String configuredVersion = params.remove(LUCENE_MATCH_VERSION_PARAM); plugin.setLuceneMatchVersion(parseConfiguredVersion(configuredVersion, plugin.getClass().getSimpleName())); plugin.init( params ); tokenizers.add( plugin ); }
// in core/src/java/org/apache/solr/schema/PointType.java
Override protected void init(IndexSchema schema, Map<String, String> args) { SolrParams p = new MapSolrParams(args); dimension = p.getInt(DIMENSION, DEFAULT_DIMENSION); if (dimension < 1) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "The dimension must be > 0: " + dimension); } args.remove(DIMENSION); this.schema = schema; super.init(schema, args); // cache suffixes createSuffixCache(dimension); }
// in core/src/java/org/apache/solr/schema/PointType.java
Override public IndexableField[] createFields(SchemaField field, Object value, float boost) { String externalVal = value.toString(); String[] point = new String[0]; try { point = ParseUtils.parsePoint(null, externalVal, dimension); } catch (InvalidShapeException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); } // TODO: this doesn't currently support polyFields as sub-field types IndexableField[] f = new IndexableField[ (field.indexed() ? dimension : 0) + (field.stored() ? 1 : 0) ]; if (field.indexed()) { for (int i=0; i<dimension; i++) { f[i] = subField(field, i).createField(point[i], boost); } } if (field.stored()) { String storedVal = externalVal; // normalize or not? FieldType customType = new FieldType(); customType.setStored(true); f[f.length - 1] = createField(field.getName(), storedVal, customType, boost); } return f; }
// in core/src/java/org/apache/solr/schema/PointType.java
Override public SortField getSortField(SchemaField field, boolean top) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Sorting not supported on PointType " + field.getName()); }
// in core/src/java/org/apache/solr/schema/PointType.java
Override public Query getFieldQuery(QParser parser, SchemaField field, String externalVal) { String[] p1 = new String[0]; try { p1 = ParseUtils.parsePoint(null, externalVal, dimension); } catch (InvalidShapeException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); } //TODO: should we assert that p1.length == dimension? BooleanQuery bq = new BooleanQuery(true); for (int i = 0; i < dimension; i++) { SchemaField sf = subField(field, i); Query tq = sf.getType().getFieldQuery(parser, sf, p1[i]); bq.add(tq, BooleanClause.Occur.MUST); } return bq; }
// in core/src/java/org/apache/solr/schema/PointType.java
public Query createSpatialQuery(QParser parser, SpatialOptions options) { Query result = null; double [] point = new double[0]; try { point = ParseUtils.parsePointDouble(null, options.pointStr, dimension); } catch (InvalidShapeException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); } if (dimension == 1){ //TODO: Handle distance measures String lower = String.valueOf(point[0] - options.distance); String upper = String.valueOf(point[0] + options.distance); SchemaField subSF = subField(options.field, 0); // points must currently be ordered... should we support specifying any two opposite corner points? result = subSF.getType().getRangeQuery(parser, subSF, lower, upper, true, true); } else { BooleanQuery tmp = new BooleanQuery(); //TODO: Handle distance measures, as this assumes Euclidean double [] ur = DistanceUtils.vectorBoxCorner(point, null, options.distance, true); double [] ll = DistanceUtils.vectorBoxCorner(point, null, options.distance, false); for (int i = 0; i < ur.length; i++) { SchemaField subSF = subField(options.field, i); Query range = subSF.getType().getRangeQuery(parser, subSF, String.valueOf(ll[i]), String.valueOf(ur[i]), true, true); tmp.add(range, BooleanClause.Occur.MUST); } result = tmp; } return result; }
// in core/src/java/org/apache/solr/schema/UUIDField.java
Override public String toInternal(String val) { if (val == null || 0==val.length() || NEW.equals(val)) { return UUID.randomUUID().toString().toLowerCase(Locale.ENGLISH); } else { // we do some basic validation if 'val' looks like an UUID if (val.length() != 36 || val.charAt(8) != DASH || val.charAt(13) != DASH || val.charAt(18) != DASH || val.charAt(23) != DASH) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Invalid UUID String: '" + val + "'"); } return val.toLowerCase(Locale.ENGLISH); } }
// in core/src/java/org/apache/solr/schema/GeoHashField.java
public Query createSpatialQuery(QParser parser, SpatialOptions options) { double [] point = new double[0]; try { point = ParseUtils.parsePointDouble(null, options.pointStr, 2); } catch (InvalidShapeException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); } String geohash = GeohashUtils.encodeLatLon(point[0], point[1]); //TODO: optimize this return new SolrConstantScoreQuery(new ValueSourceRangeFilter(new GeohashHaversineFunction(getValueSource(options.field, parser), new LiteralValueSource(geohash), options.radius), "0", String.valueOf(options.distance), true, true)); }
// in core/src/java/org/apache/solr/schema/GeoHashField.java
Override public String toInternal(String val) { // validate that the string is of the form // latitude, longitude double[] latLon = new double[0]; try { latLon = ParseUtils.parseLatitudeLongitude(null, val); } catch (InvalidShapeException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); } return GeohashUtils.encodeLatLon(latLon[0], latLon[1]); }
// in core/src/java/org/apache/solr/schema/ExternalFileField.java
Override protected void init(IndexSchema schema, Map<String, String> args) { restrictProps(SORT_MISSING_FIRST | SORT_MISSING_LAST); // valType has never been used for anything except to throw an error, so make it optional since the // code (see getValueSource) gives you a FileFloatSource. String ftypeS = args.remove("valType"); if (ftypeS != null) { ftype = schema.getFieldTypes().get(ftypeS); if (ftype != null && !(ftype instanceof FloatField) && !(ftype instanceof TrieFloatField)) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Only float and pfloat (Trie|Float)Field are currently supported as external field type. Got " + ftypeS); } } keyFieldName = args.remove("keyField"); String defValS = args.remove("defVal"); defVal = defValS == null ? 0 : Float.parseFloat(defValS); this.schema = schema; }
// in core/src/java/org/apache/solr/schema/SchemaField.java
public void checkSortability() throws SolrException { if (! indexed() ) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "can not sort on unindexed field: " + getName()); } if ( multiValued() ) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "can not sort on multivalued field: " + getName()); } }
// in core/src/java/org/apache/solr/schema/SchemaField.java
public void checkFieldCacheSource(QParser parser) throws SolrException { if (! indexed() ) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "can not use FieldCache on unindexed field: " + getName()); } if ( multiValued() ) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "can not use FieldCache on multivalued field: " + getName()); } }
// in core/src/java/org/apache/solr/schema/TextField.java
public static BytesRef analyzeMultiTerm(String field, String part, Analyzer analyzerIn) { if (part == null) return null; TokenStream source; try { source = analyzerIn.tokenStream(field, new StringReader(part)); source.reset(); } catch (IOException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Unable to initialize TokenStream to analyze multiTerm term: " + part, e); } TermToBytesRefAttribute termAtt = source.getAttribute(TermToBytesRefAttribute.class); BytesRef bytes = termAtt.getBytesRef(); try { if (!source.incrementToken()) throw new SolrException(SolrException.ErrorCode.BAD_REQUEST,"analyzer returned no terms for multiTerm term: " + part); termAtt.fillBytesRef(); if (source.incrementToken()) throw new SolrException(SolrException.ErrorCode.BAD_REQUEST,"analyzer returned too many terms for multiTerm term: " + part); } catch (IOException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST,"error analyzing range part: " + part, e); } try { source.end(); source.close(); } catch (IOException e) { throw new RuntimeException("Unable to end & close TokenStream after analyzing multiTerm term: " + part, e); } return BytesRef.deepCopyOf(bytes); }
// in core/src/java/org/apache/solr/schema/CollationField.java
private void setup(ResourceLoader loader, Map<String,String> args) { String custom = args.remove("custom"); String language = args.remove("language"); String country = args.remove("country"); String variant = args.remove("variant"); String strength = args.remove("strength"); String decomposition = args.remove("decomposition"); final Collator collator; if (custom == null && language == null) throw new SolrException(ErrorCode.SERVER_ERROR, "Either custom or language is required."); if (custom != null && (language != null || country != null || variant != null)) throw new SolrException(ErrorCode.SERVER_ERROR, "Cannot specify both language and custom. " + "To tailor rules for a built-in language, see the javadocs for RuleBasedCollator. " + "Then save the entire customized ruleset to a file, and use with the custom parameter"); if (language != null) { // create from a system collator, based on Locale. collator = createFromLocale(language, country, variant); } else { // create from a custom ruleset collator = createFromRules(custom, loader); } // set the strength flag, otherwise it will be the default. if (strength != null) { if (strength.equalsIgnoreCase("primary")) collator.setStrength(Collator.PRIMARY); else if (strength.equalsIgnoreCase("secondary")) collator.setStrength(Collator.SECONDARY); else if (strength.equalsIgnoreCase("tertiary")) collator.setStrength(Collator.TERTIARY); else if (strength.equalsIgnoreCase("identical")) collator.setStrength(Collator.IDENTICAL); else throw new SolrException(ErrorCode.SERVER_ERROR, "Invalid strength: " + strength); } // set the decomposition flag, otherwise it will be the default. if (decomposition != null) { if (decomposition.equalsIgnoreCase("no")) collator.setDecomposition(Collator.NO_DECOMPOSITION); else if (decomposition.equalsIgnoreCase("canonical")) collator.setDecomposition(Collator.CANONICAL_DECOMPOSITION); else if (decomposition.equalsIgnoreCase("full")) collator.setDecomposition(Collator.FULL_DECOMPOSITION); else throw new SolrException(ErrorCode.SERVER_ERROR, "Invalid decomposition: " + decomposition); } // we use 4.0 because it ensures we just encode the pure byte[] keys. analyzer = new CollationKeyAnalyzer(Version.LUCENE_40, collator); }
// in core/src/java/org/apache/solr/schema/CollationField.java
private Collator createFromLocale(String language, String country, String variant) { Locale locale; if (language != null && country == null && variant != null) throw new SolrException(ErrorCode.SERVER_ERROR, "To specify variant, country is required"); else if (language != null && country != null && variant != null) locale = new Locale(language, country, variant); else if (language != null && country != null) locale = new Locale(language, country); else locale = new Locale(language); return Collator.getInstance(locale); }
// in core/src/java/org/apache/solr/schema/AbstractSubTypeFieldType.java
Override protected void init(IndexSchema schema, Map<String, String> args) { this.schema = schema; //it's not a first class citizen for the IndexSchema SolrParams p = new MapSolrParams(args); String subFT = p.get(SUB_FIELD_TYPE); String subSuffix = p.get(SUB_FIELD_SUFFIX); if (subFT != null) { args.remove(SUB_FIELD_TYPE); subType = schema.getFieldTypeByName(subFT.trim()); suffix = POLY_FIELD_SEPARATOR + subType.typeName; } else if (subSuffix != null) { args.remove(SUB_FIELD_SUFFIX); suffix = subSuffix; } else { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "The field type: " + typeName + " must specify the " + SUB_FIELD_TYPE + " attribute or the " + SUB_FIELD_SUFFIX + " attribute."); } }
// in core/src/java/org/apache/solr/schema/FieldType.java
protected String getArg(String n, Map<String,String> args) { String s = args.remove(n); if (s == null) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Missing parameter '"+n+"' for FieldType=" + typeName +args); } return s; }
// in core/src/java/org/apache/solr/schema/FieldType.java
public IndexableField createField(SchemaField field, Object value, float boost) { if (!field.indexed() && !field.stored()) { if (log.isTraceEnabled()) log.trace("Ignoring unindexed/unstored field: " + field); return null; } String val; try { val = toInternal(value.toString()); } catch (RuntimeException e) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "Error while creating field '" + field + "' from value '" + value + "'", e); } if (val==null) return null; org.apache.lucene.document.FieldType newType = new org.apache.lucene.document.FieldType(); newType.setIndexed(field.indexed()); newType.setTokenized(field.isTokenized()); newType.setStored(field.stored()); newType.setOmitNorms(field.omitNorms()); newType.setIndexOptions(getIndexOptions(field, val)); newType.setStoreTermVectors(field.storeTermVector()); newType.setStoreTermVectorOffsets(field.storeTermOffsets()); newType.setStoreTermVectorPositions(field.storeTermPositions()); return createField(field.getName(), val, newType, boost); }
// in core/src/java/org/apache/solr/schema/FieldType.java
public void setAnalyzer(Analyzer analyzer) { throw new SolrException (ErrorCode.SERVER_ERROR, "FieldType: " + this.getClass().getSimpleName() + " (" + typeName + ") does not support specifying an analyzer"); }
// in core/src/java/org/apache/solr/schema/FieldType.java
public void setQueryAnalyzer(Analyzer analyzer) { throw new SolrException (ErrorCode.SERVER_ERROR, "FieldType: " + this.getClass().getSimpleName() + " (" + typeName + ") does not support specifying an analyzer"); }
// in core/src/java/org/apache/solr/search/SpatialFilterQParser.java
Override public Query parse() throws ParseException { //if more than one, we need to treat them as a point... //TODO: Should we accept multiple fields String[] fields = localParams.getParams("f"); if (fields == null || fields.length == 0) { String field = getParam(SpatialParams.FIELD); if (field == null) throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, " missing sfield for spatial request"); fields = new String[] {field}; } String pointStr = getParam(SpatialParams.POINT); if (pointStr == null) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, SpatialParams.POINT + " missing."); } double dist = -1; String distS = getParam(SpatialParams.DISTANCE); if (distS != null) dist = Double.parseDouble(distS); if (dist < 0) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, SpatialParams.DISTANCE + " must be >= 0"); } String measStr = localParams.get(SpatialParams.MEASURE); //TODO: Need to do something with Measures Query result = null; //fields is valid at this point if (fields.length == 1) { SchemaField sf = req.getSchema().getField(fields[0]); FieldType type = sf.getType(); if (type instanceof SpatialQueryable) { double radius = localParams.getDouble(SpatialParams.SPHERE_RADIUS, DistanceUtils.EARTH_MEAN_RADIUS_KM); SpatialOptions opts = new SpatialOptions(pointStr, dist, sf, measStr, radius, DistanceUnits.KILOMETERS); opts.bbox = bbox; result = ((SpatialQueryable)type).createSpatialQuery(this, opts); } else { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "The field " + fields[0] + " does not support spatial filtering"); } } else {// fields.length > 1 //TODO: Not sure about this just yet, is there a way to delegate, or do we just have a helper class? //Seems like we could just use FunctionQuery, but then what about scoring /*List<ValueSource> sources = new ArrayList<ValueSource>(fields.length); for (String field : fields) { SchemaField sf = schema.getField(field); sources.add(sf.getType().getValueSource(sf, this)); } MultiValueSource vs = new VectorValueSource(sources); ValueSourceRangeFilter rf = new ValueSourceRangeFilter(vs, "0", String.valueOf(dist), true, true); result = new SolrConstantScoreQuery(rf);*/ } return result; }
// in core/src/java/org/apache/solr/search/Grouping.java
public void execute() throws IOException { if (commands.isEmpty()) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Specify at least one field, function or query to group by."); } DocListAndSet out = new DocListAndSet(); qr.setDocListAndSet(out); SolrIndexSearcher.ProcessedFilter pf = searcher.getProcessedFilter(cmd.getFilter(), cmd.getFilterList()); final Filter luceneFilter = pf.filter; maxDoc = searcher.maxDoc(); needScores = (cmd.getFlags() & SolrIndexSearcher.GET_SCORES) != 0; boolean cacheScores = false; // NOTE: Change this when groupSort can be specified per group if (!needScores && !commands.isEmpty()) { if (commands.get(0).groupSort == null) { cacheScores = true; } else { for (SortField field : commands.get(0).groupSort.getSort()) { if (field.getType() == SortField.Type.SCORE) { cacheScores = true; break; } } } } else if (needScores) { cacheScores = needScores; } getDocSet = (cmd.getFlags() & SolrIndexSearcher.GET_DOCSET) != 0; getDocList = (cmd.getFlags() & SolrIndexSearcher.GET_DOCLIST) != 0; query = QueryUtils.makeQueryable(cmd.getQuery()); for (Command cmd : commands) { cmd.prepare(); } AbstractAllGroupHeadsCollector<?> allGroupHeadsCollector = null; List<Collector> collectors = new ArrayList<Collector>(commands.size()); for (Command cmd : commands) { Collector collector = cmd.createFirstPassCollector(); if (collector != null) { collectors.add(collector); } if (getGroupedDocSet && allGroupHeadsCollector == null) { collectors.add(allGroupHeadsCollector = cmd.createAllGroupCollector()); } } Collector allCollectors = MultiCollector.wrap(collectors.toArray(new Collector[collectors.size()])); DocSetCollector setCollector = null; if (getDocSet && allGroupHeadsCollector == null) { setCollector = new DocSetDelegateCollector(maxDoc >> 6, maxDoc, allCollectors); allCollectors = setCollector; } CachingCollector cachedCollector = null; if (cacheSecondPassSearch && allCollectors != null) { int maxDocsToCache = (int) Math.round(maxDoc * (maxDocsPercentageToCache / 100.0d)); // Only makes sense to cache if we cache more than zero. // Maybe we should have a minimum and a maximum, that defines the window we would like caching for. if (maxDocsToCache > 0) { allCollectors = cachedCollector = CachingCollector.create(allCollectors, cacheScores, maxDocsToCache); } } if (pf.postFilter != null) { pf.postFilter.setLastDelegate(allCollectors); allCollectors = pf.postFilter; } if (allCollectors != null) { searchWithTimeLimiter(luceneFilter, allCollectors); } if (getGroupedDocSet && allGroupHeadsCollector != null) { FixedBitSet fixedBitSet = allGroupHeadsCollector.retrieveGroupHeads(maxDoc); long[] bits = fixedBitSet.getBits(); OpenBitSet openBitSet = new OpenBitSet(bits, bits.length); qr.setDocSet(new BitDocSet(openBitSet)); } else if (getDocSet) { qr.setDocSet(setCollector.getDocSet()); } collectors.clear(); for (Command cmd : commands) { Collector collector = cmd.createSecondPassCollector(); if (collector != null) collectors.add(collector); } if (!collectors.isEmpty()) { Collector secondPhaseCollectors = MultiCollector.wrap(collectors.toArray(new Collector[collectors.size()])); if (collectors.size() > 0) { if (cachedCollector != null) { if (cachedCollector.isCached()) { cachedCollector.replay(secondPhaseCollectors); } else { signalCacheWarning = true; logger.warn(String.format("The grouping cache is active, but not used because it exceeded the max cache limit of %d percent", maxDocsPercentageToCache)); logger.warn("Please increase cache size or disable group caching."); searchWithTimeLimiter(luceneFilter, secondPhaseCollectors); } } else { if (pf.postFilter != null) { pf.postFilter.setLastDelegate(secondPhaseCollectors); secondPhaseCollectors = pf.postFilter; } searchWithTimeLimiter(luceneFilter, secondPhaseCollectors); } } } for (Command cmd : commands) { cmd.finish(); } qr.groupedResults = grouped; if (getDocList) { int sz = idSet.size(); int[] ids = new int[sz]; int idx = 0; for (int val : idSet) { ids[idx++] = val; } qr.setDocList(new DocSlice(0, sz, ids, null, maxMatches, maxScore)); } }
// in core/src/java/org/apache/solr/search/ReturnFields.java
private void add(String fl, NamedList<String> rename, DocTransformers augmenters, SolrQueryRequest req) { if( fl == null ) { return; } try { QueryParsing.StrParser sp = new QueryParsing.StrParser(fl); for(;;) { sp.opt(','); sp.eatws(); if (sp.pos >= sp.end) break; int start = sp.pos; // short circuit test for a really simple field name String key = null; String field = getFieldName(sp); char ch = sp.ch(); if (field != null) { if (sp.opt(':')) { // this was a key, not a field name key = field; field = null; sp.eatws(); start = sp.pos; } else { if (ch==' ' || ch == ',' || ch==0) { addField( field, key, augmenters, req ); continue; } // an invalid field name... reset the position pointer to retry sp.pos = start; field = null; } } if (key != null) { // we read "key : " field = sp.getId(null); ch = sp.ch(); if (field != null && (ch==' ' || ch == ',' || ch==0)) { rename.add(field, key); addField( field, key, augmenters, req ); continue; } // an invalid field name... reset the position pointer to retry sp.pos = start; field = null; } if (field == null) { // We didn't find a simple name, so let's see if it's a globbed field name. // Globbing only works with field names of the recommended form (roughly like java identifiers) field = sp.getGlobbedId(null); ch = sp.ch(); if (field != null && (ch==' ' || ch == ',' || ch==0)) { // "*" looks and acts like a glob, but we give it special treatment if ("*".equals(field)) { _wantsAllFields = true; } else { globs.add(field); } continue; } // an invalid glob sp.pos = start; } String funcStr = sp.val.substring(start); // Is it an augmenter of the form [augmenter_name foo=1 bar=myfield]? // This is identical to localParams syntax except it uses [] instead of {!} if (funcStr.startsWith("[")) { Map<String,String> augmenterArgs = new HashMap<String,String>(); int end = QueryParsing.parseLocalParams(funcStr, 0, augmenterArgs, req.getParams(), "[", ']'); sp.pos += end; // [foo] is short for [type=foo] in localParams syntax String augmenterName = augmenterArgs.remove("type"); String disp = key; if( disp == null ) { disp = '['+augmenterName+']'; } TransformerFactory factory = req.getCore().getTransformerFactory( augmenterName ); if( factory != null ) { MapSolrParams augmenterParams = new MapSolrParams( augmenterArgs ); augmenters.addTransformer( factory.create(disp, augmenterParams, req) ); } else { // unknown transformer? } addField(field, disp, augmenters, req); continue; } // let's try it as a function instead QParser parser = QParser.getParser(funcStr, FunctionQParserPlugin.NAME, req); Query q = null; ValueSource vs = null; try { if (parser instanceof FunctionQParser) { FunctionQParser fparser = (FunctionQParser)parser; fparser.setParseMultipleSources(false); fparser.setParseToEnd(false); q = fparser.getQuery(); if (fparser.localParams != null) { if (fparser.valFollowedParams) { // need to find the end of the function query via the string parser int leftOver = fparser.sp.end - fparser.sp.pos; sp.pos = sp.end - leftOver; // reset our parser to the same amount of leftover } else { // the value was via the "v" param in localParams, so we need to find // the end of the local params themselves to pick up where we left off sp.pos = start + fparser.localParamsEnd; } } else { // need to find the end of the function query via the string parser int leftOver = fparser.sp.end - fparser.sp.pos; sp.pos = sp.end - leftOver; // reset our parser to the same amount of leftover } } else { // A QParser that's not for function queries. // It must have been specified via local params. q = parser.getQuery(); assert parser.getLocalParams() != null; sp.pos = start + parser.localParamsEnd; } if (q instanceof FunctionQuery) { vs = ((FunctionQuery)q).getValueSource(); } else { vs = new QueryValueSource(q, 0.0f); } if (key==null) { SolrParams localParams = parser.getLocalParams(); if (localParams != null) { key = localParams.get("key"); } if (key == null) { // use the function name itself as the field name key = sp.val.substring(start, sp.pos); } } if (key==null) { key = funcStr; } okFieldNames.add( key ); okFieldNames.add( funcStr ); augmenters.addTransformer( new ValueSourceAugmenter( key, parser, vs ) ); } catch (ParseException e) { // try again, simple rules for a field name with no whitespace sp.pos = start; field = sp.getSimpleString(); if (req.getSchema().getFieldOrNull(field) != null) { // OK, it was an oddly named field fields.add(field); if( key != null ) { rename.add(field, key); } } else { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Error parsing fieldname: " + e.getMessage(), e); } } // end try as function } // end for(;;) } catch (ParseException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Error parsing fieldname", e); } }
// in core/src/java/org/apache/solr/search/SolrConstantScoreQuery.java
Override public Weight createWeight(IndexSearcher searcher) { try { return new SolrConstantScoreQuery.ConstantWeight(searcher); } catch (IOException e) { // TODO: remove this if ConstantScoreQuery.createWeight adds IOException throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); } }
// in core/src/java/org/apache/solr/search/QueryParsing.java
public static Sort parseSort(String sortSpec, SolrQueryRequest req) { if (sortSpec == null || sortSpec.length() == 0) return null; List<SortField> lst = new ArrayList<SortField>(4); try { StrParser sp = new StrParser(sortSpec); while (sp.pos < sp.end) { sp.eatws(); final int start = sp.pos; // short circuit test for a really simple field name String field = sp.getId(null); Exception qParserException = null; if (field == null || !Character.isWhitespace(sp.peekChar())) { // let's try it as a function instead field = null; String funcStr = sp.val.substring(start); QParser parser = QParser.getParser(funcStr, FunctionQParserPlugin.NAME, req); Query q = null; try { if (parser instanceof FunctionQParser) { FunctionQParser fparser = (FunctionQParser)parser; fparser.setParseMultipleSources(false); fparser.setParseToEnd(false); q = fparser.getQuery(); if (fparser.localParams != null) { if (fparser.valFollowedParams) { // need to find the end of the function query via the string parser int leftOver = fparser.sp.end - fparser.sp.pos; sp.pos = sp.end - leftOver; // reset our parser to the same amount of leftover } else { // the value was via the "v" param in localParams, so we need to find // the end of the local params themselves to pick up where we left off sp.pos = start + fparser.localParamsEnd; } } else { // need to find the end of the function query via the string parser int leftOver = fparser.sp.end - fparser.sp.pos; sp.pos = sp.end - leftOver; // reset our parser to the same amount of leftover } } else { // A QParser that's not for function queries. // It must have been specified via local params. q = parser.getQuery(); assert parser.getLocalParams() != null; sp.pos = start + parser.localParamsEnd; } Boolean top = sp.getSortDirection(); if (null != top) { // we have a Query and a valid direction if (q instanceof FunctionQuery) { lst.add(((FunctionQuery)q).getValueSource().getSortField(top)); } else { lst.add((new QueryValueSource(q, 0.0f)).getSortField(top)); } continue; } } catch (IOException ioe) { throw ioe; } catch (Exception e) { // hang onto this in case the string isn't a full field name either qParserException = e; } } // if we made it here, we either have a "simple" field name, // or there was a problem parsing the string as a complex func/quer if (field == null) { // try again, simple rules for a field name with no whitespace sp.pos = start; field = sp.getSimpleString(); } Boolean top = sp.getSortDirection(); if (null == top) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Can't determine a Sort Order (asc or desc) in sort spec " + sp); } if (SCORE.equals(field)) { if (top) { lst.add(SortField.FIELD_SCORE); } else { lst.add(new SortField(null, SortField.Type.SCORE, true)); } } else if (DOCID.equals(field)) { lst.add(new SortField(null, SortField.Type.DOC, top)); } else { // try to find the field SchemaField sf = req.getSchema().getFieldOrNull(field); if (null == sf) { if (null != qParserException) { throw new SolrException (SolrException.ErrorCode.BAD_REQUEST, "sort param could not be parsed as a query, and is not a "+ "field that exists in the index: " + field, qParserException); } throw new SolrException (SolrException.ErrorCode.BAD_REQUEST, "sort param field can't be found: " + field); } lst.add(sf.getSortField(top)); } } } catch (ParseException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "error in sort: " + sortSpec, e); } catch (IOException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "error in sort: " + sortSpec, e); } // normalize a sort on score desc to null if (lst.size()==1 && lst.get(0) == SortField.FIELD_SCORE) { return null; } return new Sort(lst.toArray(new SortField[lst.size()])); }
// in core/src/java/org/apache/solr/search/grouping/distributed/responseprocessor/SearchGroupShardResponseProcessor.java
public void process(ResponseBuilder rb, ShardRequest shardRequest) { SortSpec ss = rb.getSortSpec(); Sort groupSort = rb.getGroupingSpec().getGroupSort(); String[] fields = rb.getGroupingSpec().getFields(); Map<String, List<Collection<SearchGroup<BytesRef>>>> commandSearchGroups = new HashMap<String, List<Collection<SearchGroup<BytesRef>>>>(); Map<String, Map<SearchGroup<BytesRef>, Set<String>>> tempSearchGroupToShards = new HashMap<String, Map<SearchGroup<BytesRef>, Set<String>>>(); for (String field : fields) { commandSearchGroups.put(field, new ArrayList<Collection<SearchGroup<BytesRef>>>(shardRequest.responses.size())); tempSearchGroupToShards.put(field, new HashMap<SearchGroup<BytesRef>, Set<String>>()); if (!rb.searchGroupToShards.containsKey(field)) { rb.searchGroupToShards.put(field, new HashMap<SearchGroup<BytesRef>, Set<String>>()); } } SearchGroupsResultTransformer serializer = new SearchGroupsResultTransformer(rb.req.getSearcher()); try { int maxElapsedTime = 0; int hitCountDuringFirstPhase = 0; for (ShardResponse srsp : shardRequest.responses) { maxElapsedTime = (int) Math.max(maxElapsedTime, srsp.getSolrResponse().getElapsedTime()); @SuppressWarnings("unchecked") NamedList<NamedList> firstPhaseResult = (NamedList<NamedList>) srsp.getSolrResponse().getResponse().get("firstPhase"); Map<String, Pair<Integer, Collection<SearchGroup<BytesRef>>>> result = serializer.transformToNative(firstPhaseResult, groupSort, null, srsp.getShard()); for (String field : commandSearchGroups.keySet()) { Pair<Integer, Collection<SearchGroup<BytesRef>>> firstPhaseCommandResult = result.get(field); Integer groupCount = firstPhaseCommandResult.getA(); if (groupCount != null) { Integer existingGroupCount = rb.mergedGroupCounts.get(field); // Assuming groups don't cross shard boundary... rb.mergedGroupCounts.put(field, existingGroupCount != null ? existingGroupCount + groupCount : groupCount); } Collection<SearchGroup<BytesRef>> searchGroups = firstPhaseCommandResult.getB(); if (searchGroups == null) { continue; } commandSearchGroups.get(field).add(searchGroups); for (SearchGroup<BytesRef> searchGroup : searchGroups) { Map<SearchGroup<BytesRef>, java.util.Set<String>> map = tempSearchGroupToShards.get(field); Set<String> shards = map.get(searchGroup); if (shards == null) { shards = new HashSet<String>(); map.put(searchGroup, shards); } shards.add(srsp.getShard()); } } hitCountDuringFirstPhase += (Integer) srsp.getSolrResponse().getResponse().get("totalHitCount"); } rb.totalHitCount = hitCountDuringFirstPhase; rb.firstPhaseElapsedTime = maxElapsedTime; for (String groupField : commandSearchGroups.keySet()) { List<Collection<SearchGroup<BytesRef>>> topGroups = commandSearchGroups.get(groupField); Collection<SearchGroup<BytesRef>> mergedTopGroups = SearchGroup.merge(topGroups, ss.getOffset(), ss.getCount(), groupSort); if (mergedTopGroups == null) { continue; } rb.mergedSearchGroups.put(groupField, mergedTopGroups); for (SearchGroup<BytesRef> mergedTopGroup : mergedTopGroups) { rb.searchGroupToShards.get(groupField).put(mergedTopGroup, tempSearchGroupToShards.get(groupField).get(mergedTopGroup)); } } } catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); } }
// in core/src/java/org/apache/solr/search/JoinQParserPlugin.java
public Query parse() throws ParseException { String fromField = getParam("from"); String fromIndex = getParam("fromIndex"); String toField = getParam("to"); String v = localParams.get("v"); Query fromQuery; long fromCoreOpenTime = 0; if (fromIndex != null && !fromIndex.equals(req.getCore().getCoreDescriptor().getName()) ) { CoreContainer container = req.getCore().getCoreDescriptor().getCoreContainer(); final SolrCore fromCore = container.getCore(fromIndex); RefCounted<SolrIndexSearcher> fromHolder = null; if (fromCore == null) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Cross-core join: no such core " + fromIndex); } LocalSolrQueryRequest otherReq = new LocalSolrQueryRequest(fromCore, params); try { QParser parser = QParser.getParser(v, "lucene", otherReq); fromQuery = parser.getQuery(); fromHolder = fromCore.getRegisteredSearcher(); if (fromHolder != null) fromCoreOpenTime = fromHolder.get().getOpenTime(); } finally { otherReq.close(); fromCore.close(); if (fromHolder != null) fromHolder.decref(); } } else { QParser fromQueryParser = subQuery(v, null); fromQuery = fromQueryParser.getQuery(); } JoinQuery jq = new JoinQuery(fromField, toField, fromIndex, fromQuery); jq.fromCoreOpenTime = fromCoreOpenTime; return jq; }
// in core/src/java/org/apache/solr/search/SolrQueryParser.java
private void checkNullField(String field) throws SolrException { if (field == null && defaultField == null) { throw new SolrException (SolrException.ErrorCode.BAD_REQUEST, "no field name specified in query and no defaultSearchField defined in schema.xml"); } }
// in core/src/java/org/apache/solr/search/DocSetBase.java
public void add(int doc) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR,"Unsupported Operation"); }
// in core/src/java/org/apache/solr/search/DocSetBase.java
public void addUnique(int doc) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR,"Unsupported Operation"); }
// in core/src/java/org/apache/solr/search/ValueSourceParser.java
Override public ValueSource parse(FunctionQParser fp) throws ParseException { double radius = fp.parseDouble(); //SOLR-2114, make the convert flag required, since the parser doesn't support much in the way of lookahead or the ability to convert a String into a ValueSource boolean convert = Boolean.parseBoolean(fp.parseArg()); MultiValueSource pv1; MultiValueSource pv2; ValueSource one = fp.parseValueSource(); ValueSource two = fp.parseValueSource(); if (fp.hasMoreArguments()) { List<ValueSource> s1 = new ArrayList<ValueSource>(); s1.add(one); s1.add(two); pv1 = new VectorValueSource(s1); ValueSource x2 = fp.parseValueSource(); ValueSource y2 = fp.parseValueSource(); List<ValueSource> s2 = new ArrayList<ValueSource>(); s2.add(x2); s2.add(y2); pv2 = new VectorValueSource(s2); } else { //check to see if we have multiValue source if (one instanceof MultiValueSource && two instanceof MultiValueSource){ pv1 = (MultiValueSource) one; pv2 = (MultiValueSource) two; } else { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Input must either be 2 MultiValueSources, or there must be 4 ValueSources"); } } return new HaversineFunction(pv1, pv2, radius, convert); }
// in core/src/java/org/apache/solr/search/ValueSourceParser.java
private static MVResult getMultiValueSources(List<ValueSource> sources) { MVResult mvr = new MVResult(); if (sources.size() % 2 != 0) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Illegal number of sources. There must be an even number of sources"); } if (sources.size() == 2) { //check to see if these are MultiValueSource boolean s1MV = sources.get(0) instanceof MultiValueSource; boolean s2MV = sources.get(1) instanceof MultiValueSource; if (s1MV && s2MV) { mvr.mv1 = (MultiValueSource) sources.get(0); mvr.mv2 = (MultiValueSource) sources.get(1); } else if (s1MV || s2MV) { //if one is a MultiValueSource, than the other one needs to be too. throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Illegal number of sources. There must be an even number of sources"); } else { mvr.mv1 = new VectorValueSource(Collections.singletonList(sources.get(0))); mvr.mv2 = new VectorValueSource(Collections.singletonList(sources.get(1))); } } else { int dim = sources.size() / 2; List<ValueSource> sources1 = new ArrayList<ValueSource>(dim); List<ValueSource> sources2 = new ArrayList<ValueSource>(dim); //Get dim value sources for the first vector splitSources(dim, sources, sources1, sources2); mvr.mv1 = new VectorValueSource(sources1); mvr.mv2 = new VectorValueSource(sources2); } return mvr; }
// in core/src/java/org/apache/solr/search/ValueSourceParser.java
public ValueSource getValueSource(FunctionQParser fp, String arg) { if (arg == null) return null; SchemaField f = fp.req.getSchema().getField(arg); if (f.getType().getClass() == DateField.class) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Can't use ms() function on non-numeric legacy date field " + arg); } return f.getType().getValueSource(f, fp); }
// in core/src/java/org/apache/solr/search/ValueSourceParser.java
Override public FunctionValues getValues(Map context, AtomicReaderContext readerContext) throws IOException { if (context.get(this) == null) { SolrRequestInfo requestInfo = SolrRequestInfo.getRequestInfo(); throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "testfunc: unweighted value source detected. delegate="+source + " request=" + (requestInfo==null ? "null" : requestInfo.getReq())); } return source.getValues(context, readerContext); }
// in core/src/java/org/apache/solr/highlight/DefaultSolrHighlighter.java
protected Formatter getFormatter(String fieldName, SolrParams params ) { String str = params.getFieldParam( fieldName, HighlightParams.FORMATTER ); SolrFormatter formatter = formatters.get( str ); if( formatter == null ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "Unknown formatter: "+str ); } return formatter.getFormatter( fieldName, params ); }
// in core/src/java/org/apache/solr/highlight/DefaultSolrHighlighter.java
protected Encoder getEncoder(String fieldName, SolrParams params){ String str = params.getFieldParam( fieldName, HighlightParams.ENCODER ); SolrEncoder encoder = encoders.get( str ); if( encoder == null ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "Unknown encoder: "+str ); } return encoder.getEncoder( fieldName, params ); }
// in core/src/java/org/apache/solr/highlight/DefaultSolrHighlighter.java
protected Fragmenter getFragmenter(String fieldName, SolrParams params) { String fmt = params.getFieldParam( fieldName, HighlightParams.FRAGMENTER ); SolrFragmenter frag = fragmenters.get( fmt ); if( frag == null ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "Unknown fragmenter: "+fmt ); } return frag.getFragmenter( fieldName, params ); }
// in core/src/java/org/apache/solr/highlight/DefaultSolrHighlighter.java
protected FragListBuilder getFragListBuilder( String fieldName, SolrParams params ){ String flb = params.getFieldParam( fieldName, HighlightParams.FRAG_LIST_BUILDER ); SolrFragListBuilder solrFlb = fragListBuilders.get( flb ); if( solrFlb == null ){ throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "Unknown fragListBuilder: " + flb ); } return solrFlb.getFragListBuilder( params ); }
// in core/src/java/org/apache/solr/highlight/DefaultSolrHighlighter.java
private SolrFragmentsBuilder getSolrFragmentsBuilder( String fieldName, SolrParams params ){ String fb = params.getFieldParam( fieldName, HighlightParams.FRAGMENTS_BUILDER ); SolrFragmentsBuilder solrFb = fragmentsBuilders.get( fb ); if( solrFb == null ){ throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "Unknown fragmentsBuilder: " + fb ); } return solrFb; }
// in core/src/java/org/apache/solr/highlight/DefaultSolrHighlighter.java
private BoundaryScanner getBoundaryScanner(String fieldName, SolrParams params){ String bs = params.getFieldParam(fieldName, HighlightParams.BOUNDARY_SCANNER); SolrBoundaryScanner solrBs = boundaryScanners.get(bs); if(solrBs == null){ throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Unknown boundaryScanner: " + bs); } return solrBs.getBoundaryScanner(fieldName, params); }
// in core/src/java/org/apache/solr/highlight/DefaultSolrHighlighter.java
private void doHighlightingByHighlighter( Query query, SolrQueryRequest req, NamedList docSummaries, int docId, Document doc, String fieldName ) throws IOException { final SolrIndexSearcher searcher = req.getSearcher(); final IndexSchema schema = searcher.getSchema(); // TODO: Currently in trunk highlighting numeric fields is broken (Lucene) - // so we disable them until fixed (see LUCENE-3080)! // BEGIN: Hack final SchemaField schemaField = schema.getFieldOrNull(fieldName); if (schemaField != null && ( (schemaField.getType() instanceof org.apache.solr.schema.TrieField) || (schemaField.getType() instanceof org.apache.solr.schema.TrieDateField) )) return; // END: Hack SolrParams params = req.getParams(); IndexableField[] docFields = doc.getFields(fieldName); List<String> listFields = new ArrayList<String>(); for (IndexableField field : docFields) { listFields.add(field.stringValue()); } String[] docTexts = (String[]) listFields.toArray(new String[listFields.size()]); // according to Document javadoc, doc.getValues() never returns null. check empty instead of null if (docTexts.length == 0) return; TokenStream tstream = null; int numFragments = getMaxSnippets(fieldName, params); boolean mergeContiguousFragments = isMergeContiguousFragments(fieldName, params); String[] summaries = null; List<TextFragment> frags = new ArrayList<TextFragment>(); TermOffsetsTokenStream tots = null; // to be non-null iff we're using TermOffsets optimization try { TokenStream tvStream = TokenSources.getTokenStream(searcher.getIndexReader(), docId, fieldName); if (tvStream != null) { tots = new TermOffsetsTokenStream(tvStream); } } catch (IllegalArgumentException e) { // No problem. But we can't use TermOffsets optimization. } for (int j = 0; j < docTexts.length; j++) { if( tots != null ) { // if we're using TermOffsets optimization, then get the next // field value's TokenStream (i.e. get field j's TokenStream) from tots: tstream = tots.getMultiValuedTokenStream( docTexts[j].length() ); } else { // fall back to analyzer tstream = createAnalyzerTStream(schema, fieldName, docTexts[j]); } int maxCharsToAnalyze = params.getFieldInt(fieldName, HighlightParams.MAX_CHARS, Highlighter.DEFAULT_MAX_CHARS_TO_ANALYZE); Highlighter highlighter; if (Boolean.valueOf(req.getParams().get(HighlightParams.USE_PHRASE_HIGHLIGHTER, "true"))) { if (maxCharsToAnalyze < 0) { tstream = new CachingTokenFilter(tstream); } else { tstream = new CachingTokenFilter(new OffsetLimitTokenFilter(tstream, maxCharsToAnalyze)); } // get highlighter highlighter = getPhraseHighlighter(query, fieldName, req, (CachingTokenFilter) tstream); // after highlighter initialization, reset tstream since construction of highlighter already used it tstream.reset(); } else { // use "the old way" highlighter = getHighlighter(query, fieldName, req); } if (maxCharsToAnalyze < 0) { highlighter.setMaxDocCharsToAnalyze(docTexts[j].length()); } else { highlighter.setMaxDocCharsToAnalyze(maxCharsToAnalyze); } try { TextFragment[] bestTextFragments = highlighter.getBestTextFragments(tstream, docTexts[j], mergeContiguousFragments, numFragments); for (int k = 0; k < bestTextFragments.length; k++) { if ((bestTextFragments[k] != null) && (bestTextFragments[k].getScore() > 0)) { frags.add(bestTextFragments[k]); } } } catch (InvalidTokenOffsetsException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); } } // sort such that the fragments with the highest score come first Collections.sort(frags, new Comparator<TextFragment>() { public int compare(TextFragment arg0, TextFragment arg1) { return Math.round(arg1.getScore() - arg0.getScore()); } }); // convert fragments back into text // TODO: we can include score and position information in output as snippet attributes if (frags.size() > 0) { ArrayList<String> fragTexts = new ArrayList<String>(); for (TextFragment fragment: frags) { if ((fragment != null) && (fragment.getScore() > 0)) { fragTexts.add(fragment.toString()); } if (fragTexts.size() >= numFragments) break; } summaries = fragTexts.toArray(new String[0]); if (summaries.length > 0) docSummaries.add(fieldName, summaries); } // no summeries made, copy text from alternate field if (summaries == null || summaries.length == 0) { alternateField( docSummaries, params, doc, fieldName ); } }
// in core/src/java/org/apache/solr/highlight/BreakIteratorBoundaryScanner.java
Override protected BoundaryScanner get(String fieldName, SolrParams params) { // construct Locale String language = params.getFieldParam(fieldName, HighlightParams.BS_LANGUAGE); String country = params.getFieldParam(fieldName, HighlightParams.BS_COUNTRY); if(country != null && language == null){ throw new SolrException(ErrorCode.BAD_REQUEST, HighlightParams.BS_LANGUAGE + " parameter cannot be null when you specify " + HighlightParams.BS_COUNTRY); } Locale locale = null; if(language != null){ locale = country == null ? new Locale(language) : new Locale(language, country); } // construct BreakIterator String type = params.getFieldParam(fieldName, HighlightParams.BS_TYPE, "WORD").toLowerCase(); BreakIterator bi = null; if(type.equals("character")){ bi = locale == null ? BreakIterator.getCharacterInstance() : BreakIterator.getCharacterInstance(locale); } else if(type.equals("word")){ bi = locale == null ? BreakIterator.getWordInstance() : BreakIterator.getWordInstance(locale); } else if(type.equals("line")){ bi = locale == null ? BreakIterator.getLineInstance() : BreakIterator.getLineInstance(locale); } else if(type.equals("sentence")){ bi = locale == null ? BreakIterator.getSentenceInstance() : BreakIterator.getSentenceInstance(locale); } else throw new SolrException(ErrorCode.BAD_REQUEST, type + " is invalid for parameter " + HighlightParams.BS_TYPE); return new org.apache.lucene.search.vectorhighlight.BreakIteratorBoundaryScanner(bi); }
// in core/src/java/org/apache/solr/highlight/SolrFragmentsBuilder.java
protected char getMultiValuedSeparatorChar( SolrParams params ){ String separator = params.get( HighlightParams.MULTI_VALUED_SEPARATOR, " " ); if( separator.length() > 1 ){ throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, HighlightParams.MULTI_VALUED_SEPARATOR + " parameter must be a char, but is \"" + separator + "\"" ); } return separator.charAt( 0 ); }
// in core/src/java/org/apache/solr/cloud/ElectionContext.java
Override void runLeaderProcess(boolean weAreReplacement) throws KeeperException, InterruptedException, IOException { if (cc != null) { String coreName = leaderProps.get(ZkStateReader.CORE_NAME_PROP); SolrCore core = null; try { // the first time we are run, we will get a startupCore - after // we will get null and must use cc.getCore core = cc.getCore(coreName); if (core == null) { cancelElection(); throw new SolrException(ErrorCode.SERVER_ERROR, "Fatal Error, SolrCore not found:" + coreName + " in " + cc.getCoreNames()); } // should I be leader? if (weAreReplacement && !shouldIBeLeader(leaderProps)) { // System.out.println("there is a better leader candidate it appears"); rejoinLeaderElection(leaderSeqPath, core); return; } if (weAreReplacement) { if (zkClient.exists(leaderPath, true)) { zkClient.delete(leaderPath, -1, true); } // System.out.println("I may be the new Leader:" + leaderPath // + " - I need to try and sync"); boolean success = syncStrategy.sync(zkController, core, leaderProps); if (!success && anyoneElseActive()) { rejoinLeaderElection(leaderSeqPath, core); return; } } // If I am going to be the leader I have to be active // System.out.println("I am leader go active"); core.getUpdateHandler().getSolrCoreState().cancelRecovery(); zkController.publish(core.getCoreDescriptor(), ZkStateReader.ACTIVE); } finally { if (core != null ) { core.close(); } } } super.runLeaderProcess(weAreReplacement); }
// in core/src/java/org/apache/solr/cloud/SolrZkServer.java
public void parseConfig() { if (zkProps == null) { zkProps = new SolrZkServerProps(); // set default data dir // TODO: use something based on IP+port??? support ensemble all from same solr home? zkProps.setDataDir(dataHome); zkProps.zkRun = zkRun; zkProps.solrPort = solrPort; } try { props = SolrZkServerProps.getProperties(confHome + '/' + "zoo.cfg"); SolrZkServerProps.injectServers(props, zkRun, zkHost); zkProps.parseProperties(props); if (zkProps.getClientPortAddress() == null) { zkProps.setClientPort(Integer.parseInt(solrPort)+1000); } } catch (QuorumPeerConfig.ConfigException e) { if (zkRun != null) throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); } catch (IOException e) { if (zkRun != null) throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); } }
// in core/src/java/org/apache/solr/cloud/SolrZkServer.java
Override public void run() { try { if (zkProps.getServers().size() > 1) { QuorumPeerMain zkServer = new QuorumPeerMain(); zkServer.runFromConfig(zkProps); } else { ServerConfig sc = new ServerConfig(); sc.readFrom(zkProps); ZooKeeperServerMain zkServer = new ZooKeeperServerMain(); zkServer.runFromConfig(sc); } log.info("ZooKeeper Server exited."); } catch (Throwable e) { log.error("ZooKeeper Server ERROR", e); throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); } }
// in core/src/java/org/apache/solr/cloud/RecoveryStrategy.java
private void replicate(String nodeName, SolrCore core, ZkNodeProps leaderprops, String baseUrl) throws SolrServerException, IOException { String leaderBaseUrl = leaderprops.get(ZkStateReader.BASE_URL_PROP); ZkCoreNodeProps leaderCNodeProps = new ZkCoreNodeProps(leaderprops); String leaderUrl = leaderCNodeProps.getCoreUrl(); log.info("Attempting to replicate from " + leaderUrl); // if we are the leader, either we are trying to recover faster // then our ephemeral timed out or we are the only node if (!leaderBaseUrl.equals(baseUrl)) { // send commit commitOnLeader(leaderUrl); // use rep handler directly, so we can do this sync rather than async SolrRequestHandler handler = core.getRequestHandler(REPLICATION_HANDLER); if (handler instanceof LazyRequestHandlerWrapper) { handler = ((LazyRequestHandlerWrapper)handler).getWrappedHandler(); } ReplicationHandler replicationHandler = (ReplicationHandler) handler; if (replicationHandler == null) { throw new SolrException(ErrorCode.SERVICE_UNAVAILABLE, "Skipping recovery, no " + REPLICATION_HANDLER + " handler found"); } ModifiableSolrParams solrParams = new ModifiableSolrParams(); solrParams.set(ReplicationHandler.MASTER_URL, leaderUrl + "replication"); if (isClosed()) retries = INTERRUPTED; boolean success = replicationHandler.doFetch(solrParams, true); // TODO: look into making sure force=true does not download files we already have if (!success) { throw new SolrException(ErrorCode.SERVER_ERROR, "Replication for recovery failed."); } // solrcloud_debug // try { // RefCounted<SolrIndexSearcher> searchHolder = core.getNewestSearcher(false); // SolrIndexSearcher searcher = searchHolder.get(); // try { // System.out.println(core.getCoreDescriptor().getCoreContainer().getZkController().getNodeName() + " replicated " // + searcher.search(new MatchAllDocsQuery(), 1).totalHits + " from " + leaderUrl + " gen:" + core.getDeletionPolicy().getLatestCommit().getGeneration() + " data:" + core.getDataDir()); // } finally { // searchHolder.decref(); // } // } catch (Exception e) { // // } } }
// in core/src/java/org/apache/solr/cloud/ZkController.java
private void init() { try { // makes nodes zkNode cmdExecutor.ensureExists(ZkStateReader.LIVE_NODES_ZKNODE, zkClient); Overseer.createClientNodes(zkClient, getNodeName()); createEphemeralLiveNode(); cmdExecutor.ensureExists(ZkStateReader.COLLECTIONS_ZKNODE, zkClient); syncNodeState(); overseerElector = new LeaderElector(zkClient); ElectionContext context = new OverseerElectionContext(getNodeName(), zkClient, zkStateReader); overseerElector.setup(context); overseerElector.joinElection(context); zkStateReader.createClusterStateWatchersAndUpdate(); } catch (IOException e) { log.error("", e); throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Can't create ZooKeeperController", e); } catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } catch (KeeperException e) { log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } }
// in core/src/java/org/apache/solr/cloud/ZkController.java
public String register(String coreName, final CoreDescriptor desc, boolean recoverReloadedCores) throws Exception { final String baseUrl = getBaseUrl(); final CloudDescriptor cloudDesc = desc.getCloudDescriptor(); final String collection = cloudDesc.getCollectionName(); final String coreZkNodeName = getNodeName() + "_" + coreName; String shardId = cloudDesc.getShardId(); Map<String,String> props = new HashMap<String,String>(); // we only put a subset of props into the leader node props.put(ZkStateReader.BASE_URL_PROP, baseUrl); props.put(ZkStateReader.CORE_NAME_PROP, coreName); props.put(ZkStateReader.NODE_NAME_PROP, getNodeName()); if (log.isInfoEnabled()) { log.info("Register shard - core:" + coreName + " address:" + baseUrl + " shardId:" + shardId); } ZkNodeProps leaderProps = new ZkNodeProps(props); try { joinElection(desc); } catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } catch (KeeperException e) { throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } catch (IOException e) { throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } // rather than look in the cluster state file, we go straight to the zknodes // here, because on cluster restart there could be stale leader info in the // cluster state node that won't be updated for a moment String leaderUrl = getLeaderProps(collection, cloudDesc.getShardId()).getCoreUrl(); // now wait until our currently cloud state contains the latest leader String cloudStateLeader = zkStateReader.getLeaderUrl(collection, cloudDesc.getShardId(), 30000); int tries = 0; while (!leaderUrl.equals(cloudStateLeader)) { if (tries == 60) { throw new SolrException(ErrorCode.SERVER_ERROR, "There is conflicting information about the leader of shard: " + cloudDesc.getShardId()); } Thread.sleep(1000); tries++; cloudStateLeader = zkStateReader.getLeaderUrl(collection, cloudDesc.getShardId(), 30000); } String ourUrl = ZkCoreNodeProps.getCoreUrl(baseUrl, coreName); log.info("We are " + ourUrl + " and leader is " + leaderUrl); boolean isLeader = leaderUrl.equals(ourUrl); SolrCore core = null; if (cc != null) { // CoreContainer only null in tests try { core = cc.getCore(desc.getName()); // recover from local transaction log and wait for it to complete before // going active // TODO: should this be moved to another thread? To recoveryStrat? // TODO: should this actually be done earlier, before (or as part of) // leader election perhaps? // TODO: if I'm the leader, ensure that a replica that is trying to recover waits until I'm // active (or don't make me the // leader until my local replay is done. UpdateLog ulog = core.getUpdateHandler().getUpdateLog(); if (!core.isReloaded() && ulog != null) { Future<UpdateLog.RecoveryInfo> recoveryFuture = core.getUpdateHandler() .getUpdateLog().recoverFromLog(); if (recoveryFuture != null) { recoveryFuture.get(); // NOTE: this could potentially block for // minutes or more! // TODO: public as recovering in the mean time? // TODO: in the future we could do peerync in parallel with recoverFromLog } else { log.info("No LogReplay needed for core="+core.getName() + " baseURL=" + baseUrl); } } boolean didRecovery = checkRecovery(coreName, desc, recoverReloadedCores, isLeader, cloudDesc, collection, coreZkNodeName, shardId, leaderProps, core, cc); if (!didRecovery) { publishAsActive(baseUrl, desc, coreZkNodeName, coreName); } } finally { if (core != null) { core.close(); } } } else { publishAsActive(baseUrl, desc, coreZkNodeName, coreName); } // make sure we have an update cluster state right away zkStateReader.updateCloudState(true); return shardId; }
// in core/src/java/org/apache/solr/cloud/ZkController.java
private void publishState(CoreDescriptor cd, String shardZkNodeName, String coreName, Map<String,String> props) { CloudDescriptor cloudDesc = cd.getCloudDescriptor(); if (cloudDesc.getRoles() != null) { props.put(ZkStateReader.ROLES_PROP, cloudDesc.getRoles()); } if (cloudDesc.getShardId() == null && needsToBeAssignedShardId(cd, zkStateReader.getCloudState(), shardZkNodeName)) { // publish with no shard id so we are assigned one, and then look for it doPublish(shardZkNodeName, coreName, props, cloudDesc); String shardId; try { shardId = doGetShardIdProcess(coreName, cloudDesc); } catch (InterruptedException e) { throw new SolrException(ErrorCode.SERVER_ERROR, "Interrupted"); } cloudDesc.setShardId(shardId); } if (!props.containsKey(ZkStateReader.SHARD_ID_PROP) && cloudDesc.getShardId() != null) { props.put(ZkStateReader.SHARD_ID_PROP, cloudDesc.getShardId()); } doPublish(shardZkNodeName, coreName, props, cloudDesc); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
private String doGetShardIdProcess(String coreName, CloudDescriptor descriptor) throws InterruptedException { final String shardZkNodeName = getNodeName() + "_" + coreName; int retryCount = 120; while (retryCount-- > 0) { final String shardId = zkStateReader.getCloudState().getShardId( shardZkNodeName); if (shardId != null) { return shardId; } try { Thread.sleep(500); } catch (InterruptedException e) { Thread.currentThread().interrupt(); } } throw new SolrException(ErrorCode.SERVER_ERROR, "Could not get shard_id for core: " + coreName); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
private ZkCoreNodeProps waitForLeaderToSeeDownState( CoreDescriptor descriptor, final String coreZkNodeName) { CloudDescriptor cloudDesc = descriptor.getCloudDescriptor(); String collection = cloudDesc.getCollectionName(); String shard = cloudDesc.getShardId(); ZkCoreNodeProps leaderProps = null; int retries = 6; for (int i = 0; i < retries; i++) { try { // go straight to zk, not the cloud state - we must have current info leaderProps = getLeaderProps(collection, shard); break; } catch (Exception e) { SolrException.log(log, "There was a problem finding the leader in zk", e); try { Thread.sleep(2000); } catch (InterruptedException e1) { Thread.currentThread().interrupt(); } if (i == retries - 1) { throw new SolrException(ErrorCode.SERVER_ERROR, "There was a problem finding the leader in zk"); } } } String leaderBaseUrl = leaderProps.getBaseUrl(); String leaderCoreName = leaderProps.getCoreName(); String ourUrl = ZkCoreNodeProps.getCoreUrl(getBaseUrl(), descriptor.getName()); boolean isLeader = leaderProps.getCoreUrl().equals(ourUrl); if (!isLeader && !SKIP_AUTO_RECOVERY) { HttpSolrServer server = null; server = new HttpSolrServer(leaderBaseUrl); server.setConnectionTimeout(45000); server.setSoTimeout(45000); WaitForState prepCmd = new WaitForState(); prepCmd.setCoreName(leaderCoreName); prepCmd.setNodeName(getNodeName()); prepCmd.setCoreNodeName(coreZkNodeName); prepCmd.setState(ZkStateReader.DOWN); prepCmd.setPauseFor(0); // let's retry a couple times - perhaps the leader just went down, // or perhaps he is just not quite ready for us yet retries = 6; for (int i = 0; i < retries; i++) { try { server.request(prepCmd); break; } catch (Exception e) { SolrException.log(log, "There was a problem making a request to the leader", e); try { Thread.sleep(2000); } catch (InterruptedException e1) { Thread.currentThread().interrupt(); } if (i == retries - 1) { throw new SolrException(ErrorCode.SERVER_ERROR, "There was a problem making a request to the leader"); } } } server.shutdown(); } return leaderProps; }
// in core/src/java/org/apache/solr/update/UpdateLog.java
private void ensureLog() { if (tlog == null) { String newLogName = String.format(Locale.ENGLISH, LOG_FILENAME_PATTERN, TLOG_NAME, id); try { tlog = new TransactionLog(new File(tlogDir, newLogName), globalStrings); } catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Can't open new tlog!", e); } } }
// in core/src/java/org/apache/solr/update/UpdateLog.java
private void update() { int numUpdates = 0; updateList = new ArrayList<List<Update>>(logList.size()); deleteByQueryList = new ArrayList<Update>(); deleteList = new ArrayList<DeleteUpdate>(); updates = new HashMap<Long,Update>(numRecordsToKeep); for (TransactionLog oldLog : logList) { List<Update> updatesForLog = new ArrayList<Update>(); TransactionLog.ReverseReader reader = null; try { reader = oldLog.getReverseReader(); while (numUpdates < numRecordsToKeep) { Object o = reader.next(); if (o==null) break; try { // should currently be a List<Oper,Ver,Doc/Id> List entry = (List)o; // TODO: refactor this out so we get common error handling int opAndFlags = (Integer)entry.get(0); if (latestOperation == 0) { latestOperation = opAndFlags; } int oper = opAndFlags & UpdateLog.OPERATION_MASK; long version = (Long) entry.get(1); switch (oper) { case UpdateLog.ADD: case UpdateLog.DELETE: case UpdateLog.DELETE_BY_QUERY: Update update = new Update(); update.log = oldLog; update.pointer = reader.position(); update.version = version; updatesForLog.add(update); updates.put(version, update); if (oper == UpdateLog.DELETE_BY_QUERY) { deleteByQueryList.add(update); } else if (oper == UpdateLog.DELETE) { deleteList.add(new DeleteUpdate(version, (byte[])entry.get(2))); } break; case UpdateLog.COMMIT: break; default: throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Unknown Operation! " + oper); } } catch (ClassCastException cl) { log.warn("Unexpected log entry or corrupt log. Entry=" + o, cl); // would be caused by a corrupt transaction log } catch (Exception ex) { log.warn("Exception reverse reading log", ex); break; } } } catch (IOException e) { // failure to read a log record isn't fatal log.error("Exception reading versions from log",e); } finally { if (reader != null) reader.close(); } updateList.add(updatesForLog); } }
// in core/src/java/org/apache/solr/update/UpdateLog.java
public void doReplay(TransactionLog translog) { try { loglog.warn("Starting log replay " + translog + " active="+activeLog + " starting pos=" + recoveryInfo.positionOfStart); tlogReader = translog.getReader(recoveryInfo.positionOfStart); // NOTE: we don't currently handle a core reload during recovery. This would cause the core // to change underneath us. // TODO: use the standard request factory? We won't get any custom configuration instantiating this way. RunUpdateProcessorFactory runFac = new RunUpdateProcessorFactory(); DistributedUpdateProcessorFactory magicFac = new DistributedUpdateProcessorFactory(); runFac.init(new NamedList()); magicFac.init(new NamedList()); UpdateRequestProcessor proc = magicFac.getInstance(req, rsp, runFac.getInstance(req, rsp, null)); long commitVersion = 0; int operationAndFlags = 0; for(;;) { Object o = null; if (cancelApplyBufferUpdate) break; try { if (testing_logReplayHook != null) testing_logReplayHook.run(); o = null; o = tlogReader.next(); if (o == null && activeLog) { if (!finishing) { // block to prevent new adds, but don't immediately unlock since // we could be starved from ever completing recovery. Only unlock // after we've finished this recovery. // NOTE: our own updates won't be blocked since the thread holding a write lock can // lock a read lock. versionInfo.blockUpdates(); finishing = true; o = tlogReader.next(); } else { // we had previously blocked updates, so this "null" from the log is final. // Wait until our final commit to change the state and unlock. // This is only so no new updates are written to the current log file, and is // only an issue if we crash before the commit (and we are paying attention // to incomplete log files). // // versionInfo.unblockUpdates(); } } } catch (InterruptedException e) { SolrException.log(log,e); } catch (IOException e) { SolrException.log(log,e); } catch (Throwable e) { SolrException.log(log,e); } if (o == null) break; try { // should currently be a List<Oper,Ver,Doc/Id> List entry = (List)o; operationAndFlags = (Integer)entry.get(0); int oper = operationAndFlags & OPERATION_MASK; long version = (Long) entry.get(1); switch (oper) { case UpdateLog.ADD: { recoveryInfo.adds++; // byte[] idBytes = (byte[]) entry.get(2); SolrInputDocument sdoc = (SolrInputDocument)entry.get(entry.size()-1); AddUpdateCommand cmd = new AddUpdateCommand(req); // cmd.setIndexedId(new BytesRef(idBytes)); cmd.solrDoc = sdoc; cmd.setVersion(version); cmd.setFlags(UpdateCommand.REPLAY | UpdateCommand.IGNORE_AUTOCOMMIT); if (debug) log.debug("add " + cmd); proc.processAdd(cmd); break; } case UpdateLog.DELETE: { recoveryInfo.deletes++; byte[] idBytes = (byte[]) entry.get(2); DeleteUpdateCommand cmd = new DeleteUpdateCommand(req); cmd.setIndexedId(new BytesRef(idBytes)); cmd.setVersion(version); cmd.setFlags(UpdateCommand.REPLAY | UpdateCommand.IGNORE_AUTOCOMMIT); if (debug) log.debug("delete " + cmd); proc.processDelete(cmd); break; } case UpdateLog.DELETE_BY_QUERY: { recoveryInfo.deleteByQuery++; String query = (String)entry.get(2); DeleteUpdateCommand cmd = new DeleteUpdateCommand(req); cmd.query = query; cmd.setVersion(version); cmd.setFlags(UpdateCommand.REPLAY | UpdateCommand.IGNORE_AUTOCOMMIT); if (debug) log.debug("deleteByQuery " + cmd); proc.processDelete(cmd); break; } case UpdateLog.COMMIT: { commitVersion = version; break; } default: throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Unknown Operation! " + oper); } if (rsp.getException() != null) { loglog.error("REPLAY_ERR: Exception replaying log", rsp.getException()); throw rsp.getException(); } } catch (IOException ex) { recoveryInfo.errors++; loglog.warn("REYPLAY_ERR: IOException reading log", ex); // could be caused by an incomplete flush if recovering from log } catch (ClassCastException cl) { recoveryInfo.errors++; loglog.warn("REPLAY_ERR: Unexpected log entry or corrupt log. Entry=" + o, cl); // would be caused by a corrupt transaction log } catch (Throwable ex) { recoveryInfo.errors++; loglog.warn("REPLAY_ERR: Exception replaying log", ex); // something wrong with the request? } } CommitUpdateCommand cmd = new CommitUpdateCommand(req, false); cmd.setVersion(commitVersion); cmd.softCommit = false; cmd.waitSearcher = true; cmd.setFlags(UpdateCommand.REPLAY); try { if (debug) log.debug("commit " + cmd); uhandler.commit(cmd); // this should cause a commit to be added to the incomplete log and avoid it being replayed again after a restart. } catch (IOException ex) { recoveryInfo.errors++; loglog.error("Replay exception: final commit.", ex); } if (!activeLog) { // if we are replaying an old tlog file, we need to add a commit to the end // so we don't replay it again if we restart right after. // if the last operation we replayed had FLAG_GAP set, we want to use that again so we don't lose it // as the flag on the last operation. translog.writeCommit(cmd, operationFlags | (operationAndFlags & ~OPERATION_MASK)); } try { proc.finish(); } catch (IOException ex) { recoveryInfo.errors++; loglog.error("Replay exception: finish()", ex); } } finally { if (tlogReader != null) tlogReader.close(); translog.decref(); } }
// in core/src/java/org/apache/solr/update/AddUpdateCommand.java
public BytesRef getIndexedId() { if (indexedId == null) { IndexSchema schema = req.getSchema(); SchemaField sf = schema.getUniqueKeyField(); if (sf != null) { if (solrDoc != null) { SolrInputField field = solrDoc.getField(sf.getName()); int count = field==null ? 0 : field.getValueCount(); if (count == 0) { if (overwrite) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST,"Document is missing mandatory uniqueKey field: " + sf.getName()); } } else if (count > 1) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST,"Document contains multiple values for uniqueKey field: " + field); } else { indexedId = new BytesRef(); sf.getType().readableToIndexed(field.getFirstValue().toString(), indexedId); } } } } return indexedId; }
// in core/src/java/org/apache/solr/update/DirectUpdateHandler2.java
Override public void deleteByQuery(DeleteUpdateCommand cmd) throws IOException { deleteByQueryCommands.incrementAndGet(); deleteByQueryCommandsCumulative.incrementAndGet(); boolean madeIt=false; try { Query q; try { // TODO: move this higher in the stack? QParser parser = QParser.getParser(cmd.query, "lucene", cmd.req); q = parser.getQuery(); q = QueryUtils.makeQueryable(q); // peer-sync can cause older deleteByQueries to be executed and could // delete newer documents. We prevent this by adding a clause restricting // version. if ((cmd.getFlags() & UpdateCommand.PEER_SYNC) != 0) { BooleanQuery bq = new BooleanQuery(); bq.add(q, Occur.MUST); SchemaField sf = core.getSchema().getField(VersionInfo.VERSION_FIELD); ValueSource vs = sf.getType().getValueSource(sf, null); ValueSourceRangeFilter filt = new ValueSourceRangeFilter(vs, null, Long.toString(Math.abs(cmd.version)), true, true); FunctionRangeQuery range = new FunctionRangeQuery(filt); bq.add(range, Occur.MUST); q = bq; } } catch (ParseException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); } boolean delAll = MatchAllDocsQuery.class == q.getClass(); // // synchronized to prevent deleteByQuery from running during the "open new searcher" // part of a commit. DBQ needs to signal that a fresh reader will be needed for // a realtime view of the index. When a new searcher is opened after a DBQ, that // flag can be cleared. If those thing happen concurrently, it's not thread safe. // synchronized (this) { if (delAll) { deleteAll(); } else { solrCoreState.getIndexWriter(core).deleteDocuments(q); } if (ulog != null) ulog.deleteByQuery(cmd); } madeIt = true; updateDeleteTrackers(cmd); } finally { if (!madeIt) { numErrors.incrementAndGet(); numErrorsCumulative.incrementAndGet(); } } }
// in core/src/java/org/apache/solr/update/DirectUpdateHandler2.java
Override public void decref() { try { solrCoreState.decref(this); } catch (IOException e) { throw new SolrException(ErrorCode.SERVER_ERROR, "", e); } }
// in core/src/java/org/apache/solr/update/processor/DistributedUpdateProcessor.java
public static DistribPhase parseParam(final String param) { if (param == null || param.trim().isEmpty()) { return NONE; } try { return valueOf(param); } catch (IllegalArgumentException e) { throw new SolrException (SolrException.ErrorCode.BAD_REQUEST, "Illegal value for " + DISTRIB_UPDATE_PARAM + ": " + param, e); } }
// in core/src/java/org/apache/solr/update/processor/DistributedUpdateProcessor.java
private boolean versionAdd(AddUpdateCommand cmd) throws IOException { BytesRef idBytes = cmd.getIndexedId(); if (vinfo == null || idBytes == null) { super.processAdd(cmd); return false; } // This is only the hash for the bucket, and must be based only on the uniqueKey (i.e. do not use a pluggable hash here) int bucketHash = Hash.murmurhash3_x86_32(idBytes.bytes, idBytes.offset, idBytes.length, 0); // at this point, there is an update we need to try and apply. // we may or may not be the leader. // Find any existing version in the document // TODO: don't reuse update commands any more! long versionOnUpdate = cmd.getVersion(); if (versionOnUpdate == 0) { SolrInputField versionField = cmd.getSolrInputDocument().getField(VersionInfo.VERSION_FIELD); if (versionField != null) { Object o = versionField.getValue(); versionOnUpdate = o instanceof Number ? ((Number) o).longValue() : Long.parseLong(o.toString()); } else { // Find the version String versionOnUpdateS = req.getParams().get(VERSION_FIELD); versionOnUpdate = versionOnUpdateS == null ? 0 : Long.parseLong(versionOnUpdateS); } } boolean isReplay = (cmd.getFlags() & UpdateCommand.REPLAY) != 0; boolean leaderLogic = isLeader && !isReplay; VersionBucket bucket = vinfo.bucket(bucketHash); vinfo.lockForUpdate(); try { synchronized (bucket) { // we obtain the version when synchronized and then do the add so we can ensure that // if version1 < version2 then version1 is actually added before version2. // even if we don't store the version field, synchronizing on the bucket // will enable us to know what version happened first, and thus enable // realtime-get to work reliably. // TODO: if versions aren't stored, do we need to set on the cmd anyway for some reason? // there may be other reasons in the future for a version on the commands if (versionsStored) { long bucketVersion = bucket.highest; if (leaderLogic) { boolean updated = getUpdatedDocument(cmd); if (updated && versionOnUpdate == -1) { versionOnUpdate = 1; // implied "doc must exist" for now... } if (versionOnUpdate != 0) { Long lastVersion = vinfo.lookupVersion(cmd.getIndexedId()); long foundVersion = lastVersion == null ? -1 : lastVersion; if ( versionOnUpdate == foundVersion || (versionOnUpdate < 0 && foundVersion < 0) || (versionOnUpdate==1 && foundVersion > 0) ) { // we're ok if versions match, or if both are negative (all missing docs are equal), or if cmd // specified it must exist (versionOnUpdate==1) and it does. } else { throw new SolrException(ErrorCode.CONFLICT, "version conflict for " + cmd.getPrintableId() + " expected=" + versionOnUpdate + " actual=" + foundVersion); } } long version = vinfo.getNewClock(); cmd.setVersion(version); cmd.getSolrInputDocument().setField(VersionInfo.VERSION_FIELD, version); bucket.updateHighest(version); } else { // The leader forwarded us this update. cmd.setVersion(versionOnUpdate); if (ulog.getState() != UpdateLog.State.ACTIVE && (cmd.getFlags() & UpdateCommand.REPLAY) == 0) { // we're not in an active state, and this update isn't from a replay, so buffer it. cmd.setFlags(cmd.getFlags() | UpdateCommand.BUFFERING); ulog.add(cmd); return true; } // if we aren't the leader, then we need to check that updates were not re-ordered if (bucketVersion != 0 && bucketVersion < versionOnUpdate) { // we're OK... this update has a version higher than anything we've seen // in this bucket so far, so we know that no reordering has yet occured. bucket.updateHighest(versionOnUpdate); } else { // there have been updates higher than the current update. we need to check // the specific version for this id. Long lastVersion = vinfo.lookupVersion(cmd.getIndexedId()); if (lastVersion != null && Math.abs(lastVersion) >= versionOnUpdate) { // This update is a repeat, or was reordered. We need to drop this update. return true; } } } } doLocalAdd(cmd); } // end synchronized (bucket) } finally { vinfo.unlockForUpdate(); } return false; }
// in core/src/java/org/apache/solr/update/processor/DistributedUpdateProcessor.java
boolean getUpdatedDocument(AddUpdateCommand cmd) throws IOException { SolrInputDocument sdoc = cmd.getSolrInputDocument(); boolean update = false; for (SolrInputField sif : sdoc.values()) { if (sif.getValue() instanceof Map) { update = true; break; } } if (!update) return false; BytesRef id = cmd.getIndexedId(); SolrInputDocument oldDoc = RealTimeGetComponent.getInputDocument(cmd.getReq().getCore(), id); if (oldDoc == null) { // not found... allow this in the future (depending on the details of the update, or if the user explicitly sets it). // could also just not change anything here and let the optimistic locking throw the error throw new SolrException(ErrorCode.CONFLICT, "Document not found for update. id=" + cmd.getPrintableId()); } oldDoc.remove(VERSION_FIELD); for (SolrInputField sif : sdoc.values()) { Object val = sif.getValue(); if (val instanceof Map) { for (Entry<String,Object> entry : ((Map<String,Object>) val).entrySet()) { String key = entry.getKey(); Object fieldVal = entry.getValue(); if ("add".equals(key)) { oldDoc.addField( sif.getName(), fieldVal, sif.getBoost()); } else if ("set".equals(key)) { oldDoc.setField(sif.getName(), fieldVal, sif.getBoost()); } else if ("inc".equals(key)) { SolrInputField numericField = oldDoc.get(sif.getName()); if (numericField == null) { oldDoc.setField(sif.getName(), fieldVal, sif.getBoost()); } else { // TODO: fieldtype needs externalToObject? String oldValS = numericField.getFirstValue().toString(); SchemaField sf = cmd.getReq().getSchema().getField(sif.getName()); BytesRef term = new BytesRef(); sf.getType().readableToIndexed(oldValS, term); Object oldVal = sf.getType().toObject(sf, term); String fieldValS = fieldVal.toString(); Number result; if (oldVal instanceof Long) { result = ((Long) oldVal).longValue() + Long.parseLong(fieldValS); } else if (oldVal instanceof Float) { result = ((Float) oldVal).floatValue() + Float.parseFloat(fieldValS); } else if (oldVal instanceof Double) { result = ((Double) oldVal).doubleValue() + Double.parseDouble(fieldValS); } else { // int, short, byte result = ((Integer) oldVal).intValue() + Integer.parseInt(fieldValS); } oldDoc.setField(sif.getName(), result, sif.getBoost()); } } } } else { // normal fields are treated as a "set" oldDoc.put(sif.getName(), sif); } } cmd.solrDoc = oldDoc; return true; }
// in core/src/java/org/apache/solr/update/processor/DistributedUpdateProcessor.java
public void doDeleteByQuery(DeleteUpdateCommand cmd) throws IOException { // even in non zk mode, tests simulate updates from a leader if(!zkEnabled) { isLeader = getNonZkLeaderAssumption(req); } else { zkCheck(); } // NONE: we are the first to receive this deleteByQuery // - it must be forwarded to the leader of every shard // TO: we are a leader receiving a forwarded deleteByQuery... we must: // - block all updates (use VersionInfo) // - flush *all* updates going to our replicas // - forward the DBQ to our replicas and wait for the response // - log + execute the local DBQ // FROM: we are a replica receiving a DBQ from our leader // - log + execute the local DBQ DistribPhase phase = DistribPhase.parseParam(req.getParams().get(DISTRIB_UPDATE_PARAM)); if (zkEnabled && DistribPhase.NONE == phase) { boolean leaderForAnyShard = false; // start off by assuming we are not a leader for any shard Map<String,Slice> slices = zkController.getCloudState().getSlices(collection); if (slices == null) { throw new SolrException(ErrorCode.BAD_REQUEST, "Cannot find collection:" + collection + " in " + zkController.getCloudState().getCollections()); } ModifiableSolrParams params = new ModifiableSolrParams(req.getParams()); params.set(DISTRIB_UPDATE_PARAM, DistribPhase.TOLEADER.toString()); List<Node> leaders = new ArrayList<Node>(slices.size()); for (Map.Entry<String,Slice> sliceEntry : slices.entrySet()) { String sliceName = sliceEntry.getKey(); ZkNodeProps leaderProps; try { leaderProps = zkController.getZkStateReader().getLeaderProps(collection, sliceName); } catch (InterruptedException e) { throw new SolrException(ErrorCode.SERVER_ERROR, "Exception finding leader for shard " + sliceName, e); } // TODO: What if leaders changed in the meantime? // should we send out slice-at-a-time and if a node returns "hey, I'm not a leader" (or we get an error because it went down) then look up the new leader? // Am I the leader for this slice? ZkCoreNodeProps coreLeaderProps = new ZkCoreNodeProps(leaderProps); String leaderNodeName = coreLeaderProps.getCoreNodeName(); String coreName = req.getCore().getName(); String coreNodeName = zkController.getNodeName() + "_" + coreName; isLeader = coreNodeName.equals(leaderNodeName); if (isLeader) { // don't forward to ourself leaderForAnyShard = true; } else { leaders.add(new StdNode(coreLeaderProps)); } } params.remove("commit"); // this will be distributed from the local commit cmdDistrib.distribDelete(cmd, leaders, params); if (!leaderForAnyShard) { return; } // change the phase to TOLEADER so we look up and forward to our own replicas (if any) phase = DistribPhase.TOLEADER; } List<Node> replicas = null; if (zkEnabled && DistribPhase.TOLEADER == phase) { // This core should be a leader replicas = setupRequest(); } if (vinfo == null) { super.processDelete(cmd); return; } // at this point, there is an update we need to try and apply. // we may or may not be the leader. // Find the version long versionOnUpdate = cmd.getVersion(); if (versionOnUpdate == 0) { String versionOnUpdateS = req.getParams().get(VERSION_FIELD); versionOnUpdate = versionOnUpdateS == null ? 0 : Long.parseLong(versionOnUpdateS); } versionOnUpdate = Math.abs(versionOnUpdate); // normalize to positive version boolean isReplay = (cmd.getFlags() & UpdateCommand.REPLAY) != 0; boolean leaderLogic = isLeader && !isReplay; if (!leaderLogic && versionOnUpdate==0) { throw new SolrException(ErrorCode.BAD_REQUEST, "missing _version_ on update from leader"); } vinfo.blockUpdates(); try { if (versionsStored) { if (leaderLogic) { long version = vinfo.getNewClock(); cmd.setVersion(-version); // TODO update versions in all buckets doLocalDelete(cmd); } else { cmd.setVersion(-versionOnUpdate); if (ulog.getState() != UpdateLog.State.ACTIVE && (cmd.getFlags() & UpdateCommand.REPLAY) == 0) { // we're not in an active state, and this update isn't from a replay, so buffer it. cmd.setFlags(cmd.getFlags() | UpdateCommand.BUFFERING); ulog.deleteByQuery(cmd); return; } doLocalDelete(cmd); } } // since we don't know which documents were deleted, the easiest thing to do is to invalidate // all real-time caches (i.e. UpdateLog) which involves also getting a new version of the IndexReader // (so cache misses will see up-to-date data) } finally { vinfo.unblockUpdates(); } // TODO: need to handle reorders to replicas somehow // forward to all replicas if (leaderLogic && replicas != null) { ModifiableSolrParams params = new ModifiableSolrParams(req.getParams()); params.set(VERSION_FIELD, Long.toString(cmd.getVersion())); params.set(DISTRIB_UPDATE_PARAM, DistribPhase.FROMLEADER.toString()); cmdDistrib.distribDelete(cmd, replicas, params); cmdDistrib.finish(); } if (returnVersions && rsp != null) { if (deleteByQueryResponse == null) { deleteByQueryResponse = new NamedList<String>(); rsp.add("deleteByQuery",deleteByQueryResponse); } deleteByQueryResponse.add(cmd.getQuery(), cmd.getVersion()); } }
// in core/src/java/org/apache/solr/update/processor/DistributedUpdateProcessor.java
private void zkCheck() { int retries = 10; while (!zkController.isConnected()) { if (retries-- == 0) { throw new SolrException(ErrorCode.SERVICE_UNAVAILABLE, "Cannot talk to ZooKeeper - Updates are disabled."); } try { Thread.sleep(100); } catch (InterruptedException e) { Thread.currentThread().interrupt(); break; } } }
// in core/src/java/org/apache/solr/update/processor/DistributedUpdateProcessor.java
private boolean versionDelete(DeleteUpdateCommand cmd) throws IOException { BytesRef idBytes = cmd.getIndexedId(); if (vinfo == null || idBytes == null) { super.processDelete(cmd); return false; } // This is only the hash for the bucket, and must be based only on the uniqueKey (i.e. do not use a pluggable hash here) int bucketHash = Hash.murmurhash3_x86_32(idBytes.bytes, idBytes.offset, idBytes.length, 0); // at this point, there is an update we need to try and apply. // we may or may not be the leader. // Find the version long versionOnUpdate = cmd.getVersion(); if (versionOnUpdate == 0) { String versionOnUpdateS = req.getParams().get(VERSION_FIELD); versionOnUpdate = versionOnUpdateS == null ? 0 : Long.parseLong(versionOnUpdateS); } long signedVersionOnUpdate = versionOnUpdate; versionOnUpdate = Math.abs(versionOnUpdate); // normalize to positive version boolean isReplay = (cmd.getFlags() & UpdateCommand.REPLAY) != 0; boolean leaderLogic = isLeader && !isReplay; if (!leaderLogic && versionOnUpdate==0) { throw new SolrException(ErrorCode.BAD_REQUEST, "missing _version_ on update from leader"); } VersionBucket bucket = vinfo.bucket(bucketHash); vinfo.lockForUpdate(); try { synchronized (bucket) { if (versionsStored) { long bucketVersion = bucket.highest; if (leaderLogic) { if (signedVersionOnUpdate != 0) { Long lastVersion = vinfo.lookupVersion(cmd.getIndexedId()); long foundVersion = lastVersion == null ? -1 : lastVersion; if ( (signedVersionOnUpdate == foundVersion) || (signedVersionOnUpdate < 0 && foundVersion < 0) || (signedVersionOnUpdate == 1 && foundVersion > 0) ) { // we're ok if versions match, or if both are negative (all missing docs are equal), or if cmd // specified it must exist (versionOnUpdate==1) and it does. } else { throw new SolrException(ErrorCode.CONFLICT, "version conflict for " + cmd.getId() + " expected=" + signedVersionOnUpdate + " actual=" + foundVersion); } } long version = vinfo.getNewClock(); cmd.setVersion(-version); bucket.updateHighest(version); } else { cmd.setVersion(-versionOnUpdate); if (ulog.getState() != UpdateLog.State.ACTIVE && (cmd.getFlags() & UpdateCommand.REPLAY) == 0) { // we're not in an active state, and this update isn't from a replay, so buffer it. cmd.setFlags(cmd.getFlags() | UpdateCommand.BUFFERING); ulog.delete(cmd); return true; } // if we aren't the leader, then we need to check that updates were not re-ordered if (bucketVersion != 0 && bucketVersion < versionOnUpdate) { // we're OK... this update has a version higher than anything we've seen // in this bucket so far, so we know that no reordering has yet occured. bucket.updateHighest(versionOnUpdate); } else { // there have been updates higher than the current update. we need to check // the specific version for this id. Long lastVersion = vinfo.lookupVersion(cmd.getIndexedId()); if (lastVersion != null && Math.abs(lastVersion) >= versionOnUpdate) { // This update is a repeat, or was reordered. We need to drop this update. return true; } } } } doLocalDelete(cmd); return false; } // end synchronized (bucket) } finally { vinfo.unlockForUpdate(); } }
// in core/src/java/org/apache/solr/update/processor/UpdateRequestProcessorChain.java
public void init(PluginInfo info) { final String infomsg = "updateRequestProcessorChain \"" + (null != info.name ? info.name : "") + "\"" + (info.isDefault() ? " (default)" : ""); // wrap in an ArrayList so we know we know we can do fast index lookups // and that add(int,Object) is supported List<UpdateRequestProcessorFactory> list = new ArrayList (solrCore.initPlugins(info.getChildren("processor"),UpdateRequestProcessorFactory.class,null)); if(list.isEmpty()){ throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, infomsg + " require at least one processor"); } int numDistrib = 0; int runIndex = -1; // hi->lo incase multiple run instances, add before first one // (no idea why someone might use multiple run instances, but just in case) for (int i = list.size()-1; 0 <= i; i--) { UpdateRequestProcessorFactory factory = list.get(i); if (factory instanceof DistributingUpdateProcessorFactory) { numDistrib++; } if (factory instanceof RunUpdateProcessorFactory) { runIndex = i; } } if (1 < numDistrib) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, infomsg + " may not contain more then one " + "instance of DistributingUpdateProcessorFactory"); } if (0 <= runIndex && 0 == numDistrib) { // by default, add distrib processor immediately before run DistributedUpdateProcessorFactory distrib = new DistributedUpdateProcessorFactory(); distrib.init(new NamedList()); list.add(runIndex, distrib); log.info("inserting DistributedUpdateProcessorFactory into " + infomsg); } chain = list.toArray(new UpdateRequestProcessorFactory[list.size()]); }
// in core/src/java/org/apache/solr/update/processor/FieldMutatingUpdateProcessor.java
Override public void processAdd(AddUpdateCommand cmd) throws IOException { final SolrInputDocument doc = cmd.getSolrInputDocument(); // make a copy we can iterate over while mutating the doc final Collection<String> fieldNames = new ArrayList<String>(doc.getFieldNames()); for (final String fname : fieldNames) { if (! selector.shouldMutate(fname)) continue; final SolrInputField src = doc.get(fname); SolrInputField dest = null; try { dest = mutate(src); } catch (SolrException e) { String msg = "Unable to mutate field '"+fname+"': "+e.getMessage(); SolrException.log(log, msg, e); throw new SolrException(BAD_REQUEST, msg, e); } if (null == dest) { doc.remove(fname); } else { // semantics of what happens if dest has diff name are hard // we could treat it as a copy, or a rename // for now, don't allow it. if (! fname.equals(dest.getName()) ) { throw new SolrException(SERVER_ERROR, "mutute returned field with different name: " + fname + " => " + dest.getName()); } doc.put(dest.getName(), dest); } } super.processAdd(cmd); }
// in core/src/java/org/apache/solr/update/processor/FieldMutatingUpdateProcessor.java
public static FieldNameSelector createFieldNameSelector (final SolrResourceLoader loader, final IndexSchema schema, final Set<String> fields, final Set<String> typeNames, final Collection<String> typeClasses, final Collection<Pattern> regexes, final FieldNameSelector defSelector) { final Collection<Class> classes = new ArrayList<Class>(typeClasses.size()); for (String t : typeClasses) { try { classes.add(loader.findClass(t, Object.class)); } catch (Exception e) { throw new SolrException(SERVER_ERROR, "Can't resolve typeClass: " + t, e); } } if (classes.isEmpty() && typeNames.isEmpty() && regexes.isEmpty() && fields.isEmpty()) { return defSelector; } return new ConfigurableFieldNameSelector (schema, fields, typeNames, classes, regexes); }
// in core/src/java/org/apache/solr/update/processor/SignatureUpdateProcessorFactory.java
public void inform(SolrCore core) { final SchemaField field = core.getSchema().getFieldOrNull(getSignatureField()); if (null == field) { throw new SolrException (ErrorCode.SERVER_ERROR, "Can't use signatureField which does not exist in schema: " + getSignatureField()); } if (getOverwriteDupes() && ( ! field.indexed() ) ) { throw new SolrException (ErrorCode.SERVER_ERROR, "Can't set overwriteDupes when signatureField is not indexed: " + getSignatureField()); } }
// in core/src/java/org/apache/solr/update/processor/FieldMutatingUpdateProcessorFactory.java
protected final FieldMutatingUpdateProcessor.FieldNameSelector getSelector() { if (null != selector) return selector; throw new SolrException(SERVER_ERROR, "selector was never initialized, "+ " inform(SolrCore) never called???"); }
// in core/src/java/org/apache/solr/update/processor/FieldMutatingUpdateProcessorFactory.java
private static Collection<String> oneOrMany(final NamedList args, final String key) { List<String> result = new ArrayList<String>(args.size() / 2); final String err = "init arg '" + key + "' must be a string " + "(ie: 'str'), or an array (ie: 'arr') containing strings; found: "; for (Object o = args.remove(key); null != o; o = args.remove(key)) { if (o instanceof String) { result.add((String)o); continue; } if (o instanceof Object[]) { o = Arrays.asList((Object[]) o); } if (o instanceof Collection) { for (Object item : (Collection)o) { if (! (item instanceof String)) { throw new SolrException(SERVER_ERROR, err + item.getClass()); } result.add((String)item); } continue; } // who knows what the hell we have throw new SolrException(SERVER_ERROR, err + o.getClass()); } return result; }
// in core/src/java/org/apache/solr/update/VersionInfo.java
public Long getVersionFromIndex(BytesRef idBytes) { // TODO: we could cache much of this and invalidate during a commit. // TODO: most DocValues classes are threadsafe - expose which. RefCounted<SolrIndexSearcher> newestSearcher = core.getRealtimeSearcher(); try { SolrIndexSearcher searcher = newestSearcher.get(); long lookup = searcher.lookupId(idBytes); if (lookup < 0) return null; ValueSource vs = versionField.getType().getValueSource(versionField, null); Map context = ValueSource.newContext(searcher); vs.createWeight(context, searcher); FunctionValues fv = vs.getValues(context, searcher.getTopReaderContext().leaves()[(int)(lookup>>32)]); long ver = fv.longVal((int)lookup); return ver; } catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Error reading version from index", e); } finally { if (newestSearcher != null) { newestSearcher.decref(); } } }
// in core/src/java/org/apache/solr/update/SolrIndexConfig.java
private void assertWarnOrFail(String reason, boolean assertCondition, boolean failCondition) { if(assertCondition) { return; } else if(failCondition) { throw new SolrException(ErrorCode.FORBIDDEN, reason); } else { log.warn(reason); } }
// in core/src/java/org/apache/solr/update/DocumentBuilder.java
protected void addSingleField(SchemaField sfield, String val, float boost) { //System.out.println("###################ADDING FIELD "+sfield+"="+val); // we don't check for a null val ourselves because a solr.FieldType // might actually want to map it to something. If createField() // returns null, then we don't store the field. if (sfield.isPolyField()) { IndexableField[] fields = sfield.createFields(val, boost); if (fields.length > 0) { if (!sfield.multiValued()) { String oldValue = map.put(sfield.getName(), val); if (oldValue != null) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "ERROR: multiple values encountered for non multiValued field " + sfield.getName() + ": first='" + oldValue + "' second='" + val + "'"); } } // Add each field for (IndexableField field : fields) { doc.add(field); } } } else { IndexableField field = sfield.createField(val, boost); if (field != null) { if (!sfield.multiValued()) { String oldValue = map.put(sfield.getName(), val); if (oldValue != null) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST,"ERROR: multiple values encountered for non multiValued field " + sfield.getName() + ": first='" + oldValue + "' second='" + val + "'"); } } } doc.add(field); } }
// in core/src/java/org/apache/solr/update/DocumentBuilder.java
public void addField(String name, String val, float boost) { SchemaField sfield = schema.getFieldOrNull(name); if (sfield != null) { addField(sfield,val,boost); } // Check if we should copy this field to any other fields. // This could happen whether it is explicit or not. final List<CopyField> copyFields = schema.getCopyFieldsList(name); if (copyFields != null) { for(CopyField cf : copyFields) { addSingleField(cf.getDestination(), cf.getLimitedValue( val ), boost); } } // error if this field name doesn't match anything if (sfield==null && (copyFields==null || copyFields.size()==0)) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST,"ERROR:unknown field '" + name + "'"); } }
// in core/src/java/org/apache/solr/update/DocumentBuilder.java
public Document getDoc() throws IllegalArgumentException { // Check for all required fields -- Note, all fields with a // default value are defacto 'required' fields. List<String> missingFields = null; for (SchemaField field : schema.getRequiredFields()) { if (doc.getField(field.getName() ) == null) { if (field.getDefaultValue() != null) { addField(doc, field, field.getDefaultValue(), 1.0f); } else { if (missingFields==null) { missingFields = new ArrayList<String>(1); } missingFields.add(field.getName()); } } } if (missingFields != null) { StringBuilder builder = new StringBuilder(); // add the uniqueKey if possible if( schema.getUniqueKeyField() != null ) { String n = schema.getUniqueKeyField().getName(); String v = doc.getField( n ).stringValue(); builder.append( "Document ["+n+"="+v+"] " ); } builder.append("missing required fields: " ); for (String field : missingFields) { builder.append(field); builder.append(" "); } throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, builder.toString()); } Document ret = doc; doc=null; return ret; }
// in core/src/java/org/apache/solr/update/DocumentBuilder.java
public static Document toDocument( SolrInputDocument doc, IndexSchema schema ) { Document out = new Document(); final float docBoost = doc.getDocumentBoost(); // Load fields from SolrDocument to Document for( SolrInputField field : doc ) { String name = field.getName(); SchemaField sfield = schema.getFieldOrNull(name); boolean used = false; float boost = field.getBoost(); boolean omitNorms = sfield != null && sfield.omitNorms(); // Make sure it has the correct number if( sfield!=null && !sfield.multiValued() && field.getValueCount() > 1 ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "ERROR: "+getID(doc, schema)+"multiple values encountered for non multiValued field " + sfield.getName() + ": " +field.getValue() ); } if (omitNorms && boost != 1.0F) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "ERROR: "+getID(doc, schema)+"cannot set an index-time boost, norms are omitted for field " + sfield.getName() + ": " +field.getValue() ); } // load each field value boolean hasField = false; try { for( Object v : field ) { if( v == null ) { continue; } hasField = true; if (sfield != null) { used = true; addField(out, sfield, v, omitNorms ? 1F : docBoost*boost); } // Check if we should copy this field to any other fields. // This could happen whether it is explicit or not. List<CopyField> copyFields = schema.getCopyFieldsList(name); for (CopyField cf : copyFields) { SchemaField destinationField = cf.getDestination(); // check if the copy field is a multivalued or not if (!destinationField.multiValued() && out.getField(destinationField.getName()) != null) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "ERROR: "+getID(doc, schema)+"multiple values encountered for non multiValued copy field " + destinationField.getName() + ": " + v); } used = true; // Perhaps trim the length of a copy field Object val = v; if( val instanceof String && cf.getMaxChars() > 0 ) { val = cf.getLimitedValue((String)val); } addField(out, destinationField, val, destinationField.omitNorms() ? 1F : docBoost*boost); } // In lucene, the boost for a given field is the product of the // document boost and *all* boosts on values of that field. // For multi-valued fields, we only want to set the boost on the // first field. boost = docBoost; } } catch( SolrException ex ) { throw ex; } catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "ERROR: "+getID(doc, schema)+"Error adding field '" + field.getName() + "'='" +field.getValue()+"' msg=" + ex.getMessage(), ex ); } // make sure the field was used somehow... if( !used && hasField ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "ERROR: "+getID(doc, schema)+"unknown field '" +name + "'"); } } // Now validate required fields or add default values // fields with default values are defacto 'required' for (SchemaField field : schema.getRequiredFields()) { if (out.getField(field.getName() ) == null) { if (field.getDefaultValue() != null) { addField(out, field, field.getDefaultValue(), 1.0f); } else { String msg = getID(doc, schema) + "missing required field: " + field.getName(); throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, msg ); } } } return out; }
// in core/src/java/org/apache/solr/update/SolrCmdDistributor.java
void checkResponses(boolean block) { while (pending != null && pending.size() > 0) { try { Future<Request> future = block ? completionService.take() : completionService.poll(); if (future == null) return; pending.remove(future); try { Request sreq = future.get(); if (sreq.rspCode != 0) { // error during request // if there is a retry url, we want to retry... // TODO: but we really should only retry on connection errors... if (sreq.retries < 5 && sreq.node.checkRetry()) { sreq.retries++; sreq.rspCode = 0; sreq.exception = null; Thread.sleep(500); submit(sreq); checkResponses(block); } else { Exception e = sreq.exception; Error error = new Error(); error.e = e; error.node = sreq.node; response.errors.add(error); response.sreq = sreq; SolrException.log(SolrCore.log, "shard update error " + sreq.node, sreq.exception); } } } catch (ExecutionException e) { // shouldn't happen since we catch exceptions ourselves SolrException.log(SolrCore.log, "error sending update request to shard", e); } } catch (InterruptedException e) { throw new SolrException(SolrException.ErrorCode.SERVICE_UNAVAILABLE, "interrupted waiting for shard update response", e); } } }
// in core/src/java/org/apache/solr/update/PeerSync.java
private boolean handleUpdates(ShardResponse srsp) { // we retrieved the last N updates from the replica List<Object> updates = (List<Object>)srsp.getSolrResponse().getResponse().get("updates"); SyncShardRequest sreq = (SyncShardRequest) srsp.getShardRequest(); if (updates.size() < sreq.requestedUpdates.size()) { log.error(msg() + " Requested " + sreq.requestedUpdates.size() + " updates from " + sreq.shards[0] + " but retrieved " + updates.size()); return false; } ModifiableSolrParams params = new ModifiableSolrParams(); params.set(DISTRIB_UPDATE_PARAM, FROMLEADER.toString()); // params.set("peersync",true); // debugging SolrQueryRequest req = new LocalSolrQueryRequest(uhandler.core, params); SolrQueryResponse rsp = new SolrQueryResponse(); RunUpdateProcessorFactory runFac = new RunUpdateProcessorFactory(); DistributedUpdateProcessorFactory magicFac = new DistributedUpdateProcessorFactory(); runFac.init(new NamedList()); magicFac.init(new NamedList()); UpdateRequestProcessor proc = magicFac.getInstance(req, rsp, runFac.getInstance(req, rsp, null)); Collections.sort(updates, updateRecordComparator); Object o = null; long lastVersion = 0; try { // Apply oldest updates first for (Object obj : updates) { // should currently be a List<Oper,Ver,Doc/Id> o = obj; List<Object> entry = (List<Object>)o; if (debug) { log.debug(msg() + "raw update record " + o); } int oper = (Integer)entry.get(0) & UpdateLog.OPERATION_MASK; long version = (Long) entry.get(1); if (version == lastVersion && version != 0) continue; lastVersion = version; switch (oper) { case UpdateLog.ADD: { // byte[] idBytes = (byte[]) entry.get(2); SolrInputDocument sdoc = (SolrInputDocument)entry.get(entry.size()-1); AddUpdateCommand cmd = new AddUpdateCommand(req); // cmd.setIndexedId(new BytesRef(idBytes)); cmd.solrDoc = sdoc; cmd.setVersion(version); cmd.setFlags(UpdateCommand.PEER_SYNC | UpdateCommand.IGNORE_AUTOCOMMIT); if (debug) { log.debug(msg() + "add " + cmd); } proc.processAdd(cmd); break; } case UpdateLog.DELETE: { byte[] idBytes = (byte[]) entry.get(2); DeleteUpdateCommand cmd = new DeleteUpdateCommand(req); cmd.setIndexedId(new BytesRef(idBytes)); cmd.setVersion(version); cmd.setFlags(UpdateCommand.PEER_SYNC | UpdateCommand.IGNORE_AUTOCOMMIT); if (debug) { log.debug(msg() + "delete " + cmd); } proc.processDelete(cmd); break; } case UpdateLog.DELETE_BY_QUERY: { String query = (String)entry.get(2); DeleteUpdateCommand cmd = new DeleteUpdateCommand(req); cmd.query = query; cmd.setVersion(version); cmd.setFlags(UpdateCommand.PEER_SYNC | UpdateCommand.IGNORE_AUTOCOMMIT); if (debug) { log.debug(msg() + "deleteByQuery " + cmd); } proc.processDelete(cmd); break; } default: throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Unknown Operation! " + oper); } } } catch (IOException e) { // TODO: should this be handled separately as a problem with us? // I guess it probably already will by causing replication to be kicked off. sreq.updateException = e; log.error(msg() + "Error applying updates from " + sreq.shards + " ,update=" + o, e); return false; } catch (Exception e) { sreq.updateException = e; log.error(msg() + "Error applying updates from " + sreq.shards + " ,update=" + o, e); return false; } finally { try { proc.finish(); } catch (Exception e) { sreq.updateException = e; log.error(msg() + "Error applying updates from " + sreq.shards + " ,finish()", e); return false; } } return true; }
// in core/src/java/org/apache/solr/update/TransactionLog.java
Override public String readExternString(FastInputStream fis) throws IOException { int idx = readSize(fis); if (idx != 0) {// idx != 0 is the index of the extern string // no need to synchronize globalStringList - it's only updated before the first record is written to the log return globalStringList.get(idx - 1); } else {// idx == 0 means it has a string value // this shouldn't happen with this codec subclass. throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Corrupt transaction log"); } }
// in core/src/java/org/apache/solr/update/TransactionLog.java
public long writeData(Object o) { LogCodec codec = new LogCodec(); try { long pos = fos.size(); // if we had flushed, this should be equal to channel.position() codec.init(fos); codec.writeVal(o); return pos; } catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); } }
// in core/src/java/org/apache/solr/update/TransactionLog.java
public long write(AddUpdateCommand cmd, int flags) { LogCodec codec = new LogCodec(); long pos = 0; synchronized (this) { try { pos = fos.size(); // if we had flushed, this should be equal to channel.position() SolrInputDocument sdoc = cmd.getSolrInputDocument(); if (pos == 0) { // TODO: needs to be changed if we start writing a header first addGlobalStrings(sdoc.getFieldNames()); writeLogHeader(codec); pos = fos.size(); } /*** System.out.println("###writing at " + pos + " fos.size()=" + fos.size() + " raf.length()=" + raf.length()); if (pos != fos.size()) { throw new RuntimeException("ERROR" + "###writing at " + pos + " fos.size()=" + fos.size() + " raf.length()=" + raf.length()); } ***/ codec.init(fos); codec.writeTag(JavaBinCodec.ARR, 3); codec.writeInt(UpdateLog.ADD | flags); // should just take one byte codec.writeLong(cmd.getVersion()); codec.writeSolrInputDocument(cmd.getSolrInputDocument()); endRecord(pos); // fos.flushBuffer(); // flush later return pos; } catch (IOException e) { // TODO: reset our file pointer back to "pos", the start of this record. throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Error logging add", e); } } }
// in core/src/java/org/apache/solr/update/TransactionLog.java
public long writeDelete(DeleteUpdateCommand cmd, int flags) { LogCodec codec = new LogCodec(); synchronized (this) { try { long pos = fos.size(); // if we had flushed, this should be equal to channel.position() if (pos == 0) { writeLogHeader(codec); pos = fos.size(); } codec.init(fos); codec.writeTag(JavaBinCodec.ARR, 3); codec.writeInt(UpdateLog.DELETE | flags); // should just take one byte codec.writeLong(cmd.getVersion()); BytesRef br = cmd.getIndexedId(); codec.writeByteArray(br.bytes, br.offset, br.length); endRecord(pos); // fos.flushBuffer(); // flush later return pos; } catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); } } }
// in core/src/java/org/apache/solr/update/TransactionLog.java
public long writeDeleteByQuery(DeleteUpdateCommand cmd, int flags) { LogCodec codec = new LogCodec(); synchronized (this) { try { long pos = fos.size(); // if we had flushed, this should be equal to channel.position() if (pos == 0) { writeLogHeader(codec); pos = fos.size(); } codec.init(fos); codec.writeTag(JavaBinCodec.ARR, 3); codec.writeInt(UpdateLog.DELETE_BY_QUERY | flags); // should just take one byte codec.writeLong(cmd.getVersion()); codec.writeStr(cmd.query); endRecord(pos); // fos.flushBuffer(); // flush later return pos; } catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); } } }
// in core/src/java/org/apache/solr/update/TransactionLog.java
public long writeCommit(CommitUpdateCommand cmd, int flags) { LogCodec codec = new LogCodec(); synchronized (this) { try { long pos = fos.size(); // if we had flushed, this should be equal to channel.position() if (pos == 0) { writeLogHeader(codec); pos = fos.size(); } codec.init(fos); codec.writeTag(JavaBinCodec.ARR, 3); codec.writeInt(UpdateLog.COMMIT | flags); // should just take one byte codec.writeLong(cmd.getVersion()); codec.writeStr(END_MESSAGE); // ensure these bytes are (almost) last in the file endRecord(pos); fos.flush(); // flush since this will be the last record in a log fill assert fos.size() == channel.size(); return pos; } catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); } } }
// in core/src/java/org/apache/solr/update/TransactionLog.java
public Object lookup(long pos) { // A negative position can result from a log replay (which does not re-log, but does // update the version map. This is OK since the node won't be ACTIVE when this happens. if (pos < 0) return null; try { // make sure any unflushed buffer has been flushed synchronized (this) { // TODO: optimize this by keeping track of what we have flushed up to fos.flushBuffer(); /*** System.out.println("###flushBuffer to " + fos.size() + " raf.length()=" + raf.length() + " pos="+pos); if (fos.size() != raf.length() || pos >= fos.size() ) { throw new RuntimeException("ERROR" + "###flushBuffer to " + fos.size() + " raf.length()=" + raf.length() + " pos="+pos); } ***/ } ChannelFastInputStream fis = new ChannelFastInputStream(channel, pos); LogCodec codec = new LogCodec(); return codec.readVal(fis); } catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); } }
// in core/src/java/org/apache/solr/update/TransactionLog.java
public void incref() { int result = refcount.incrementAndGet(); if (result <= 1) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "incref on a closed log: " + this); } }
// in core/src/java/org/apache/solr/update/TransactionLog.java
public void finish(UpdateLog.SyncLevel syncLevel) { if (syncLevel == UpdateLog.SyncLevel.NONE) return; try { synchronized (this) { fos.flushBuffer(); } if (syncLevel == UpdateLog.SyncLevel.FSYNC) { // Since fsync is outside of synchronized block, we can end up with a partial // last record on power failure (which is OK, and does not represent an error... // we just need to be aware of it when reading). raf.getFD().sync(); } } catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); } }
// in core/src/java/org/apache/solr/update/TransactionLog.java
private void close() { try { if (debug) { log.debug("Closing tlog" + this); } synchronized (this) { fos.flush(); fos.close(); } if (deleteOnClose) { tlogFile.delete(); } } catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); } }
// in core/src/java/org/apache/solr/core/Config.java
public Object evaluate(String path, QName type) { XPath xpath = xpathFactory.newXPath(); try { String xstr=normalize(path); // TODO: instead of prepending /prefix/, we could do the search rooted at /prefix... Object o = xpath.evaluate(xstr, doc, type); return o; } catch (XPathExpressionException e) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR,"Error in xpath:" + path +" for " + name,e); } }
// in core/src/java/org/apache/solr/core/Config.java
public Node getNode(String path, boolean errIfMissing) { XPath xpath = xpathFactory.newXPath(); Node nd = null; String xstr = normalize(path); try { nd = (Node)xpath.evaluate(xstr, doc, XPathConstants.NODE); if (nd==null) { if (errIfMissing) { throw new RuntimeException(name + " missing "+path); } else { log.debug(name + " missing optional " + path); return null; } } log.trace(name + ":" + path + "=" + nd); return nd; } catch (XPathExpressionException e) { SolrException.log(log,"Error in xpath",e); throw new SolrException( SolrException.ErrorCode.SERVER_ERROR,"Error in xpath:" + xstr + " for " + name,e); } catch (SolrException e) { throw(e); } catch (Throwable e) { SolrException.log(log,"Error in xpath",e); throw new SolrException( SolrException.ErrorCode.SERVER_ERROR,"Error in xpath:" + xstr+ " for " + name,e); } }
// in core/src/java/org/apache/solr/core/Config.java
public static final Version parseLuceneVersionString(final String matchVersion) { final Version version; try { version = Version.parseLeniently(matchVersion); } catch (IllegalArgumentException iae) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Invalid luceneMatchVersion '" + matchVersion + "', valid values are: " + Arrays.toString(Version.values()) + " or a string in format 'V.V'", iae); } if (version == Version.LUCENE_CURRENT && !versionWarningAlreadyLogged.getAndSet(true)) { log.warn( "You should not use LUCENE_CURRENT as luceneMatchVersion property: "+ "if you use this setting, and then Solr upgrades to a newer release of Lucene, "+ "sizable changes may happen. If precise back compatibility is important "+ "then you should instead explicitly specify an actual Lucene version." ); } return version; }
// in core/src/java/org/apache/solr/core/RequestHandlers.java
void initHandlersFromConfig(SolrConfig config ){ // use link map so we iterate in the same order Map<PluginInfo,SolrRequestHandler> handlers = new LinkedHashMap<PluginInfo,SolrRequestHandler>(); for (PluginInfo info : config.getPluginInfos(SolrRequestHandler.class.getName())) { try { SolrRequestHandler requestHandler; String startup = info.attributes.get("startup") ; if( startup != null ) { if( "lazy".equals(startup) ) { log.info("adding lazy requestHandler: " + info.className); requestHandler = new LazyRequestHandlerWrapper( core, info.className, info.initArgs ); } else { throw new Exception( "Unknown startup value: '"+startup+"' for: "+info.className ); } } else { requestHandler = core.createRequestHandler(info.className); } handlers.put(info,requestHandler); SolrRequestHandler old = register(info.name, requestHandler); if(old != null) { log.warn("Multiple requestHandler registered to the same name: " + info.name + " ignoring: " + old.getClass().getName()); } if(info.isDefault()){ old = register("",requestHandler); if(old != null) log.warn("Multiple default requestHandler registered" + " ignoring: " + old.getClass().getName()); } log.info("created "+info.name+": " + info.className); } catch (Exception ex) { throw new SolrException (ErrorCode.SERVER_ERROR, "RequestHandler init failure", ex); } } // we've now registered all handlers, time to init them in the same order for (Map.Entry<PluginInfo,SolrRequestHandler> entry : handlers.entrySet()) { PluginInfo info = entry.getKey(); SolrRequestHandler requestHandler = entry.getValue(); if (requestHandler instanceof PluginInfoInitialized) { ((PluginInfoInitialized) requestHandler).init(info); } else{ requestHandler.init(info.initArgs); } } if(get("") == null) register("", get("/select"));//defacto default handler if(get("") == null) register("", get("standard"));//old default handler name; TODO remove? if(get("") == null) log.warn("no default request handler is registered (either '/select' or 'standard')"); }
// in core/src/java/org/apache/solr/core/RequestHandlers.java
public synchronized SolrRequestHandler getWrappedHandler() { if( _handler == null ) { try { SolrRequestHandler handler = core.createRequestHandler(_className); handler.init( _args ); if( handler instanceof SolrCoreAware ) { ((SolrCoreAware)handler).inform( core ); } _handler = handler; } catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "lazy loading error", ex ); } } return _handler; }
// in core/src/java/org/apache/solr/core/CachingDirectoryFactory.java
private static Directory injectLockFactory(Directory dir, String lockPath, String rawLockType) throws IOException { if (null == rawLockType) { // we default to "simple" for backwards compatibility log.warn("No lockType configured for " + dir + " assuming 'simple'"); rawLockType = "simple"; } final String lockType = rawLockType.toLowerCase(Locale.ENGLISH).trim(); if ("simple".equals(lockType)) { // multiple SimpleFSLockFactory instances should be OK dir.setLockFactory(new SimpleFSLockFactory(lockPath)); } else if ("native".equals(lockType)) { dir.setLockFactory(new NativeFSLockFactory(lockPath)); } else if ("single".equals(lockType)) { if (!(dir.getLockFactory() instanceof SingleInstanceLockFactory)) dir .setLockFactory(new SingleInstanceLockFactory()); } else if ("none".equals(lockType)) { // Recipe for disaster log.error("CONFIGURATION WARNING: locks are disabled on " + dir); dir.setLockFactory(NoLockFactory.getNoLockFactory()); } else { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Unrecognized lockType: " + rawLockType); } return dir; }
// in core/src/java/org/apache/solr/core/SolrCore.java
private <T> T createInstance(String className, Class<T> cast, String msg) { Class<? extends T> clazz = null; if (msg == null) msg = "SolrCore Object"; try { clazz = getResourceLoader().findClass(className, cast); //most of the classes do not have constructors which takes SolrCore argument. It is recommended to obtain SolrCore by implementing SolrCoreAware. // So invariably always it will cause a NoSuchMethodException. So iterate though the list of available constructors Constructor[] cons = clazz.getConstructors(); for (Constructor con : cons) { Class[] types = con.getParameterTypes(); if(types.length == 1 && types[0] == SolrCore.class){ return (T)con.newInstance(this); } } return getResourceLoader().newInstance(className, cast);//use the empty constructor } catch (SolrException e) { throw e; } catch (Exception e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR,"Error Instantiating "+msg+", "+className+ " failed to instantiate " +cast.getName(), e); } }
// in core/src/java/org/apache/solr/core/SolrCore.java
private UpdateHandler createReloadedUpdateHandler(String className, String msg, UpdateHandler updateHandler) { Class<? extends UpdateHandler> clazz = null; if (msg == null) msg = "SolrCore Object"; try { clazz = getResourceLoader().findClass(className, UpdateHandler.class); //most of the classes do not have constructors which takes SolrCore argument. It is recommended to obtain SolrCore by implementing SolrCoreAware. // So invariably always it will cause a NoSuchMethodException. So iterate though the list of available constructors Constructor justSolrCoreCon = null; Constructor[] cons = clazz.getConstructors(); for (Constructor con : cons) { Class[] types = con.getParameterTypes(); if(types.length == 2 && types[0] == SolrCore.class && types[1] == UpdateHandler.class){ return (UpdateHandler) con.newInstance(this, updateHandler); } } throw new SolrException(SolrException.ErrorCode.SERVER_ERROR,"Error Instantiating "+msg+", "+className+ " could not find proper constructor for " + UpdateHandler.class.getName()); } catch (SolrException e) { throw e; } catch (Exception e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR,"Error Instantiating "+msg+", "+className+ " failed to instantiate " + UpdateHandler.class.getName(), e); } }
// in core/src/java/org/apache/solr/core/SolrCore.java
public UpdateRequestProcessorChain getUpdateProcessingChain( final String name ) { UpdateRequestProcessorChain chain = updateProcessorChains.get( name ); if( chain == null ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "unknown UpdateRequestProcessorChain: "+name ); } return chain; }
// in core/src/java/org/apache/solr/core/SolrCore.java
public SearchComponent getSearchComponent( String name ) { SearchComponent component = searchComponents.get( name ); if( component == null ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "Unknown Search Component: "+name ); } return component; }
// in core/src/java/org/apache/solr/core/SolrCore.java
public RefCounted<SolrIndexSearcher> openNewSearcher(boolean updateHandlerReopens, boolean realtime) { SolrIndexSearcher tmp; RefCounted<SolrIndexSearcher> newestSearcher = null; boolean nrt = solrConfig.reopenReaders && updateHandlerReopens; openSearcherLock.lock(); try { String newIndexDir = getNewIndexDir(); File indexDirFile = null; File newIndexDirFile = null; // if it's not a normal near-realtime update, check that paths haven't changed. if (!nrt) { indexDirFile = new File(getIndexDir()).getCanonicalFile(); newIndexDirFile = new File(newIndexDir).getCanonicalFile(); } synchronized (searcherLock) { newestSearcher = realtimeSearcher; if (newestSearcher != null) { newestSearcher.incref(); // the matching decref is in the finally block } } if (newestSearcher != null && solrConfig.reopenReaders && (nrt || indexDirFile.equals(newIndexDirFile))) { DirectoryReader newReader; DirectoryReader currentReader = newestSearcher.get().getIndexReader(); if (updateHandlerReopens) { // SolrCore.verbose("start reopen from",previousSearcher,"writer=",writer); IndexWriter writer = getUpdateHandler().getSolrCoreState().getIndexWriter(this); newReader = DirectoryReader.openIfChanged(currentReader, writer, true); } else { // verbose("start reopen without writer, reader=", currentReader); newReader = DirectoryReader.openIfChanged(currentReader); // verbose("reopen result", newReader); } if (newReader == null) { // if this is a request for a realtime searcher, just return the same searcher if there haven't been any changes. if (realtime) { newestSearcher.incref(); return newestSearcher; } currentReader.incRef(); newReader = currentReader; } // for now, turn off caches if this is for a realtime reader (caches take a little while to instantiate) tmp = new SolrIndexSearcher(this, schema, (realtime ? "realtime":"main"), newReader, true, !realtime, true, directoryFactory); } else { // verbose("non-reopen START:"); tmp = new SolrIndexSearcher(this, newIndexDir, schema, getSolrConfig().indexConfig, "main", true, directoryFactory); // verbose("non-reopen DONE: searcher=",tmp); } List<RefCounted<SolrIndexSearcher>> searcherList = realtime ? _realtimeSearchers : _searchers; RefCounted<SolrIndexSearcher> newSearcher = newHolder(tmp, searcherList); // refcount now at 1 // Increment reference again for "realtimeSearcher" variable. It should be at 2 after. // When it's decremented by both the caller of this method, and by realtimeSearcher being replaced, // it will be closed. newSearcher.incref(); synchronized (searcherLock) { if (realtimeSearcher != null) { realtimeSearcher.decref(); } realtimeSearcher = newSearcher; searcherList.add(realtimeSearcher); } return newSearcher; } catch (Exception e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Error opening new searcher", e); } finally { openSearcherLock.unlock(); if (newestSearcher != null) { newestSearcher.decref(); } } }
// in core/src/java/org/apache/solr/core/SolrCore.java
public RefCounted<SolrIndexSearcher> getSearcher(boolean forceNew, boolean returnSearcher, final Future[] waitSearcher, boolean updateHandlerReopens) throws IOException { // it may take some time to open an index.... we may need to make // sure that two threads aren't trying to open one at the same time // if it isn't necessary. synchronized (searcherLock) { // see if we can return the current searcher if (_searcher!=null && !forceNew) { if (returnSearcher) { _searcher.incref(); return _searcher; } else { return null; } } // check to see if we can wait for someone else's searcher to be set if (onDeckSearchers>0 && !forceNew && _searcher==null) { try { searcherLock.wait(); } catch (InterruptedException e) { log.info(SolrException.toStr(e)); } } // check again: see if we can return right now if (_searcher!=null && !forceNew) { if (returnSearcher) { _searcher.incref(); return _searcher; } else { return null; } } // At this point, we know we need to open a new searcher... // first: increment count to signal other threads that we are // opening a new searcher. onDeckSearchers++; if (onDeckSearchers < 1) { // should never happen... just a sanity check log.error(logid+"ERROR!!! onDeckSearchers is " + onDeckSearchers); onDeckSearchers=1; // reset } else if (onDeckSearchers > maxWarmingSearchers) { onDeckSearchers--; String msg="Error opening new searcher. exceeded limit of maxWarmingSearchers="+maxWarmingSearchers + ", try again later."; log.warn(logid+""+ msg); // HTTP 503==service unavailable, or 409==Conflict throw new SolrException(SolrException.ErrorCode.SERVICE_UNAVAILABLE,msg); } else if (onDeckSearchers > 1) { log.warn(logid+"PERFORMANCE WARNING: Overlapping onDeckSearchers=" + onDeckSearchers); } } // a signal to decrement onDeckSearchers if something goes wrong. final boolean[] decrementOnDeckCount=new boolean[]{true}; RefCounted<SolrIndexSearcher> currSearcherHolder = null; // searcher we are autowarming from RefCounted<SolrIndexSearcher> searchHolder = null; boolean success = false; openSearcherLock.lock(); try { searchHolder = openNewSearcher(updateHandlerReopens, false); // the searchHolder will be incremented once already (and it will eventually be assigned to _searcher when registered) // increment it again if we are going to return it to the caller. if (returnSearcher) { searchHolder.incref(); } final RefCounted<SolrIndexSearcher> newSearchHolder = searchHolder; final SolrIndexSearcher newSearcher = newSearchHolder.get(); boolean alreadyRegistered = false; synchronized (searcherLock) { if (_searcher == null) { // if there isn't a current searcher then we may // want to register this one before warming is complete instead of waiting. if (solrConfig.useColdSearcher) { registerSearcher(newSearchHolder); decrementOnDeckCount[0]=false; alreadyRegistered=true; } } else { // get a reference to the current searcher for purposes of autowarming. currSearcherHolder=_searcher; currSearcherHolder.incref(); } } final SolrIndexSearcher currSearcher = currSearcherHolder==null ? null : currSearcherHolder.get(); Future future=null; // warm the new searcher based on the current searcher. // should this go before the other event handlers or after? if (currSearcher != null) { future = searcherExecutor.submit( new Callable() { public Object call() throws Exception { try { newSearcher.warm(currSearcher); } catch (Throwable e) { SolrException.log(log,e); } return null; } } ); } if (currSearcher==null && firstSearcherListeners.size() > 0) { future = searcherExecutor.submit( new Callable() { public Object call() throws Exception { try { for (SolrEventListener listener : firstSearcherListeners) { listener.newSearcher(newSearcher,null); } } catch (Throwable e) { SolrException.log(log,null,e); } return null; } } ); }
// in core/src/java/org/apache/solr/core/SolrCore.java
public void execute(SolrRequestHandler handler, SolrQueryRequest req, SolrQueryResponse rsp) { if (handler==null) { String msg = "Null Request Handler '" + req.getParams().get(CommonParams.QT) + "'"; if (log.isWarnEnabled()) log.warn(logid + msg + ":" + req); throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, msg); } // setup response header and handle request final NamedList<Object> responseHeader = new SimpleOrderedMap<Object>(); rsp.add("responseHeader", responseHeader); // toLog is a local ref to the same NamedList used by the request NamedList<Object> toLog = rsp.getToLog(); // for back compat, we set these now just in case other code // are expecting them during handleRequest toLog.add("webapp", req.getContext().get("webapp")); toLog.add("path", req.getContext().get("path")); toLog.add("params", "{" + req.getParamString() + "}"); // TODO: this doesn't seem to be working correctly and causes problems with the example server and distrib (for example /spell) // if (req.getParams().getBool(ShardParams.IS_SHARD,false) && !(handler instanceof SearchHandler)) // throw new SolrException(SolrException.ErrorCode.BAD_REQUEST,"isShard is only acceptable with search handlers"); handler.handleRequest(req,rsp); setResponseHeaderValues(handler,req,rsp); if (log.isInfoEnabled() && toLog.size() > 0) { StringBuilder sb = new StringBuilder(logid); for (int i=0; i<toLog.size(); i++) { String name = toLog.getName(i); Object val = toLog.getVal(i); if (name != null) { sb.append(name).append('='); } sb.append(val).append(' '); } log.info(sb.toString()); } }
// in core/src/java/org/apache/solr/core/SolrCore.java
public static void setResponseHeaderValues(SolrRequestHandler handler, SolrQueryRequest req, SolrQueryResponse rsp) { // TODO should check that responseHeader has not been replaced by handler NamedList<Object> responseHeader = rsp.getResponseHeader(); final int qtime=(int)(rsp.getEndTime() - req.getStartTime()); int status = 0; Exception exception = rsp.getException(); if( exception != null ){ if( exception instanceof SolrException ) status = ((SolrException)exception).code(); else status = 500; } responseHeader.add("status",status); responseHeader.add("QTime",qtime); if (rsp.getToLog().size() > 0) { rsp.getToLog().add("status",status); rsp.getToLog().add("QTime",qtime); } SolrParams params = req.getParams(); if( params.getBool(CommonParams.HEADER_ECHO_HANDLER, false) ) { responseHeader.add("handler", handler.getName() ); } // Values for echoParams... false/true/all or false/explicit/all ??? String ep = params.get( CommonParams.HEADER_ECHO_PARAMS, null ); if( ep != null ) { EchoParamStyle echoParams = EchoParamStyle.get( ep ); if( echoParams == null ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST,"Invalid value '" + ep + "' for " + CommonParams.HEADER_ECHO_PARAMS + " parameter, use '" + EchoParamStyle.EXPLICIT + "' or '" + EchoParamStyle.ALL + "'" ); } if( echoParams == EchoParamStyle.EXPLICIT ) { responseHeader.add("params", req.getOriginalParams().toNamedList()); } else if( echoParams == EchoParamStyle.ALL ) { responseHeader.add("params", req.getParams().toNamedList()); } } }
// in core/src/java/org/apache/solr/core/SolrCore.java
private void initQParsers() { initPlugins(qParserPlugins,QParserPlugin.class); // default parsers for (int i=0; i<QParserPlugin.standardPlugins.length; i+=2) { try { String name = (String)QParserPlugin.standardPlugins[i]; if (null == qParserPlugins.get(name)) { Class<QParserPlugin> clazz = (Class<QParserPlugin>)QParserPlugin.standardPlugins[i+1]; QParserPlugin plugin = clazz.newInstance(); qParserPlugins.put(name, plugin); plugin.init(null); } } catch (Exception e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); } } }
// in core/src/java/org/apache/solr/core/SolrCore.java
public QParserPlugin getQueryPlugin(String parserName) { QParserPlugin plugin = qParserPlugins.get(parserName); if (plugin != null) return plugin; throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Unknown query type '"+parserName+"'"); }
// in core/src/java/org/apache/solr/core/SolrCore.java
private void initValueSourceParsers() { initPlugins(valueSourceParsers,ValueSourceParser.class); // default value source parsers for (Map.Entry<String, ValueSourceParser> entry : ValueSourceParser.standardValueSourceParsers.entrySet()) { try { String name = entry.getKey(); if (null == valueSourceParsers.get(name)) { ValueSourceParser valueSourceParser = entry.getValue(); valueSourceParsers.put(name, valueSourceParser); valueSourceParser.init(null); } } catch (Exception e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); } } }
// in core/src/java/org/apache/solr/core/SolrCore.java
private void initTransformerFactories() { // Load any transformer factories initPlugins(transformerFactories,TransformerFactory.class); // Tell each transformer what its name is for( Map.Entry<String, TransformerFactory> entry : TransformerFactory.defaultFactories.entrySet() ) { try { String name = entry.getKey(); if (null == valueSourceParsers.get(name)) { TransformerFactory f = entry.getValue(); transformerFactories.put(name, f); // f.init(null); default ones don't need init } } catch (Exception e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); } } }
// in core/src/java/org/apache/solr/core/SolrCore.java
public synchronized QueryResponseWriter getWrappedWriter() { if( _writer == null ) { try { QueryResponseWriter writer = createQueryResponseWriter(_className); writer.init( _args ); _writer = writer; } catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "lazy loading error", ex ); } } return _writer; }
// in core/src/java/org/apache/solr/core/SolrResourceLoader.java
public <T> Class<? extends T> findClass(String cname, Class<T> expectedType, String... subpackages) { if (subpackages == null || subpackages.length == 0 || subpackages == packages) { subpackages = packages; String c = classNameCache.get(cname); if(c != null) { try { return Class.forName(c, true, classLoader).asSubclass(expectedType); } catch (ClassNotFoundException e) { //this is unlikely log.error("Unable to load cached class-name : "+ c +" for shortname : "+cname + e); } } } Class<? extends T> clazz = null; // first try cname == full name try { return Class.forName(cname, true, classLoader).asSubclass(expectedType); } catch (ClassNotFoundException e) { String newName=cname; if (newName.startsWith(project)) { newName = cname.substring(project.length()+1); } for (String subpackage : subpackages) { try { String name = base + '.' + subpackage + newName; log.trace("Trying class name " + name); return clazz = Class.forName(name,true,classLoader).asSubclass(expectedType); } catch (ClassNotFoundException e1) { // ignore... assume first exception is best. } } throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "Error loading class '" + cname + "'", e); }finally{ //cache the shortname vs FQN if it is loaded by the webapp classloader and it is loaded // using a shortname if ( clazz != null && clazz.getClassLoader() == SolrResourceLoader.class.getClassLoader() && !cname.equals(clazz.getName()) && (subpackages.length == 0 || subpackages == packages)) { //store in the cache classNameCache.put(cname, clazz.getName()); } } }
// in core/src/java/org/apache/solr/core/SolrResourceLoader.java
public <T> T newInstance(String cname, Class<T> expectedType, String ... subpackages) { Class<? extends T> clazz = findClass(cname, expectedType, subpackages); if( clazz == null ) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "Can not find class: "+cname + " in " + classLoader); } T obj = null; try { obj = clazz.newInstance(); } catch (Exception e) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "Error instantiating class: '" + clazz.getName()+"'", e); } if (!live) { if( obj instanceof SolrCoreAware ) { assertAwareCompatibility( SolrCoreAware.class, obj ); waitingForCore.add( (SolrCoreAware)obj ); } if (org.apache.solr.util.plugin.ResourceLoaderAware.class.isInstance(obj)) { log.warn("Class [{}] uses org.apache.solr.util.plugin.ResourceLoaderAware " + "which is deprecated. Change to org.apache.lucene.analysis.util.ResourceLoaderAware.", cname); } if( obj instanceof ResourceLoaderAware ) { assertAwareCompatibility( ResourceLoaderAware.class, obj ); waitingForResources.add( (ResourceLoaderAware)obj ); } if (obj instanceof SolrInfoMBean){ //TODO: Assert here? infoMBeans.add((SolrInfoMBean) obj); } } return obj; }
// in core/src/java/org/apache/solr/core/SolrResourceLoader.java
public CoreAdminHandler newAdminHandlerInstance(final CoreContainer coreContainer, String cname, String ... subpackages) { Class<? extends CoreAdminHandler> clazz = findClass(cname, CoreAdminHandler.class, subpackages); if( clazz == null ) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "Can not find class: "+cname + " in " + classLoader); } CoreAdminHandler obj = null; try { Constructor<? extends CoreAdminHandler> ctor = clazz.getConstructor(CoreContainer.class); obj = ctor.newInstance(coreContainer); } catch (Exception e) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "Error instantiating class: '" + clazz.getName()+"'", e); } if (!live) { //TODO: Does SolrCoreAware make sense here since in a multi-core context // which core are we talking about ? if (org.apache.solr.util.plugin.ResourceLoaderAware.class.isInstance(obj)) { log.warn("Class [{}] uses org.apache.solr.util.plugin.ResourceLoaderAware " + "which is deprecated. Change to org.apache.lucene.analysis.util.ResourceLoaderAware.", cname); } if( obj instanceof ResourceLoaderAware ) { assertAwareCompatibility( ResourceLoaderAware.class, obj ); waitingForResources.add( (ResourceLoaderAware)obj ); } } return obj; }
// in core/src/java/org/apache/solr/core/SolrResourceLoader.java
public <T> T newInstance(String cName, Class<T> expectedType, String [] subPackages, Class[] params, Object[] args){ Class<? extends T> clazz = findClass(cName, expectedType, subPackages); if( clazz == null ) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "Can not find class: "+cName + " in " + classLoader); } T obj = null; try { Constructor<? extends T> constructor = clazz.getConstructor(params); obj = constructor.newInstance(args); } catch (Exception e) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "Error instantiating class: '" + clazz.getName()+"'", e); } if (!live) { if( obj instanceof SolrCoreAware ) { assertAwareCompatibility( SolrCoreAware.class, obj ); waitingForCore.add( (SolrCoreAware)obj ); } if (org.apache.solr.util.plugin.ResourceLoaderAware.class.isInstance(obj)) { log.warn("Class [{}] uses org.apache.solr.util.plugin.ResourceLoaderAware " + "which is deprecated. Change to org.apache.lucene.analysis.util.ResourceLoaderAware.", cName); } if( obj instanceof ResourceLoaderAware ) { assertAwareCompatibility( ResourceLoaderAware.class, obj ); waitingForResources.add( (ResourceLoaderAware)obj ); } if (obj instanceof SolrInfoMBean){ //TODO: Assert here? infoMBeans.add((SolrInfoMBean) obj); } } return obj; }
// in core/src/java/org/apache/solr/core/SolrResourceLoader.java
void assertAwareCompatibility( Class aware, Object obj ) { Class[] valid = awareCompatibility.get( aware ); if( valid == null ) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "Unknown Aware interface: "+aware ); } for( Class v : valid ) { if( v.isInstance( obj ) ) { return; } } StringBuilder builder = new StringBuilder(); builder.append( "Invalid 'Aware' object: " ).append( obj ); builder.append( " -- ").append( aware.getName() ); builder.append( " must be an instance of: " ); for( Class v : valid ) { builder.append( "[" ).append( v.getName() ).append( "] ") ; } throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, builder.toString() ); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
public void load(String dir, InputSource cfgis) throws ParserConfigurationException, IOException, SAXException { if (null == dir) { // don't rely on SolrResourceLoader(), determine explicitly first dir = SolrResourceLoader.locateSolrHome(); } log.info("Loading CoreContainer using Solr Home: '{}'", dir); this.loader = new SolrResourceLoader(dir); solrHome = loader.getInstanceDir(); Config cfg = new Config(loader, null, cfgis, null, false); // keep orig config for persist to consult try { this.cfg = new Config(loader, null, copyDoc(cfg.getDocument())); } catch (TransformerException e) { throw new SolrException(ErrorCode.SERVER_ERROR, "", e); } cfg.substituteProperties(); // Initialize Logging if(cfg.getBool("solr/logging/@enabled",true)) { String slf4jImpl = null; String fname = cfg.get("solr/logging/watcher/@class", null); try { slf4jImpl = StaticLoggerBinder.getSingleton().getLoggerFactoryClassStr(); if(fname==null) { if( slf4jImpl.indexOf("Log4j") > 0) { log.warn("Log watching is not yet implemented for log4j" ); } else if( slf4jImpl.indexOf("JDK") > 0) { fname = "JUL"; } } } catch(Throwable ex) { log.warn("Unable to read SLF4J version. LogWatcher will be disabled: "+ex); } // Now load the framework if(fname!=null) { if("JUL".equalsIgnoreCase(fname)) { logging = new JulWatcher(slf4jImpl); } // else if( "Log4j".equals(fname) ) { // logging = new Log4jWatcher(slf4jImpl); // } else { try { logging = loader.newInstance(fname, LogWatcher.class); } catch (Throwable e) { log.warn("Unable to load LogWatcher", e); } } if( logging != null ) { ListenerConfig v = new ListenerConfig(); v.size = cfg.getInt("solr/logging/watcher/@size",50); v.threshold = cfg.get("solr/logging/watcher/@threshold",null); if(v.size>0) { log.info("Registering Log Listener"); logging.registerListener(v, this); } } } } String dcoreName = cfg.get("solr/cores/@defaultCoreName", null); if(dcoreName != null && !dcoreName.isEmpty()) { defaultCoreName = dcoreName; } persistent = cfg.getBool("solr/@persistent", false); libDir = cfg.get("solr/@sharedLib", null); zkHost = cfg.get("solr/@zkHost" , null); adminPath = cfg.get("solr/cores/@adminPath", null); shareSchema = cfg.getBool("solr/cores/@shareSchema", DEFAULT_SHARE_SCHEMA); zkClientTimeout = cfg.getInt("solr/cores/@zkClientTimeout", DEFAULT_ZK_CLIENT_TIMEOUT); hostPort = cfg.get("solr/cores/@hostPort", DEFAULT_HOST_PORT); hostContext = cfg.get("solr/cores/@hostContext", DEFAULT_HOST_CONTEXT); host = cfg.get("solr/cores/@host", null); if(shareSchema){ indexSchemaCache = new ConcurrentHashMap<String ,IndexSchema>(); } adminHandler = cfg.get("solr/cores/@adminHandler", null ); managementPath = cfg.get("solr/cores/@managementPath", null ); zkClientTimeout = Integer.parseInt(System.getProperty("zkClientTimeout", Integer.toString(zkClientTimeout))); initZooKeeper(zkHost, zkClientTimeout); if (libDir != null) { File f = FileUtils.resolvePath(new File(dir), libDir); log.info( "loading shared library: "+f.getAbsolutePath() ); libLoader = SolrResourceLoader.createClassLoader(f, null); } if (adminPath != null) { if (adminHandler == null) { coreAdminHandler = new CoreAdminHandler(this); } else { coreAdminHandler = this.createMultiCoreHandler(adminHandler); } } try { containerProperties = readProperties(cfg, ((NodeList) cfg.evaluate(DEFAULT_HOST_CONTEXT, XPathConstants.NODESET)).item(0)); } catch (Throwable e) { SolrException.log(log,null,e); } NodeList nodes = (NodeList)cfg.evaluate("solr/cores/core", XPathConstants.NODESET); for (int i=0; i<nodes.getLength(); i++) { Node node = nodes.item(i); try { String rawName = DOMUtil.getAttr(node, "name", null); if (null == rawName) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Each core in solr.xml must have a 'name'"); } String name = rawName; CoreDescriptor p = new CoreDescriptor(this, name, DOMUtil.getAttr(node, "instanceDir", null)); // deal with optional settings String opt = DOMUtil.getAttr(node, "config", null); if (opt != null) { p.setConfigName(opt); } opt = DOMUtil.getAttr(node, "schema", null); if (opt != null) { p.setSchemaName(opt); } if (zkController != null) { opt = DOMUtil.getAttr(node, "shard", null); if (opt != null && opt.length() > 0) { p.getCloudDescriptor().setShardId(opt); } opt = DOMUtil.getAttr(node, "collection", null); if (opt != null) { p.getCloudDescriptor().setCollectionName(opt); } opt = DOMUtil.getAttr(node, "roles", null); if(opt != null){ p.getCloudDescriptor().setRoles(opt); } } opt = DOMUtil.getAttr(node, "properties", null); if (opt != null) { p.setPropertiesName(opt); } opt = DOMUtil.getAttr(node, CoreAdminParams.DATA_DIR, null); if (opt != null) { p.setDataDir(opt); } p.setCoreProperties(readProperties(cfg, node)); SolrCore core = create(p); register(name, core, false); // track original names coreToOrigName.put(core, rawName); } catch (Throwable ex) { SolrException.log(log,null,ex); } } }
// in core/src/java/org/apache/solr/core/CoreContainer.java
public void reload(String name) throws ParserConfigurationException, IOException, SAXException { name= checkDefault(name); SolrCore core; synchronized(cores) { core = cores.get(name); } if (core == null) throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "No such core: " + name ); CoreDescriptor cd = core.getCoreDescriptor(); File instanceDir = new File(cd.getInstanceDir()); if (!instanceDir.isAbsolute()) { instanceDir = new File(getSolrHome(), cd.getInstanceDir()); } log.info("Reloading SolrCore '{}' using instanceDir: {}", cd.getName(), instanceDir.getAbsolutePath()); SolrResourceLoader solrLoader; if(zkController == null) { solrLoader = new SolrResourceLoader(instanceDir.getAbsolutePath(), libLoader, getCoreProps(instanceDir.getAbsolutePath(), cd.getPropertiesName(),cd.getCoreProperties())); } else { try { String collection = cd.getCloudDescriptor().getCollectionName(); zkController.createCollectionZkNode(cd.getCloudDescriptor()); String zkConfigName = zkController.readConfigName(collection); if (zkConfigName == null) { log.error("Could not find config name for collection:" + collection); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "Could not find config name for collection:" + collection); } solrLoader = new ZkSolrResourceLoader(instanceDir.getAbsolutePath(), zkConfigName, libLoader, getCoreProps(instanceDir.getAbsolutePath(), cd.getPropertiesName(),cd.getCoreProperties()), zkController); } catch (KeeperException e) { log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } } SolrCore newCore = core.reload(solrLoader); // keep core to orig name link String origName = coreToOrigName.remove(core); if (origName != null) { coreToOrigName.put(newCore, origName); } register(name, newCore, false); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
public void swap(String n0, String n1) { if( n0 == null || n1 == null ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "Can not swap unnamed cores." ); } n0 = checkDefault(n0); n1 = checkDefault(n1); synchronized( cores ) { SolrCore c0 = cores.get(n0); SolrCore c1 = cores.get(n1); if (c0 == null) throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "No such core: " + n0 ); if (c1 == null) throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "No such core: " + n1 ); cores.put(n0, c1); cores.put(n1, c0); c0.setName(n1); c0.getCoreDescriptor().name = n1; c1.setName(n0); c1.getCoreDescriptor().name = n0; } log.info("swapped: "+n0 + " with " + n1); }
// in core/src/java/org/apache/solr/core/SolrXMLSerializer.java
void persistFile(File file, SolrXMLDef solrXMLDef) { log.info("Persisting cores config to " + file); File tmpFile = null; try { // write in temp first tmpFile = File.createTempFile("solr", ".xml", file.getParentFile()); java.io.FileOutputStream out = new java.io.FileOutputStream(tmpFile); Writer writer = new BufferedWriter(new OutputStreamWriter(out, "UTF-8")); try { persist(writer, solrXMLDef); } finally { writer.close(); out.close(); } // rename over origin or copy if this fails if (tmpFile != null) { if (tmpFile.renameTo(file)) tmpFile = null; else fileCopy(tmpFile, file); } } catch (java.io.FileNotFoundException xnf) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, xnf); } catch (java.io.IOException xio) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, xio); } finally { if (tmpFile != null) { if (!tmpFile.delete()) tmpFile.deleteOnExit(); } } }
// in core/src/java/org/apache/solr/core/JmxMonitoredMap.java
private void unregister(String key, SolrInfoMBean infoBean) { if (server == null) return; try { ObjectName name = getObjectName(key, infoBean); if (server.isRegistered(name) && coreHashCode.equals(server.getAttribute(name, "coreHashCode"))) { server.unregisterMBean(name); } } catch (Exception e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Failed to unregister info bean: " + key, e); } }
// in core/src/java/org/apache/solr/util/SolrPluginUtils.java
public static DocList doSimpleQuery(String sreq, SolrQueryRequest req, int start, int limit) throws IOException { List<String> commands = StrUtils.splitSmart(sreq,';'); String qs = commands.size() >= 1 ? commands.get(0) : ""; try { Query query = QParser.getParser(qs, null, req).getQuery(); // If the first non-query, non-filter command is a simple sort on an indexed field, then // we can use the Lucene sort ability. Sort sort = null; if (commands.size() >= 2) { sort = QueryParsing.parseSort(commands.get(1), req); } DocList results = req.getSearcher().getDocList(query,(DocSet)null, sort, start, limit); return results; } catch (ParseException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Error parsing query: " + qs); } }
// in core/src/java/org/apache/solr/util/DOMUtil.java
public static String substituteProperty(String value, Properties coreProperties) { if (value == null || value.indexOf('$') == -1) { return value; } List<String> fragments = new ArrayList<String>(); List<String> propertyRefs = new ArrayList<String>(); parsePropertyString(value, fragments, propertyRefs); StringBuilder sb = new StringBuilder(); Iterator<String> i = fragments.iterator(); Iterator<String> j = propertyRefs.iterator(); while (i.hasNext()) { String fragment = i.next(); if (fragment == null) { String propertyName = j.next(); String defaultValue = null; int colon_index = propertyName.indexOf(':'); if (colon_index > -1) { defaultValue = propertyName.substring(colon_index + 1); propertyName = propertyName.substring(0,colon_index); } if (coreProperties != null) { fragment = coreProperties.getProperty(propertyName); } if (fragment == null) { fragment = System.getProperty(propertyName, defaultValue); } if (fragment == null) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "No system property or default value specified for " + propertyName + " value:" + value); } } sb.append(fragment); } return sb.toString(); }
// in core/src/java/org/apache/solr/util/plugin/AbstractPluginLoader.java
public T load( ResourceLoader loader, NodeList nodes ) { List<PluginInitInfo> info = new ArrayList<PluginInitInfo>(); T defaultPlugin = null; if (nodes !=null ) { for (int i=0; i<nodes.getLength(); i++) { Node node = nodes.item(i); String name = null; try { name = DOMUtil.getAttr(node,"name", requireName?type:null); String className = DOMUtil.getAttr(node,"class", type); String defaultStr = DOMUtil.getAttr(node,"default", null ); T plugin = create(loader, name, className, node ); log.debug("created " + ((name != null) ? name : "") + ": " + plugin.getClass().getName()); // Either initialize now or wait till everything has been registered if( preRegister ) { info.add( new PluginInitInfo( plugin, node ) ); } else { init( plugin, node ); } T old = register( name, plugin ); if( old != null && !( name == null && !requireName ) ) { throw new SolrException( ErrorCode.SERVER_ERROR, "Multiple "+type+" registered to the same name: "+name+" ignoring: "+old ); } if( defaultStr != null && Boolean.parseBoolean( defaultStr ) ) { if( defaultPlugin != null ) { throw new SolrException( ErrorCode.SERVER_ERROR, "Multiple default "+type+" plugins: "+defaultPlugin + " AND " + name ); } defaultPlugin = plugin; } } catch (Exception ex) { SolrException e = new SolrException (ErrorCode.SERVER_ERROR, "Plugin init failure for " + type + (null != name ? (" \"" + name + "\"") : "") + ": " + ex.getMessage(), ex); throw e; } } } // If everything needs to be registered *first*, this will initialize later for( PluginInitInfo pinfo : info ) { try { init( pinfo.plugin, pinfo.node ); } catch( Exception ex ) { SolrException e = new SolrException (ErrorCode.SERVER_ERROR, "Plugin Initializing failure for " + type, ex); throw e; } } return defaultPlugin; }
// in core/src/java/org/apache/solr/util/plugin/AbstractPluginLoader.java
public T loadSingle(ResourceLoader loader, Node node) { List<PluginInitInfo> info = new ArrayList<PluginInitInfo>(); T plugin = null; try { String name = DOMUtil.getAttr(node, "name", requireName ? type : null); String className = DOMUtil.getAttr(node, "class", type); plugin = create(loader, name, className, node); log.debug("created " + name + ": " + plugin.getClass().getName()); // Either initialize now or wait till everything has been registered if (preRegister) { info.add(new PluginInitInfo(plugin, node)); } else { init(plugin, node); } T old = register(name, plugin); if (old != null && !(name == null && !requireName)) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Multiple " + type + " registered to the same name: " + name + " ignoring: " + old); } } catch (Exception ex) { SolrException e = new SolrException (ErrorCode.SERVER_ERROR, "Plugin init failure for " + type, ex); throw e; } // If everything needs to be registered *first*, this will initialize later for (PluginInitInfo pinfo : info) { try { init(pinfo.plugin, pinfo.node); } catch (Exception ex) { SolrException e = new SolrException (ErrorCode.SERVER_ERROR, "Plugin init failure for " + type, ex); throw e; } } return plugin; }
187
              
// in solrj/src/java/org/apache/solr/common/params/FacetParams.java
catch (IllegalArgumentException e) { throw new SolrException (SolrException.ErrorCode.BAD_REQUEST, label+" is not a valid type of 'other' range facet information",e); }
// in solrj/src/java/org/apache/solr/common/params/FacetParams.java
catch (IllegalArgumentException e) { throw new SolrException (SolrException.ErrorCode.BAD_REQUEST, label+" is not a valid type of 'other' range facet information",e); }
// in solrj/src/java/org/apache/solr/common/params/FacetParams.java
catch (IllegalArgumentException e) { throw new SolrException (SolrException.ErrorCode.BAD_REQUEST, label+" is not a valid type of for range 'include' information",e); }
// in solrj/src/java/org/apache/solr/common/params/SolrParams.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, ex.getMessage(), ex ); }
// in solrj/src/java/org/apache/solr/common/params/SolrParams.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, ex.getMessage(), ex ); }
// in solrj/src/java/org/apache/solr/common/params/SolrParams.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, ex.getMessage(), ex ); }
// in solrj/src/java/org/apache/solr/common/params/SolrParams.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, ex.getMessage(), ex ); }
// in solrj/src/java/org/apache/solr/common/params/SolrParams.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, ex.getMessage(), ex ); }
// in solrj/src/java/org/apache/solr/common/params/SolrParams.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, ex.getMessage(), ex ); }
// in solrj/src/java/org/apache/solr/common/params/SolrParams.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, ex.getMessage(), ex ); }
// in solrj/src/java/org/apache/solr/common/params/SolrParams.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, ex.getMessage(), ex ); }
// in solrj/src/java/org/apache/solr/common/params/SolrParams.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, ex.getMessage(), ex ); }
// in solrj/src/java/org/apache/solr/common/params/SolrParams.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, ex.getMessage(), ex ); }
// in solrj/src/java/org/apache/solr/common/params/SolrParams.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, ex.getMessage(), ex ); }
// in solrj/src/java/org/apache/solr/common/params/SolrParams.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, ex.getMessage(), ex ); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/BinaryResponseParser.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "parsing error", e); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/StreamingBinaryResponseParser.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "parsing error", e); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/XMLResponseParser.java
catch (XMLStreamException e) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "parsing error", e); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/XMLResponseParser.java
catch (XMLStreamException e) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "parsing error", e); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/XMLResponseParser.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "parsing error", ex ); }
// in contrib/extraction/src/java/org/apache/solr/handler/extraction/ExtractingDocumentLoader.java
catch (TikaException e) { if(ignoreTikaException) log.warn(new StringBuilder("skip extracting text due to ").append(e.getLocalizedMessage()) .append(". metadata=").append(metadata.toString()).toString()); else throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in contrib/extraction/src/java/org/apache/solr/handler/extraction/ExtractingDocumentLoader.java
catch (SAXException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in contrib/extraction/src/java/org/apache/solr/handler/extraction/ExtractingRequestHandler.java
catch (Exception e) { throw new SolrException(ErrorCode.SERVER_ERROR, e); }
// in contrib/extraction/src/java/org/apache/solr/handler/extraction/ExtractingRequestHandler.java
catch (MimeTypeException e) { throw new SolrException(ErrorCode.SERVER_ERROR, e); }
// in contrib/extraction/src/java/org/apache/solr/handler/extraction/ExtractingRequestHandler.java
catch (IOException e) { throw new SolrException(ErrorCode.SERVER_ERROR, e); }
// in contrib/uima/src/java/org/apache/solr/uima/processor/UIMAUpdateRequestProcessor.java
catch (Exception e) { String logField = solrUIMAConfiguration.getLogField(); if(logField == null){ SchemaField uniqueKeyField = solrCore.getSchema().getUniqueKeyField(); if(uniqueKeyField != null){ logField = uniqueKeyField.getName(); } } String optionalFieldInfo = logField == null ? "." : new StringBuilder(". ").append(logField).append("=") .append((String)cmd.getSolrInputDocument().getField(logField).getValue()) .append(", ").toString(); int len = Math.min(text.length(), 100); if (solrUIMAConfiguration.isIgnoreErrors()) { log.warn(new StringBuilder("skip the text processing due to ") .append(e.getLocalizedMessage()).append(optionalFieldInfo) .append(" text=\"").append(text.substring(0, len)).append("...\"").toString()); } else { throw new SolrException(ErrorCode.SERVER_ERROR, new StringBuilder("processing error: ") .append(e.getLocalizedMessage()).append(optionalFieldInfo) .append(" text=\"").append(text.substring(0, len)).append("...\"").toString(), e); } }
// in contrib/clustering/src/java/org/apache/solr/handler/clustering/carrot2/CarrotClusteringEngine.java
catch (IOException e) { throw new SolrException(ErrorCode.SERVER_ERROR, e); }
// in contrib/clustering/src/java/org/apache/solr/handler/clustering/carrot2/CarrotClusteringEngine.java
catch (Exception e) { log.error("Carrot2 clustering failed", e); throw new SolrException(ErrorCode.SERVER_ERROR, "Carrot2 clustering failed", e); }
// in contrib/analysis-extras/src/java/org/apache/solr/analysis/StempelPolishStemFilterFactory.java
catch (IOException e) { throw new SolrException(ErrorCode.SERVER_ERROR, "Could not load stem table: " + STEMTABLE); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DataImportHandler.java
catch (Throwable e) { LOG.error( DataImporter.MSG.LOAD_EXP, e); throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, DataImporter.MSG.INVALID_CONFIG, e); }
// in core/src/java/org/apache/solr/handler/AnalysisRequestHandlerBase.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/handler/AnalysisRequestHandlerBase.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
catch (Exception e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Index fetch failed : ", e); }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
catch (Exception e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Unable to write index.properties", e); }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
catch (Exception e) { LOG.warn("Error in fetching packets ", e); //for any failure , increment the error count errorCount++; //if it fails for the same pacaket for MAX_RETRIES fail and come out if (errorCount > MAX_RETRIES) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Fetch failed for file:" + fileName, e); } return ERR; }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
catch (NumberFormatException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, INTERVAL_ERR_MSG); }
// in core/src/java/org/apache/solr/handler/loader/XMLLoader.java
catch(TransformerException te) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, te.getMessage(), te); }
// in core/src/java/org/apache/solr/handler/loader/XMLLoader.java
catch (XMLStreamException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e.getMessage(), e); }
// in core/src/java/org/apache/solr/handler/loader/XMLLoader.java
catch (XMLStreamException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e.getMessage(), e); }
// in core/src/java/org/apache/solr/handler/loader/JavabinLoader.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "ERROR handling commit/rollback"); }
// in core/src/java/org/apache/solr/handler/loader/JavabinLoader.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "ERROR adding document " + document); }
// in core/src/java/org/apache/solr/handler/loader/CSVLoaderBase.java
catch (IOException e) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST,e); }
// in core/src/java/org/apache/solr/handler/MoreLikeThisHandler.java
catch (ParseException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/handler/component/TermsComponent.java
catch (IllegalArgumentException iae) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Unknown terms regex flag '" + flagParam + "'"); }
// in core/src/java/org/apache/solr/handler/component/HttpShardHandler.java
catch (InterruptedException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/handler/component/HttpShardHandler.java
catch (ExecutionException e) { // should be impossible... the problem with catching the exception // at this level is we don't know what ShardRequest it applied to throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Impossible Exception",e); }
// in core/src/java/org/apache/solr/handler/component/QueryElevationComponent.java
catch (Exception ex) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Error initializing QueryElevationComponent.", ex); }
// in core/src/java/org/apache/solr/handler/component/QueryElevationComponent.java
catch (XPathExpressionException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "query requires '<doc .../>' child"); }
// in core/src/java/org/apache/solr/handler/component/QueryElevationComponent.java
catch (Exception ex) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Error loading elevation", ex); }
// in core/src/java/org/apache/solr/handler/component/QueryComponent.java
catch (ParseException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/handler/component/QueryComponent.java
catch (IllegalArgumentException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, String.format("Illegal %s parameter", GroupParams.GROUP_FORMAT)); }
// in core/src/java/org/apache/solr/handler/component/QueryComponent.java
catch (ParseException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/handler/component/FacetComponent.java
catch (ParseException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/handler/component/HighlightComponent.java
catch (ParseException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/handler/component/HighlightComponent.java
catch (Exception e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/handler/component/HttpShardHandlerFactory.java
catch (MalformedURLException e) { // should be impossible since we're not passing any URLs here throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/handler/component/TermVectorComponent.java
catch (NumberFormatException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e.getMessage(), e); }
// in core/src/java/org/apache/solr/handler/admin/ShowFileRequestHandler.java
catch (URISyntaxException e) { throw new SolrException( ErrorCode.FORBIDDEN, "Can not access configuration directory!"); }
// in core/src/java/org/apache/solr/handler/admin/SolrInfoMBeanHandler.java
catch(Exception ex) { throw new SolrException(ErrorCode.BAD_REQUEST, "missing content-stream for diff"); }
// in core/src/java/org/apache/solr/handler/admin/SolrInfoMBeanHandler.java
catch(Exception ex) { throw new SolrException(ErrorCode.BAD_REQUEST, "Unable to read original XML", ex); }
// in core/src/java/org/apache/solr/handler/admin/CoreAdminHandler.java
catch (Exception ex) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Error executing default implementation of CREATE", ex); }
// in core/src/java/org/apache/solr/handler/admin/CoreAdminHandler.java
catch (InterruptedException e) { Thread.currentThread().interrupt(); throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Could not unregister core " + cname + " from cloudstate: " + e.getMessage(), e); }
// in core/src/java/org/apache/solr/handler/admin/CoreAdminHandler.java
catch (KeeperException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Could not unregister core " + cname + " from cloudstate: " + e.getMessage(), e); }
// in core/src/java/org/apache/solr/handler/admin/CoreAdminHandler.java
catch (Exception ex) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Error handling 'status' action ", ex); }
// in core/src/java/org/apache/solr/handler/admin/CoreAdminHandler.java
catch (Exception ex) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Error handling 'reload' action", ex); }
// in core/src/java/org/apache/solr/handler/admin/LoggingHandler.java
catch(Exception ex) { throw new SolrException(ErrorCode.BAD_REQUEST, "invalid timestamp: "+since); }
// in core/src/java/org/apache/solr/handler/PingRequestHandler.java
catch (IllegalArgumentException iae){ throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Unknown action: " + actionParam); }
// in core/src/java/org/apache/solr/handler/PingRequestHandler.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Unable to write healthcheck flag file", e); }
// in core/src/java/org/apache/solr/analysis/TrieTokenizerFactory.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Unable to create TrieIndexTokenizer", e); }
// in core/src/java/org/apache/solr/analysis/TrieTokenizerFactory.java
catch (NumberFormatException nfe) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Invalid Number: " + v); }
// in core/src/java/org/apache/solr/analysis/TrieTokenizerFactory.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Unable to create TrieIndexTokenizer", e); }
// in core/src/java/org/apache/solr/response/transform/ValueAugmenterFactory.java
catch( Exception ex ) { throw new SolrException( ErrorCode.BAD_REQUEST, "Unable to parse "+type+"="+val, ex ); }
// in core/src/java/org/apache/solr/response/transform/ValueSourceAugmenter.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/response/transform/ValueSourceAugmenter.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "exception at docid " + docid + " for valuesource " + valueSource, e); }
// in core/src/java/org/apache/solr/response/transform/ExplainAugmenterFactory.java
catch( Exception ex ) { throw new SolrException( ErrorCode.BAD_REQUEST, "Unknown Explain Style: "+str ); }
// in core/src/java/org/apache/solr/request/PerSegmentSingleValuedFaceting.java
catch (InterruptedException e) { Thread.currentThread().interrupt(); throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/request/PerSegmentSingleValuedFaceting.java
catch (ExecutionException e) { Throwable cause = e.getCause(); if (cause instanceof RuntimeException) { throw (RuntimeException)cause; } else { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Error in per-segment faceting on field: " + fieldName, cause); } }
// in core/src/java/org/apache/solr/request/SimpleFacets.java
catch (IOException e) { throw new SolrException(ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/request/SimpleFacets.java
catch (ParseException e) { throw new SolrException(ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/request/SimpleFacets.java
catch (SolrException e) { throw new SolrException (SolrException.ErrorCode.BAD_REQUEST, "date facet 'start' is not a valid Date string: " + startS, e); }
// in core/src/java/org/apache/solr/request/SimpleFacets.java
catch (SolrException e) { throw new SolrException (SolrException.ErrorCode.BAD_REQUEST, "date facet 'end' is not a valid Date string: " + endS, e); }
// in core/src/java/org/apache/solr/request/SimpleFacets.java
catch (java.text.ParseException e) { throw new SolrException (SolrException.ErrorCode.BAD_REQUEST, "date facet 'gap' is not a valid Date Math string: " + gap, e); }
// in core/src/java/org/apache/solr/request/SimpleFacets.java
catch (Exception e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Can't parse value "+rawval+" for field: " + field.getName(), e); }
// in core/src/java/org/apache/solr/request/SimpleFacets.java
catch (Exception e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Can't parse gap "+gap+" for field: " + field.getName(), e); }
// in core/src/java/org/apache/solr/request/SimpleFacets.java
catch (Exception e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Can't add gap "+gap+" to value " + value + " for field: " + field.getName(), e); }
// in core/src/java/org/apache/solr/request/UnInvertedField.java
catch (IllegalStateException ise) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, ise.getMessage()); }
// in core/src/java/org/apache/solr/servlet/SolrDispatchFilter.java
catch (Exception e) { //unlikely throw new SolrException(SolrException.ErrorCode.SERVER_ERROR,e); }
// in core/src/java/org/apache/solr/servlet/SolrRequestParsers.java
catch( UnsupportedEncodingException uex ) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, uex ); }
// in core/src/java/org/apache/solr/servlet/cache/HttpCacheHeaderUtil.java
catch (IOException e) { // we're pretty freaking screwed if this happens throw new SolrException(ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/schema/OpenExchangeRatesOrgProvider.java
catch (Exception e) { throw new SolrException(ErrorCode.SERVER_ERROR, "Error reloading exchange rates", e); }
// in core/src/java/org/apache/solr/schema/OpenExchangeRatesOrgProvider.java
catch (IOException e) { throw new SolrException(ErrorCode.SERVER_ERROR, "Error closing stream", e); }
// in core/src/java/org/apache/solr/schema/OpenExchangeRatesOrgProvider.java
catch (Exception e) { throw new SolrException(ErrorCode.BAD_REQUEST, "Error initializing", e); }
// in core/src/java/org/apache/solr/schema/IndexSchema.java
catch(Exception e) { // unexpected exception... throw new SolrException( SolrException.ErrorCode.SERVER_ERROR,"Schema Parsing Failed: " + e.getMessage(), e); }
// in core/src/java/org/apache/solr/schema/DateField.java
catch (ParseException e) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "Invalid Date in Date Math String:'" +val+'\'',e); }
// in core/src/java/org/apache/solr/schema/DateField.java
catch (ParseException e) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "Invalid Date Math String:'" +val+'\'',e); }
// in core/src/java/org/apache/solr/schema/DateField.java
catch (ParseException e) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "Invalid Date in Date Math String:'" +val+'\'',e); }
// in core/src/java/org/apache/solr/schema/DateField.java
catch (ParseException e) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "Invalid Date Math String:'" +val+'\'',e); }
// in core/src/java/org/apache/solr/schema/CurrencyField.java
catch (Exception e) { throw new SolrException(ErrorCode.BAD_REQUEST, "Error instansiating exhange rate provider "+exchangeRateProviderClass+". Please check your FieldType configuration", e); }
// in core/src/java/org/apache/solr/schema/CurrencyField.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/schema/CurrencyField.java
catch (UnsupportedOperationException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "XML parser doesn't support XInclude option", e); }
// in core/src/java/org/apache/solr/schema/CurrencyField.java
catch (NumberFormatException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Could not parse exchange rate: " + rateNode, e); }
// in core/src/java/org/apache/solr/schema/CurrencyField.java
catch (SAXException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Error parsing currency config.", e); }
// in core/src/java/org/apache/solr/schema/CurrencyField.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Error parsing currency config.", e); }
// in core/src/java/org/apache/solr/schema/CurrencyField.java
catch (ParserConfigurationException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Error parsing currency config.", e); }
// in core/src/java/org/apache/solr/schema/CurrencyField.java
catch (XPathExpressionException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Error parsing currency config.", e); }
// in core/src/java/org/apache/solr/schema/CurrencyField.java
catch (IOException e) { throw new SolrException(ErrorCode.BAD_REQUEST, "Error while opening Currency configuration file "+currencyConfigFile, e); }
// in core/src/java/org/apache/solr/schema/CurrencyField.java
catch (NumberFormatException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/schema/TrieField.java
catch (IllegalArgumentException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Invalid type specified in schema.xml for field: " + args.get("name"), e); }
// in core/src/java/org/apache/solr/schema/LatLonType.java
catch (InvalidShapeException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/schema/LatLonType.java
catch (InvalidShapeException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/schema/LatLonType.java
catch (InvalidShapeException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/schema/LatLonType.java
catch (InvalidShapeException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/schema/LatLonType.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/schema/FieldTypePluginLoader.java
catch (Exception e) { log.error("Cannot load analyzer: "+analyzerName, e); throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "Cannot load analyzer: "+analyzerName, e ); }
// in core/src/java/org/apache/solr/schema/PointType.java
catch (InvalidShapeException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/schema/PointType.java
catch (InvalidShapeException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/schema/PointType.java
catch (InvalidShapeException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/schema/PointType.java
catch (InvalidShapeException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/schema/GeoHashField.java
catch (InvalidShapeException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/schema/GeoHashField.java
catch (InvalidShapeException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/schema/TextField.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Unable to initialize TokenStream to analyze multiTerm term: " + part, e); }
// in core/src/java/org/apache/solr/schema/TextField.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST,"error analyzing range part: " + part, e); }
// in core/src/java/org/apache/solr/schema/FieldType.java
catch (RuntimeException e) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "Error while creating field '" + field + "' from value '" + value + "'", e); }
// in core/src/java/org/apache/solr/search/ReturnFields.java
catch (ParseException e) { // try again, simple rules for a field name with no whitespace sp.pos = start; field = sp.getSimpleString(); if (req.getSchema().getFieldOrNull(field) != null) { // OK, it was an oddly named field fields.add(field); if( key != null ) { rename.add(field, key); } } else { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Error parsing fieldname: " + e.getMessage(), e); } }
// in core/src/java/org/apache/solr/search/ReturnFields.java
catch (ParseException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Error parsing fieldname", e); }
// in core/src/java/org/apache/solr/search/SolrConstantScoreQuery.java
catch (IOException e) { // TODO: remove this if ConstantScoreQuery.createWeight adds IOException throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/search/QueryParsing.java
catch (ParseException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "error in sort: " + sortSpec, e); }
// in core/src/java/org/apache/solr/search/QueryParsing.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "error in sort: " + sortSpec, e); }
// in core/src/java/org/apache/solr/search/grouping/distributed/responseprocessor/TopGroupsShardResponseProcessor.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/search/grouping/distributed/responseprocessor/SearchGroupShardResponseProcessor.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/highlight/DefaultSolrHighlighter.java
catch (InvalidTokenOffsetsException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/cloud/SolrZkServer.java
catch (QuorumPeerConfig.ConfigException e) { if (zkRun != null) throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/cloud/SolrZkServer.java
catch (IOException e) { if (zkRun != null) throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/cloud/SolrZkServer.java
catch (Throwable e) { log.error("ZooKeeper Server ERROR", e); throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (IOException e) { log.error("", e); throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Can't create ZooKeeperController", e); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (InterruptedException e) { throw new SolrException(ErrorCode.SERVER_ERROR, "Interrupted"); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (Exception e) { SolrException.log(log, "There was a problem finding the leader in zk", e); try { Thread.sleep(2000); } catch (InterruptedException e1) { Thread.currentThread().interrupt(); } if (i == retries - 1) { throw new SolrException(ErrorCode.SERVER_ERROR, "There was a problem finding the leader in zk"); } }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (Exception e) { SolrException.log(log, "There was a problem making a request to the leader", e); try { Thread.sleep(2000); } catch (InterruptedException e1) { Thread.currentThread().interrupt(); } if (i == retries - 1) { throw new SolrException(ErrorCode.SERVER_ERROR, "There was a problem making a request to the leader"); } }
// in core/src/java/org/apache/solr/update/UpdateLog.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Can't open new tlog!", e); }
// in core/src/java/org/apache/solr/update/DirectUpdateHandler2.java
catch (ParseException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/update/DirectUpdateHandler2.java
catch (IOException e) { throw new SolrException(ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/update/processor/DistributedUpdateProcessor.java
catch (IllegalArgumentException e) { throw new SolrException (SolrException.ErrorCode.BAD_REQUEST, "Illegal value for " + DISTRIB_UPDATE_PARAM + ": " + param, e); }
// in core/src/java/org/apache/solr/update/processor/DistributedUpdateProcessor.java
catch (InterruptedException e) { throw new SolrException(ErrorCode.SERVER_ERROR, "Exception finding leader for shard " + sliceName, e); }
// in core/src/java/org/apache/solr/update/processor/MinFieldValueUpdateProcessorFactory.java
catch (ClassCastException e) { throw new SolrException (BAD_REQUEST, "Field values are not mutually comparable: " + e.getMessage(), e); }
// in core/src/java/org/apache/solr/update/processor/RegexReplaceProcessorFactory.java
catch (PatternSyntaxException e) { throw new SolrException(ErrorCode.SERVER_ERROR, "Invalid regex: " + patternParam, e); }
// in core/src/java/org/apache/solr/update/processor/FieldMutatingUpdateProcessor.java
catch (SolrException e) { String msg = "Unable to mutate field '"+fname+"': "+e.getMessage(); SolrException.log(log, msg, e); throw new SolrException(BAD_REQUEST, msg, e); }
// in core/src/java/org/apache/solr/update/processor/FieldMutatingUpdateProcessor.java
catch (Exception e) { throw new SolrException(SERVER_ERROR, "Can't resolve typeClass: " + t, e); }
// in core/src/java/org/apache/solr/update/processor/MaxFieldValueUpdateProcessorFactory.java
catch (ClassCastException e) { throw new SolrException (BAD_REQUEST, "Field values are not mutually comparable: " + e.getMessage(), e); }
// in core/src/java/org/apache/solr/update/processor/FieldMutatingUpdateProcessorFactory.java
catch (PatternSyntaxException e) { throw new SolrException (SERVER_ERROR, "Invalid 'fieldRegex' pattern: " + s, e); }
// in core/src/java/org/apache/solr/update/VersionInfo.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Error reading version from index", e); }
// in core/src/java/org/apache/solr/update/DocumentBuilder.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "ERROR: "+getID(doc, schema)+"Error adding field '" + field.getName() + "'='" +field.getValue()+"' msg=" + ex.getMessage(), ex ); }
// in core/src/java/org/apache/solr/update/SolrCmdDistributor.java
catch (InterruptedException e) { throw new SolrException(SolrException.ErrorCode.SERVICE_UNAVAILABLE, "interrupted waiting for shard update response", e); }
// in core/src/java/org/apache/solr/update/TransactionLog.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/update/TransactionLog.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/update/TransactionLog.java
catch (IOException e) { // TODO: reset our file pointer back to "pos", the start of this record. throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Error logging add", e); }
// in core/src/java/org/apache/solr/update/TransactionLog.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/update/TransactionLog.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/update/TransactionLog.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/update/TransactionLog.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/update/TransactionLog.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/update/TransactionLog.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/core/Config.java
catch (XPathExpressionException e) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR,"Error in xpath:" + path +" for " + name,e); }
// in core/src/java/org/apache/solr/core/Config.java
catch (XPathExpressionException e) { SolrException.log(log,"Error in xpath",e); throw new SolrException( SolrException.ErrorCode.SERVER_ERROR,"Error in xpath:" + xstr + " for " + name,e); }
// in core/src/java/org/apache/solr/core/Config.java
catch (Throwable e) { SolrException.log(log,"Error in xpath",e); throw new SolrException( SolrException.ErrorCode.SERVER_ERROR,"Error in xpath:" + xstr+ " for " + name,e); }
// in core/src/java/org/apache/solr/core/Config.java
catch (IllegalArgumentException iae) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Invalid luceneMatchVersion '" + matchVersion + "', valid values are: " + Arrays.toString(Version.values()) + " or a string in format 'V.V'", iae); }
// in core/src/java/org/apache/solr/core/RequestHandlers.java
catch (Exception ex) { throw new SolrException (ErrorCode.SERVER_ERROR, "RequestHandler init failure", ex); }
// in core/src/java/org/apache/solr/core/RequestHandlers.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "lazy loading error", ex ); }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (Exception e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR,"Error Instantiating "+msg+", "+className+ " failed to instantiate " +cast.getName(), e); }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (Exception e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR,"Error Instantiating "+msg+", "+className+ " failed to instantiate " + UpdateHandler.class.getName(), e); }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (Throwable e) { latch.countDown();//release the latch, otherwise we block trying to do the close. This should be fine, since counting down on a latch of 0 is still fine //close down the searcher and any other resources, if it exists, as this is not recoverable close(); throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, null, e); }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (Exception e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Error opening new searcher", e); }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (Exception e) { if (e instanceof SolrException) throw (SolrException)e; throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (Exception e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (Exception e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (Exception e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "lazy loading error", ex ); }
// in core/src/java/org/apache/solr/core/SolrResourceLoader.java
catch (CharacterCodingException ex) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Error loading resource (wrong encoding?): " + resource, ex); }
// in core/src/java/org/apache/solr/core/SolrResourceLoader.java
catch (ClassNotFoundException e) { String newName=cname; if (newName.startsWith(project)) { newName = cname.substring(project.length()+1); } for (String subpackage : subpackages) { try { String name = base + '.' + subpackage + newName; log.trace("Trying class name " + name); return clazz = Class.forName(name,true,classLoader).asSubclass(expectedType); } catch (ClassNotFoundException e1) { // ignore... assume first exception is best. } } throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "Error loading class '" + cname + "'", e); }
// in core/src/java/org/apache/solr/core/SolrResourceLoader.java
catch (Exception e) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "Error instantiating class: '" + clazz.getName()+"'", e); }
// in core/src/java/org/apache/solr/core/SolrResourceLoader.java
catch (Exception e) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "Error instantiating class: '" + clazz.getName()+"'", e); }
// in core/src/java/org/apache/solr/core/SolrResourceLoader.java
catch (Exception e) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "Error instantiating class: '" + clazz.getName()+"'", e); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (TransformerException e) { throw new SolrException(ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/core/SolrXMLSerializer.java
catch (java.io.FileNotFoundException xnf) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, xnf); }
// in core/src/java/org/apache/solr/core/SolrXMLSerializer.java
catch (java.io.IOException xio) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, xio); }
// in core/src/java/org/apache/solr/core/JmxMonitoredMap.java
catch (Exception e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Failed to unregister info bean: " + key, e); }
// in core/src/java/org/apache/solr/util/SolrPluginUtils.java
catch (ParseException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Error parsing query: " + qs); }
// in core/src/java/org/apache/solr/util/DOMUtil.java
catch (NumberFormatException nfe) { throw new SolrException (SolrException.ErrorCode.SERVER_ERROR, "Value " + (null != name ? ("of '" +name+ "' ") : "") + "can not be parsed as '" +type+ "': \"" + textValue + "\"", nfe); }
15
              
// in core/src/java/org/apache/solr/handler/admin/CoreAdminHandler.java
protected boolean handleCreateAction(SolrQueryRequest req, SolrQueryResponse rsp) throws SolrException { try { SolrParams params = req.getParams(); String name = params.get(CoreAdminParams.NAME); //for now, do not allow creating new core with same name when in cloud mode //XXX perhaps it should just be unregistered from cloud before readding it?, //XXX perhaps we should also check that cores are of same type before adding new core to collection? if (coreContainer.getZkController() != null) { if (coreContainer.getCore(name) != null) { log.info("Re-creating a core with existing name is not allowed in cloud mode"); throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Core with name '" + name + "' already exists."); } } String instanceDir = params.get(CoreAdminParams.INSTANCE_DIR); if (instanceDir == null) { // instanceDir = coreContainer.getSolrHome() + "/" + name; instanceDir = name; // bare name is already relative to solr home } CoreDescriptor dcore = new CoreDescriptor(coreContainer, name, instanceDir); // fillup optional parameters String opts = params.get(CoreAdminParams.CONFIG); if (opts != null) dcore.setConfigName(opts); opts = params.get(CoreAdminParams.SCHEMA); if (opts != null) dcore.setSchemaName(opts); opts = params.get(CoreAdminParams.DATA_DIR); if (opts != null) dcore.setDataDir(opts); CloudDescriptor cd = dcore.getCloudDescriptor(); if (cd != null) { cd.setParams(req.getParams()); opts = params.get(CoreAdminParams.COLLECTION); if (opts != null) cd.setCollectionName(opts); opts = params.get(CoreAdminParams.SHARD); if (opts != null) cd.setShardId(opts); opts = params.get(CoreAdminParams.ROLES); if (opts != null) cd.setRoles(opts); Integer numShards = params.getInt(ZkStateReader.NUM_SHARDS_PROP); if (numShards != null) cd.setNumShards(numShards); } // Process all property.name=value parameters and set them as name=value core properties Properties coreProperties = new Properties(); Iterator<String> parameterNamesIterator = params.getParameterNamesIterator(); while (parameterNamesIterator.hasNext()) { String parameterName = parameterNamesIterator.next(); if(parameterName.startsWith(CoreAdminParams.PROPERTY_PREFIX)) { String parameterValue = params.get(parameterName); String propertyName = parameterName.substring(CoreAdminParams.PROPERTY_PREFIX.length()); // skip prefix coreProperties.put(propertyName, parameterValue); } } dcore.setCoreProperties(coreProperties); SolrCore core = coreContainer.create(dcore); coreContainer.register(name, core, false); rsp.add("core", core.getName()); return coreContainer.isPersistent(); } catch (Exception ex) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Error executing default implementation of CREATE", ex); } }
// in core/src/java/org/apache/solr/handler/admin/CoreAdminHandler.java
protected boolean handleRenameAction(SolrQueryRequest req, SolrQueryResponse rsp) throws SolrException { SolrParams params = req.getParams(); String name = params.get(CoreAdminParams.OTHER); String cname = params.get(CoreAdminParams.CORE); boolean doPersist = false; if (cname.equals(name)) return doPersist; doPersist = coreContainer.isPersistent(); coreContainer.rename(cname, name); return doPersist; }
// in core/src/java/org/apache/solr/handler/admin/CoreAdminHandler.java
protected boolean handleUnloadAction(SolrQueryRequest req, SolrQueryResponse rsp) throws SolrException { SolrParams params = req.getParams(); String cname = params.get(CoreAdminParams.CORE); SolrCore core = coreContainer.remove(cname); if(core == null){ throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "No such core exists '" + cname + "'"); } else { if (coreContainer.getZkController() != null) { log.info("Unregistering core " + cname + " from cloudstate."); try { coreContainer.getZkController().unregister(cname, core.getCoreDescriptor().getCloudDescriptor()); } catch (InterruptedException e) { Thread.currentThread().interrupt(); throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Could not unregister core " + cname + " from cloudstate: " + e.getMessage(), e); } catch (KeeperException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Could not unregister core " + cname + " from cloudstate: " + e.getMessage(), e); } } } if (params.getBool(CoreAdminParams.DELETE_INDEX, false)) { core.addCloseHook(new CloseHook() { @Override public void preClose(SolrCore core) {} @Override public void postClose(SolrCore core) { File dataDir = new File(core.getIndexDir()); File[] files = dataDir.listFiles(); if (files != null) { for (File file : files) { if (!file.delete()) { log.error(file.getAbsolutePath() + " could not be deleted on core unload"); } } if (!dataDir.delete()) log.error(dataDir.getAbsolutePath() + " could not be deleted on core unload"); } else { log.error(dataDir.getAbsolutePath() + " could not be deleted on core unload"); } } }); }
// in core/src/java/org/apache/solr/handler/admin/CoreAdminHandler.java
protected boolean handleStatusAction(SolrQueryRequest req, SolrQueryResponse rsp) throws SolrException { SolrParams params = req.getParams(); String cname = params.get(CoreAdminParams.CORE); boolean doPersist = false; NamedList<Object> status = new SimpleOrderedMap<Object>(); try { if (cname == null) { rsp.add("defaultCoreName", coreContainer.getDefaultCoreName()); for (String name : coreContainer.getCoreNames()) { status.add(name, getCoreStatus(coreContainer, name)); } } else { status.add(cname, getCoreStatus(coreContainer, cname)); } rsp.add("status", status); doPersist = false; // no state change return doPersist; } catch (Exception ex) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Error handling 'status' action ", ex); } }
// in core/src/java/org/apache/solr/handler/admin/CoreAdminHandler.java
protected boolean handlePersistAction(SolrQueryRequest req, SolrQueryResponse rsp) throws SolrException { SolrParams params = req.getParams(); boolean doPersist = false; String fileName = params.get(CoreAdminParams.FILE); if (fileName != null) { File file = new File(coreContainer.getConfigFile().getParentFile(), fileName); coreContainer.persistFile(file); rsp.add("saved", file.getAbsolutePath()); doPersist = false; } else if (!coreContainer.isPersistent()) { throw new SolrException(SolrException.ErrorCode.FORBIDDEN, "Persistence is not enabled"); } else doPersist = true; return doPersist; }
// in core/src/java/org/apache/solr/handler/PingRequestHandler.java
protected void handleEnable(boolean enable) throws SolrException { if (healthcheck == null) { throw new SolrException(SolrException.ErrorCode.SERVICE_UNAVAILABLE, "No healthcheck file defined."); } if ( enable ) { try { // write out when the file was created FileUtils.write(healthcheck, DateField.formatExternal(new Date()), "UTF-8"); } catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Unable to write healthcheck flag file", e); } } else { if (healthcheck.exists() && !healthcheck.delete()){ throw new SolrException(SolrException.ErrorCode.NOT_FOUND, "Did not successfully delete healthcheck file: " +healthcheck.getAbsolutePath()); } } }
// in core/src/java/org/apache/solr/schema/OpenExchangeRatesOrgProvider.java
Override public boolean reload() throws SolrException { InputStream ratesJsonStream = null; try { log.info("Reloading exchange rates from "+ratesFileLocation); try { ratesJsonStream = (new URL(ratesFileLocation)).openStream(); } catch (Exception e) { ratesJsonStream = resourceLoader.openResource(ratesFileLocation); } rates = new OpenExchangeRates(ratesJsonStream); return true; } catch (Exception e) { throw new SolrException(ErrorCode.SERVER_ERROR, "Error reloading exchange rates", e); } finally { if (ratesJsonStream != null) try { ratesJsonStream.close(); } catch (IOException e) { throw new SolrException(ErrorCode.SERVER_ERROR, "Error closing stream", e); } } }
// in core/src/java/org/apache/solr/schema/OpenExchangeRatesOrgProvider.java
Override public void init(Map<String,String> params) throws SolrException { try { ratesFileLocation = getParam(params.get(PARAM_RATES_FILE_LOCATION), DEFAULT_RATES_FILE_LOCATION); refreshInterval = Integer.parseInt(getParam(params.get(PARAM_REFRESH_INTERVAL), DEFAULT_REFRESH_INTERVAL)); // Force a refresh interval of minimum one hour, since the API does not offer better resolution if (refreshInterval < 60) { refreshInterval = 60; log.warn("Specified refreshInterval was too small. Setting to 60 minutes which is the update rate of openexchangerates.org"); } log.info("Initialized with rates="+ratesFileLocation+", refreshInterval="+refreshInterval+"."); } catch (Exception e) { throw new SolrException(ErrorCode.BAD_REQUEST, "Error initializing", e); } finally { // Removing config params custom to us params.remove(PARAM_RATES_FILE_LOCATION); params.remove(PARAM_REFRESH_INTERVAL); } }
// in core/src/java/org/apache/solr/schema/OpenExchangeRatesOrgProvider.java
Override public void inform(ResourceLoader loader) throws SolrException { resourceLoader = loader; reload(); }
// in core/src/java/org/apache/solr/schema/CurrencyField.java
Override public boolean reload() throws SolrException { InputStream is = null; Map<String, Map<String, Double>> tmpRates = new HashMap<String, Map<String, Double>>(); try { log.info("Reloading exchange rates from file "+this.currencyConfigFile); is = loader.openResource(currencyConfigFile); javax.xml.parsers.DocumentBuilderFactory dbf = DocumentBuilderFactory.newInstance(); try { dbf.setXIncludeAware(true); dbf.setNamespaceAware(true); } catch (UnsupportedOperationException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "XML parser doesn't support XInclude option", e); } try { Document doc = dbf.newDocumentBuilder().parse(is); XPathFactory xpathFactory = XPathFactory.newInstance(); XPath xpath = xpathFactory.newXPath(); // Parse exchange rates. NodeList nodes = (NodeList) xpath.evaluate("/currencyConfig/rates/rate", doc, XPathConstants.NODESET); for (int i = 0; i < nodes.getLength(); i++) { Node rateNode = nodes.item(i); NamedNodeMap attributes = rateNode.getAttributes(); Node from = attributes.getNamedItem("from"); Node to = attributes.getNamedItem("to"); Node rate = attributes.getNamedItem("rate"); if (from == null || to == null || rate == null) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Exchange rate missing attributes (required: from, to, rate) " + rateNode); } String fromCurrency = from.getNodeValue(); String toCurrency = to.getNodeValue(); Double exchangeRate; if (java.util.Currency.getInstance(fromCurrency) == null || java.util.Currency.getInstance(toCurrency) == null) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Could not find from currency specified in exchange rate: " + rateNode); } try { exchangeRate = Double.parseDouble(rate.getNodeValue()); } catch (NumberFormatException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Could not parse exchange rate: " + rateNode, e); } addRate(tmpRates, fromCurrency, toCurrency, exchangeRate); } } catch (SAXException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Error parsing currency config.", e); } catch (IOException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Error parsing currency config.", e); } catch (ParserConfigurationException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Error parsing currency config.", e); } catch (XPathExpressionException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Error parsing currency config.", e); } } catch (IOException e) { throw new SolrException(ErrorCode.BAD_REQUEST, "Error while opening Currency configuration file "+currencyConfigFile, e); } finally { try { if (is != null) { is.close(); } } catch (IOException e) { e.printStackTrace(); } } // Atomically swap in the new rates map, if it loaded successfully this.rates = tmpRates; return true; }
// in core/src/java/org/apache/solr/schema/CurrencyField.java
Override public void init(Map<String,String> params) throws SolrException { this.currencyConfigFile = params.get(PARAM_CURRENCY_CONFIG); if(currencyConfigFile == null) { throw new SolrException(ErrorCode.NOT_FOUND, "Missing required configuration "+PARAM_CURRENCY_CONFIG); } // Removing config params custom to us params.remove(PARAM_CURRENCY_CONFIG); }
// in core/src/java/org/apache/solr/schema/CurrencyField.java
Override public void inform(ResourceLoader loader) throws SolrException { if(loader == null) { throw new SolrException(ErrorCode.BAD_REQUEST, "Needs ResourceLoader in order to load config file"); } this.loader = loader; reload(); }
// in core/src/java/org/apache/solr/schema/SchemaField.java
public void checkSortability() throws SolrException { if (! indexed() ) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "can not sort on unindexed field: " + getName()); } if ( multiValued() ) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "can not sort on multivalued field: " + getName()); } }
// in core/src/java/org/apache/solr/schema/SchemaField.java
public void checkFieldCacheSource(QParser parser) throws SolrException { if (! indexed() ) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "can not use FieldCache on unindexed field: " + getName()); } if ( multiValued() ) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "can not use FieldCache on multivalued field: " + getName()); } }
// in core/src/java/org/apache/solr/search/SolrQueryParser.java
private void checkNullField(String field) throws SolrException { if (field == null && defaultField == null) { throw new SolrException (SolrException.ErrorCode.BAD_REQUEST, "no field name specified in query and no defaultSearchField defined in schema.xml"); } }
(Lib) RuntimeException 145
              
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
public void printLayout(String path, int indent, StringBuilder string) throws KeeperException, InterruptedException { byte[] data = getData(path, null, null, true); List<String> children = getChildren(path, null, true); StringBuilder dent = new StringBuilder(); for (int i = 0; i < indent; i++) { dent.append(" "); } string.append(dent + path + " (" + children.size() + ")" + NEWL); if (data != null) { try { String dataString = new String(data, "UTF-8"); if ((!path.endsWith(".txt") && !path.endsWith(".xml")) || path.endsWith(ZkStateReader.CLUSTER_STATE)) { if (path.endsWith(".xml")) { // this is the cluster state in xml format - lets pretty print dataString = prettyPrint(dataString); } string.append(dent + "DATA:\n" + dent + " " + dataString.replaceAll("\n", "\n" + dent + " ") + NEWL); } else { string.append(dent + "DATA: ...supressed..." + NEWL); } } catch (UnsupportedEncodingException e) { // can't happen - UTF-8 throw new RuntimeException(e); } } for (String child : children) { if (!child.equals("quota")) { try { printLayout(path + (path.equals("/") ? "" : "/") + child, indent + 1, string); } catch (NoNodeException e) { // must have gone away } } } }
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
public static String prettyPrint(String input, int indent) { try { Source xmlInput = new StreamSource(new StringReader(input)); StringWriter stringWriter = new StringWriter(); StreamResult xmlOutput = new StreamResult(stringWriter); TransformerFactory transformerFactory = TransformerFactory.newInstance(); transformerFactory.setAttribute("indent-number", indent); Transformer transformer = transformerFactory.newTransformer(); transformer.setOutputProperty(OutputKeys.INDENT, "yes"); transformer.transform(xmlInput, xmlOutput); return xmlOutput.getWriter().toString(); } catch (Exception e) { throw new RuntimeException("Problem pretty printing XML", e); } }
// in solrj/src/java/org/apache/solr/common/cloud/ZkStateReader.java
public static Object fromJSON(byte[] utf8) { // convert directly from bytes to chars // and parse directly from that instead of going through // intermediate strings or readers CharArr chars = new CharArr(); ByteUtils.UTF8toUTF16(utf8, 0, utf8.length, chars); JSONParser parser = new JSONParser(chars.getArray(), chars.getStart(), chars.length()); try { return ObjectBuilder.getVal(parser); } catch (IOException e) { throw new RuntimeException(e); // should never happen w/o using real IO } }
// in solrj/src/java/org/apache/solr/common/cloud/ZkStateReader.java
public ZkNodeProps getLeaderProps(String collection, String shard, int timeout) throws InterruptedException { long timeoutAt = System.currentTimeMillis() + timeout; while (System.currentTimeMillis() < timeoutAt) { if (cloudState != null) { final CloudState currentState = cloudState; final ZkNodeProps nodeProps = currentState.getLeader(collection, shard); if (nodeProps != null) { return nodeProps; } } Thread.sleep(50); } throw new RuntimeException("No registered leader was found, collection:" + collection + " slice:" + shard); }
// in solrj/src/java/org/apache/solr/common/util/IteratorChain.java
public void addIterator(Iterator<E> it) { if(itit!=null) throw new RuntimeException("all Iterators must be added before calling hasNext()"); iterators.add(it); }
// in solrj/src/java/org/apache/solr/common/util/IteratorChain.java
public E next() { if(current==null) { throw new RuntimeException("For an IteratorChain, hasNext() MUST be called before calling next()"); } return current.next(); }
// in solrj/src/java/org/apache/solr/common/util/JavaBinCodec.java
public Object unmarshal(InputStream is) throws IOException { FastInputStream dis = FastInputStream.wrap(is); version = dis.readByte(); if (version != VERSION) { throw new RuntimeException("Invalid version (expected " + VERSION + ", but " + version + ") or the data in not in 'javabin' format"); } return readVal(dis); }
// in solrj/src/java/org/apache/solr/common/util/JavaBinCodec.java
public Object readVal(FastInputStream dis) throws IOException { tagByte = dis.readByte(); // if ((tagByte & 0xe0) == 0) { // if top 3 bits are clear, this is a normal tag // OK, try type + size in single byte switch (tagByte >>> 5) { case STR >>> 5: return readStr(dis); case SINT >>> 5: return readSmallInt(dis); case SLONG >>> 5: return readSmallLong(dis); case ARR >>> 5: return readArray(dis); case ORDERED_MAP >>> 5: return readOrderedMap(dis); case NAMED_LST >>> 5: return readNamedList(dis); case EXTERN_STRING >>> 5: return readExternString(dis); } switch (tagByte) { case NULL: return null; case DATE: return new Date(dis.readLong()); case INT: return dis.readInt(); case BOOL_TRUE: return Boolean.TRUE; case BOOL_FALSE: return Boolean.FALSE; case FLOAT: return dis.readFloat(); case DOUBLE: return dis.readDouble(); case LONG: return dis.readLong(); case BYTE: return dis.readByte(); case SHORT: return dis.readShort(); case MAP: return readMap(dis); case SOLRDOC: return readSolrDocument(dis); case SOLRDOCLST: return readSolrDocumentList(dis); case BYTEARR: return readByteArray(dis); case ITERATOR: return readIterator(dis); case END: return END_OBJ; case SOLRINPUTDOC: return readSolrInputDocument(dis); } throw new RuntimeException("Unknown type " + tagByte); }
// in solrj/src/java/org/apache/solr/common/params/MapSolrParams.java
Override public String toString() { StringBuilder sb = new StringBuilder(128); try { boolean first=true; for (Map.Entry<String,String> entry : map.entrySet()) { String key = entry.getKey(); String val = entry.getValue(); if (!first) sb.append('&'); first=false; sb.append(key); sb.append('='); StrUtils.partialURLEncodeVal(sb, val==null ? "" : val); } } catch (IOException e) {throw new RuntimeException(e);} // can't happen return sb.toString(); }
// in solrj/src/java/org/apache/solr/common/params/MultiMapSolrParams.java
Override public String toString() { StringBuilder sb = new StringBuilder(128); try { boolean first=true; for (Map.Entry<String,String[]> entry : map.entrySet()) { String key = entry.getKey(); String[] valarr = entry.getValue(); for (String val : valarr) { if (!first) sb.append('&'); first=false; sb.append(key); sb.append('='); StrUtils.partialURLEncodeVal(sb, val==null ? "" : val); } } } catch (IOException e) {throw new RuntimeException(e);} // can't happen return sb.toString(); }
// in solrj/src/java/org/apache/solr/common/params/ModifiableSolrParams.java
Override public String toString() { StringBuilder sb = new StringBuilder(128); try { boolean first=true; for (Map.Entry<String,String[]> entry : vals.entrySet()) { String key = entry.getKey(); String[] valarr = entry.getValue(); for (String val : valarr) { if (!first) sb.append('&'); first=false; sb.append(key); sb.append('='); if( val != null ) { sb.append( URLEncoder.encode( val, "UTF-8" ) ); } } } } catch (IOException e) {throw new RuntimeException(e);} // can't happen return sb.toString(); }
// in solrj/src/java/org/apache/solr/client/solrj/request/RequestWriter.java
private ContentStream getDelegate() { if (contentStream == null) { try { contentStream = getContentStream(req); } catch (IOException e) { throw new RuntimeException("Unable to write xml into a stream", e); } } return contentStream; }
// in solrj/src/java/org/apache/solr/client/solrj/request/CoreAdminRequest.java
Override public SolrParams getParams() { if( action == null ) { throw new RuntimeException( "no action specified!" ); } ModifiableSolrParams params = new ModifiableSolrParams(); params.set( CoreAdminParams.ACTION, action.toString() ); if( action.equals(CoreAdminAction.CREATE) ) { params.set( CoreAdminParams.NAME, core ); } else { params.set( CoreAdminParams.CORE, core ); } params.set( CoreAdminParams.INSTANCE_DIR, instanceDir); if (configName != null) { params.set( CoreAdminParams.CONFIG, configName); } if (schemaName != null) { params.set( CoreAdminParams.SCHEMA, schemaName); } if (dataDir != null) { params.set( CoreAdminParams.DATA_DIR, dataDir); } if (collection != null) { params.set( CoreAdminParams.COLLECTION, collection); } if (numShards != null) { params.set( ZkStateReader.NUM_SHARDS_PROP, numShards); } if (shardId != null) { params.set( CoreAdminParams.SHARD, shardId); } if (roles != null) { params.set( CoreAdminParams.ROLES, roles); } return params; }
// in solrj/src/java/org/apache/solr/client/solrj/request/CoreAdminRequest.java
Override public SolrParams getParams() { if( action == null ) { throw new RuntimeException( "no action specified!" ); } ModifiableSolrParams params = new ModifiableSolrParams(); params.set( CoreAdminParams.ACTION, action.toString() ); params.set( CoreAdminParams.CORE, core ); if (nodeName != null) { params.set( "nodeName", nodeName); } if (coreNodeName != null) { params.set( "coreNodeName", coreNodeName); } if (state != null) { params.set( "state", state); } if (checkLive != null) { params.set( "checkLive", checkLive); } if (pauseFor != null) { params.set( "pauseFor", pauseFor); } return params; }
// in solrj/src/java/org/apache/solr/client/solrj/request/CoreAdminRequest.java
Override public SolrParams getParams() { if( action == null ) { throw new RuntimeException( "no action specified!" ); } ModifiableSolrParams params = new ModifiableSolrParams(); params.set( CoreAdminParams.ACTION, action.toString() ); params.set( CoreAdminParams.CORE, core ); return params; }
// in solrj/src/java/org/apache/solr/client/solrj/request/CoreAdminRequest.java
Override public SolrParams getParams() { if( action == null ) { throw new RuntimeException( "no action specified!" ); } ModifiableSolrParams params = new ModifiableSolrParams(); params.set( CoreAdminParams.ACTION, action.toString() ); if (fileName != null) { params.set( CoreAdminParams.FILE, fileName); } return params; }
// in solrj/src/java/org/apache/solr/client/solrj/request/CoreAdminRequest.java
Override public SolrParams getParams() { if (action == null) { throw new RuntimeException("no action specified!"); } ModifiableSolrParams params = new ModifiableSolrParams(); params.set(CoreAdminParams.ACTION, action.toString()); params.set(CoreAdminParams.CORE, core); if (indexDirs != null) { for (String indexDir : indexDirs) { params.set(CoreAdminParams.INDEX_DIR, indexDir); } } if (srcCores != null) { for (String srcCore : srcCores) { params.set(CoreAdminParams.SRC_CORE, srcCore); } } return params; }
// in solrj/src/java/org/apache/solr/client/solrj/request/CoreAdminRequest.java
Override public SolrParams getParams() { if( action == null ) { throw new RuntimeException( "no action specified!" ); } ModifiableSolrParams params = new ModifiableSolrParams(); params.set( CoreAdminParams.ACTION, action.toString() ); params.set( CoreAdminParams.CORE, core ); if (other != null) { params.set(CoreAdminParams.OTHER, other); } return params; }
// in solrj/src/java/org/apache/solr/client/solrj/impl/BinaryResponseParser.java
Override public NamedList<Object> processResponse(Reader reader) { throw new RuntimeException("Cannot handle character stream"); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/BinaryRequestWriter.java
public Reader getReader() throws IOException { throw new RuntimeException("No reader available . this is a binarystream"); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/StreamingBinaryResponseParser.java
Override public SolrDocumentList readSolrDocumentList(FastInputStream dis) throws IOException { SolrDocumentList solrDocs = new SolrDocumentList(); List list = (List) readVal(dis); solrDocs.setNumFound((Long) list.get(0)); solrDocs.setStart((Long) list.get(1)); solrDocs.setMaxScore((Float) list.get(2)); callback.streamDocListInfo( solrDocs.getNumFound(), solrDocs.getStart(), solrDocs.getMaxScore() ); // Read the Array tagByte = dis.readByte(); if( (tagByte >>> 5) != (ARR >>> 5) ) { throw new RuntimeException( "doclist must have an array" ); } int sz = readSize(dis); for (int i = 0; i < sz; i++) { // must be a SolrDocument readVal( dis ); } return solrDocs; }
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
public String removeSolrServer(String server) { try { server = new URL(server).toExternalForm(); } catch (MalformedURLException e) { throw new RuntimeException(e); } if (server.endsWith("/")) { server = server.substring(0, server.length() - 1); } // there is a small race condition here - if the server is in the process of being moved between // lists, we could fail to remove it. removeFromAlive(server); zombieServers.remove(server); return null; }
// in solrj/src/java/org/apache/solr/client/solrj/impl/XMLResponseParser.java
protected NamedList<Object> readNamedList( XMLStreamReader parser ) throws XMLStreamException { if( XMLStreamConstants.START_ELEMENT != parser.getEventType() ) { throw new RuntimeException( "must be start element, not: "+parser.getEventType() ); } StringBuilder builder = new StringBuilder(); NamedList<Object> nl = new SimpleOrderedMap<Object>(); KnownType type = null; String name = null; // just eat up the events... int depth = 0; while( true ) { switch (parser.next()) { case XMLStreamConstants.START_ELEMENT: depth++; builder.setLength( 0 ); // reset the text type = KnownType.get( parser.getLocalName() ); if( type == null ) { throw new RuntimeException( "this must be known type! not: "+parser.getLocalName() ); } name = null; int cnt = parser.getAttributeCount(); for( int i=0; i<cnt; i++ ) { if( "name".equals( parser.getAttributeLocalName( i ) ) ) { name = parser.getAttributeValue( i ); break; } } /** The name in a NamedList can actually be null if( name == null ) { throw new XMLStreamException( "requires 'name' attribute: "+parser.getLocalName(), parser.getLocation() ); } **/ if( !type.isLeaf ) { switch( type ) { case LST: nl.add( name, readNamedList( parser ) ); depth--; continue; case ARR: nl.add( name, readArray( parser ) ); depth--; continue; case RESULT: nl.add( name, readDocuments( parser ) ); depth--; continue; case DOC: nl.add( name, readDocument( parser ) ); depth--; continue; } throw new XMLStreamException( "branch element not handled!", parser.getLocation() ); } break; case XMLStreamConstants.END_ELEMENT: if( --depth < 0 ) { return nl; } //System.out.println( "NL:ELEM:"+type+"::"+name+"::"+builder ); nl.add( name, type.read( builder.toString().trim() ) ); break; case XMLStreamConstants.SPACE: // TODO? should this be trimmed? make sure it only gets one/two space? case XMLStreamConstants.CDATA: case XMLStreamConstants.CHARACTERS: builder.append( parser.getText() ); break; } } }
// in solrj/src/java/org/apache/solr/client/solrj/impl/XMLResponseParser.java
protected List<Object> readArray( XMLStreamReader parser ) throws XMLStreamException { if( XMLStreamConstants.START_ELEMENT != parser.getEventType() ) { throw new RuntimeException( "must be start element, not: "+parser.getEventType() ); } if( !"arr".equals( parser.getLocalName().toLowerCase(Locale.ENGLISH) ) ) { throw new RuntimeException( "must be 'arr', not: "+parser.getLocalName() ); } StringBuilder builder = new StringBuilder(); KnownType type = null; List<Object> vals = new ArrayList<Object>(); int depth = 0; while( true ) { switch (parser.next()) { case XMLStreamConstants.START_ELEMENT: depth++; KnownType t = KnownType.get( parser.getLocalName() ); if( t == null ) { throw new RuntimeException( "this must be known type! not: "+parser.getLocalName() ); } if( type == null ) { type = t; } /*** actually, there is no rule that arrays need the same type else if( type != t && !(t == KnownType.NULL || type == KnownType.NULL)) { throw new RuntimeException( "arrays must have the same type! ("+type+"!="+t+") "+parser.getLocalName() ); } ***/ type = t; builder.setLength( 0 ); // reset the text if( !type.isLeaf ) { switch( type ) { case LST: vals.add( readNamedList( parser ) ); depth--; continue; case ARR: vals.add( readArray( parser ) ); depth--; continue; case RESULT: vals.add( readDocuments( parser ) ); depth--; continue; case DOC: vals.add( readDocument( parser ) ); depth--; continue; } throw new XMLStreamException( "branch element not handled!", parser.getLocation() ); } break; case XMLStreamConstants.END_ELEMENT: if( --depth < 0 ) { return vals; // the last element is itself } //System.out.println( "ARR:"+type+"::"+builder ); Object val = type.read( builder.toString().trim() ); if( val == null && type != KnownType.NULL) { throw new XMLStreamException( "error reading value:"+type, parser.getLocation() ); } vals.add( val ); break; case XMLStreamConstants.SPACE: // TODO? should this be trimmed? make sure it only gets one/two space? case XMLStreamConstants.CDATA: case XMLStreamConstants.CHARACTERS: builder.append( parser.getText() ); break; } } }
// in solrj/src/java/org/apache/solr/client/solrj/impl/XMLResponseParser.java
protected SolrDocumentList readDocuments( XMLStreamReader parser ) throws XMLStreamException { SolrDocumentList docs = new SolrDocumentList(); // Parse the attributes for( int i=0; i<parser.getAttributeCount(); i++ ) { String n = parser.getAttributeLocalName( i ); String v = parser.getAttributeValue( i ); if( "numFound".equals( n ) ) { docs.setNumFound( Long.parseLong( v ) ); } else if( "start".equals( n ) ) { docs.setStart( Long.parseLong( v ) ); } else if( "maxScore".equals( n ) ) { docs.setMaxScore( Float.parseFloat( v ) ); } } // Read through each document int event; while( true ) { event = parser.next(); if( XMLStreamConstants.START_ELEMENT == event ) { if( !"doc".equals( parser.getLocalName() ) ) { throw new RuntimeException( "should be doc! "+parser.getLocalName() + " :: " + parser.getLocation() ); } docs.add( readDocument( parser ) ); } else if ( XMLStreamConstants.END_ELEMENT == event ) { return docs; // only happens once } } }
// in solrj/src/java/org/apache/solr/client/solrj/impl/XMLResponseParser.java
protected SolrDocument readDocument( XMLStreamReader parser ) throws XMLStreamException { if( XMLStreamConstants.START_ELEMENT != parser.getEventType() ) { throw new RuntimeException( "must be start element, not: "+parser.getEventType() ); } if( !"doc".equals( parser.getLocalName().toLowerCase(Locale.ENGLISH) ) ) { throw new RuntimeException( "must be 'lst', not: "+parser.getLocalName() ); } SolrDocument doc = new SolrDocument(); StringBuilder builder = new StringBuilder(); KnownType type = null; String name = null; // just eat up the events... int depth = 0; while( true ) { switch (parser.next()) { case XMLStreamConstants.START_ELEMENT: depth++; builder.setLength( 0 ); // reset the text type = KnownType.get( parser.getLocalName() ); if( type == null ) { throw new RuntimeException( "this must be known type! not: "+parser.getLocalName() ); } name = null; int cnt = parser.getAttributeCount(); for( int i=0; i<cnt; i++ ) { if( "name".equals( parser.getAttributeLocalName( i ) ) ) { name = parser.getAttributeValue( i ); break; } } if( name == null ) { throw new XMLStreamException( "requires 'name' attribute: "+parser.getLocalName(), parser.getLocation() ); } // Handle multi-valued fields if( type == KnownType.ARR ) { for( Object val : readArray( parser ) ) { doc.addField( name, val ); } depth--; // the array reading clears out the 'endElement' } else if( type == KnownType.LST ) { doc.addField( name, readNamedList( parser ) ); depth--; } else if( !type.isLeaf ) { System.out.println("nbot leaf!:" + type); throw new XMLStreamException( "must be value or array", parser.getLocation() ); } break; case XMLStreamConstants.END_ELEMENT: if( --depth < 0 ) { return doc; } //System.out.println( "FIELD:"+type+"::"+name+"::"+builder ); Object val = type.read( builder.toString().trim() ); if( val == null ) { throw new XMLStreamException( "error reading value:"+type, parser.getLocation() ); } doc.addField( name, val ); break; case XMLStreamConstants.SPACE: // TODO? should this be trimmed? make sure it only gets one/two space? case XMLStreamConstants.CDATA: case XMLStreamConstants.CHARACTERS: builder.append( parser.getText() ); break; } } }
// in solrj/src/java/org/apache/solr/client/solrj/util/ClientUtils.java
public static String toQueryString( SolrParams params, boolean xml ) { StringBuilder sb = new StringBuilder(128); try { String amp = xml ? "&amp;" : "&"; boolean first=true; Iterator<String> names = params.getParameterNamesIterator(); while( names.hasNext() ) { String key = names.next(); String[] valarr = params.getParams( key ); if( valarr == null ) { sb.append( first?"?":amp ); sb.append(key); first=false; } else { for (String val : valarr) { sb.append( first? "?":amp ); sb.append(key); if( val != null ) { sb.append('='); sb.append( URLEncoder.encode( val, "UTF-8" ) ); } first=false; } } } } catch (IOException e) {throw new RuntimeException(e);} // can't happen return sb.toString(); }
// in solrj/src/java/org/apache/noggit/CharArr.java
Override public void flush() { try { sink.write(buf, start, end-start); } catch (IOException e) { throw new RuntimeException(e); } start = end = 0; }
// in solrj/src/java/org/apache/noggit/CharArr.java
Override public void write(char b[], int off, int len) { int space = buf.length - end; if (len < space) { unsafeWrite(b, off, len); } else if (len < buf.length) { unsafeWrite(b, off, space); flush(); unsafeWrite(b, off+space, len-space); } else { flush(); try { sink.write(b, off, len); } catch (IOException e) { throw new RuntimeException(e); } } }
// in solrj/src/java/org/apache/noggit/CharArr.java
Override public void write(String s, int stringOffset, int len) { int space = buf.length - end; if (len < space) { s.getChars(stringOffset, stringOffset+len, buf, end); end += len; } else if (len < buf.length) { // if the data to write is small enough, buffer it. s.getChars(stringOffset, stringOffset+space, buf, end); flush(); s.getChars(stringOffset+space, stringOffset+len, buf, 0); end = len-space; } else { flush(); // don't buffer, just write to sink try { sink.write(s, stringOffset, len); } catch (IOException e) { throw new RuntimeException(e); } } }
// in contrib/analysis-extras/src/java/org/apache/solr/schema/ICUCollationField.java
private Collator createFromRules(String fileName, ResourceLoader loader) { InputStream input = null; try { input = loader.openResource(fileName); String rules = IOUtils.toString(input, "UTF-8"); return new RuleBasedCollator(rules); } catch (Exception e) { // io error or invalid rules throw new RuntimeException(e); } finally { IOUtils.closeQuietly(input); } }
// in contrib/analysis-extras/src/java/org/apache/solr/schema/ICUCollationField.java
private BytesRef analyzeRangePart(String field, String part) { TokenStream source; try { source = analyzer.tokenStream(field, new StringReader(part)); source.reset(); } catch (IOException e) { throw new RuntimeException("Unable to initialize TokenStream to analyze range part: " + part, e); } TermToBytesRefAttribute termAtt = source.getAttribute(TermToBytesRefAttribute.class); BytesRef bytes = termAtt.getBytesRef(); // we control the analyzer here: most errors are impossible try { if (!source.incrementToken()) throw new IllegalArgumentException("analyzer returned no terms for range part: " + part); termAtt.fillBytesRef(); assert !source.incrementToken(); } catch (IOException e) { throw new RuntimeException("error analyzing range part: " + part, e); } try { source.end(); source.close(); } catch (IOException e) { throw new RuntimeException("Unable to end & close TokenStream after analyzing range part: " + part, e); } return BytesRef.deepCopyOf(bytes); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/XPathEntityProcessor.java
Override public void run() { try { xpathReader.streamRecords(data, new XPathRecordReader.Handler() { @SuppressWarnings("unchecked") public void handle(Map<String, Object> record, String xpath) { if (isEnd.get()) { throwExp.set(false); //To end the streaming . otherwise the parsing will go on forever //though consumer has gone away throw new RuntimeException("BREAK"); } Map<String, Object> row; try { row = readRow(record, xpath); } catch (Exception e) { isEnd.set(true); return; } offer(row); } });
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/FileDataSource.java
static File getFile(String basePath, String query) { try { File file0 = new File(query); File file = file0; if (!file.isAbsolute()) file = new File(basePath + query); if (file.isFile() && file.canRead()) { LOG.debug("Accessing File: " + file.toString()); return file; } else if (file != file0) if (file0.isFile() && file0.canRead()) { LOG.debug("Accessing File0: " + file0.toString()); return file0; } throw new FileNotFoundException("Could not find file: " + query); } catch (FileNotFoundException e) { throw new RuntimeException(e); } }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/FieldStreamDataSource.java
Override public InputStream getData(String query) { Object o = wrapper.getVariableResolver().resolve(dataField); if (o == null) { throw new DataImportHandlerException(SEVERE, "No field available for name : " + dataField); } if (o instanceof Blob) { Blob blob = (Blob) o; try { //Most of the JDBC drivers have getBinaryStream defined as public // so let us just check it Method m = blob.getClass().getDeclaredMethod("getBinaryStream"); if (Modifier.isPublic(m.getModifiers())) { return (InputStream) m.invoke(blob); } else { // force invoke m.setAccessible(true); return (InputStream) m.invoke(blob); } } catch (Exception e) { LOG.info("Unable to get data from BLOB"); return null; } } else if (o instanceof byte[]) { byte[] bytes = (byte[]) o; return new ByteArrayInputStream(bytes); } else { throw new RuntimeException("unsupported type : " + o.getClass()); } }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/XPathRecordReader.java
private void addField0(String xpath, String name, boolean multiValued, boolean isRecord, int flags) { if (!xpath.startsWith("/")) throw new RuntimeException("xpath must start with '/' : " + xpath); List<String> paths = splitEscapeQuote(xpath); // deal with how split behaves when seperator starts a string! if ("".equals(paths.get(0).trim())) paths.remove(0); rootNode.build(paths, name, multiValued, isRecord, flags); rootNode.buildOptimise(null); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/XPathRecordReader.java
public void streamRecords(Reader r, Handler handler) { try { XMLStreamReader parser = factory.createXMLStreamReader(r); rootNode.parse(parser, handler, new HashMap<String, Object>(), new Stack<Set<String>>(), false); } catch (Exception e) { throw new RuntimeException(e); } }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DocBuilder.java
private void buildDocument(VariableResolverImpl vr, DocWrapper doc, Map<String,Object> pk, EntityProcessorWrapper epw, boolean isRoot, ContextImpl parentCtx) { List<EntityProcessorWrapper> entitiesToDestroy = new ArrayList<EntityProcessorWrapper>(); try { buildDocument(vr, doc, pk, epw, isRoot, parentCtx, entitiesToDestroy); } catch (Exception e) { throw new RuntimeException(e); } finally { for (EntityProcessorWrapper entityWrapper : entitiesToDestroy) { entityWrapper.destroy(); } resetEntity(epw); } }
// in contrib/velocity/src/java/org/apache/solr/response/SolrParamResourceLoader.java
Override public InputStream getResourceStream(String s) throws ResourceNotFoundException { String template = templates.get(s); try { return template == null ? null : new ByteArrayInputStream(template.getBytes("UTF-8")); } catch (UnsupportedEncodingException e) { throw new RuntimeException(e); // may not happen } }
// in contrib/velocity/src/java/org/apache/solr/response/VelocityResponseWriter.java
private VelocityEngine getEngine(SolrQueryRequest request) { VelocityEngine engine = new VelocityEngine(); String template_root = request.getParams().get("v.base_dir"); File baseDir = new File(request.getCore().getResourceLoader().getConfigDir(), "velocity"); if (template_root != null) { baseDir = new File(template_root); } engine.setProperty(RuntimeConstants.FILE_RESOURCE_LOADER_PATH, baseDir.getAbsolutePath()); engine.setProperty("params.resource.loader.instance", new SolrParamResourceLoader(request)); SolrVelocityResourceLoader resourceLoader = new SolrVelocityResourceLoader(request.getCore().getSolrConfig().getResourceLoader()); engine.setProperty("solr.resource.loader.instance", resourceLoader); // TODO: Externalize Velocity properties engine.setProperty(RuntimeConstants.RESOURCE_LOADER, "params,file,solr"); String propFile = request.getParams().get("v.properties"); try { if (propFile == null) engine.init(); else { InputStream is = null; try { is = resourceLoader.getResourceStream(propFile); Properties props = new Properties(); props.load(is); engine.init(props); } finally { if (is != null) is.close(); } } } catch (Exception e) { throw new RuntimeException(e); } return engine; }
// in core/src/java/org/apache/solr/handler/AnalysisRequestHandlerBase.java
protected Set<BytesRef> getQueryTokenSet(String query, Analyzer analyzer) { try { final Set<BytesRef> tokens = new HashSet<BytesRef>(); final TokenStream tokenStream = analyzer.tokenStream("", new StringReader(query)); final TermToBytesRefAttribute bytesAtt = tokenStream.getAttribute(TermToBytesRefAttribute.class); final BytesRef bytes = bytesAtt.getBytesRef(); tokenStream.reset(); while (tokenStream.incrementToken()) { bytesAtt.fillBytesRef(); tokens.add(BytesRef.deepCopyOf(bytes)); } tokenStream.end(); tokenStream.close(); return tokens; } catch (IOException ioe) { throw new RuntimeException("Error occured while iterating over tokenstream", ioe); } }
// in core/src/java/org/apache/solr/handler/AnalysisRequestHandlerBase.java
private List<AttributeSource> analyzeTokenStream(TokenStream tokenStream) { final List<AttributeSource> tokens = new ArrayList<AttributeSource>(); final PositionIncrementAttribute posIncrAtt = tokenStream.addAttribute(PositionIncrementAttribute.class); final TokenTrackingAttribute trackerAtt = tokenStream.addAttribute(TokenTrackingAttribute.class); // for backwards compatibility, add all "common" attributes tokenStream.addAttribute(OffsetAttribute.class); tokenStream.addAttribute(TypeAttribute.class); try { tokenStream.reset(); int position = 0; while (tokenStream.incrementToken()) { position += posIncrAtt.getPositionIncrement(); trackerAtt.setActPosition(position); tokens.add(tokenStream.cloneAttributes()); } } catch (IOException ioe) { throw new RuntimeException("Error occured while iterating over tokenstream", ioe); } return tokens; }
// in core/src/java/org/apache/solr/handler/ReplicationHandler.java
public void write(Writer writer, SolrQueryRequest request, SolrQueryResponse response) throws IOException { throw new RuntimeException("This is a binary writer , Cannot write to a characterstream"); }
// in core/src/java/org/apache/solr/handler/component/QueryElevationComponent.java
Override public URL[] getDocs() { try { return new URL[]{ new URL("http://wiki.apache.org/solr/QueryElevationComponent") }; } catch (MalformedURLException e) { throw new RuntimeException(e); } }
// in core/src/java/org/apache/solr/handler/component/SpellCheckComponent.java
public void inform(SolrCore core) { if (initParams != null) { LOG.info("Initializing spell checkers"); boolean hasDefault = false; for (int i = 0; i < initParams.size(); i++) { if (initParams.getName(i).equals("spellchecker")) { NamedList spellchecker = (NamedList) initParams.getVal(i); String className = (String) spellchecker.get("classname"); // TODO: this is a little bit sneaky: warn if class isnt supplied // so that its mandatory in a future release? if (className == null) className = IndexBasedSpellChecker.class.getName(); SolrResourceLoader loader = core.getResourceLoader(); SolrSpellChecker checker = loader.newInstance(className, SolrSpellChecker.class); if (checker != null) { String dictionary = checker.init(spellchecker, core); if (dictionary != null) { boolean isDefault = dictionary.equals(SolrSpellChecker.DEFAULT_DICTIONARY_NAME); if (isDefault == true && hasDefault == false){ hasDefault = true; } else if (isDefault == true && hasDefault == true){ throw new RuntimeException("More than one dictionary is missing name."); } spellCheckers.put(dictionary, checker); } else { if (hasDefault == false){ spellCheckers.put(SolrSpellChecker.DEFAULT_DICTIONARY_NAME, checker); hasDefault = true; } else { throw new RuntimeException("More than one dictionary is missing name."); } } // Register event listeners for this SpellChecker core.registerFirstSearcherListener(new SpellCheckerListener(core, checker, false, false)); boolean buildOnCommit = Boolean.parseBoolean((String) spellchecker.get("buildOnCommit")); boolean buildOnOptimize = Boolean.parseBoolean((String) spellchecker.get("buildOnOptimize")); if (buildOnCommit || buildOnOptimize) { LOG.info("Registering newSearcher listener for spellchecker: " + checker.getDictionaryName()); core.registerNewSearcherListener(new SpellCheckerListener(core, checker, buildOnCommit, buildOnOptimize)); } } else { throw new RuntimeException("Can't load spell checker: " + className); } } } Map<String, QueryConverter> queryConverters = new HashMap<String, QueryConverter>(); core.initPlugins(queryConverters,QueryConverter.class); //ensure that there is at least one query converter defined if (queryConverters.size() == 0) { LOG.info("No queryConverter defined, using default converter"); queryConverters.put("queryConverter", new SpellingQueryConverter()); } //there should only be one if (queryConverters.size() == 1) { queryConverter = queryConverters.values().iterator().next(); IndexSchema schema = core.getSchema(); String fieldTypeName = (String) initParams.get("queryAnalyzerFieldType"); FieldType fieldType = schema.getFieldTypes().get(fieldTypeName); Analyzer analyzer = fieldType == null ? new WhitespaceAnalyzer(core.getSolrConfig().luceneMatchVersion) : fieldType.getQueryAnalyzer(); //TODO: There's got to be a better way! Where's Spring when you need it? queryConverter.setAnalyzer(analyzer); } } }
// in core/src/java/org/apache/solr/handler/component/ShardDoc.java
Comparator getCachedComparator(String fieldname, SortField.Type type, FieldComparatorSource factory) { Comparator comparator = null; switch (type) { case SCORE: comparator = comparatorScore(fieldname); break; case STRING: comparator = comparatorNatural(fieldname); break; case CUSTOM: if (factory instanceof MissingStringLastComparatorSource){ comparator = comparatorMissingStringLast(fieldname); } else { // TODO: support other types such as random... is there a way to // support generically? Perhaps just comparing Object comparator = comparatorNatural(fieldname); // throw new RuntimeException("Custom sort not supported factory is "+factory.getClass()); } break; case DOC: // TODO: we can support this! throw new RuntimeException("Doc sort not supported"); default: comparator = comparatorNatural(fieldname); break; } return comparator; }
// in core/src/java/org/apache/solr/handler/component/StatsComponent.java
public NamedList<?> getFieldCacheStats(String fieldName, String[] facet ) { SchemaField sf = searcher.getSchema().getField(fieldName); FieldCache.DocTermsIndex si; try { si = FieldCache.DEFAULT.getTermsIndex(searcher.getAtomicReader(), fieldName); } catch (IOException e) { throw new RuntimeException( "failed to open field cache for: "+fieldName, e ); } StatsValues allstats = StatsValuesFactory.createStatsValues(sf); final int nTerms = si.numOrd(); if ( nTerms <= 0 || docs.size() <= 0 ) return allstats.getStatsValues(); // don't worry about faceting if no documents match... List<FieldFacetStats> facetStats = new ArrayList<FieldFacetStats>(); FieldCache.DocTermsIndex facetTermsIndex; for( String facetField : facet ) { SchemaField fsf = searcher.getSchema().getField(facetField); FieldType facetFieldType = fsf.getType(); if (facetFieldType.isTokenized() || facetFieldType.isMultiValued()) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Stats can only facet on single-valued fields, not: " + facetField + "[" + facetFieldType + "]"); } try { facetTermsIndex = FieldCache.DEFAULT.getTermsIndex(searcher.getAtomicReader(), facetField); } catch (IOException e) { throw new RuntimeException( "failed to open field cache for: " + facetField, e ); } facetStats.add(new FieldFacetStats(facetField, facetTermsIndex, sf, fsf, nTerms)); } final BytesRef tempBR = new BytesRef(); DocIterator iter = docs.iterator(); while (iter.hasNext()) { int docID = iter.nextDoc(); BytesRef raw = si.lookup(si.getOrd(docID), tempBR); if( raw.length > 0 ) { allstats.accumulate(raw); } else { allstats.missing(); } // now update the facets for (FieldFacetStats f : facetStats) { f.facet(docID, raw); } } for (FieldFacetStats f : facetStats) { allstats.addFacet(f.name, f.facetStatsValues); } return allstats.getStatsValues(); }
// in core/src/java/org/apache/solr/handler/component/TermVectorComponent.java
private static int getDocFreq(IndexReader reader, String field, BytesRef term) { int result = 1; try { result = reader.docFreq(field, term); } catch (IOException e) { throw new RuntimeException(e); } return result; }
// in core/src/java/org/apache/solr/client/solrj/embedded/EmbeddedSolrServer.java
Override public NamedList<Object> request(SolrRequest request) throws SolrServerException, IOException { String path = request.getPath(); if( path == null || !path.startsWith( "/" ) ) { path = "/select"; } // Check for cores action SolrCore core = coreContainer.getCore( coreName ); if( core == null ) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "No such core: " + coreName ); } SolrParams params = request.getParams(); if( params == null ) { params = new ModifiableSolrParams(); } // Extract the handler from the path or params SolrRequestHandler handler = core.getRequestHandler( path ); if( handler == null ) { if( "/select".equals( path ) || "/select/".equalsIgnoreCase( path) ) { String qt = params.get( CommonParams.QT ); handler = core.getRequestHandler( qt ); if( handler == null ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "unknown handler: "+qt); } } // Perhaps the path is to manage the cores if( handler == null && coreContainer != null && path.equals( coreContainer.getAdminPath() ) ) { handler = coreContainer.getMultiCoreHandler(); } } if( handler == null ) { core.close(); throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "unknown handler: "+path ); } SolrQueryRequest req = null; try { req = _parser.buildRequestFrom( core, params, request.getContentStreams() ); req.getContext().put( "path", path ); SolrQueryResponse rsp = new SolrQueryResponse(); SolrRequestInfo.setRequestInfo(new SolrRequestInfo(req, rsp)); core.execute( handler, req, rsp ); if( rsp.getException() != null ) { if(rsp.getException() instanceof SolrException) { throw rsp.getException(); } throw new SolrServerException( rsp.getException() ); } // Check if this should stream results if( request.getStreamingResponseCallback() != null ) { try { final StreamingResponseCallback callback = request.getStreamingResponseCallback(); BinaryResponseWriter.Resolver resolver = new BinaryResponseWriter.Resolver( req, rsp.getReturnFields()) { @Override public void writeResults(ResultContext ctx, JavaBinCodec codec) throws IOException { // write an empty list... SolrDocumentList docs = new SolrDocumentList(); docs.setNumFound( ctx.docs.matches() ); docs.setStart( ctx.docs.offset() ); docs.setMaxScore( ctx.docs.maxScore() ); codec.writeSolrDocumentList( docs ); // This will transform writeResultsBody( ctx, codec ); } }; ByteArrayOutputStream out = new ByteArrayOutputStream(); new JavaBinCodec(resolver) { @Override public void writeSolrDocument(SolrDocument doc) throws IOException { callback.streamSolrDocument( doc ); //super.writeSolrDocument( doc, fields ); } @Override public void writeSolrDocumentList(SolrDocumentList docs) throws IOException { if( docs.size() > 0 ) { SolrDocumentList tmp = new SolrDocumentList(); tmp.setMaxScore( docs.getMaxScore() ); tmp.setNumFound( docs.getNumFound() ); tmp.setStart( docs.getStart() ); docs = tmp; } callback.streamDocListInfo( docs.getNumFound(), docs.getStart(), docs.getMaxScore() ); super.writeSolrDocumentList(docs); } }.marshal(rsp.getValues(), out); InputStream in = new ByteArrayInputStream(out.toByteArray()); return (NamedList<Object>) new JavaBinCodec(resolver).unmarshal(in); } catch (Exception ex) { throw new RuntimeException(ex); } } // Now write it out NamedList<Object> normalized = getParsedResponse(req, rsp); return normalized; } catch( IOException iox ) { throw iox; } catch( SolrException sx ) { throw sx; } catch( Exception ex ) { throw new SolrServerException( ex ); } finally { if (req != null) req.close(); core.close(); SolrRequestInfo.clearRequestInfo(); } }
// in core/src/java/org/apache/solr/client/solrj/embedded/JettySolrRunner.java
public void start(boolean waitForSolr) throws Exception { // if started before, make a new server if (startedBefore) { waitOnSolr = false; init(solrHome, context, lastPort, stopAtShutdown); } else { startedBefore = true; } if( dataDir != null) { System.setProperty("solr.data.dir", dataDir); } if(shards != null) { System.setProperty("shard", shards); } if (!server.isRunning()) { server.start(); } synchronized (JettySolrRunner.this) { int cnt = 0; while (!waitOnSolr) { this.wait(100); if (cnt++ == 5) { throw new RuntimeException("Jetty/Solr unresponsive"); } } } System.clearProperty("shard"); System.clearProperty("solr.data.dir"); }
// in core/src/java/org/apache/solr/client/solrj/embedded/JettySolrRunner.java
private int getFirstConnectorPort() { Connector[] conns = server.getConnectors(); if (0 == conns.length) { throw new RuntimeException("Jetty Server has no Connectors"); } return conns[0].getLocalPort(); }
// in core/src/java/org/apache/solr/response/XSLTResponseWriter.java
public String getContentType(SolrQueryRequest request, SolrQueryResponse response) { Transformer t = null; try { t = getTransformer(request); } catch(Exception e) { // TODO should our parent interface throw (IO)Exception? throw new RuntimeException("getTransformer fails in getContentType",e); } String mediaType = t.getOutputProperty("media-type"); if (mediaType == null || mediaType.length()==0) { // This did not happen in my tests, mediaTypeFromXslt is set to "text/xml" // if the XSLT transform does not contain an xsl:output element. Not sure // if this is standard behavior or if it's just my JVM/libraries mediaType = DEFAULT_CONTENT_TYPE; } if (!mediaType.contains("charset")) { String encoding = t.getOutputProperty("encoding"); if (encoding == null || encoding.length()==0) { encoding = "UTF-8"; } mediaType = mediaType + "; charset=" + encoding; } return mediaType; }
// in core/src/java/org/apache/solr/response/BinaryResponseWriter.java
public void write(Writer writer, SolrQueryRequest request, SolrQueryResponse response) throws IOException { throw new RuntimeException("This is a binary writer , Cannot write to a characterstream"); }
// in core/src/java/org/apache/solr/request/UnInvertedField.java
public StatsValues getStats(SolrIndexSearcher searcher, DocSet baseDocs, String[] facet) throws IOException { //this function is ripped off nearly wholesale from the getCounts function to use //for multiValued fields within the StatsComponent. may be useful to find common //functionality between the two and refactor code somewhat use.incrementAndGet(); SchemaField sf = searcher.getSchema().getField(field); // FieldType ft = sf.getType(); StatsValues allstats = StatsValuesFactory.createStatsValues(sf); DocSet docs = baseDocs; int baseSize = docs.size(); int maxDoc = searcher.maxDoc(); if (baseSize <= 0) return allstats; DocSet missing = docs.andNot( searcher.getDocSet(new TermRangeQuery(field, null, null, false, false)) ); int i = 0; final FieldFacetStats[] finfo = new FieldFacetStats[facet.length]; //Initialize facetstats, if facets have been passed in FieldCache.DocTermsIndex si; for (String f : facet) { SchemaField facet_sf = searcher.getSchema().getField(f); try { si = FieldCache.DEFAULT.getTermsIndex(searcher.getAtomicReader(), f); } catch (IOException e) { throw new RuntimeException("failed to open field cache for: " + f, e); } finfo[i] = new FieldFacetStats(f, si, sf, facet_sf, numTermsInField); i++; } final int[] index = this.index; final int[] counts = new int[numTermsInField];//keep track of the number of times we see each word in the field for all the documents in the docset TermsEnum te = getOrdTermsEnum(searcher.getAtomicReader()); boolean doNegative = false; if (finfo.length == 0) { //if we're collecting statistics with a facet field, can't do inverted counting doNegative = baseSize > maxDoc >> 1 && termInstances > 0 && docs instanceof BitDocSet; } if (doNegative) { OpenBitSet bs = (OpenBitSet) ((BitDocSet) docs).getBits().clone(); bs.flip(0, maxDoc); // TODO: when iterator across negative elements is available, use that // instead of creating a new bitset and inverting. docs = new BitDocSet(bs, maxDoc - baseSize); // simply negating will mean that we have deleted docs in the set. // that should be OK, as their entries in our table should be empty. } // For the biggest terms, do straight set intersections for (TopTerm tt : bigTerms.values()) { // TODO: counts could be deferred if sorted==false if (tt.termNum >= 0 && tt.termNum < numTermsInField) { final Term t = new Term(field, tt.term); if (finfo.length == 0) { counts[tt.termNum] = searcher.numDocs(new TermQuery(t), docs); } else { //COULD BE VERY SLOW //if we're collecting stats for facet fields, we need to iterate on all matching documents DocSet bigTermDocSet = searcher.getDocSet(new TermQuery(t)).intersection(docs); DocIterator iter = bigTermDocSet.iterator(); while (iter.hasNext()) { int doc = iter.nextDoc(); counts[tt.termNum]++; for (FieldFacetStats f : finfo) { f.facetTermNum(doc, tt.termNum); } } } } } if (termInstances > 0) { DocIterator iter = docs.iterator(); while (iter.hasNext()) { int doc = iter.nextDoc(); int code = index[doc]; if ((code & 0xff) == 1) { int pos = code >>> 8; int whichArray = (doc >>> 16) & 0xff; byte[] arr = tnums[whichArray]; int tnum = 0; for (; ;) { int delta = 0; for (; ;) { byte b = arr[pos++]; delta = (delta << 7) | (b & 0x7f); if ((b & 0x80) == 0) break; } if (delta == 0) break; tnum += delta - TNUM_OFFSET; counts[tnum]++; for (FieldFacetStats f : finfo) { f.facetTermNum(doc, tnum); } } } else { int tnum = 0; int delta = 0; for (; ;) { delta = (delta << 7) | (code & 0x7f); if ((code & 0x80) == 0) { if (delta == 0) break; tnum += delta - TNUM_OFFSET; counts[tnum]++; for (FieldFacetStats f : finfo) { f.facetTermNum(doc, tnum); } delta = 0; } code >>>= 8; } } } } // add results in index order for (i = 0; i < numTermsInField; i++) { int c = doNegative ? maxTermCounts[i] - counts[i] : counts[i]; if (c == 0) continue; BytesRef value = getTermValue(te, i); allstats.accumulate(value, c); //as we've parsed the termnum into a value, lets also accumulate fieldfacet statistics for (FieldFacetStats f : finfo) { f.accumulateTermNum(i, value); } } int c = missing.size(); allstats.addMissing(c); if (finfo.length > 0) { for (FieldFacetStats f : finfo) { Map<String, StatsValues> facetStatsValues = f.facetStatsValues; FieldType facetType = searcher.getSchema().getFieldType(f.name); for (Map.Entry<String,StatsValues> entry : facetStatsValues.entrySet()) { String termLabel = entry.getKey(); int missingCount = searcher.numDocs(new TermQuery(new Term(f.name, facetType.toInternal(termLabel))), missing); entry.getValue().addMissing(missingCount); } allstats.addFacet(f.name, facetStatsValues); } } return allstats; }
// in core/src/java/org/apache/solr/servlet/cache/HttpCacheHeaderUtil.java
public String calcEtag(final long currentIndexVersion) { if (currentIndexVersion != indexVersionCache) { indexVersionCache=currentIndexVersion; try { etagCache = "\"" + new String(Base64.encodeBase64((Long.toHexString (Long.reverse(indexVersionCache)) + etagSeed).getBytes()), "US-ASCII") + "\""; } catch (UnsupportedEncodingException e) { throw new RuntimeException(e); // may not happen } } return etagCache; }
// in core/src/java/org/apache/solr/schema/IndexSchema.java
private SchemaField getIndexedField(String fname) { SchemaField f = getFields().get(fname); if (f==null) { throw new RuntimeException("unknown field '" + fname + "'"); } if (!f.indexed()) { throw new RuntimeException("'"+fname+"' is not an indexed field:" + f); } return f; }
// in core/src/java/org/apache/solr/schema/IndexSchema.java
private void readSchema(InputSource is) { log.info("Reading Solr Schema"); try { // pass the config resource loader to avoid building an empty one for no reason: // in the current case though, the stream is valid so we wont load the resource by name Config schemaConf = new Config(loader, "schema", is, "/schema/"); Document document = schemaConf.getDocument(); final XPath xpath = schemaConf.getXPath(); final List<SchemaAware> schemaAware = new ArrayList<SchemaAware>(); Node nd = (Node) xpath.evaluate("/schema/@name", document, XPathConstants.NODE); if (nd==null) { log.warn("schema has no name!"); } else { name = nd.getNodeValue(); log.info("Schema name=" + name); } version = schemaConf.getFloat("/schema/@version", 1.0f); // load the Field Types final FieldTypePluginLoader typeLoader = new FieldTypePluginLoader(this, fieldTypes, schemaAware); String expression = "/schema/types/fieldtype | /schema/types/fieldType"; NodeList nodes = (NodeList) xpath.evaluate(expression, document, XPathConstants.NODESET); typeLoader.load( loader, nodes ); // load the Fields // Hang on to the fields that say if they are required -- this lets us set a reasonable default for the unique key Map<String,Boolean> explicitRequiredProp = new HashMap<String, Boolean>(); ArrayList<DynamicField> dFields = new ArrayList<DynamicField>(); expression = "/schema/fields/field | /schema/fields/dynamicField"; nodes = (NodeList) xpath.evaluate(expression, document, XPathConstants.NODESET); for (int i=0; i<nodes.getLength(); i++) { Node node = nodes.item(i); NamedNodeMap attrs = node.getAttributes(); String name = DOMUtil.getAttr(attrs,"name","field definition"); log.trace("reading field def "+name); String type = DOMUtil.getAttr(attrs,"type","field " + name); FieldType ft = fieldTypes.get(type); if (ft==null) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST,"Unknown fieldtype '" + type + "' specified on field " + name); } Map<String,String> args = DOMUtil.toMapExcept(attrs, "name", "type"); if( args.get( "required" ) != null ) { explicitRequiredProp.put( name, Boolean.valueOf( args.get( "required" ) ) ); } SchemaField f = SchemaField.create(name,ft,args); if (node.getNodeName().equals("field")) { SchemaField old = fields.put(f.getName(),f); if( old != null ) { String msg = "[schema.xml] Duplicate field definition for '" + f.getName() + "' [[["+old.toString()+"]]] and [[["+f.toString()+"]]]"; throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, msg ); } log.debug("field defined: " + f); if( f.getDefaultValue() != null ) { log.debug(name+" contains default value: " + f.getDefaultValue()); fieldsWithDefaultValue.add( f ); } if (f.isRequired()) { log.debug(name+" is required in this schema"); requiredFields.add(f); } } else if (node.getNodeName().equals("dynamicField")) { // make sure nothing else has the same path addDynamicField(dFields, f); } else { // we should never get here throw new RuntimeException("Unknown field type"); } } //fields with default values are by definition required //add them to required fields, and we only have to loop once // in DocumentBuilder.getDoc() requiredFields.addAll(getFieldsWithDefaultValue()); // OK, now sort the dynamic fields largest to smallest size so we don't get // any false matches. We want to act like a compiler tool and try and match // the largest string possible. Collections.sort(dFields); log.trace("Dynamic Field Ordering:" + dFields); // stuff it in a normal array for faster access dynamicFields = dFields.toArray(new DynamicField[dFields.size()]); Node node = (Node) xpath.evaluate("/schema/similarity", document, XPathConstants.NODE); SimilarityFactory simFactory = readSimilarity(loader, node); if (simFactory == null) { simFactory = new DefaultSimilarityFactory(); } if (simFactory instanceof SchemaAware) { ((SchemaAware)simFactory).inform(this); } similarity = simFactory.getSimilarity(); node = (Node) xpath.evaluate("/schema/defaultSearchField/text()", document, XPathConstants.NODE); if (node==null) { log.warn("no default search field specified in schema."); } else { defaultSearchFieldName=node.getNodeValue().trim(); // throw exception if specified, but not found or not indexed if (defaultSearchFieldName!=null) { SchemaField defaultSearchField = getFields().get(defaultSearchFieldName); if ((defaultSearchField == null) || !defaultSearchField.indexed()) { String msg = "default search field '" + defaultSearchFieldName + "' not defined or not indexed" ; throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, msg ); } } log.info("default search field is "+defaultSearchFieldName); } node = (Node) xpath.evaluate("/schema/solrQueryParser/@defaultOperator", document, XPathConstants.NODE); if (node==null) { log.debug("using default query parser operator (OR)"); } else { queryParserDefaultOperator=node.getNodeValue().trim(); log.info("query parser default operator is "+queryParserDefaultOperator); } node = (Node) xpath.evaluate("/schema/uniqueKey/text()", document, XPathConstants.NODE); if (node==null) { log.warn("no uniqueKey specified in schema."); } else { uniqueKeyField=getIndexedField(node.getNodeValue().trim()); if (!uniqueKeyField.stored()) { log.error("uniqueKey is not stored - distributed search will not work"); } if (uniqueKeyField.multiValued()) { log.error("uniqueKey should not be multivalued"); } uniqueKeyFieldName=uniqueKeyField.getName(); uniqueKeyFieldType=uniqueKeyField.getType(); log.info("unique key field: "+uniqueKeyFieldName); // Unless the uniqueKeyField is marked 'required=false' then make sure it exists if( Boolean.FALSE != explicitRequiredProp.get( uniqueKeyFieldName ) ) { uniqueKeyField.required = true; requiredFields.add(uniqueKeyField); } } /////////////// parse out copyField commands /////////////// // Map<String,ArrayList<SchemaField>> cfields = new HashMap<String,ArrayList<SchemaField>>(); // expression = "/schema/copyField"; dynamicCopyFields = new DynamicCopy[] {}; expression = "//copyField"; nodes = (NodeList) xpath.evaluate(expression, document, XPathConstants.NODESET); for (int i=0; i<nodes.getLength(); i++) { node = nodes.item(i); NamedNodeMap attrs = node.getAttributes(); String source = DOMUtil.getAttr(attrs,"source","copyField definition"); String dest = DOMUtil.getAttr(attrs,"dest", "copyField definition"); String maxChars = DOMUtil.getAttr(attrs, "maxChars"); int maxCharsInt = CopyField.UNLIMITED; if (maxChars != null) { try { maxCharsInt = Integer.parseInt(maxChars); } catch (NumberFormatException e) { log.warn("Couldn't parse maxChars attribute for copyField from " + source + " to " + dest + " as integer. The whole field will be copied."); } } registerCopyField(source, dest, maxCharsInt); } for (Map.Entry<SchemaField, Integer> entry : copyFieldTargetCounts.entrySet()) { if (entry.getValue() > 1 && !entry.getKey().multiValued()) { log.warn("Field " + entry.getKey().name + " is not multivalued "+ "and destination for multiple copyFields ("+ entry.getValue()+")"); } } //Run the callbacks on SchemaAware now that everything else is done for (SchemaAware aware : schemaAware) { aware.inform(this); } } catch (SolrException e) { throw e; } catch(Exception e) { // unexpected exception... throw new SolrException( SolrException.ErrorCode.SERVER_ERROR,"Schema Parsing Failed: " + e.getMessage(), e); } // create the field analyzers refreshAnalyzers(); }
// in core/src/java/org/apache/solr/schema/DateField.java
Override public Date toObject(IndexableField f) { try { return parseDate( toExternal(f) ); } catch( ParseException ex ) { throw new RuntimeException( ex ); } }
// in core/src/java/org/apache/solr/schema/BinaryField.java
Override public SortField getSortField(SchemaField field, boolean top) { throw new RuntimeException("Cannot sort on a Binary field"); }
// in core/src/java/org/apache/solr/schema/SimplePreAnalyzedParser.java
static final int charToNibble(char c) { if (c >= '0' && c <= '9') { return c - '0'; } else if (c >= 'a' && c <= 'f') { return 0xa + (c - 'a'); } else if (c >= 'A' && c <= 'F') { return 0xA + (c - 'A'); } else { throw new RuntimeException("Not a hex character: '" + c + "'"); } }
// in core/src/java/org/apache/solr/schema/SchemaField.java
static int calcProps(String name, FieldType ft, Map<String, String> props) { int trueProps = parseProperties(props,true); int falseProps = parseProperties(props,false); int p = ft.properties; // // If any properties were explicitly turned off, then turn off other properties // that depend on that. // if (on(falseProps,STORED)) { int pp = STORED | BINARY; if (on(pp,trueProps)) { throw new RuntimeException("SchemaField: " + name + " conflicting stored field options:" + props); } p &= ~pp; } if (on(falseProps,INDEXED)) { int pp = (INDEXED | STORE_TERMVECTORS | STORE_TERMPOSITIONS | STORE_TERMOFFSETS | SORT_MISSING_FIRST | SORT_MISSING_LAST); if (on(pp,trueProps)) { throw new RuntimeException("SchemaField: " + name + " conflicting 'true' field options for non-indexed field:" + props); } p &= ~pp; } if (on(falseProps,INDEXED)) { int pp = (OMIT_NORMS | OMIT_TF_POSITIONS | OMIT_POSITIONS); if (on(pp,falseProps)) { throw new RuntimeException("SchemaField: " + name + " conflicting 'false' field options for non-indexed field:" + props); } p &= ~pp; } if (on(trueProps,OMIT_TF_POSITIONS)) { int pp = (OMIT_POSITIONS | OMIT_TF_POSITIONS); if (on(pp, falseProps)) { throw new RuntimeException("SchemaField: " + name + " conflicting tf and position field options:" + props); } p &= ~pp; } if (on(falseProps,STORE_TERMVECTORS)) { int pp = (STORE_TERMVECTORS | STORE_TERMPOSITIONS | STORE_TERMOFFSETS); if (on(pp,trueProps)) { throw new RuntimeException("SchemaField: " + name + " conflicting termvector field options:" + props); } p &= ~pp; } // override sort flags if (on(trueProps,SORT_MISSING_FIRST)) { p &= ~SORT_MISSING_LAST; } if (on(trueProps,SORT_MISSING_LAST)) { p &= ~SORT_MISSING_FIRST; } p &= ~falseProps; p |= trueProps; return p; }
// in core/src/java/org/apache/solr/schema/TextField.java
public static BytesRef analyzeMultiTerm(String field, String part, Analyzer analyzerIn) { if (part == null) return null; TokenStream source; try { source = analyzerIn.tokenStream(field, new StringReader(part)); source.reset(); } catch (IOException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Unable to initialize TokenStream to analyze multiTerm term: " + part, e); } TermToBytesRefAttribute termAtt = source.getAttribute(TermToBytesRefAttribute.class); BytesRef bytes = termAtt.getBytesRef(); try { if (!source.incrementToken()) throw new SolrException(SolrException.ErrorCode.BAD_REQUEST,"analyzer returned no terms for multiTerm term: " + part); termAtt.fillBytesRef(); if (source.incrementToken()) throw new SolrException(SolrException.ErrorCode.BAD_REQUEST,"analyzer returned too many terms for multiTerm term: " + part); } catch (IOException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST,"error analyzing range part: " + part, e); } try { source.end(); source.close(); } catch (IOException e) { throw new RuntimeException("Unable to end & close TokenStream after analyzing multiTerm term: " + part, e); } return BytesRef.deepCopyOf(bytes); }
// in core/src/java/org/apache/solr/schema/TextField.java
static Query parseFieldQuery(QParser parser, Analyzer analyzer, String field, String queryText) { int phraseSlop = 0; boolean enablePositionIncrements = true; // most of the following code is taken from the Lucene QueryParser // Use the analyzer to get all the tokens, and then build a TermQuery, // PhraseQuery, or nothing based on the term count TokenStream source; try { source = analyzer.tokenStream(field, new StringReader(queryText)); source.reset(); } catch (IOException e) { throw new RuntimeException("Unable to initialize TokenStream to analyze query text", e); } CachingTokenFilter buffer = new CachingTokenFilter(source); CharTermAttribute termAtt = null; PositionIncrementAttribute posIncrAtt = null; int numTokens = 0; try { buffer.reset(); } catch (IOException e) { throw new RuntimeException("Unable to initialize TokenStream to analyze query text", e); } if (buffer.hasAttribute(CharTermAttribute.class)) { termAtt = buffer.getAttribute(CharTermAttribute.class); } if (buffer.hasAttribute(PositionIncrementAttribute.class)) { posIncrAtt = buffer.getAttribute(PositionIncrementAttribute.class); } int positionCount = 0; boolean severalTokensAtSamePosition = false; boolean hasMoreTokens = false; if (termAtt != null) { try { hasMoreTokens = buffer.incrementToken(); while (hasMoreTokens) { numTokens++; int positionIncrement = (posIncrAtt != null) ? posIncrAtt.getPositionIncrement() : 1; if (positionIncrement != 0) { positionCount += positionIncrement; } else { severalTokensAtSamePosition = true; } hasMoreTokens = buffer.incrementToken(); } } catch (IOException e) { // ignore } } try { // rewind the buffer stream buffer.reset(); // close original stream - all tokens buffered source.close(); } catch (IOException e) { // ignore } if (numTokens == 0) return null; else if (numTokens == 1) { String term = null; try { boolean hasNext = buffer.incrementToken(); assert hasNext == true; term = termAtt.toString(); } catch (IOException e) { // safe to ignore, because we know the number of tokens } // return newTermQuery(new Term(field, term)); return new TermQuery(new Term(field, term)); } else { if (severalTokensAtSamePosition) { if (positionCount == 1) { // no phrase query: // BooleanQuery q = newBooleanQuery(true); BooleanQuery q = new BooleanQuery(true); for (int i = 0; i < numTokens; i++) { String term = null; try { boolean hasNext = buffer.incrementToken(); assert hasNext == true; term = termAtt.toString(); } catch (IOException e) { // safe to ignore, because we know the number of tokens } // Query currentQuery = newTermQuery(new Term(field, term)); Query currentQuery = new TermQuery(new Term(field, term)); q.add(currentQuery, BooleanClause.Occur.SHOULD); } return q; } else { // phrase query: // MultiPhraseQuery mpq = newMultiPhraseQuery(); MultiPhraseQuery mpq = new MultiPhraseQuery(); mpq.setSlop(phraseSlop); List multiTerms = new ArrayList(); int position = -1; for (int i = 0; i < numTokens; i++) { String term = null; int positionIncrement = 1; try { boolean hasNext = buffer.incrementToken(); assert hasNext == true; term = termAtt.toString(); if (posIncrAtt != null) { positionIncrement = posIncrAtt.getPositionIncrement(); } } catch (IOException e) { // safe to ignore, because we know the number of tokens } if (positionIncrement > 0 && multiTerms.size() > 0) { if (enablePositionIncrements) { mpq.add((Term[])multiTerms.toArray(new Term[0]),position); } else { mpq.add((Term[])multiTerms.toArray(new Term[0])); } multiTerms.clear(); } position += positionIncrement; multiTerms.add(new Term(field, term)); } if (enablePositionIncrements) { mpq.add((Term[])multiTerms.toArray(new Term[0]),position); } else { mpq.add((Term[])multiTerms.toArray(new Term[0])); } return mpq; } } else { // PhraseQuery pq = newPhraseQuery(); PhraseQuery pq = new PhraseQuery(); pq.setSlop(phraseSlop); int position = -1; for (int i = 0; i < numTokens; i++) { String term = null; int positionIncrement = 1; try { boolean hasNext = buffer.incrementToken(); assert hasNext == true; term = termAtt.toString(); if (posIncrAtt != null) { positionIncrement = posIncrAtt.getPositionIncrement(); } } catch (IOException e) { // safe to ignore, because we know the number of tokens } if (enablePositionIncrements) { position += positionIncrement; pq.add(new Term(field, term),position); } else { pq.add(new Term(field, term)); } } return pq; } } }
// in core/src/java/org/apache/solr/schema/CollationField.java
private Collator createFromRules(String fileName, ResourceLoader loader) { InputStream input = null; try { input = loader.openResource(fileName); String rules = IOUtils.toString(input, "UTF-8"); return new RuleBasedCollator(rules); } catch (IOException e) { // io error throw new RuntimeException(e); } catch (ParseException e) { // invalid rules throw new RuntimeException(e); } finally { IOUtils.closeQuietly(input); } }
// in core/src/java/org/apache/solr/schema/CollationField.java
private BytesRef analyzeRangePart(String field, String part) { TokenStream source; try { source = analyzer.tokenStream(field, new StringReader(part)); source.reset(); } catch (IOException e) { throw new RuntimeException("Unable to initialize TokenStream to analyze range part: " + part, e); } TermToBytesRefAttribute termAtt = source.getAttribute(TermToBytesRefAttribute.class); BytesRef bytes = termAtt.getBytesRef(); // we control the analyzer here: most errors are impossible try { if (!source.incrementToken()) throw new IllegalArgumentException("analyzer returned no terms for range part: " + part); termAtt.fillBytesRef(); assert !source.incrementToken(); } catch (IOException e) { throw new RuntimeException("error analyzing range part: " + part, e); } try { source.end(); source.close(); } catch (IOException e) { throw new RuntimeException("Unable to end & close TokenStream after analyzing range part: " + part, e); } return BytesRef.deepCopyOf(bytes); }
// in core/src/java/org/apache/solr/schema/FieldType.java
void setArgs(IndexSchema schema, Map<String,String> args) { // default to STORED, INDEXED, OMIT_TF_POSITIONS and MULTIVALUED depending on schema version properties = (STORED | INDEXED); float schemaVersion = schema.getVersion(); if (schemaVersion < 1.1f) properties |= MULTIVALUED; if (schemaVersion > 1.1f) properties |= OMIT_TF_POSITIONS; if (schemaVersion < 1.3) { args.remove("compressThreshold"); } this.args=args; Map<String,String> initArgs = new HashMap<String,String>(args); trueProperties = FieldProperties.parseProperties(initArgs,true); falseProperties = FieldProperties.parseProperties(initArgs,false); properties &= ~falseProperties; properties |= trueProperties; for (String prop : FieldProperties.propertyNames) initArgs.remove(prop); init(schema, initArgs); String positionInc = initArgs.get("positionIncrementGap"); if (positionInc != null) { Analyzer analyzer = getAnalyzer(); if (analyzer instanceof SolrAnalyzer) { ((SolrAnalyzer)analyzer).setPositionIncrementGap(Integer.parseInt(positionInc)); } else { throw new RuntimeException("Can't set positionIncrementGap on custom analyzer " + analyzer.getClass()); } analyzer = getQueryAnalyzer(); if (analyzer instanceof SolrAnalyzer) { ((SolrAnalyzer)analyzer).setPositionIncrementGap(Integer.parseInt(positionInc)); } else { throw new RuntimeException("Can't set positionIncrementGap on custom analyzer " + analyzer.getClass()); } initArgs.remove("positionIncrementGap"); } final String postingsFormat = initArgs.get("postingsFormat"); if (postingsFormat != null) { this.postingsFormat = postingsFormat; initArgs.remove("postingsFormat"); } if (initArgs.size() > 0) { throw new RuntimeException("schema fieldtype " + typeName + "("+ this.getClass().getName() + ")" + " invalid arguments:" + initArgs); } }
// in core/src/java/org/apache/solr/schema/FieldType.java
protected void restrictProps(int props) { if ((properties & props) != 0) { throw new RuntimeException("schema fieldtype " + typeName + "("+ this.getClass().getName() + ")" + " invalid properties:" + propertiesToString(properties & props)); } }
// in core/src/java/org/apache/solr/internal/csv/CSVStrategy.java
public Object clone() { try { return super.clone(); } catch (CloneNotSupportedException e) { throw new RuntimeException(e); // impossible } }
// in core/src/java/org/apache/solr/search/similarities/IBSimilarityFactory.java
private Distribution parseDistribution(String expr) { if ("LL".equals(expr)) { return new DistributionLL(); } else if ("SPL".equals(expr)) { return new DistributionSPL(); } else { throw new RuntimeException("Invalid distribution: " + expr); } }
// in core/src/java/org/apache/solr/search/similarities/IBSimilarityFactory.java
private Lambda parseLambda(String expr) { if ("DF".equals(expr)) { return new LambdaDF(); } else if ("TTF".equals(expr)) { return new LambdaTTF(); } else { throw new RuntimeException("Invalid lambda: " + expr); } }
// in core/src/java/org/apache/solr/search/similarities/DFRSimilarityFactory.java
private BasicModel parseBasicModel(String expr) { if ("Be".equals(expr)) { return new BasicModelBE(); } else if ("D".equals(expr)) { return new BasicModelD(); } else if ("G".equals(expr)) { return new BasicModelG(); } else if ("I(F)".equals(expr)) { return new BasicModelIF(); } else if ("I(n)".equals(expr)) { return new BasicModelIn(); } else if ("I(ne)".equals(expr)) { return new BasicModelIne(); } else if ("P".equals(expr)) { return new BasicModelP(); } else { throw new RuntimeException("Invalid basicModel: " + expr); } }
// in core/src/java/org/apache/solr/search/similarities/DFRSimilarityFactory.java
private AfterEffect parseAfterEffect(String expr) { if ("B".equals(expr)) { return new AfterEffectB(); } else if ("L".equals(expr)) { return new AfterEffectL(); } else if ("none".equals(expr)) { return new AfterEffect.NoAfterEffect(); } else { throw new RuntimeException("Invalid afterEffect: " + expr); } }
// in core/src/java/org/apache/solr/search/similarities/DFRSimilarityFactory.java
static Normalization parseNormalization(String expr, String c, String mu, String z) { if (mu != null && !"H3".equals(expr)) { throw new RuntimeException( "parameter mu only makes sense for normalization H3"); } if (z != null && !"Z".equals(expr)) { throw new RuntimeException( "parameter z only makes sense for normalization Z"); } if (c != null && !("H1".equals(expr) || "H2".equals(expr))) { throw new RuntimeException( "parameter c only makese sense for normalizations H1 and H2"); } if ("H1".equals(expr)) { return (c != null) ? new NormalizationH1(Float.parseFloat(c)) : new NormalizationH1(); } else if ("H2".equals(expr)) { return (c != null) ? new NormalizationH2(Float.parseFloat(c)) : new NormalizationH2(); } else if ("H3".equals(expr)) { return (mu != null) ? new NormalizationH3(Float.parseFloat(mu)) : new NormalizationH3(); } else if ("Z".equals(expr)) { return (z != null) ? new NormalizationZ(Float.parseFloat(z)) : new NormalizationZ(); } else if ("none".equals(expr)) { return new Normalization.NoNormalization(); } else { throw new RuntimeException("Invalid normalization: " + expr); } }
// in core/src/java/org/apache/solr/search/QueryParsing.java
public static String toString(Query query, IndexSchema schema) { try { StringBuilder sb = new StringBuilder(); toString(query, schema, sb, 0); return sb.toString(); } catch (Exception e) { throw new RuntimeException(e); } }
// in core/src/java/org/apache/solr/spelling/AbstractLuceneSpellChecker.java
Override public String init(NamedList config, SolrCore core) { super.init(config, core); indexDir = (String) config.get(INDEX_DIR); String accuracy = (String) config.get(ACCURACY); //If indexDir is relative then create index inside core.getDataDir() if (indexDir != null) { if (!new File(indexDir).isAbsolute()) { indexDir = core.getDataDir() + File.separator + indexDir; } } sourceLocation = (String) config.get(LOCATION); String compClass = (String) config.get(COMPARATOR_CLASS); Comparator<SuggestWord> comp = null; if (compClass != null){ if (compClass.equalsIgnoreCase(SCORE_COMP)){ comp = SuggestWordQueue.DEFAULT_COMPARATOR; } else if (compClass.equalsIgnoreCase(FREQ_COMP)){ comp = new SuggestWordFrequencyComparator(); } else{//must be a FQCN comp = (Comparator<SuggestWord>) core.getResourceLoader().newInstance(compClass, Comparator.class); } } else { comp = SuggestWordQueue.DEFAULT_COMPARATOR; } String strDistanceName = (String)config.get(STRING_DISTANCE); if (strDistanceName != null) { sd = core.getResourceLoader().newInstance(strDistanceName, StringDistance.class); //TODO: Figure out how to configure options. Where's Spring when you need it? Or at least BeanUtils... } else { sd = new LevensteinDistance(); } try { initIndex(); spellChecker = new SpellChecker(index, sd, comp); } catch (IOException e) { throw new RuntimeException(e); } if (accuracy != null) { try { this.accuracy = Float.parseFloat(accuracy); spellChecker.setAccuracy(this.accuracy); } catch (NumberFormatException e) { throw new RuntimeException( "Unparseable accuracy given for dictionary: " + name, e); } } return name; }
// in core/src/java/org/apache/solr/spelling/FileBasedSpellChecker.java
Override public void build(SolrCore core, SolrIndexSearcher searcher) { try { loadExternalFileDictionary(core); spellChecker.clearIndex(); // TODO: you should be able to specify the IWC params? // TODO: if we enable this, codec gets angry since field won't exist in the schema // config.setCodec(core.getCodec()); spellChecker.indexDictionary(dictionary, new IndexWriterConfig(core.getSolrConfig().luceneMatchVersion, null), false); } catch (IOException e) { throw new RuntimeException(e); } }
// in core/src/java/org/apache/solr/spelling/SuggestQueryConverter.java
Override public Collection<Token> convert(String original) { if (original == null) { // this can happen with q.alt = and no query return Collections.emptyList(); } Collection<Token> result = new ArrayList<Token>(); try { analyze(result, new StringReader(original), 0); } catch (IOException e) { throw new RuntimeException(e); } return result; }
// in core/src/java/org/apache/solr/spelling/IndexBasedSpellChecker.java
private void initSourceReader() { if (sourceLocation != null) { try { FSDirectory luceneIndexDir = FSDirectory.open(new File(sourceLocation)); this.reader = DirectoryReader.open(luceneIndexDir); } catch (IOException e) { throw new RuntimeException(e); } } }
// in core/src/java/org/apache/solr/spelling/IndexBasedSpellChecker.java
Override public void build(SolrCore core, SolrIndexSearcher searcher) { IndexReader reader = null; try { if (sourceLocation == null) { // Load from Solr's index reader = searcher.getIndexReader(); } else { // Load from Lucene index at given sourceLocation reader = this.reader; } // Create the dictionary dictionary = new HighFrequencyDictionary(reader, field, threshold); // TODO: maybe whether or not to clear the index should be configurable? // an incremental update is faster (just adds new terms), but if you 'expunged' // old terms I think they might hang around. spellChecker.clearIndex(); // TODO: you should be able to specify the IWC params? // TODO: if we enable this, codec gets angry since field won't exist in the schema // config.setCodec(core.getCodec()); spellChecker.indexDictionary(dictionary, new IndexWriterConfig(core.getSolrConfig().luceneMatchVersion, null), false); } catch (IOException e) { throw new RuntimeException(e); } }
// in core/src/java/org/apache/solr/cloud/ZkSolrResourceLoader.java
Override public InputStream openResource(String resource) { InputStream is = null; String file = collectionZkPath + "/" + resource; try { if (zkController.pathExists(file)) { byte[] bytes = zkController.getZkClient().getData(collectionZkPath + "/" + resource, null, null, true); return new ByteArrayInputStream(bytes); } } catch (Exception e) { throw new RuntimeException("Error opening " + file, e); } try { // delegate to the class loader (looking into $INSTANCE_DIR/lib jars) is = classLoader.getResourceAsStream(resource); } catch (Exception e) { throw new RuntimeException("Error opening " + resource, e); } if (is == null) { throw new RuntimeException("Can't find resource '" + resource + "' in classpath or '" + collectionZkPath + "', cwd=" + System.getProperty("user.dir")); } return is; }
// in core/src/java/org/apache/solr/cloud/SolrZkServer.java
public void setClientPort(int clientPort) { if (clientPortAddress != null) { try { this.clientPortAddress = new InetSocketAddress( InetAddress.getByName(clientPortAddress.getHostName()), clientPort); } catch (UnknownHostException e) { throw new RuntimeException(e); } } else { this.clientPortAddress = new InetSocketAddress(clientPort); } }
// in core/src/java/org/apache/solr/cloud/ZkController.java
private ZkCoreNodeProps getLeaderProps(final String collection, final String slice) throws KeeperException, InterruptedException { int iterCount = 60; while (iterCount-- > 0) try { byte[] data = zkClient.getData( ZkStateReader.getShardLeadersPath(collection, slice), null, null, true); ZkCoreNodeProps leaderProps = new ZkCoreNodeProps( ZkNodeProps.load(data)); return leaderProps; } catch (NoNodeException e) { Thread.sleep(500); } throw new RuntimeException("Could not get leader props"); }
// in core/src/java/org/apache/solr/update/UpdateLog.java
public Future<RecoveryInfo> applyBufferedUpdates() { // recovery trips this assert under some race - even when // it checks the state first // assert state == State.BUFFERING; // block all updates to eliminate race conditions // reading state and acting on it in the update processor versionInfo.blockUpdates(); try { cancelApplyBufferUpdate = false; if (state != State.BUFFERING) return null; // handle case when no log was even created because no updates // were received. if (tlog == null) { state = State.ACTIVE; return null; } tlog.incref(); state = State.APPLYING_BUFFERED; operationFlags &= ~FLAG_GAP; } finally { versionInfo.unblockUpdates(); } if (recoveryExecutor.isShutdown()) { tlog.decref(); throw new RuntimeException("executor is not running..."); } ExecutorCompletionService<RecoveryInfo> cs = new ExecutorCompletionService<RecoveryInfo>(recoveryExecutor); LogReplayer replayer = new LogReplayer(Arrays.asList(new TransactionLog[]{tlog}), true); return cs.submit(replayer, recoveryInfo); }
// in core/src/java/org/apache/solr/update/processor/MD5Signature.java
Override protected MessageDigest initialValue() { try { return MessageDigest.getInstance("MD5"); } catch (NoSuchAlgorithmException e) { throw new RuntimeException(e); } }
// in core/src/java/org/apache/solr/update/processor/MD5Signature.java
Override public void add(String content) { try { digester.update(content.getBytes("UTF-8")); } catch (UnsupportedEncodingException e) { // won't happen log.error("UTF-8 not supported", e); throw new RuntimeException(e); } }
// in core/src/java/org/apache/solr/core/Config.java
public Node getNode(String path, boolean errIfMissing) { XPath xpath = xpathFactory.newXPath(); Node nd = null; String xstr = normalize(path); try { nd = (Node)xpath.evaluate(xstr, doc, XPathConstants.NODE); if (nd==null) { if (errIfMissing) { throw new RuntimeException(name + " missing "+path); } else { log.debug(name + " missing optional " + path); return null; } } log.trace(name + ":" + path + "=" + nd); return nd; } catch (XPathExpressionException e) { SolrException.log(log,"Error in xpath",e); throw new SolrException( SolrException.ErrorCode.SERVER_ERROR,"Error in xpath:" + xstr + " for " + name,e); } catch (SolrException e) { throw(e); } catch (Throwable e) { SolrException.log(log,"Error in xpath",e); throw new SolrException( SolrException.ErrorCode.SERVER_ERROR,"Error in xpath:" + xstr+ " for " + name,e); } }
// in core/src/java/org/apache/solr/core/SolrConfig.java
private void initLibs() { NodeList nodes = (NodeList) evaluate("lib", XPathConstants.NODESET); if (nodes==null || nodes.getLength()==0) return; log.info("Adding specified lib dirs to ClassLoader"); for (int i=0; i<nodes.getLength(); i++) { Node node = nodes.item(i); String baseDir = DOMUtil.getAttr(node, "dir"); String path = DOMUtil.getAttr(node, "path"); if (null != baseDir) { // :TODO: add support for a simpler 'glob' mutually eclusive of regex String regex = DOMUtil.getAttr(node, "regex"); FileFilter filter = (null == regex) ? null : new RegexFileFilter(regex); getResourceLoader().addToClassLoader(baseDir, filter); } else if (null != path) { getResourceLoader().addToClassLoader(path); } else { throw new RuntimeException ("lib: missing mandatory attributes: 'dir' or 'path'"); } } }
// in core/src/java/org/apache/solr/core/SolrCore.java
void initIndex() { try { String indexDir = getNewIndexDir(); boolean indexExists = getDirectoryFactory().exists(indexDir); boolean firstTime; synchronized (SolrCore.class) { firstTime = dirs.add(new File(indexDir).getCanonicalPath()); } boolean removeLocks = solrConfig.unlockOnStartup; initIndexReaderFactory(); if (indexExists && firstTime) { // to remove locks, the directory must already exist... so we create it // if it didn't exist already... Directory dir = directoryFactory.get(indexDir, getSolrConfig().indexConfig.lockType); if (dir != null) { if (IndexWriter.isLocked(dir)) { if (removeLocks) { log.warn(logid + "WARNING: Solr index directory '{}' is locked. Unlocking...", indexDir); IndexWriter.unlock(dir); } else { log.error(logid + "Solr index directory '{}' is locked. Throwing exception", indexDir); throw new LockObtainFailedException("Index locked for write for core " + name); } } directoryFactory.release(dir); } } // Create the index if it doesn't exist. if(!indexExists) { log.warn(logid+"Solr index directory '" + new File(indexDir) + "' doesn't exist." + " Creating new index..."); SolrIndexWriter writer = new SolrIndexWriter("SolrCore.initIndex", indexDir, getDirectoryFactory(), true, schema, solrConfig.indexConfig, solrDelPolicy, codec, false); writer.close(); } } catch (IOException e) { throw new RuntimeException(e); } }
// in core/src/java/org/apache/solr/core/SolrResourceLoader.java
public InputStream openResource(String resource) { InputStream is=null; try { File f0 = new File(resource); File f = f0; if (!f.isAbsolute()) { // try $CWD/$configDir/$resource f = new File(getConfigDir() + resource); } if (f.isFile() && f.canRead()) { return new FileInputStream(f); } else if (f != f0) { // no success with $CWD/$configDir/$resource if (f0.isFile() && f0.canRead()) return new FileInputStream(f0); } // delegate to the class loader (looking into $INSTANCE_DIR/lib jars) is = classLoader.getResourceAsStream(resource); if (is == null) is = classLoader.getResourceAsStream(getConfigDir() + resource); } catch (Exception e) { throw new RuntimeException("Error opening " + resource, e); } if (is==null) { throw new RuntimeException("Can't find resource '" + resource + "' in classpath or '" + getConfigDir() + "', cwd="+System.getProperty("user.dir")); } return is; }
// in core/src/java/org/apache/solr/core/CoreContainer.java
public SolrCore register(String name, SolrCore core, boolean returnPrevNotClosed) { if( core == null ) { throw new RuntimeException( "Can not register a null core." ); } if( name == null || name.indexOf( '/' ) >= 0 || name.indexOf( '\\' ) >= 0 ){ throw new RuntimeException( "Invalid core name: "+name ); } if (zkController != null) { // this happens before we can receive requests zkController.preRegister(core.getCoreDescriptor()); } SolrCore old = null; synchronized (cores) { if (isShutDown) { core.close(); throw new IllegalStateException("This CoreContainer has been shutdown"); } old = cores.put(name, core); /* * set both the name of the descriptor and the name of the * core, since the descriptors name is used for persisting. */ core.setName(name); core.getCoreDescriptor().name = name; } if( old == null || old == core) { log.info( "registering core: "+name ); registerInZk(core); return null; } else { log.info( "replacing core: "+name ); if (!returnPrevNotClosed) { old.close(); } registerInZk(core); return old; } }
// in core/src/java/org/apache/solr/core/JmxMonitoredMap.java
private OpenType determineType(Class type) { try { for (Field field : SimpleType.class.getFields()) { if (field.getType().equals(SimpleType.class)) { SimpleType candidate = (SimpleType) field.get(SimpleType.class); if (candidate.getTypeName().equals(type.getName())) { return candidate; } } } } catch (Exception e) { throw new RuntimeException(e); } return null; }
// in core/src/java/org/apache/solr/util/SolrPluginUtils.java
public static void invokeSetters(Object bean, NamedList initArgs) { if (initArgs == null) return; Class clazz = bean.getClass(); Method[] methods = clazz.getMethods(); Iterator<Map.Entry<String, Object>> iterator = initArgs.iterator(); while (iterator.hasNext()) { Map.Entry<String, Object> entry = iterator.next(); String key = entry.getKey(); String setterName = "set" + String.valueOf(Character.toUpperCase(key.charAt(0))) + key.substring(1); Method method = null; try { for (Method m : methods) { if (m.getName().equals(setterName) && m.getParameterTypes().length == 1) { method = m; break; } } if (method == null) { throw new RuntimeException("no setter corrresponding to '" + key + "' in " + clazz.getName()); } Class pClazz = method.getParameterTypes()[0]; Object val = entry.getValue(); method.invoke(bean, val); } catch (InvocationTargetException e1) { throw new RuntimeException("Error invoking setter " + setterName + " on class : " + clazz.getName(), e1); } catch (IllegalAccessException e1) { throw new RuntimeException("Error invoking setter " + setterName + " on class : " + clazz.getName(), e1); } } }
// in core/src/java/org/apache/solr/util/DOMUtil.java
public static String getAttr(NamedNodeMap attrs, String name, String missing_err) { Node attr = attrs==null? null : attrs.getNamedItem(name); if (attr==null) { if (missing_err==null) return null; throw new RuntimeException(missing_err + ": missing mandatory attribute '" + name + "'"); } String val = attr.getNodeValue(); return val; }
// in core/src/java/org/apache/solr/util/DOMUtil.java
private static void parsePropertyString(String value, List<String> fragments, List<String> propertyRefs) { int prev = 0; int pos; //search for the next instance of $ from the 'prev' position while ((pos = value.indexOf("$", prev)) >= 0) { //if there was any text before this, add it as a fragment //TODO, this check could be modified to go if pos>prev; //seems like this current version could stick empty strings //into the list if (pos > 0) { fragments.add(value.substring(prev, pos)); } //if we are at the end of the string, we tack on a $ //then move past it if (pos == (value.length() - 1)) { fragments.add("$"); prev = pos + 1; } else if (value.charAt(pos + 1) != '{') { //peek ahead to see if the next char is a property or not //not a property: insert the char as a literal /* fragments.addElement(value.substring(pos + 1, pos + 2)); prev = pos + 2; */ if (value.charAt(pos + 1) == '$') { //backwards compatibility two $ map to one mode fragments.add("$"); prev = pos + 2; } else { //new behaviour: $X maps to $X for all values of X!='$' fragments.add(value.substring(pos, pos + 2)); prev = pos + 2; } } else { //property found, extract its name or bail on a typo int endName = value.indexOf('}', pos); if (endName < 0) { throw new RuntimeException("Syntax error in property: " + value); } String propertyName = value.substring(pos + 2, endName); fragments.add(null); propertyRefs.add(propertyName); prev = endName + 1; } } //no more $ signs found //if there is any tail to the string, append it if (prev < value.length()) { fragments.add(value.substring(prev)); } }
65
              
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
catch (UnsupportedEncodingException e) { // can't happen - UTF-8 throw new RuntimeException(e); }
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
catch (Exception e) { throw new RuntimeException("Problem pretty printing XML", e); }
// in solrj/src/java/org/apache/solr/common/cloud/ZkStateReader.java
catch (IOException e) { throw new RuntimeException(e); // should never happen w/o using real IO }
// in solrj/src/java/org/apache/solr/common/cloud/ZkCmdExecutor.java
catch (KeeperException.ConnectionLossException e) { if (exception == null) { exception = e; } if (Thread.currentThread().isInterrupted()) { Thread.currentThread().interrupt(); throw new InterruptedException(); } if (Thread.currentThread() instanceof SafeStopThread) { if (((SafeStopThread) Thread.currentThread()).isClosed()) { throw new RuntimeException("Interrupted"); } } retryDelay(i); }
// in solrj/src/java/org/apache/solr/common/params/MapSolrParams.java
catch (IOException e) {throw new RuntimeException(e);}
// in solrj/src/java/org/apache/solr/common/params/MultiMapSolrParams.java
catch (IOException e) {throw new RuntimeException(e);}
// in solrj/src/java/org/apache/solr/common/params/ModifiableSolrParams.java
catch (IOException e) {throw new RuntimeException(e);}
// in solrj/src/java/org/apache/solr/client/solrj/request/RequestWriter.java
catch (IOException e) { throw new RuntimeException("Unable to write xml into a stream", e); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
catch (MalformedURLException e) { throw new RuntimeException(e); }
// in solrj/src/java/org/apache/solr/client/solrj/util/ClientUtils.java
catch (IOException e) {throw new RuntimeException(e);}
// in solrj/src/java/org/apache/noggit/CharArr.java
catch (IOException e) { throw new RuntimeException(e); }
// in solrj/src/java/org/apache/noggit/CharArr.java
catch (IOException e) { throw new RuntimeException(e); }
// in solrj/src/java/org/apache/noggit/CharArr.java
catch (IOException e) { throw new RuntimeException(e); }
// in contrib/langid/src/java/org/apache/solr/update/processor/LangDetectLanguageIdentifierUpdateProcessorFactory.java
catch (Exception e) { throw new RuntimeException("Couldn't load profile data, will return empty languages always!", e); }
// in contrib/analysis-extras/src/java/org/apache/solr/schema/ICUCollationField.java
catch (Exception e) { // io error or invalid rules throw new RuntimeException(e); }
// in contrib/analysis-extras/src/java/org/apache/solr/schema/ICUCollationField.java
catch (IOException e) { throw new RuntimeException("Unable to initialize TokenStream to analyze range part: " + part, e); }
// in contrib/analysis-extras/src/java/org/apache/solr/schema/ICUCollationField.java
catch (IOException e) { throw new RuntimeException("error analyzing range part: " + part, e); }
// in contrib/analysis-extras/src/java/org/apache/solr/schema/ICUCollationField.java
catch (IOException e) { throw new RuntimeException("Unable to end & close TokenStream after analyzing range part: " + part, e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/FileDataSource.java
catch (FileNotFoundException e) { throw new RuntimeException(e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/XPathRecordReader.java
catch (Exception e) { throw new RuntimeException(e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DocBuilder.java
catch(Exception e) { throw new RuntimeException(e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DocBuilder.java
catch (Exception e) { throw new RuntimeException(e); }
// in contrib/velocity/src/java/org/apache/solr/response/SolrParamResourceLoader.java
catch (UnsupportedEncodingException e) { throw new RuntimeException(e); // may not happen }
// in contrib/velocity/src/java/org/apache/solr/response/VelocityResponseWriter.java
catch (Exception e) { throw new RuntimeException(e); }
// in core/src/java/org/apache/solr/handler/AnalysisRequestHandlerBase.java
catch (IOException ioe) { throw new RuntimeException("Error occured while iterating over tokenstream", ioe); }
// in core/src/java/org/apache/solr/handler/AnalysisRequestHandlerBase.java
catch (IOException ioe) { throw new RuntimeException("Error occured while iterating over tokenstream", ioe); }
// in core/src/java/org/apache/solr/handler/component/QueryElevationComponent.java
catch (MalformedURLException e) { throw new RuntimeException(e); }
// in core/src/java/org/apache/solr/handler/component/StatsComponent.java
catch (IOException e) { throw new RuntimeException( "failed to open field cache for: "+fieldName, e ); }
// in core/src/java/org/apache/solr/handler/component/StatsComponent.java
catch (IOException e) { throw new RuntimeException( "failed to open field cache for: " + facetField, e ); }
// in core/src/java/org/apache/solr/handler/component/TermVectorComponent.java
catch (IOException e) { throw new RuntimeException(e); }
// in core/src/java/org/apache/solr/client/solrj/embedded/EmbeddedSolrServer.java
catch (Exception ex) { throw new RuntimeException(ex); }
// in core/src/java/org/apache/solr/response/XSLTResponseWriter.java
catch(Exception e) { // TODO should our parent interface throw (IO)Exception? throw new RuntimeException("getTransformer fails in getContentType",e); }
// in core/src/java/org/apache/solr/response/BinaryResponseWriter.java
catch (Exception ex) { throw new RuntimeException(ex); }
// in core/src/java/org/apache/solr/request/UnInvertedField.java
catch (IOException e) { throw new RuntimeException("failed to open field cache for: " + f, e); }
// in core/src/java/org/apache/solr/servlet/cache/HttpCacheHeaderUtil.java
catch (UnsupportedEncodingException e) { throw new RuntimeException(e); // may not happen }
// in core/src/java/org/apache/solr/schema/DateField.java
catch( ParseException ex ) { throw new RuntimeException( ex ); }
// in core/src/java/org/apache/solr/schema/TextField.java
catch (IOException e) { throw new RuntimeException("Unable to end & close TokenStream after analyzing multiTerm term: " + part, e); }
// in core/src/java/org/apache/solr/schema/TextField.java
catch (IOException e) { throw new RuntimeException("Unable to initialize TokenStream to analyze query text", e); }
// in core/src/java/org/apache/solr/schema/TextField.java
catch (IOException e) { throw new RuntimeException("Unable to initialize TokenStream to analyze query text", e); }
// in core/src/java/org/apache/solr/schema/CollationField.java
catch (IOException e) { // io error throw new RuntimeException(e); }
// in core/src/java/org/apache/solr/schema/CollationField.java
catch (ParseException e) { // invalid rules throw new RuntimeException(e); }
// in core/src/java/org/apache/solr/schema/CollationField.java
catch (IOException e) { throw new RuntimeException("Unable to initialize TokenStream to analyze range part: " + part, e); }
// in core/src/java/org/apache/solr/schema/CollationField.java
catch (IOException e) { throw new RuntimeException("error analyzing range part: " + part, e); }
// in core/src/java/org/apache/solr/schema/CollationField.java
catch (IOException e) { throw new RuntimeException("Unable to end & close TokenStream after analyzing range part: " + part, e); }
// in core/src/java/org/apache/solr/internal/csv/CSVStrategy.java
catch (CloneNotSupportedException e) { throw new RuntimeException(e); // impossible }
// in core/src/java/org/apache/solr/search/QueryParsing.java
catch (Exception e) { throw new RuntimeException(e); }
// in core/src/java/org/apache/solr/search/SolrCacheBase.java
catch (Exception e) { throw new RuntimeException("Can't parse autoWarm value: " + configValue, e); }
// in core/src/java/org/apache/solr/spelling/AbstractLuceneSpellChecker.java
catch (IOException e) { throw new RuntimeException(e); }
// in core/src/java/org/apache/solr/spelling/AbstractLuceneSpellChecker.java
catch (NumberFormatException e) { throw new RuntimeException( "Unparseable accuracy given for dictionary: " + name, e); }
// in core/src/java/org/apache/solr/spelling/FileBasedSpellChecker.java
catch (IOException e) { throw new RuntimeException(e); }
// in core/src/java/org/apache/solr/spelling/SuggestQueryConverter.java
catch (IOException e) { throw new RuntimeException(e); }
// in core/src/java/org/apache/solr/spelling/IndexBasedSpellChecker.java
catch (IOException e) { throw new RuntimeException(e); }
// in core/src/java/org/apache/solr/spelling/IndexBasedSpellChecker.java
catch (IOException e) { throw new RuntimeException(e); }
// in core/src/java/org/apache/solr/cloud/ZkSolrResourceLoader.java
catch (Exception e) { throw new RuntimeException("Error opening " + file, e); }
// in core/src/java/org/apache/solr/cloud/ZkSolrResourceLoader.java
catch (Exception e) { throw new RuntimeException("Error opening " + resource, e); }
// in core/src/java/org/apache/solr/cloud/SolrZkServer.java
catch (UnknownHostException e) { throw new RuntimeException(e); }
// in core/src/java/org/apache/solr/update/processor/MD5Signature.java
catch (NoSuchAlgorithmException e) { throw new RuntimeException(e); }
// in core/src/java/org/apache/solr/update/processor/MD5Signature.java
catch (UnsupportedEncodingException e) { // won't happen log.error("UTF-8 not supported", e); throw new RuntimeException(e); }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (IOException e) { throw new RuntimeException(e); }
// in core/src/java/org/apache/solr/core/SolrResourceLoader.java
catch (Exception e) { throw new RuntimeException("Error opening " + resource, e); }
// in core/src/java/org/apache/solr/core/JmxMonitoredMap.java
catch (Exception e) { // Release the reference server = null; throw new RuntimeException("Could not start JMX monitoring ", e); }
// in core/src/java/org/apache/solr/core/JmxMonitoredMap.java
catch (Exception e) { throw new RuntimeException(e); }
// in core/src/java/org/apache/solr/util/SolrPluginUtils.java
catch (InvocationTargetException e1) { throw new RuntimeException("Error invoking setter " + setterName + " on class : " + clazz.getName(), e1); }
// in core/src/java/org/apache/solr/util/SolrPluginUtils.java
catch (IllegalAccessException e1) { throw new RuntimeException("Error invoking setter " + setterName + " on class : " + clazz.getName(), e1); }
// in core/src/java/org/apache/solr/util/SuggestMissingFactories.java
catch (MalformedURLException e) { throw new RuntimeException ("WTF, how can JarFile.toURL() be malformed?", e); }
0
(Domain) DataImportHandlerException 68
              
// in contrib/dataimporthandler-extras/src/java/org/apache/solr/handler/dataimport/TikaEntityProcessor.java
Override protected void firstInit(Context context) { try { String tikaConfigFile = context.getResolvedEntityAttribute("tikaConfig"); if (tikaConfigFile == null) { ClassLoader classLoader = context.getSolrCore().getResourceLoader().getClassLoader(); tikaConfig = new TikaConfig(classLoader); } else { File configFile = new File(tikaConfigFile); if (!configFile.isAbsolute()) { configFile = new File(context.getSolrCore().getResourceLoader().getConfigDir(), tikaConfigFile); } tikaConfig = new TikaConfig(configFile); } } catch (Exception e) { wrapAndThrow (SEVERE, e,"Unable to load Tika Config"); } format = context.getResolvedEntityAttribute("format"); if(format == null) format = "text"; if (!"html".equals(format) && !"xml".equals(format) && !"text".equals(format)&& !"none".equals(format) ) throw new DataImportHandlerException(SEVERE, "'format' can be one of text|html|xml|none"); parser = context.getResolvedEntityAttribute("parser"); if(parser == null) { parser = AUTO_PARSER; } done = false; }
// in contrib/dataimporthandler-extras/src/java/org/apache/solr/handler/dataimport/MailEntityProcessor.java
Override public void init(Context context) { super.init(context); // set attributes using XXX getXXXFromContext(attribute, defualtValue); // applies variable resolver and return default if value is not found or null // REQUIRED : connection and folder info user = getStringFromContext("user", null); password = getStringFromContext("password", null); host = getStringFromContext("host", null); protocol = getStringFromContext("protocol", null); folderNames = getStringFromContext("folders", null); // validate if (host == null || protocol == null || user == null || password == null || folderNames == null) throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "'user|password|protocol|host|folders' are required attributes"); //OPTIONAL : have defaults and are optional recurse = getBoolFromContext("recurse", true); String excludes = getStringFromContext("exclude", ""); if (excludes != null && !excludes.trim().equals("")) { exclude = Arrays.asList(excludes.split(",")); } String includes = getStringFromContext("include", ""); if (includes != null && !includes.trim().equals("")) { include = Arrays.asList(includes.split(",")); } batchSize = getIntFromContext("batchSize", 20); customFilter = getStringFromContext("customFilter", ""); String s = getStringFromContext("fetchMailsSince", ""); if (s != null) try { fetchMailsSince = new SimpleDateFormat("yyyy-MM-dd HH:mm:ss").parse(s); } catch (ParseException e) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Invalid value for fetchMailSince: " + s, e); } fetchSize = getIntFromContext("fetchSize", 32 * 1024); cTimeout = getIntFromContext("connectTimeout", 30 * 1000); rTimeout = getIntFromContext("readTimeout", 60 * 1000); processAttachment = getBoolFromContext( getStringFromContext("processAttachment",null) == null ? "processAttachement":"processAttachment" , true); tika = new Tika(); logConfig(); }
// in contrib/dataimporthandler-extras/src/java/org/apache/solr/handler/dataimport/MailEntityProcessor.java
private boolean connectToMailBox() { try { Properties props = new Properties(); props.setProperty("mail.store.protocol", protocol); props.setProperty("mail.imap.fetchsize", "" + fetchSize); props.setProperty("mail.imap.timeout", "" + rTimeout); props.setProperty("mail.imap.connectiontimeout", "" + cTimeout); Session session = Session.getDefaultInstance(props, null); mailbox = session.getStore(protocol); mailbox.connect(host, user, password); LOG.info("Connected to mailbox"); return true; } catch (MessagingException e) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Connection failed", e); } }
// in contrib/dataimporthandler-extras/src/java/org/apache/solr/handler/dataimport/MailEntityProcessor.java
private void createFilters() { if (fetchMailsSince != null) { filters.add(new MailsSinceLastCheckFilter(fetchMailsSince)); } if (customFilter != null && !customFilter.equals("")) { try { Class cf = Class.forName(customFilter); Object obj = cf.newInstance(); if (obj instanceof CustomFilter) { filters.add((CustomFilter) obj); } } catch (Exception e) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Custom filter could not be created", e); } } }
// in contrib/dataimporthandler-extras/src/java/org/apache/solr/handler/dataimport/MailEntityProcessor.java
private void getTopLevelFolders(Store mailBox) { if (folderNames != null) topLevelFolders = Arrays.asList(folderNames.split(",")); for (int i = 0; topLevelFolders != null && i < topLevelFolders.size(); i++) { try { folders.add(mailbox.getFolder(topLevelFolders.get(i))); } catch (MessagingException e) { // skip bad ones unless its the last one and still no good folder if (folders.size() == 0 && i == topLevelFolders.size() - 1) throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Folder retreival failed"); } } if (topLevelFolders == null || topLevelFolders.size() == 0) { try { folders.add(mailBox.getDefaultFolder()); } catch (MessagingException e) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Folder retreival failed"); } } }
// in contrib/dataimporthandler-extras/src/java/org/apache/solr/handler/dataimport/MailEntityProcessor.java
public boolean hasNext() { boolean hasMore = current < messagesInCurBatch.length; if (!hasMore && doBatching && currentBatch * batchSize < totalInFolder) { // try next batch try { getNextBatch(batchSize, folder); hasMore = current < messagesInCurBatch.length; } catch (MessagingException e) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Message retreival failed", e); } } return hasMore; }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DataImporter.java
private void loadDataConfig(InputSource configFile) { try { DocumentBuilderFactory dbf = DocumentBuilderFactory.newInstance(); // only enable xinclude, if a a SolrCore and SystemId is present (makes no sense otherwise) if (core != null && configFile.getSystemId() != null) { try { dbf.setXIncludeAware(true); dbf.setNamespaceAware(true); } catch( UnsupportedOperationException e ) { LOG.warn( "XML parser doesn't support XInclude option" ); } } DocumentBuilder builder = dbf.newDocumentBuilder(); if (core != null) builder.setEntityResolver(new SystemIdResolver(core.getResourceLoader())); builder.setErrorHandler(XMLLOG); Document document; try { document = builder.parse(configFile); } finally { // some XML parsers are broken and don't close the byte stream (but they should according to spec) IOUtils.closeQuietly(configFile.getByteStream()); } config = readFromXml(document); LOG.info("Data Configuration loaded successfully"); } catch (Exception e) { throw new DataImportHandlerException(SEVERE, "Exception occurred while initializing context", e); } }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DataImporter.java
public DIHConfiguration readFromXml(Document xmlDocument) { DIHConfiguration config; List<Map<String, String >> functions = new ArrayList<Map<String ,String>>(); Script script = null; Map<String, Properties> dataSources = new HashMap<String, Properties>(); NodeList dataConfigTags = xmlDocument.getElementsByTagName("dataConfig"); if(dataConfigTags == null || dataConfigTags.getLength() == 0) { throw new DataImportHandlerException(SEVERE, "the root node '<dataConfig>' is missing"); } Element e = (Element) dataConfigTags.item(0); List<Element> documentTags = ConfigParseUtil.getChildNodes(e, "document"); if (documentTags.isEmpty()) { throw new DataImportHandlerException(SEVERE, "DataImportHandler " + "configuration file must have one <document> node."); } List<Element> scriptTags = ConfigParseUtil.getChildNodes(e, ConfigNameConstants.SCRIPT); if (!scriptTags.isEmpty()) { script = new Script(scriptTags.get(0)); } // Add the provided evaluators List<Element> functionTags = ConfigParseUtil.getChildNodes(e, ConfigNameConstants.FUNCTION); if (!functionTags.isEmpty()) { for (Element element : functionTags) { String func = ConfigParseUtil.getStringAttribute(element, NAME, null); String clz = ConfigParseUtil.getStringAttribute(element, ConfigNameConstants.CLASS, null); if (func == null || clz == null){ throw new DataImportHandlerException( SEVERE, "<function> must have a 'name' and 'class' attributes"); } else { functions.add(ConfigParseUtil.getAllAttributes(element)); } } } List<Element> dataSourceTags = ConfigParseUtil.getChildNodes(e, DATA_SRC); if (!dataSourceTags.isEmpty()) { for (Element element : dataSourceTags) { Properties p = new Properties(); HashMap<String, String> attrs = ConfigParseUtil.getAllAttributes(element); for (Map.Entry<String, String> entry : attrs.entrySet()) { p.setProperty(entry.getKey(), entry.getValue()); } dataSources.put(p.getProperty("name"), p); } } if(dataSources.get(null) == null){ for (Properties properties : dataSources.values()) { dataSources.put(null,properties); break; } } return new DIHConfiguration(documentTags.get(0), this, functions, script, dataSources); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DataImporter.java
DataSource getDataSourceInstance(Entity key, String name, Context ctx) { Properties p = dataSourceProps.get(name); if (p == null) p = config.getDataSources().get(name); if (p == null) p = dataSourceProps.get(null);// for default data source if (p == null) p = config.getDataSources().get(null); if (p == null) throw new DataImportHandlerException(SEVERE, "No dataSource :" + name + " available for entity :" + key.getName()); String type = p.getProperty(TYPE); DataSource dataSrc = null; if (type == null) { dataSrc = new JdbcDataSource(); } else { try { dataSrc = (DataSource) DocBuilder.loadClass(type, getCore()).newInstance(); } catch (Exception e) { wrapAndThrow(SEVERE, e, "Invalid type for data source: " + type); } } try { Properties copyProps = new Properties(); copyProps.putAll(p); Map<String, Object> map = ctx.getRequestParameters(); if (map.containsKey("rows")) { int rows = Integer.parseInt((String) map.get("rows")); if (map.containsKey("start")) { rows += Integer.parseInt((String) map.get("start")); } copyProps.setProperty("maxRows", String.valueOf(rows)); } dataSrc.init(ctx, copyProps); } catch (Exception e) { wrapAndThrow(SEVERE, e, "Failed to initialize DataSource: " + key.getDataSourceName()); } return dataSrc; }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DataImporter.java
private void checkWritablePersistFile(SolrWriter writer) { // File persistFile = propWriter.getPersistFile(); // boolean isWritable = persistFile.exists() ? persistFile.canWrite() : persistFile.getParentFile().canWrite(); if (isDeltaImportSupported && !propWriter.isWritable()) { throw new DataImportHandlerException(SEVERE, "Properties is not writable. Delta imports are supported by data config but will not work."); } }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/XPathEntityProcessor.java
private void initXpathReader() { useSolrAddXml = Boolean.parseBoolean(context .getEntityAttribute(USE_SOLR_ADD_SCHEMA)); streamRows = Boolean.parseBoolean(context .getEntityAttribute(STREAM)); if (context.getResolvedEntityAttribute("batchSize") != null) { blockingQueueSize = Integer.parseInt(context.getEntityAttribute("batchSize")); } if (context.getResolvedEntityAttribute("readTimeOut") != null) { blockingQueueTimeOut = Integer.parseInt(context.getEntityAttribute("readTimeOut")); } String xslt = context.getEntityAttribute(XSL); if (xslt != null) { xslt = context.replaceTokens(xslt); try { // create an instance of TransformerFactory TransformerFactory transFact = TransformerFactory.newInstance(); final SolrCore core = context.getSolrCore(); final StreamSource xsltSource; if (core != null) { final ResourceLoader loader = core.getResourceLoader(); transFact.setURIResolver(new SystemIdResolver(loader).asURIResolver()); xsltSource = new StreamSource(loader.openResource(xslt), SystemIdResolver.createSystemIdFromResourceName(xslt)); } else { // fallback for tests xsltSource = new StreamSource(xslt); } transFact.setErrorListener(xmllog); try { xslTransformer = transFact.newTransformer(xsltSource); } finally { // some XML parsers are broken and don't close the byte stream (but they should according to spec) IOUtils.closeQuietly(xsltSource.getInputStream()); } LOG.info("Using xslTransformer: " + xslTransformer.getClass().getName()); } catch (Exception e) { throw new DataImportHandlerException(SEVERE, "Error initializing XSL ", e); } } if (useSolrAddXml) { // Support solr add documents xpathReader = new XPathRecordReader("/add/doc"); xpathReader.addField("name", "/add/doc/field/@name", true); xpathReader.addField("value", "/add/doc/field", true); } else { String forEachXpath = context.getEntityAttribute(FOR_EACH); if (forEachXpath == null) throw new DataImportHandlerException(SEVERE, "Entity : " + context.getEntityAttribute("name") + " must have a 'forEach' attribute"); try { xpathReader = new XPathRecordReader(forEachXpath); for (Map<String, String> field : context.getAllEntityFields()) { if (field.get(XPATH) == null) continue; int flags = 0; if ("true".equals(field.get("flatten"))) { flags = XPathRecordReader.FLATTEN; } String xpath = field.get(XPATH); xpath = context.replaceTokens(xpath); xpathReader.addField(field.get(DataImporter.COLUMN), xpath, Boolean.parseBoolean(field.get(DataImporter.MULTI_VALUED)), flags); } } catch (RuntimeException e) { throw new DataImportHandlerException(SEVERE, "Exception while reading xpaths for fields", e); } } String url = context.getEntityAttribute(URL); List<String> l = url == null ? Collections.EMPTY_LIST : TemplateString.getVariables(url); for (String s : l) { if (s.startsWith(entityName + ".")) { if (placeHolderVariables == null) placeHolderVariables = new ArrayList<String>(); placeHolderVariables.add(s.substring(entityName.length() + 1)); } } for (Map<String, String> fld : context.getAllEntityFields()) { if (fld.get(COMMON_FIELD) != null && "true".equals(fld.get(COMMON_FIELD))) { if (commonFields == null) commonFields = new ArrayList<String>(); commonFields.add(fld.get(DataImporter.COLUMN)); } } }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DebugLogger.java
Override public DebugInfo pop() { if (size() == 1) throw new DataImportHandlerException( DataImportHandlerException.SEVERE, "Stack is becoming empty"); return super.pop(); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/LineEntityProcessor.java
Override public void init(Context context) { super.init(context); String s; // init a regex to locate files from the input we want to index s = context.getResolvedEntityAttribute(ACCEPT_LINE_REGEX); if (s != null) { acceptLineRegex = Pattern.compile(s); } // init a regex to locate files from the input to be skipped s = context.getResolvedEntityAttribute(SKIP_LINE_REGEX); if (s != null) { skipLineRegex = Pattern.compile(s); } // the FileName is required. url = context.getResolvedEntityAttribute(URL); if (url == null) throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "'"+ URL +"' is a required attribute"); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/LineEntityProcessor.java
Override public Map<String, Object> nextRow() { if (reader == null) { reader = new BufferedReader((Reader) context.getDataSource().getData(url)); } String line; while ( true ) { // read a line from the input file try { line = reader.readLine(); } catch (IOException exp) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Problem reading from input", exp); } if (line == null) return null; // end of input // First scan whole line to see if we want it if (acceptLineRegex != null && ! acceptLineRegex.matcher(line).find()) continue; if (skipLineRegex != null && skipLineRegex.matcher(line).find()) continue; // Contruct the 'row' of fields Map<String, Object> row = new HashMap<String, Object>(); row.put("rawLine", line); return row; } }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/ScriptTransformer.java
private void initEngine(Context context) { String scriptText = context.getScript(); String scriptLang = context.getScriptLanguage(); if (scriptText == null) { throw new DataImportHandlerException(SEVERE, "<script> tag is not present under <dataConfig>"); } ScriptEngineManager scriptEngineMgr = new ScriptEngineManager(); ScriptEngine scriptEngine = scriptEngineMgr.getEngineByName(scriptLang); if (scriptEngine == null) { throw new DataImportHandlerException(SEVERE, "Cannot load Script Engine for language: " + scriptLang); } if (scriptEngine instanceof Invocable) { engine = (Invocable) scriptEngine; } else { throw new DataImportHandlerException(SEVERE, "The installed ScriptEngine for: " + scriptLang + " does not implement Invocable. Class is " + scriptEngine.getClass().getName()); } try { scriptEngine.eval(scriptText); } catch (ScriptException e) { wrapAndThrow(SEVERE, e, "'eval' failed with language: " + scriptLang + " and script: \n" + scriptText); } }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/FieldStreamDataSource.java
Override public InputStream getData(String query) { Object o = wrapper.getVariableResolver().resolve(dataField); if (o == null) { throw new DataImportHandlerException(SEVERE, "No field available for name : " + dataField); } if (o instanceof Blob) { Blob blob = (Blob) o; try { //Most of the JDBC drivers have getBinaryStream defined as public // so let us just check it Method m = blob.getClass().getDeclaredMethod("getBinaryStream"); if (Modifier.isPublic(m.getModifiers())) { return (InputStream) m.invoke(blob); } else { // force invoke m.setAccessible(true); return (InputStream) m.invoke(blob); } } catch (Exception e) { LOG.info("Unable to get data from BLOB"); return null; } } else if (o instanceof byte[]) { byte[] bytes = (byte[]) o; return new ByteArrayInputStream(bytes); } else { throw new RuntimeException("unsupported type : " + o.getClass()); } }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/SimplePropertiesWriter.java
Override public void persist(Properties p) { OutputStream propOutput = null; Properties props = readIndexerProperties(); try { props.putAll(p); String filePath = configDir; if (configDir != null && !configDir.endsWith(File.separator)) filePath += File.separator; filePath += persistFilename; propOutput = new FileOutputStream(filePath); props.store(propOutput, null); log.info("Wrote last indexed time to " + persistFilename); } catch (Exception e) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Unable to persist Index Start Time", e); } finally { try { if (propOutput != null) propOutput.close(); } catch (IOException e) { propOutput = null; } } }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/EvaluatorBag.java
Override public String evaluate(String expression, Context context) { List l = parseParams(expression, context.getVariableResolver()); if (l.size() != 1) { throw new DataImportHandlerException(SEVERE, "'escapeSql' must have at least one parameter "); } String s = l.get(0).toString(); // escape single quote with two single quotes, double quote // with two doule quotes, and backslash with double backslash. // See: http://dev.mysql.com/doc/refman/4.1/en/mysql-real-escape-string.html return s.replaceAll("'", "''").replaceAll("\"", "\"\"").replaceAll("\\\\", "\\\\\\\\"); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/EvaluatorBag.java
Override public String evaluate(String expression, Context context) { List l = parseParams(expression, context.getVariableResolver()); if (l.size() != 1) { throw new DataImportHandlerException(SEVERE, "'escapeQueryChars' must have at least one parameter "); } String s = l.get(0).toString(); return ClientUtils.escapeQueryChars(s); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/EvaluatorBag.java
Override public String evaluate(String expression, Context context) { List l = parseParams(expression, context.getVariableResolver()); if (l.size() != 1) { throw new DataImportHandlerException(SEVERE, "'encodeUrl' must have at least one parameter "); } String s = l.get(0).toString(); try { return URLEncoder.encode(s.toString(), "UTF-8"); } catch (Exception e) { wrapAndThrow(SEVERE, e, "Unable to encode expression: " + expression + " with value: " + s); return null; } }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/EvaluatorBag.java
Override public String evaluate(String expression, Context context) { List l = parseParams(expression, context.getVariableResolver()); if (l.size() != 2) { throw new DataImportHandlerException(SEVERE, "'formatDate()' must have two parameters "); } Object o = l.get(0); Object format = l.get(1); if (format instanceof VariableWrapper) { VariableWrapper wrapper = (VariableWrapper) format; o = wrapper.resolve(); if (o == null) { format = wrapper.varName; LOG.warn("Deprecated syntax used. The syntax of formatDate has been changed to formatDate(<var>, '<date_format_string>'). " + "The old syntax will stop working in Solr 1.5"); } else { format = o.toString(); } } String dateFmt = format.toString(); SimpleDateFormat fmt = new SimpleDateFormat(dateFmt); Date date = null; if (o instanceof VariableWrapper) { VariableWrapper variableWrapper = (VariableWrapper) o; Object variableval = variableWrapper.resolve(); if (variableval instanceof Date) { date = (Date) variableval; } else { String s = variableval.toString(); try { date = DataImporter.DATE_TIME_FORMAT.get().parse(s); } catch (ParseException exp) { wrapAndThrow(SEVERE, exp, "Invalid expression for date"); } } } else { String datemathfmt = o.toString(); datemathfmt = datemathfmt.replaceAll("NOW", ""); try { date = dateMathParser.parseMath(datemathfmt); } catch (ParseException e) { wrapAndThrow(SEVERE, e, "Invalid expression for date"); } } return fmt.format(date); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/EvaluatorBag.java
public static List parseParams(String expression, VariableResolver vr) { List result = new ArrayList(); expression = expression.trim(); String[] ss = expression.split(","); for (int i = 0; i < ss.length; i++) { ss[i] = ss[i].trim(); if (ss[i].startsWith("'")) {//a string param has started StringBuilder sb = new StringBuilder(); while (true) { sb.append(ss[i]); if (ss[i].endsWith("'")) break; i++; if (i >= ss.length) throw new DataImportHandlerException(SEVERE, "invalid string at " + ss[i - 1] + " in function params: " + expression); sb.append(","); } String s = sb.substring(1, sb.length() - 1); s = s.replaceAll("\\\\'", "'"); result.add(s); } else { if (Character.isDigit(ss[i].charAt(0))) { try { Double doub = Double.parseDouble(ss[i]); result.add(doub); } catch (NumberFormatException e) { if (vr.resolve(ss[i]) == null) { wrapAndThrow( SEVERE, e, "Invalid number :" + ss[i] + "in parameters " + expression); } } } else { result.add(new VariableWrapper(ss[i], vr)); } } } return result; }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/JdbcDataSource.java
protected Callable<Connection> createConnectionFactory(final Context context, final Properties initProps) { // final VariableResolver resolver = context.getVariableResolver(); resolveVariables(context, initProps); final String jndiName = initProps.getProperty(JNDI_NAME); final String url = initProps.getProperty(URL); final String driver = initProps.getProperty(DRIVER); if (url == null && jndiName == null) throw new DataImportHandlerException(SEVERE, "JDBC URL or JNDI name has to be specified"); if (driver != null) { try { DocBuilder.loadClass(driver, context.getSolrCore()); } catch (ClassNotFoundException e) { wrapAndThrow(SEVERE, e, "Could not load driver: " + driver); } } else { if(jndiName == null){ throw new DataImportHandlerException(SEVERE, "One of driver or jndiName must be specified in the data source"); } } String s = initProps.getProperty("maxRows"); if (s != null) { maxRows = Integer.parseInt(s); } return factory = new Callable<Connection>() { public Connection call() throws Exception { LOG.info("Creating a connection for entity " + context.getEntityAttribute(DataImporter.NAME) + " with URL: " + url); long start = System.currentTimeMillis(); Connection c = null; try { if(url != null){ c = DriverManager.getConnection(url, initProps); } else if(jndiName != null){ InitialContext ctx = new InitialContext(); Object jndival = ctx.lookup(jndiName); if (jndival instanceof javax.sql.DataSource) { javax.sql.DataSource dataSource = (javax.sql.DataSource) jndival; String user = (String) initProps.get("user"); String pass = (String) initProps.get("password"); if(user == null || user.trim().equals("")){ c = dataSource.getConnection(); } else { c = dataSource.getConnection(user, pass); } } else { throw new DataImportHandlerException(SEVERE, "the jndi name : '"+jndiName +"' is not a valid javax.sql.DataSource"); } } } catch (SQLException e) { // DriverManager does not allow you to use a driver which is not loaded through // the class loader of the class which is trying to make the connection. // This is a workaround for cases where the user puts the driver jar in the // solr.home/lib or solr.home/core/lib directories. Driver d = (Driver) DocBuilder.loadClass(driver, context.getSolrCore()).newInstance(); c = d.connect(url, initProps); } if (c != null) { if (Boolean.parseBoolean(initProps.getProperty("readOnly"))) { c.setReadOnly(true); // Add other sane defaults c.setAutoCommit(true); c.setTransactionIsolation(Connection.TRANSACTION_READ_UNCOMMITTED); c.setHoldability(ResultSet.CLOSE_CURSORS_AT_COMMIT); } if (!Boolean.parseBoolean(initProps.getProperty("autoCommit"))) { c.setAutoCommit(false); } String transactionIsolation = initProps.getProperty("transactionIsolation"); if ("TRANSACTION_READ_UNCOMMITTED".equals(transactionIsolation)) { c.setTransactionIsolation(Connection.TRANSACTION_READ_UNCOMMITTED); } else if ("TRANSACTION_READ_COMMITTED".equals(transactionIsolation)) { c.setTransactionIsolation(Connection.TRANSACTION_READ_COMMITTED); } else if ("TRANSACTION_REPEATABLE_READ".equals(transactionIsolation)) { c.setTransactionIsolation(Connection.TRANSACTION_REPEATABLE_READ); } else if ("TRANSACTION_SERIALIZABLE".equals(transactionIsolation)) { c.setTransactionIsolation(Connection.TRANSACTION_SERIALIZABLE); } else if ("TRANSACTION_NONE".equals(transactionIsolation)) { c.setTransactionIsolation(Connection.TRANSACTION_NONE); } String holdability = initProps.getProperty("holdability"); if ("CLOSE_CURSORS_AT_COMMIT".equals(holdability)) { c.setHoldability(ResultSet.CLOSE_CURSORS_AT_COMMIT); } else if ("HOLD_CURSORS_OVER_COMMIT".equals(holdability)) { c.setHoldability(ResultSet.HOLD_CURSORS_OVER_COMMIT); } } LOG.info("Time taken for getConnection(): " + (System.currentTimeMillis() - start)); return c; } }; }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/JdbcDataSource.java
public Connection call() throws Exception { LOG.info("Creating a connection for entity " + context.getEntityAttribute(DataImporter.NAME) + " with URL: " + url); long start = System.currentTimeMillis(); Connection c = null; try { if(url != null){ c = DriverManager.getConnection(url, initProps); } else if(jndiName != null){ InitialContext ctx = new InitialContext(); Object jndival = ctx.lookup(jndiName); if (jndival instanceof javax.sql.DataSource) { javax.sql.DataSource dataSource = (javax.sql.DataSource) jndival; String user = (String) initProps.get("user"); String pass = (String) initProps.get("password"); if(user == null || user.trim().equals("")){ c = dataSource.getConnection(); } else { c = dataSource.getConnection(user, pass); } } else { throw new DataImportHandlerException(SEVERE, "the jndi name : '"+jndiName +"' is not a valid javax.sql.DataSource"); } } } catch (SQLException e) { // DriverManager does not allow you to use a driver which is not loaded through // the class loader of the class which is trying to make the connection. // This is a workaround for cases where the user puts the driver jar in the // solr.home/lib or solr.home/core/lib directories. Driver d = (Driver) DocBuilder.loadClass(driver, context.getSolrCore()).newInstance(); c = d.connect(url, initProps); } if (c != null) { if (Boolean.parseBoolean(initProps.getProperty("readOnly"))) { c.setReadOnly(true); // Add other sane defaults c.setAutoCommit(true); c.setTransactionIsolation(Connection.TRANSACTION_READ_UNCOMMITTED); c.setHoldability(ResultSet.CLOSE_CURSORS_AT_COMMIT); } if (!Boolean.parseBoolean(initProps.getProperty("autoCommit"))) { c.setAutoCommit(false); } String transactionIsolation = initProps.getProperty("transactionIsolation"); if ("TRANSACTION_READ_UNCOMMITTED".equals(transactionIsolation)) { c.setTransactionIsolation(Connection.TRANSACTION_READ_UNCOMMITTED); } else if ("TRANSACTION_READ_COMMITTED".equals(transactionIsolation)) { c.setTransactionIsolation(Connection.TRANSACTION_READ_COMMITTED); } else if ("TRANSACTION_REPEATABLE_READ".equals(transactionIsolation)) { c.setTransactionIsolation(Connection.TRANSACTION_REPEATABLE_READ); } else if ("TRANSACTION_SERIALIZABLE".equals(transactionIsolation)) { c.setTransactionIsolation(Connection.TRANSACTION_SERIALIZABLE); } else if ("TRANSACTION_NONE".equals(transactionIsolation)) { c.setTransactionIsolation(Connection.TRANSACTION_NONE); } String holdability = initProps.getProperty("holdability"); if ("CLOSE_CURSORS_AT_COMMIT".equals(holdability)) { c.setHoldability(ResultSet.CLOSE_CURSORS_AT_COMMIT); } else if ("HOLD_CURSORS_OVER_COMMIT".equals(holdability)) { c.setHoldability(ResultSet.HOLD_CURSORS_OVER_COMMIT); } } LOG.info("Time taken for getConnection(): " + (System.currentTimeMillis() - start)); return c; }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DIHCacheSupport.java
private DIHCache instantiateCache(Context context) { DIHCache cache = null; try { @SuppressWarnings("unchecked") Class<DIHCache> cacheClass = DocBuilder.loadClass(cacheImplName, context .getSolrCore()); Constructor<DIHCache> constr = cacheClass.getConstructor(); cache = constr.newInstance(); cache.open(context); } catch (Exception e) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Unable to load Cache implementation:" + cacheImplName, e); } return cache; }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DIHCacheSupport.java
protected Map<String,Object> getIdCacheData(Context context, String query, Iterator<Map<String,Object>> rowIterator) { Object key = context.resolve(cacheForeignKey); if (key == null) { throw new DataImportHandlerException(DataImportHandlerException.WARN, "The cache lookup value : " + cacheForeignKey + " is resolved to be null in the entity :" + context.getEntityAttribute("name")); } if (dataSourceRowCache == null) { DIHCache cache = queryVsCache.get(query); if (cache == null) { cache = instantiateCache(context); queryVsCache.put(query, cache); populateCache(query, rowIterator); } dataSourceRowCache = cache.iterator(key); } return getFromRowCacheTransformed(); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DataImportHandlerException.java
public static void wrapAndThrow(int err, Exception e) { if (e instanceof DataImportHandlerException) { throw (DataImportHandlerException) e; } else { throw new DataImportHandlerException(err, e); } }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DataImportHandlerException.java
public static void wrapAndThrow(int err, Exception e, String msg) { if (e instanceof DataImportHandlerException) { throw (DataImportHandlerException) e; } else { throw new DataImportHandlerException(err, msg, e); } }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/BinContentStreamDataSource.java
Override public InputStream getData(String query) { contentStream = context.getDocBuilder().getReqParams().getContentStream(); if (contentStream == null) throw new DataImportHandlerException(SEVERE, "No stream available. The request has no body"); try { return in = contentStream.getStream(); } catch (IOException e) { DataImportHandlerException.wrapAndThrow(SEVERE, e); return null; } }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/SolrWriter.java
Override public void close() { try { processor.finish(); } catch (IOException e) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Unable to call finish() on UpdateRequestProcessor", e); } }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/SolrWriter.java
Override public void doDeleteAll() { try { DeleteUpdateCommand deleteCommand = new DeleteUpdateCommand(req); deleteCommand.query = "*:*"; processor.processDelete(deleteCommand); } catch (IOException e) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Exception in full dump while deleting all documents.", e); } }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/HTMLStripTransformer.java
private Object stripHTML(String value, String column) { StringBuilder out = new StringBuilder(); StringReader strReader = new StringReader(value); try { HTMLStripCharFilter html = new HTMLStripCharFilter(CharReader.get(strReader.markSupported() ? strReader : new BufferedReader(strReader))); char[] cbuf = new char[1024 * 10]; while (true) { int count = html.read(cbuf); if (count == -1) break; // end of stream mark is -1 if (count > 0) out.append(cbuf, 0, count); } html.close(); } catch (IOException e) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Failed stripping HTML for column: " + column, e); } return out.toString(); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/URLDataSource.java
Override public Reader getData(String query) { URL url = null; try { if (URIMETHOD.matcher(query).find()) url = new URL(query); else url = new URL(baseUrl + query); LOG.debug("Accessing URL: " + url.toString()); URLConnection conn = url.openConnection(); conn.setConnectTimeout(connectionTimeout); conn.setReadTimeout(readTimeout); InputStream in = conn.getInputStream(); String enc = encoding; if (enc == null) { String cType = conn.getContentType(); if (cType != null) { Matcher m = CHARSET_PATTERN.matcher(cType); if (m.find()) { enc = m.group(1); } } } if (enc == null) enc = UTF_8; DataImporter.QUERY_COUNT.get().incrementAndGet(); return new InputStreamReader(in, enc); } catch (Exception e) { LOG.error("Exception thrown while getting data", e); throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Exception in invoking url " + url, e); } }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/FileListEntityProcessor.java
Override public void init(Context context) { super.init(context); fileName = context.getEntityAttribute(FILE_NAME); if (fileName != null) { fileName = context.replaceTokens(fileName); fileNamePattern = Pattern.compile(fileName); } baseDir = context.getEntityAttribute(BASE_DIR); if (baseDir == null) throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "'baseDir' is a required attribute"); baseDir = context.replaceTokens(baseDir); File dir = new File(baseDir); if (!dir.isDirectory()) throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "'baseDir' value: " + baseDir + " is not a directory"); String r = context.getEntityAttribute(RECURSIVE); if (r != null) recursive = Boolean.parseBoolean(r); excludes = context.getEntityAttribute(EXCLUDES); if (excludes != null) { excludes = context.replaceTokens(excludes); excludesPattern = Pattern.compile(excludes); } }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/FileListEntityProcessor.java
private Date getDate(String dateStr) { if (dateStr == null) return null; Matcher m = PLACE_HOLDER_PATTERN.matcher(dateStr); if (m.find()) { Object o = context.resolve(m.group(1)); if (o instanceof Date) return (Date)o; dateStr = (String) o; } else { dateStr = context.replaceTokens(dateStr); } m = EvaluatorBag.IN_SINGLE_QUOTES.matcher(dateStr); if (m.find()) { String expr = null; expr = m.group(1).replaceAll("NOW", ""); try { return EvaluatorBag.dateMathParser.parseMath(expr); } catch (ParseException exp) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Invalid expression for date", exp); } } try { return DataImporter.DATE_TIME_FORMAT.get().parse(dateStr); } catch (ParseException exp) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Invalid expression for date", exp); } }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/SolrEntityProcessor.java
Override protected void firstInit(Context context) { super.firstInit(context); try { String serverPath = context.getResolvedEntityAttribute(SOLR_SERVER); if (serverPath == null) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "SolrEntityProcessor: parameter 'url' is required"); } HttpClient client = getHttpClient(); URL url = new URL(serverPath); // (wt="javabin|xml") default is javabin if ("xml".equals(context.getResolvedEntityAttribute(CommonParams.WT))) { solrServer = new HttpSolrServer(url.toExternalForm(), client, new XMLResponseParser()); LOG.info("using XMLResponseParser"); } else { solrServer = new HttpSolrServer(url.toExternalForm(), client); LOG.info("using BinaryResponseParser"); } } catch (MalformedURLException e) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, e); } }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/SolrEntityProcessor.java
protected SolrDocumentList doQuery(int start) { this.queryString = context.getResolvedEntityAttribute(QUERY); if (this.queryString == null) { throw new DataImportHandlerException( DataImportHandlerException.SEVERE, "SolrEntityProcessor: parameter 'query' is required" ); } String rowsP = context.getResolvedEntityAttribute(CommonParams.ROWS); if (rowsP != null) { rows = Integer.parseInt(rowsP); } String fqAsString = context.getResolvedEntityAttribute(CommonParams.FQ); if (fqAsString != null) { this.filterQueries = fqAsString.split(","); } String fieldsAsString = context.getResolvedEntityAttribute(CommonParams.FL); if (fieldsAsString != null) { this.fields = fieldsAsString.split(","); } this.queryType = context.getResolvedEntityAttribute(CommonParams.QT); String timeoutAsString = context.getResolvedEntityAttribute(TIMEOUT); if (timeoutAsString != null) { this.timeout = Integer.parseInt(timeoutAsString); } SolrQuery solrQuery = new SolrQuery(queryString); solrQuery.setRows(rows); solrQuery.setStart(start); if (fields != null) { for (String field : fields) { solrQuery.addField(field); } } solrQuery.setQueryType(queryType); solrQuery.setFilterQueries(filterQueries); solrQuery.setTimeAllowed(timeout * 1000); QueryResponse response = null; try { response = solrServer.query(solrQuery); } catch (SolrServerException e) { if (ABORT.equals(onError)) { wrapAndThrow(SEVERE, e); } else if (SKIP.equals(onError)) { wrapAndThrow(DataImportHandlerException.SKIP_ROW, e); } } return response == null ? null : response.getResults(); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/ContentStreamDataSource.java
Override public Reader getData(String query) { contentStream = context.getDocBuilder().getReqParams().getContentStream(); if (contentStream == null) throw new DataImportHandlerException(SEVERE, "No stream available. The request has no body"); try { return reader = contentStream.getReader(); } catch (IOException e) { DataImportHandlerException.wrapAndThrow(SEVERE, e); return null; } }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/FieldReaderDataSource.java
Override public Reader getData(String query) { Object o = entityProcessor.getVariableResolver().resolve(dataField); if (o == null) { throw new DataImportHandlerException (SEVERE, "No field available for name : " +dataField); } if (o instanceof String) { return new StringReader((String) o); } else if (o instanceof Clob) { Clob clob = (Clob) o; try { //Most of the JDBC drivers have getCharacterStream defined as public // so let us just check it return readCharStream(clob); } catch (Exception e) { LOG.info("Unable to get data from CLOB"); return null; } } else if (o instanceof Blob) { Blob blob = (Blob) o; try { return getReader(blob); } catch (Exception e) { LOG.info("Unable to get data from BLOB"); return null; } } else { return new StringReader(o.toString()); } }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DocBuilder.java
private void handleSpecialCommands(Map<String, Object> arow, DocWrapper doc) { Object value = arow.get("$deleteDocById"); if (value != null) { if (value instanceof Collection) { Collection collection = (Collection) value; for (Object o : collection) { writer.deleteDoc(o.toString()); importStatistics.deletedDocCount.incrementAndGet(); } } else { writer.deleteDoc(value); importStatistics.deletedDocCount.incrementAndGet(); } } value = arow.get("$deleteDocByQuery"); if (value != null) { if (value instanceof Collection) { Collection collection = (Collection) value; for (Object o : collection) { writer.deleteByQuery(o.toString()); importStatistics.deletedDocCount.incrementAndGet(); } } else { writer.deleteByQuery(value.toString()); importStatistics.deletedDocCount.incrementAndGet(); } } value = arow.get("$docBoost"); if (value != null) { float value1 = 1.0f; if (value instanceof Number) { value1 = ((Number) value).floatValue(); } else { value1 = Float.parseFloat(value.toString()); } doc.setDocumentBoost(value1); } value = arow.get("$skipDoc"); if (value != null) { if (Boolean.parseBoolean(value.toString())) { throw new DataImportHandlerException(DataImportHandlerException.SKIP, "Document skipped :" + arow); } } value = arow.get("$skipRow"); if (value != null) { if (Boolean.parseBoolean(value.toString())) { throw new DataImportHandlerException(DataImportHandlerException.SKIP_ROW); } } }
26
              
// in contrib/dataimporthandler-extras/src/java/org/apache/solr/handler/dataimport/MailEntityProcessor.java
catch (ParseException e) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Invalid value for fetchMailSince: " + s, e); }
// in contrib/dataimporthandler-extras/src/java/org/apache/solr/handler/dataimport/MailEntityProcessor.java
catch (MessagingException e) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Connection failed", e); }
// in contrib/dataimporthandler-extras/src/java/org/apache/solr/handler/dataimport/MailEntityProcessor.java
catch (Exception e) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Custom filter could not be created", e); }
// in contrib/dataimporthandler-extras/src/java/org/apache/solr/handler/dataimport/MailEntityProcessor.java
catch (MessagingException e) { // skip bad ones unless its the last one and still no good folder if (folders.size() == 0 && i == topLevelFolders.size() - 1) throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Folder retreival failed"); }
// in contrib/dataimporthandler-extras/src/java/org/apache/solr/handler/dataimport/MailEntityProcessor.java
catch (MessagingException e) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Folder retreival failed"); }
// in contrib/dataimporthandler-extras/src/java/org/apache/solr/handler/dataimport/MailEntityProcessor.java
catch (MessagingException e) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Message retreival failed", e); }
// in contrib/dataimporthandler-extras/src/java/org/apache/solr/handler/dataimport/MailEntityProcessor.java
catch (MessagingException e) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Message retreival failed", e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DataImporter.java
catch (Exception e) { throw new DataImportHandlerException(SEVERE, "Exception occurred while initializing context", e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/XPathEntityProcessor.java
catch (Exception e) { throw new DataImportHandlerException(SEVERE, "Error initializing XSL ", e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/XPathEntityProcessor.java
catch (RuntimeException e) { throw new DataImportHandlerException(SEVERE, "Exception while reading xpaths for fields", e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/LineEntityProcessor.java
catch (IOException exp) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Problem reading from input", exp); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/SqlEntityProcessor.java
catch (Exception e) { LOG.error( "The query failed '" + q + "'", e); throw new DataImportHandlerException(DataImportHandlerException.SEVERE, e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/SimplePropertiesWriter.java
catch (Exception e) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Unable to persist Index Start Time", e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DIHCacheSupport.java
catch (Exception e) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Unable to load Cache implementation:" + cacheImplName, e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/NumberFormatTransformer.java
catch (ParseException e) { throw new DataImportHandlerException( DataImportHandlerException.SEVERE, "Failed to apply NumberFormat on column: " + column, e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/NumberFormatTransformer.java
catch (ParseException e) { throw new DataImportHandlerException( DataImportHandlerException.SEVERE, "Failed to apply NumberFormat on column: " + column, e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/SolrWriter.java
catch (IOException e) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Unable to call finish() on UpdateRequestProcessor", e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/SolrWriter.java
catch (IOException e) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Exception in full dump while deleting all documents.", e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/EntityProcessorWrapper.java
catch (Exception e) { log.warn("method invocation failed on transformer : " + trans, e); throw new DataImportHandlerException(WARN, e);
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/HTMLStripTransformer.java
catch (IOException e) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Failed stripping HTML for column: " + column, e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/URLDataSource.java
catch (Exception e) { LOG.error("Exception thrown while getting data", e); throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Exception in invoking url " + url, e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/FileListEntityProcessor.java
catch (ParseException exp) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Invalid expression for date", exp); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/FileListEntityProcessor.java
catch (ParseException exp) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Invalid expression for date", exp); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/SolrEntityProcessor.java
catch (MalformedURLException e) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DocBuilder.java
catch (Exception e) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Unable to load Writer implementation:" + writerClassStr, e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DocBuilder.java
catch (Throwable t) { if (verboseDebug) { getDebugLogger().log(DIHLogLevels.ENTITY_EXCEPTION, epw.getEntity().getName(), t); } throw new DataImportHandlerException(DataImportHandlerException.SEVERE, t); }
0
(Lib) InitializationException 52
              
// in core/src/java/org/apache/solr/analysis/PhoneticFilterFactory.java
Override public void init(Map<String,String> args) { super.init( args ); inject = getBoolean(INJECT, true); String name = args.get( ENCODER ); if( name == null ) { throw new InitializationException("Missing required parameter: " + ENCODER + " [" + registry.keySet() + "]"); } clazz = registry.get(name.toUpperCase(Locale.ENGLISH)); if( clazz == null ) { clazz = resolveEncoder(name); } String v = args.get(MAX_CODE_LENGTH); if (v != null) { maxCodeLength = Integer.valueOf(v); try { setMaxCodeLenMethod = clazz.getMethod("setMaxCodeLen", int.class); } catch (Exception e) { throw new InitializationException("Encoder " + name + " / " + clazz + " does not support " + MAX_CODE_LENGTH, e); } } getEncoder();//trigger initialization for potential problems to be thrown now }
// in core/src/java/org/apache/solr/analysis/PhoneticFilterFactory.java
private Class<? extends Encoder> resolveEncoder(String name) { String lookupName = name; if (name.indexOf('.') == -1) { lookupName = PACKAGE_CONTAINING_ENCODERS + name; } try { return Class.forName(lookupName).asSubclass(Encoder.class); } catch (ClassNotFoundException cnfe) { throw new InitializationException("Unknown encoder: " + name + " must be full class name or one of " + registry.keySet(), cnfe); } catch (ClassCastException e) { throw new InitializationException("Not an encoder: " + name + " must be full class name or one of " + registry.keySet(), e); } }
// in core/src/java/org/apache/solr/analysis/PhoneticFilterFactory.java
protected Encoder getEncoder() { // Unfortunately, Commons-Codec doesn't offer any thread-safe guarantees so we must play it safe and instantiate // every time. A simple benchmark showed this as negligible. try { Encoder encoder = clazz.newInstance(); // Try to set the maxCodeLength if(maxCodeLength != null && setMaxCodeLenMethod != null) { setMaxCodeLenMethod.invoke(encoder, maxCodeLength); } return encoder; } catch (Exception e) { final Throwable t = (e instanceof InvocationTargetException) ? e.getCause() : e; throw new InitializationException("Error initializing encoder: " + name + " / " + clazz, t); } }
// in core/src/java/org/apache/solr/analysis/HunspellStemFilterFactory.java
public void inform(ResourceLoader loader) { assureMatchVersion(); String dictionaryFiles[] = args.get(PARAM_DICTIONARY).split(","); String affixFile = args.get(PARAM_AFFIX); String pic = args.get(PARAM_IGNORE_CASE); if(pic != null) { if(pic.equalsIgnoreCase(TRUE)) ignoreCase = true; else if(pic.equalsIgnoreCase(FALSE)) ignoreCase = false; else throw new InitializationException("Unknown value for " + PARAM_IGNORE_CASE + ": " + pic + ". Must be true or false"); } try { List<InputStream> dictionaries = new ArrayList<InputStream>(); for (String file : dictionaryFiles) { dictionaries.add(loader.openResource(file)); } this.dictionary = new HunspellDictionary(loader.openResource(affixFile), dictionaries, luceneMatchVersion, ignoreCase); } catch (Exception e) { throw new InitializationException("Unable to load hunspell data! [dictionary=" + args.get("dictionary") + ",affix=" + affixFile + "]", e); } }
// in core/src/java/org/apache/solr/analysis/SynonymFilterFactory.java
Override public void inform(ResourceLoader loader) { final boolean ignoreCase = getBoolean("ignoreCase", false); this.ignoreCase = ignoreCase; String tf = args.get("tokenizerFactory"); final TokenizerFactory factory = tf == null ? null : loadTokenizerFactory(loader, tf); Analyzer analyzer = new Analyzer() { @Override protected TokenStreamComponents createComponents(String fieldName, Reader reader) { Tokenizer tokenizer = factory == null ? new WhitespaceTokenizer(Version.LUCENE_50, reader) : factory.create(reader); TokenStream stream = ignoreCase ? new LowerCaseFilter(Version.LUCENE_50, tokenizer) : tokenizer; return new TokenStreamComponents(tokenizer, stream); } }; String format = args.get("format"); try { if (format == null || format.equals("solr")) { // TODO: expose dedup as a parameter? map = loadSolrSynonyms(loader, true, analyzer); } else if (format.equals("wordnet")) { map = loadWordnetSynonyms(loader, true, analyzer); } else { // TODO: somehow make this more pluggable throw new InitializationException("Unrecognized synonyms format: " + format); } } catch (Exception e) { throw new InitializationException("Exception thrown while loading synonyms", e); } if (map.fst == null) { log.warn("Synonyms loaded with " + args + " has empty rule set!"); } }
// in core/src/java/org/apache/solr/analysis/SynonymFilterFactory.java
private SynonymMap loadSolrSynonyms(ResourceLoader loader, boolean dedup, Analyzer analyzer) throws IOException, ParseException { final boolean expand = getBoolean("expand", true); String synonyms = args.get("synonyms"); if (synonyms == null) throw new InitializationException("Missing required argument 'synonyms'."); CharsetDecoder decoder = Charset.forName("UTF-8").newDecoder() .onMalformedInput(CodingErrorAction.REPORT) .onUnmappableCharacter(CodingErrorAction.REPORT); SolrSynonymParser parser = new SolrSynonymParser(dedup, expand, analyzer); File synonymFile = new File(synonyms); if (synonymFile.exists()) { decoder.reset(); parser.add(new InputStreamReader(loader.openResource(synonyms), decoder)); } else { List<String> files = StrUtils.splitFileNames(synonyms); for (String file : files) { decoder.reset(); parser.add(new InputStreamReader(loader.openResource(file), decoder)); } } return parser.build(); }
// in core/src/java/org/apache/solr/analysis/SynonymFilterFactory.java
private SynonymMap loadWordnetSynonyms(ResourceLoader loader, boolean dedup, Analyzer analyzer) throws IOException, ParseException { final boolean expand = getBoolean("expand", true); String synonyms = args.get("synonyms"); if (synonyms == null) throw new InitializationException("Missing required argument 'synonyms'."); CharsetDecoder decoder = Charset.forName("UTF-8").newDecoder() .onMalformedInput(CodingErrorAction.REPORT) .onUnmappableCharacter(CodingErrorAction.REPORT); WordnetSynonymParser parser = new WordnetSynonymParser(dedup, expand, analyzer); File synonymFile = new File(synonyms); if (synonymFile.exists()) { decoder.reset(); parser.add(new InputStreamReader(loader.openResource(synonyms), decoder)); } else { List<String> files = StrUtils.splitFileNames(synonyms); for (String file : files) { decoder.reset(); parser.add(new InputStreamReader(loader.openResource(file), decoder)); } } return parser.build(); }
// in core/src/java/org/apache/solr/analysis/ElisionFilterFactory.java
public void inform(ResourceLoader loader) { String articlesFile = args.get("articles"); boolean ignoreCase = getBoolean("ignoreCase", false); if (articlesFile != null) { try { articles = getWordSet(loader, articlesFile, ignoreCase); } catch (IOException e) { throw new InitializationException("IOException thrown while loading articles", e); } } }
// in core/src/java/org/apache/solr/analysis/TrimFilterFactory.java
Override public void init(Map<String,String> args) { super.init( args ); String v = args.get( "updateOffsets" ); if( v != null ) { try { updateOffsets = Boolean.valueOf( v ); } catch( Exception ex ) { throw new InitializationException("Error reading updateOffsets value. Must be true or false.", ex); } } }
// in core/src/java/org/apache/solr/analysis/GreekLowerCaseFilterFactory.java
Override public void init(Map<String, String> args) { super.init(args); assureMatchVersion(); if (args.containsKey("charset")) throw new InitializationException( "The charset parameter is no longer supported. " + "Please process your documents as Unicode instead."); }
// in core/src/java/org/apache/solr/analysis/HyphenationCompoundWordTokenFilterFactory.java
Override public void init(Map<String, String> args) { super.init(args); assureMatchVersion(); dictFile = args.get("dictionary"); if (args.containsKey("encoding")) encoding = args.get("encoding"); hypFile = args.get("hyphenator"); if (null == hypFile) { throw new InitializationException("Missing required parameter: hyphenator"); } minWordSize = getInt("minWordSize", CompoundWordTokenFilterBase.DEFAULT_MIN_WORD_SIZE); minSubwordSize = getInt("minSubwordSize", CompoundWordTokenFilterBase.DEFAULT_MIN_SUBWORD_SIZE); maxSubwordSize = getInt("maxSubwordSize", CompoundWordTokenFilterBase.DEFAULT_MAX_SUBWORD_SIZE); onlyLongestMatch = getBoolean("onlyLongestMatch", false); }
// in core/src/java/org/apache/solr/analysis/HyphenationCompoundWordTokenFilterFactory.java
public void inform(ResourceLoader loader) { InputStream stream = null; try { if (dictFile != null) // the dictionary can be empty. dictionary = getWordSet(loader, dictFile, false); // TODO: Broken, because we cannot resolve real system id // ResourceLoader should also supply method like ClassLoader to get resource URL stream = loader.openResource(hypFile); final InputSource is = new InputSource(stream); is.setEncoding(encoding); // if it's null let xml parser decide is.setSystemId(hypFile); hyphenator = HyphenationCompoundWordTokenFilter.getHyphenationTree(is); } catch (Exception e) { // TODO: getHyphenationTree really shouldn't throw "Exception" throw new InitializationException("Exception thrown while loading dictionary and hyphenation file", e); } finally { IOUtils.closeQuietly(stream); } }
// in core/src/java/org/apache/solr/analysis/StemmerOverrideFilterFactory.java
public void inform(ResourceLoader loader) { String dictionaryFiles = args.get("dictionary"); ignoreCase = getBoolean("ignoreCase", false); if (dictionaryFiles != null) { assureMatchVersion(); List<String> files = StrUtils.splitFileNames(dictionaryFiles); try { if (files.size() > 0) { dictionary = new CharArrayMap<String>(luceneMatchVersion, files.size() * 10, ignoreCase); for (String file : files) { List<String> list = loader.getLines(file.trim()); for (String line : list) { String[] mapping = line.split("\t", 2); dictionary.put(mapping[0], mapping[1]); } } } } catch (IOException e) { throw new InitializationException("IOException thrown while loading dictionary", e); } } }
// in core/src/java/org/apache/solr/analysis/JapaneseKatakanaStemFilterFactory.java
Override public void init(Map<String, String> args) { super.init(args); minimumLength = getInt(MINIMUM_LENGTH_PARAM, JapaneseKatakanaStemFilter.DEFAULT_MINIMUM_LENGTH); if (minimumLength < 2) { throw new InitializationException("Illegal " + MINIMUM_LENGTH_PARAM + " " + minimumLength + " (must be 2 or greater)"); } }
// in core/src/java/org/apache/solr/analysis/KeywordMarkerFilterFactory.java
public void inform(ResourceLoader loader) { String wordFiles = args.get(PROTECTED_TOKENS); ignoreCase = getBoolean("ignoreCase", false); if (wordFiles != null) { try { protectedWords = getWordSet(loader, wordFiles, ignoreCase); } catch (IOException e) { throw new InitializationException("IOException thrown while loading protected words", e); } } }
// in core/src/java/org/apache/solr/analysis/SnowballPorterFilterFactory.java
public void inform(ResourceLoader loader) { String wordFiles = args.get(PROTECTED_TOKENS); if (wordFiles != null) { try { protectedWords = getWordSet(loader, wordFiles, false); } catch (IOException e) { throw new InitializationException("IOException thrown while loading protected words", e); } } }
// in core/src/java/org/apache/solr/analysis/SnowballPorterFilterFactory.java
Override public void init(Map<String, String> args) { super.init(args); final String cfgLanguage = args.get("language"); if(cfgLanguage!=null) language = cfgLanguage; try { stemClass = Class.forName("org.tartarus.snowball.ext." + language + "Stemmer"); } catch (ClassNotFoundException e) { throw new InitializationException("Can't find class for stemmer language " + language, e); } }
// in core/src/java/org/apache/solr/analysis/SnowballPorterFilterFactory.java
public TokenFilter create(TokenStream input) { SnowballProgram program; try { program = (SnowballProgram)stemClass.newInstance(); } catch (Exception e) { throw new InitializationException("Error instantiating stemmer for language " + language + "from class " + stemClass, e); } if (protectedWords != null) input = new KeywordMarkerFilter(input, protectedWords); return new SnowballFilter(input, program); }
// in core/src/java/org/apache/solr/analysis/KeepWordFilterFactory.java
public void inform(ResourceLoader loader) { String wordFiles = args.get("words"); ignoreCase = getBoolean("ignoreCase", false); enablePositionIncrements = getBoolean("enablePositionIncrements",false); if (wordFiles != null) { try { words = getWordSet(loader, wordFiles, ignoreCase); } catch (IOException e) { throw new InitializationException("IOException thrown while loading words", e); } } }
// in core/src/java/org/apache/solr/analysis/ShingleFilterFactory.java
Override public void init(Map<String, String> args) { super.init(args); maxShingleSize = getInt("maxShingleSize", ShingleFilter.DEFAULT_MAX_SHINGLE_SIZE); if (maxShingleSize < 2) { throw new InitializationException("Invalid maxShingleSize (" + maxShingleSize + ") - must be at least 2"); } minShingleSize = getInt("minShingleSize", ShingleFilter.DEFAULT_MIN_SHINGLE_SIZE); if (minShingleSize < 2) { throw new InitializationException("Invalid minShingleSize (" + minShingleSize + ") - must be at least 2"); } if (minShingleSize > maxShingleSize) { throw new InitializationException("Invalid minShingleSize (" + minShingleSize + ") - must be no greater than maxShingleSize (" + maxShingleSize + ")"); } outputUnigrams = getBoolean("outputUnigrams", true); outputUnigramsIfNoShingles = getBoolean("outputUnigramsIfNoShingles", false); tokenSeparator = args.containsKey("tokenSeparator") ? args.get("tokenSeparator") : ShingleFilter.TOKEN_SEPARATOR; }
// in core/src/java/org/apache/solr/analysis/JapanesePartOfSpeechStopFilterFactory.java
public void inform(ResourceLoader loader) { String stopTagFiles = args.get("tags"); enablePositionIncrements = getBoolean("enablePositionIncrements", false); try { CharArraySet cas = getWordSet(loader, stopTagFiles, false); stopTags = new HashSet<String>(); for (Object element : cas) { char chars[] = (char[]) element; stopTags.add(new String(chars)); } } catch (IOException e) { throw new InitializationException("IOException thrown while loading tags", e); } }
// in core/src/java/org/apache/solr/analysis/MappingCharFilterFactory.java
public void inform(ResourceLoader loader) { mapping = args.get( "mapping" ); if( mapping != null ){ List<String> wlist = null; try{ File mappingFile = new File( mapping ); if( mappingFile.exists() ){ wlist = loader.getLines( mapping ); } else{ List<String> files = StrUtils.splitFileNames( mapping ); wlist = new ArrayList<String>(); for( String file : files ){ List<String> lines = loader.getLines( file.trim() ); wlist.addAll( lines ); } } } catch( IOException e ){ throw new InitializationException("IOException thrown while loading mappings", e); } final NormalizeCharMap.Builder builder = new NormalizeCharMap.Builder(); parseRules( wlist, builder ); normMap = builder.build(); } }
// in core/src/java/org/apache/solr/analysis/MappingCharFilterFactory.java
protected void parseRules( List<String> rules, NormalizeCharMap.Builder builder ){ for( String rule : rules ){ Matcher m = p.matcher( rule ); if( !m.find() ) throw new InitializationException("Invalid Mapping Rule : [" + rule + "], file = " + mapping); builder.add( parseString( m.group( 1 ) ), parseString( m.group( 2 ) ) ); } }
// in core/src/java/org/apache/solr/analysis/MappingCharFilterFactory.java
protected String parseString( String s ){ int readPos = 0; int len = s.length(); int writePos = 0; while( readPos < len ){ char c = s.charAt( readPos++ ); if( c == '\\' ){ if( readPos >= len ) throw new InitializationException("Invalid escaped char in [" + s + "]"); c = s.charAt( readPos++ ); switch( c ) { case '\\' : c = '\\'; break; case '"' : c = '"'; break; case 'n' : c = '\n'; break; case 't' : c = '\t'; break; case 'r' : c = '\r'; break; case 'b' : c = '\b'; break; case 'f' : c = '\f'; break; case 'u' : if( readPos + 3 >= len ) throw new InitializationException("Invalid escaped char in [" + s + "]"); c = (char)Integer.parseInt( s.substring( readPos, readPos + 4 ), 16 ); readPos += 4; break; } } out[writePos++] = c; } return new String( out, 0, writePos ); }
// in core/src/java/org/apache/solr/analysis/JapaneseTokenizerFactory.java
Override public void inform(ResourceLoader loader) { mode = getMode(args); String userDictionaryPath = args.get(USER_DICT_PATH); try { if (userDictionaryPath != null) { InputStream stream = loader.openResource(userDictionaryPath); String encoding = args.get(USER_DICT_ENCODING); if (encoding == null) { encoding = IOUtils.UTF_8; } CharsetDecoder decoder = Charset.forName(encoding).newDecoder() .onMalformedInput(CodingErrorAction.REPORT) .onUnmappableCharacter(CodingErrorAction.REPORT); Reader reader = new InputStreamReader(stream, decoder); userDictionary = new UserDictionary(reader); } else { userDictionary = null; } } catch (Exception e) { throw new InitializationException("Exception thrown while loading dictionary", e); } }
// in core/src/java/org/apache/solr/analysis/DelimitedPayloadTokenFilterFactory.java
public void inform(ResourceLoader loader) { String encoderClass = args.get(ENCODER_ATTR); if (encoderClass.equals("float")){ encoder = new FloatEncoder(); } else if (encoderClass.equals("integer")){ encoder = new IntegerEncoder(); } else if (encoderClass.equals("identity")){ encoder = new IdentityEncoder(); } else { encoder = loader.newInstance(encoderClass, PayloadEncoder.class); } String delim = args.get(DELIMITER_ATTR); if (delim != null){ if (delim.length() == 1) { delimiter = delim.charAt(0); } else{ throw new InitializationException("Delimiter must be one character only"); } } }
// in core/src/java/org/apache/solr/analysis/PatternTokenizerFactory.java
Override public void init(Map<String,String> args) { super.init(args); pattern = getPattern( PATTERN ); group = -1; // use 'split' String g = args.get( GROUP ); if( g != null ) { try { group = Integer.parseInt( g ); } catch( Exception ex ) { throw new InitializationException("invalid group argument: " + g); } } }
// in core/src/java/org/apache/solr/analysis/PatternTokenizerFactory.java
public Tokenizer create(final Reader in) { try { return new PatternTokenizer(in, pattern, group); } catch( IOException ex ) { throw new InitializationException("IOException thrown creating PatternTokenizer instance", ex); } }
// in core/src/java/org/apache/solr/analysis/DictionaryCompoundWordTokenFilterFactory.java
Override public void init(Map<String, String> args) { super.init(args); assureMatchVersion(); dictFile = args.get("dictionary"); if (null == dictFile) { throw new InitializationException("Missing required parameter: dictionary"); } minWordSize= getInt("minWordSize",CompoundWordTokenFilterBase.DEFAULT_MIN_WORD_SIZE); minSubwordSize= getInt("minSubwordSize",CompoundWordTokenFilterBase.DEFAULT_MIN_SUBWORD_SIZE); maxSubwordSize= getInt("maxSubwordSize",CompoundWordTokenFilterBase.DEFAULT_MAX_SUBWORD_SIZE); onlyLongestMatch = getBoolean("onlyLongestMatch",true); }
// in core/src/java/org/apache/solr/analysis/DictionaryCompoundWordTokenFilterFactory.java
public void inform(ResourceLoader loader) { try { dictionary = super.getWordSet(loader, dictFile, false); } catch (IOException e) { throw new InitializationException("IOException thrown while loading dictionary", e); } }
// in core/src/java/org/apache/solr/analysis/WordDelimiterFilterFactory.java
public void inform(ResourceLoader loader) { String wordFiles = args.get(PROTECTED_TOKENS); if (wordFiles != null) { try { protectedWords = getWordSet(loader, wordFiles, false); } catch (IOException e) { throw new InitializationException("IOException thrown while loading protected words", e); } } String types = args.get(TYPES); if (types != null) { try { List<String> files = StrUtils.splitFileNames( types ); List<String> wlist = new ArrayList<String>(); for( String file : files ){ List<String> lines = loader.getLines( file.trim() ); wlist.addAll( lines ); } typeTable = parseTypes(wlist); } catch (IOException e) { throw new InitializationException("IOException while loading types", e); } } }
// in core/src/java/org/apache/solr/analysis/WordDelimiterFilterFactory.java
private byte[] parseTypes(List<String> rules) { SortedMap<Character,Byte> typeMap = new TreeMap<Character,Byte>(); for( String rule : rules ){ Matcher m = typePattern.matcher(rule); if( !m.find() ) throw new InitializationException("Invalid Mapping Rule : [" + rule + "]"); String lhs = parseString(m.group(1).trim()); Byte rhs = parseType(m.group(2).trim()); if (lhs.length() != 1) throw new InitializationException("Invalid Mapping Rule : [" + rule + "]. Only a single character is allowed."); if (rhs == null) throw new InitializationException("Invalid Mapping Rule : [" + rule + "]. Illegal type."); typeMap.put(lhs.charAt(0), rhs); } // ensure the table is always at least as big as DEFAULT_WORD_DELIM_TABLE for performance byte types[] = new byte[Math.max(typeMap.lastKey()+1, WordDelimiterIterator.DEFAULT_WORD_DELIM_TABLE.length)]; for (int i = 0; i < types.length; i++) types[i] = WordDelimiterIterator.getType(i); for (Map.Entry<Character,Byte> mapping : typeMap.entrySet()) types[mapping.getKey()] = mapping.getValue(); return types; }
// in core/src/java/org/apache/solr/analysis/WordDelimiterFilterFactory.java
private String parseString(String s){ int readPos = 0; int len = s.length(); int writePos = 0; while( readPos < len ){ char c = s.charAt( readPos++ ); if( c == '\\' ){ if( readPos >= len ) throw new InitializationException("Invalid escaped char in [" + s + "]"); c = s.charAt( readPos++ ); switch( c ) { case '\\' : c = '\\'; break; case 'n' : c = '\n'; break; case 't' : c = '\t'; break; case 'r' : c = '\r'; break; case 'b' : c = '\b'; break; case 'f' : c = '\f'; break; case 'u' : if( readPos + 3 >= len ) throw new InitializationException("Invalid escaped char in [" + s + "]"); c = (char)Integer.parseInt( s.substring( readPos, readPos + 4 ), 16 ); readPos += 4; break; } } out[writePos++] = c; } return new String( out, 0, writePos ); }
// in core/src/java/org/apache/solr/analysis/PathHierarchyTokenizerFactory.java
Override public void init(Map<String,String> args){ super.init( args ); String v = args.get( "delimiter" ); if( v != null ){ if( v.length() != 1 ){ throw new InitializationException("delimiter should be a char. \"" + v + "\" is invalid"); } else{ delimiter = v.charAt(0); } } else{ delimiter = PathHierarchyTokenizer.DEFAULT_DELIMITER; } v = args.get( "replace" ); if( v != null ){ if( v.length() != 1 ){ throw new InitializationException("replace should be a char. \"" + v + "\" is invalid"); } else{ replacement = v.charAt(0); } } else{ replacement = delimiter; } v = args.get( "reverse" ); if( v != null ){ reverse = "true".equals( v ); } v = args.get( "skip" ); if( v != null ){ skip = Integer.parseInt( v ); } }
// in core/src/java/org/apache/solr/analysis/PatternReplaceFilterFactory.java
Override public void init(Map<String, String> args) { super.init(args); p = getPattern("pattern"); replacement = args.get("replacement"); String r = args.get("replace"); if (null != r) { if (r.equals("all")) { all = true; } else { if (r.equals("first")) { all = false; } else { throw new InitializationException ("Configuration Error: 'replace' must be 'first' or 'all' in " + this.getClass().getName()); } } } }
// in core/src/java/org/apache/solr/analysis/TypeTokenFilterFactory.java
Override public void inform(ResourceLoader loader) { String stopTypesFiles = args.get("types"); enablePositionIncrements = getBoolean("enablePositionIncrements", false); useWhitelist = getBoolean("useWhitelist", false); if (stopTypesFiles != null) { try { List<String> files = StrUtils.splitFileNames(stopTypesFiles); if (files.size() > 0) { stopTypes = new HashSet<String>(); for (String file : files) { List<String> typesLines = loader.getLines(file.trim()); stopTypes.addAll(typesLines); } } } catch (IOException e) { throw new InitializationException("IOException thrown while loading types", e); } } else { throw new InitializationException("Missing required parameter: types."); } }
// in core/src/java/org/apache/solr/analysis/CommonGramsQueryFilterFactory.java
public void inform(ResourceLoader loader) { String commonWordFiles = args.get("words"); ignoreCase = getBoolean("ignoreCase", false); if (commonWordFiles != null) { try { if ("snowball".equalsIgnoreCase(args.get("format"))) { commonWords = getSnowballWordSet(loader, commonWordFiles, ignoreCase); } else { commonWords = getWordSet(loader, commonWordFiles, ignoreCase); } } catch (IOException e) { throw new InitializationException("IOException thrown while loading common word file", e); } } else { commonWords = StopAnalyzer.ENGLISH_STOP_WORDS_SET; } }
// in core/src/java/org/apache/solr/analysis/StopFilterFactory.java
Override public void inform(ResourceLoader loader) { String stopWordFiles = args.get("words"); ignoreCase = getBoolean("ignoreCase",false); enablePositionIncrements = getBoolean("enablePositionIncrements",false); if (stopWordFiles != null) { try { if ("snowball".equalsIgnoreCase(args.get("format"))) { stopWords = getSnowballWordSet(loader, stopWordFiles, ignoreCase); } else { stopWords = getWordSet(loader, stopWordFiles, ignoreCase); } } catch (IOException e) { throw new InitializationException("IOException thrown while loading stopwords", e); } } else { stopWords = new CharArraySet(luceneMatchVersion, StopAnalyzer.ENGLISH_STOP_WORDS_SET, ignoreCase); } }
// in core/src/java/org/apache/solr/analysis/CommonGramsFilterFactory.java
public void inform(ResourceLoader loader) { String commonWordFiles = args.get("words"); ignoreCase = getBoolean("ignoreCase", false); if (commonWordFiles != null) { try { if ("snowball".equalsIgnoreCase(args.get("format"))) { commonWords = getSnowballWordSet(loader, commonWordFiles, ignoreCase); } else { commonWords = getWordSet(loader, commonWordFiles, ignoreCase); } } catch (IOException e) { throw new InitializationException("IOException thrown while loading common word file", e); } } else { commonWords = StopAnalyzer.ENGLISH_STOP_WORDS_SET; } }
27
              
// in core/src/java/org/apache/solr/analysis/PhoneticFilterFactory.java
catch (Exception e) { throw new InitializationException("Encoder " + name + " / " + clazz + " does not support " + MAX_CODE_LENGTH, e); }
// in core/src/java/org/apache/solr/analysis/PhoneticFilterFactory.java
catch (ClassNotFoundException cnfe) { throw new InitializationException("Unknown encoder: " + name + " must be full class name or one of " + registry.keySet(), cnfe); }
// in core/src/java/org/apache/solr/analysis/PhoneticFilterFactory.java
catch (ClassCastException e) { throw new InitializationException("Not an encoder: " + name + " must be full class name or one of " + registry.keySet(), e); }
// in core/src/java/org/apache/solr/analysis/PhoneticFilterFactory.java
catch (Exception e) { final Throwable t = (e instanceof InvocationTargetException) ? e.getCause() : e; throw new InitializationException("Error initializing encoder: " + name + " / " + clazz, t); }
// in core/src/java/org/apache/solr/analysis/HunspellStemFilterFactory.java
catch (Exception e) { throw new InitializationException("Unable to load hunspell data! [dictionary=" + args.get("dictionary") + ",affix=" + affixFile + "]", e); }
// in core/src/java/org/apache/solr/analysis/SynonymFilterFactory.java
catch (Exception e) { throw new InitializationException("Exception thrown while loading synonyms", e); }
// in core/src/java/org/apache/solr/analysis/ElisionFilterFactory.java
catch (IOException e) { throw new InitializationException("IOException thrown while loading articles", e); }
// in core/src/java/org/apache/solr/analysis/TrimFilterFactory.java
catch( Exception ex ) { throw new InitializationException("Error reading updateOffsets value. Must be true or false.", ex); }
// in core/src/java/org/apache/solr/analysis/HyphenationCompoundWordTokenFilterFactory.java
catch (Exception e) { // TODO: getHyphenationTree really shouldn't throw "Exception" throw new InitializationException("Exception thrown while loading dictionary and hyphenation file", e); }
// in core/src/java/org/apache/solr/analysis/StemmerOverrideFilterFactory.java
catch (IOException e) { throw new InitializationException("IOException thrown while loading dictionary", e); }
// in core/src/java/org/apache/solr/analysis/KeywordMarkerFilterFactory.java
catch (IOException e) { throw new InitializationException("IOException thrown while loading protected words", e); }
// in core/src/java/org/apache/solr/analysis/SnowballPorterFilterFactory.java
catch (IOException e) { throw new InitializationException("IOException thrown while loading protected words", e); }
// in core/src/java/org/apache/solr/analysis/SnowballPorterFilterFactory.java
catch (ClassNotFoundException e) { throw new InitializationException("Can't find class for stemmer language " + language, e); }
// in core/src/java/org/apache/solr/analysis/SnowballPorterFilterFactory.java
catch (Exception e) { throw new InitializationException("Error instantiating stemmer for language " + language + "from class " + stemClass, e); }
// in core/src/java/org/apache/solr/analysis/KeepWordFilterFactory.java
catch (IOException e) { throw new InitializationException("IOException thrown while loading words", e); }
// in core/src/java/org/apache/solr/analysis/JapanesePartOfSpeechStopFilterFactory.java
catch (IOException e) { throw new InitializationException("IOException thrown while loading tags", e); }
// in core/src/java/org/apache/solr/analysis/MappingCharFilterFactory.java
catch( IOException e ){ throw new InitializationException("IOException thrown while loading mappings", e); }
// in core/src/java/org/apache/solr/analysis/JapaneseTokenizerFactory.java
catch (Exception e) { throw new InitializationException("Exception thrown while loading dictionary", e); }
// in core/src/java/org/apache/solr/analysis/PatternTokenizerFactory.java
catch( Exception ex ) { throw new InitializationException("invalid group argument: " + g); }
// in core/src/java/org/apache/solr/analysis/PatternTokenizerFactory.java
catch( IOException ex ) { throw new InitializationException("IOException thrown creating PatternTokenizer instance", ex); }
// in core/src/java/org/apache/solr/analysis/DictionaryCompoundWordTokenFilterFactory.java
catch (IOException e) { throw new InitializationException("IOException thrown while loading dictionary", e); }
// in core/src/java/org/apache/solr/analysis/WordDelimiterFilterFactory.java
catch (IOException e) { throw new InitializationException("IOException thrown while loading protected words", e); }
// in core/src/java/org/apache/solr/analysis/WordDelimiterFilterFactory.java
catch (IOException e) { throw new InitializationException("IOException while loading types", e); }
// in core/src/java/org/apache/solr/analysis/TypeTokenFilterFactory.java
catch (IOException e) { throw new InitializationException("IOException thrown while loading types", e); }
// in core/src/java/org/apache/solr/analysis/CommonGramsQueryFilterFactory.java
catch (IOException e) { throw new InitializationException("IOException thrown while loading common word file", e); }
// in core/src/java/org/apache/solr/analysis/StopFilterFactory.java
catch (IOException e) { throw new InitializationException("IOException thrown while loading stopwords", e); }
// in core/src/java/org/apache/solr/analysis/CommonGramsFilterFactory.java
catch (IOException e) { throw new InitializationException("IOException thrown while loading common word file", e); }
0
(Domain) ZooKeeperException 52
              
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
Override public void update(SolrZooKeeper zooKeeper) { SolrZooKeeper oldKeeper = keeper; keeper = zooKeeper; if (oldKeeper != null) { try { oldKeeper.close(); } catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } } }
// in solrj/src/java/org/apache/solr/common/cloud/ZkStateReader.java
public void command() { try { ZkStateReader.this.createClusterStateWatchersAndUpdate(); } catch (KeeperException e) { log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } }
// in solrj/src/java/org/apache/solr/common/cloud/ZkStateReader.java
public synchronized void createClusterStateWatchersAndUpdate() throws KeeperException, InterruptedException { // We need to fetch the current cluster state and the set of live nodes synchronized (getUpdateLock()) { cmdExecutor.ensureExists(CLUSTER_STATE, zkClient); log.info("Updating cluster state from ZooKeeper... "); zkClient.exists(CLUSTER_STATE, new Watcher() { @Override public void process(WatchedEvent event) { log.info("A cluster state change has occurred"); try { // delayed approach // ZkStateReader.this.updateCloudState(false, false); synchronized (ZkStateReader.this.getUpdateLock()) { // remake watch final Watcher thisWatch = this; byte[] data = zkClient.getData(CLUSTER_STATE, thisWatch, null, true); CloudState clusterState = CloudState.load(data, ZkStateReader.this.cloudState.getLiveNodes()); // update volatile cloudState = clusterState; } } catch (KeeperException e) { if (e.code() == KeeperException.Code.SESSIONEXPIRED || e.code() == KeeperException.Code.CONNECTIONLOSS) { log.warn("ZooKeeper watch triggered, but Solr cannot talk to ZK"); return; } log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.warn("", e); return; } }
// in solrj/src/java/org/apache/solr/common/cloud/ZkStateReader.java
public void close() { if (closeClient) { try { zkClient.close(); } catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } } }
// in solrj/src/java/org/apache/solr/common/cloud/ZkStateReader.java
public List<ZkCoreNodeProps> getReplicaProps(String collection, String shardId, String thisNodeName, String coreName, String mustMatchStateFilter, String mustNotMatchStateFilter) { CloudState cloudState = this.cloudState; if (cloudState == null) { return null; } Map<String,Slice> slices = cloudState.getSlices(collection); if (slices == null) { throw new ZooKeeperException(ErrorCode.BAD_REQUEST, "Could not find collection in zk: " + collection + " " + cloudState.getCollections()); } Slice replicas = slices.get(shardId); if (replicas == null) { throw new ZooKeeperException(ErrorCode.BAD_REQUEST, "Could not find shardId in zk: " + shardId); } Map<String,ZkNodeProps> shardMap = replicas.getShards(); List<ZkCoreNodeProps> nodes = new ArrayList<ZkCoreNodeProps>(shardMap.size()); String filterNodeName = thisNodeName + "_" + coreName; for (Entry<String,ZkNodeProps> entry : shardMap.entrySet()) { ZkCoreNodeProps nodeProps = new ZkCoreNodeProps(entry.getValue()); String coreNodeName = nodeProps.getNodeName() + "_" + nodeProps.getCoreName(); if (cloudState.liveNodesContain(nodeProps.getNodeName()) && !coreNodeName.equals(filterNodeName)) { if (mustMatchStateFilter == null || mustMatchStateFilter.equals(nodeProps.getState())) { if (mustNotMatchStateFilter == null || !mustNotMatchStateFilter.equals(nodeProps.getState())) { nodes.add(nodeProps); } } } } if (nodes.size() == 0) { // no replicas - go local return null; } return nodes; }
// in solrj/src/java/org/apache/solr/client/solrj/impl/CloudSolrServer.java
public void connect() { if (zkStateReader == null) { synchronized (this) { if (zkStateReader == null) { try { ZkStateReader zk = new ZkStateReader(zkHost, zkConnectTimeout, zkClientTimeout); zk.createClusterStateWatchersAndUpdate(); zkStateReader = zk; } catch (InterruptedException e) { Thread.currentThread().interrupt(); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } catch (KeeperException e) { throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } catch (IOException e) { throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } catch (TimeoutException e) { throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } } } } }
// in core/src/java/org/apache/solr/cloud/Overseer.java
Override public void run() { while (amILeader()) { LinkedList<CloudStateUpdateRequest> requests = new LinkedList<Overseer.CloudStateUpdateRequest>(); while (!fifo.isEmpty()) { // collect all queued requests CloudStateUpdateRequest req; req = fifo.poll(); if (req == null) { break; } requests.add(req); } if (requests.size() > 0) { // process updates synchronized (reader.getUpdateLock()) { try { reader.updateCloudState(true); CloudState cloudState = reader.getCloudState(); for (CloudStateUpdateRequest request : requests) { switch (request.operation) { case LeaderChange: cloudState = setShardLeader(cloudState, (String) request.args[0], (String) request.args[1], (String) request.args[2]); break; case StateChange: cloudState = updateState(cloudState, (String) request.args[0], (CoreState) request.args[1]); break; case CoreDeleted: cloudState = removeCore(cloudState, (String) request.args[0], (String) request.args[1]); break; } } log.info("Announcing new cluster state"); zkClient.setData(ZkStateReader.CLUSTER_STATE, ZkStateReader.toJSON(cloudState), true); } catch (KeeperException e) { if (e.code() == KeeperException.Code.SESSIONEXPIRED || e.code() == KeeperException.Code.CONNECTIONLOSS) { log.warn("ZooKeeper watch triggered, but Solr cannot talk to ZK"); return; } SolrException.log(log, "", e); throw new ZooKeeperException( SolrException.ErrorCode.SERVER_ERROR, "", e); } catch (InterruptedException e) { Thread.currentThread().interrupt(); return; } } } try { Thread.sleep(STATE_UPDATE_DELAY); } catch (InterruptedException e) { Thread.currentThread().interrupt(); } }
// in core/src/java/org/apache/solr/cloud/Overseer.java
Override public void process(WatchedEvent event) { try { List<String> leaderNodes = zkClient.getChildren( ZkStateReader.getShardLeadersPath(collection, null), this, true); processLeaderNodesChanged(collection, leaderNodes); } catch (KeeperException e) { if (e.code() == KeeperException.Code.SESSIONEXPIRED || e.code() == KeeperException.Code.CONNECTIONLOSS) { log.warn("ZooKeeper watch triggered, but Solr cannot talk to ZK"); return; } SolrException.log(log, "", e); throw new ZooKeeperException( SolrException.ErrorCode.SERVER_ERROR, "", e); } catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); } }
// in core/src/java/org/apache/solr/cloud/Overseer.java
Override public void process(WatchedEvent event) { try { List<String> liveNodes = zkClient.getChildren( ZkStateReader.LIVE_NODES_ZKNODE, this, true); synchronized (nodeStateWatches) { processLiveNodesChanged(nodeStateWatches.keySet(), liveNodes); } } catch (KeeperException e) { if (e.code() == KeeperException.Code.SESSIONEXPIRED || e.code() == KeeperException.Code.CONNECTIONLOSS) { log.warn("ZooKeeper watch triggered, but Solr cannot talk to ZK"); return; } SolrException.log(log, "", e); throw new ZooKeeperException( SolrException.ErrorCode.SERVER_ERROR, "", e); } catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); } }
// in core/src/java/org/apache/solr/cloud/ZkSolrResourceLoader.java
Override public String getConfigDir() { throw new ZooKeeperException( ErrorCode.SERVER_ERROR, "ZkSolrResourceLoader does not support getConfigDir() - likely, what you are trying to do is not supported in ZooKeeper mode"); }
// in core/src/java/org/apache/solr/cloud/ZkSolrResourceLoader.java
Override public String[] listConfigDir() { List<String> list; try { list = zkController.getZkClient().getChildren(collectionZkPath, null, true); } catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } catch (KeeperException e) { log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } return list.toArray(new String[0]); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
public void command() { try { // we need to create all of our lost watches // seems we dont need to do this again... //Overseer.createClientNodes(zkClient, getNodeName()); ElectionContext context = new OverseerElectionContext(getNodeName(), zkClient, zkStateReader); overseerElector.joinElection(context); zkStateReader.createClusterStateWatchersAndUpdate(); List<CoreDescriptor> descriptors = registerOnReconnect .getCurrentDescriptors(); if (descriptors != null) { // before registering as live, make sure everyone is in a // down state for (CoreDescriptor descriptor : descriptors) { final String coreZkNodeName = getNodeName() + "_" + descriptor.getName(); try { publishAsDown(getBaseUrl(), descriptor, coreZkNodeName, descriptor.getName()); waitForLeaderToSeeDownState(descriptor, coreZkNodeName); } catch (Exception e) { SolrException.log(log, "", e); } } } // we have to register as live first to pick up docs in the buffer createEphemeralLiveNode(); // re register all descriptors if (descriptors != null) { for (CoreDescriptor descriptor : descriptors) { // TODO: we need to think carefully about what happens when it was // a leader that was expired - as well as what to do about leaders/overseers // with connection loss register(descriptor.getName(), descriptor, true); } } } catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); throw new ZooKeeperException( SolrException.ErrorCode.SERVER_ERROR, "", e); } catch (Exception e) { SolrException.log(log, "", e); throw new ZooKeeperException( SolrException.ErrorCode.SERVER_ERROR, "", e); } }
// in core/src/java/org/apache/solr/cloud/ZkController.java
public void close() { try { zkClient.close(); } catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.warn("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } }
// in core/src/java/org/apache/solr/cloud/ZkController.java
public byte[] getConfigFileData(String zkConfigName, String fileName) throws KeeperException, InterruptedException { String zkPath = CONFIGS_ZKNODE + "/" + zkConfigName + "/" + fileName; byte[] bytes = zkClient.getData(zkPath, null, null, true); if (bytes == null) { log.error("Config file contains no data:" + zkPath); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "Config file contains no data:" + zkPath); } return bytes; }
// in core/src/java/org/apache/solr/cloud/ZkController.java
private String getHostNameFromAddress(String addr) { Matcher m = URL_POST.matcher(addr); if (m.matches()) { return m.group(1); } else { log.error("Unrecognized host:" + addr); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "Unrecognized host:" + addr); } }
// in core/src/java/org/apache/solr/cloud/ZkController.java
private void init() { try { // makes nodes zkNode cmdExecutor.ensureExists(ZkStateReader.LIVE_NODES_ZKNODE, zkClient); Overseer.createClientNodes(zkClient, getNodeName()); createEphemeralLiveNode(); cmdExecutor.ensureExists(ZkStateReader.COLLECTIONS_ZKNODE, zkClient); syncNodeState(); overseerElector = new LeaderElector(zkClient); ElectionContext context = new OverseerElectionContext(getNodeName(), zkClient, zkStateReader); overseerElector.setup(context); overseerElector.joinElection(context); zkStateReader.createClusterStateWatchersAndUpdate(); } catch (IOException e) { log.error("", e); throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Can't create ZooKeeperController", e); } catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } catch (KeeperException e) { log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } }
// in core/src/java/org/apache/solr/cloud/ZkController.java
public String readConfigName(String collection) throws KeeperException, InterruptedException, IOException { String configName = null; String path = ZkStateReader.COLLECTIONS_ZKNODE + "/" + collection; if (log.isInfoEnabled()) { log.info("Load collection config from:" + path); } byte[] data = zkClient.getData(path, null, null, true); if(data != null) { ZkNodeProps props = ZkNodeProps.load(data); configName = props.get(CONFIGNAME_PROP); } if (configName != null && !zkClient.exists(CONFIGS_ZKNODE + "/" + configName, true)) { log.error("Specified config does not exist in ZooKeeper:" + configName); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "Specified config does not exist in ZooKeeper:" + configName); } return configName; }
// in core/src/java/org/apache/solr/cloud/ZkController.java
public String register(String coreName, final CoreDescriptor desc, boolean recoverReloadedCores) throws Exception { final String baseUrl = getBaseUrl(); final CloudDescriptor cloudDesc = desc.getCloudDescriptor(); final String collection = cloudDesc.getCollectionName(); final String coreZkNodeName = getNodeName() + "_" + coreName; String shardId = cloudDesc.getShardId(); Map<String,String> props = new HashMap<String,String>(); // we only put a subset of props into the leader node props.put(ZkStateReader.BASE_URL_PROP, baseUrl); props.put(ZkStateReader.CORE_NAME_PROP, coreName); props.put(ZkStateReader.NODE_NAME_PROP, getNodeName()); if (log.isInfoEnabled()) { log.info("Register shard - core:" + coreName + " address:" + baseUrl + " shardId:" + shardId); } ZkNodeProps leaderProps = new ZkNodeProps(props); try { joinElection(desc); } catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } catch (KeeperException e) { throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } catch (IOException e) { throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } // rather than look in the cluster state file, we go straight to the zknodes // here, because on cluster restart there could be stale leader info in the // cluster state node that won't be updated for a moment String leaderUrl = getLeaderProps(collection, cloudDesc.getShardId()).getCoreUrl(); // now wait until our currently cloud state contains the latest leader String cloudStateLeader = zkStateReader.getLeaderUrl(collection, cloudDesc.getShardId(), 30000); int tries = 0; while (!leaderUrl.equals(cloudStateLeader)) { if (tries == 60) { throw new SolrException(ErrorCode.SERVER_ERROR, "There is conflicting information about the leader of shard: " + cloudDesc.getShardId()); } Thread.sleep(1000); tries++; cloudStateLeader = zkStateReader.getLeaderUrl(collection, cloudDesc.getShardId(), 30000); } String ourUrl = ZkCoreNodeProps.getCoreUrl(baseUrl, coreName); log.info("We are " + ourUrl + " and leader is " + leaderUrl); boolean isLeader = leaderUrl.equals(ourUrl); SolrCore core = null; if (cc != null) { // CoreContainer only null in tests try { core = cc.getCore(desc.getName()); // recover from local transaction log and wait for it to complete before // going active // TODO: should this be moved to another thread? To recoveryStrat? // TODO: should this actually be done earlier, before (or as part of) // leader election perhaps? // TODO: if I'm the leader, ensure that a replica that is trying to recover waits until I'm // active (or don't make me the // leader until my local replay is done. UpdateLog ulog = core.getUpdateHandler().getUpdateLog(); if (!core.isReloaded() && ulog != null) { Future<UpdateLog.RecoveryInfo> recoveryFuture = core.getUpdateHandler() .getUpdateLog().recoverFromLog(); if (recoveryFuture != null) { recoveryFuture.get(); // NOTE: this could potentially block for // minutes or more! // TODO: public as recovering in the mean time? // TODO: in the future we could do peerync in parallel with recoverFromLog } else { log.info("No LogReplay needed for core="+core.getName() + " baseURL=" + baseUrl); } } boolean didRecovery = checkRecovery(coreName, desc, recoverReloadedCores, isLeader, cloudDesc, collection, coreZkNodeName, shardId, leaderProps, core, cc); if (!didRecovery) { publishAsActive(baseUrl, desc, coreZkNodeName, coreName); } } finally { if (core != null) { core.close(); } } } else { publishAsActive(baseUrl, desc, coreZkNodeName, coreName); } // make sure we have an update cluster state right away zkStateReader.updateCloudState(true); return shardId; }
// in core/src/java/org/apache/solr/cloud/ZkController.java
private void getConfName(String collection, String collectionPath, Map<String,String> collectionProps) throws KeeperException, InterruptedException { // check for configName log.info("Looking for collection configName"); List<String> configNames = null; int retry = 1; int retryLimt = 6; for (; retry < retryLimt; retry++) { if (zkClient.exists(collectionPath, true)) { ZkNodeProps cProps = ZkNodeProps.load(zkClient.getData(collectionPath, null, null, true)); if (cProps.containsKey(CONFIGNAME_PROP)) { break; } } // if there is only one conf, use that try { configNames = zkClient.getChildren(CONFIGS_ZKNODE, null, true); } catch (NoNodeException e) { // just keep trying } if (configNames != null && configNames.size() == 1) { // no config set named, but there is only 1 - use it log.info("Only one config set found in zk - using it:" + configNames.get(0)); collectionProps.put(CONFIGNAME_PROP, configNames.get(0)); break; } if (configNames != null && configNames.contains(collection)) { log.info("Could not find explicit collection configName, but found config name matching collection name - using that set."); collectionProps.put(CONFIGNAME_PROP, collection); break; } log.info("Could not find collection configName - pausing for 3 seconds and trying again - try: " + retry); Thread.sleep(3000); } if (retry == retryLimt) { log.error("Could not find configName for collection " + collection); throw new ZooKeeperException( SolrException.ErrorCode.SERVER_ERROR, "Could not find configName for collection " + collection + " found:" + configNames); } }
// in core/src/java/org/apache/solr/cloud/ZkController.java
private void publishState() { final String nodePath = "/node_states/" + getNodeName(); long version; byte[] coreStatesData; synchronized (coreStates) { version = ++coreStatesVersion; coreStatesData = ZkStateReader.toJSON(coreStates.values()); } // if multiple threads are trying to publish state, make sure that we never write // an older version after a newer version. synchronized (coreStatesPublishLock) { try { if (version < coreStatesPublishedVersion) { log.info("Another thread already published a newer coreStates: ours="+version + " lastPublished=" + coreStatesPublishedVersion); } else { zkClient.setData(nodePath, coreStatesData, true); coreStatesPublishedVersion = version; // put it after so it won't be set if there's an exception } } catch (KeeperException e) { throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "could not publish node state", e); } catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "could not publish node state", e); } } }
// in core/src/java/org/apache/solr/cloud/LeaderElector.java
public int joinElection(ElectionContext context) throws KeeperException, InterruptedException, IOException { final String shardsElectZkPath = context.electionPath + LeaderElector.ELECTION_NODE; long sessionId = zkClient.getSolrZooKeeper().getSessionId(); String id = sessionId + "-" + context.id; String leaderSeqPath = null; boolean cont = true; int tries = 0; while (cont) { try { leaderSeqPath = zkClient.create(shardsElectZkPath + "/" + id + "-n_", null, CreateMode.EPHEMERAL_SEQUENTIAL, false); context.leaderSeqPath = leaderSeqPath; cont = false; } catch (ConnectionLossException e) { // we don't know if we made our node or not... List<String> entries = zkClient.getChildren(shardsElectZkPath, null, true); boolean foundId = false; for (String entry : entries) { String nodeId = getNodeId(entry); if (id.equals(nodeId)) { // we did create our node... foundId = true; break; } } if (!foundId) { throw e; } } catch (KeeperException.NoNodeException e) { // we must have failed in creating the election node - someone else must // be working on it, lets try again if (tries++ > 9) { throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } cont = true; Thread.sleep(50); } } int seq = getSeq(leaderSeqPath); checkIfIamLeader(seq, context, false); return seq; }
// in core/src/java/org/apache/solr/update/processor/DistributedUpdateProcessor.java
private List<Node> setupRequest(int hash) { List<Node> nodes = null; // if we are in zk mode... if (zkEnabled) { // the leader is... // TODO: if there is no leader, wait and look again // TODO: we are reading the leader from zk every time - we should cache // this and watch for changes?? Just pull it from ZkController cluster state probably? String shardId = getShard(hash, collection, zkController.getCloudState()); // get the right shard based on the hash... try { // TODO: if we find out we cannot talk to zk anymore, we should probably realize we are not // a leader anymore - we shouldn't accept updates at all?? ZkCoreNodeProps leaderProps = new ZkCoreNodeProps(zkController.getZkStateReader().getLeaderProps( collection, shardId)); String leaderNodeName = leaderProps.getCoreNodeName(); String coreName = req.getCore().getName(); String coreNodeName = zkController.getNodeName() + "_" + coreName; isLeader = coreNodeName.equals(leaderNodeName); DistribPhase phase = DistribPhase.parseParam(req.getParams().get(DISTRIB_UPDATE_PARAM)); if (DistribPhase.FROMLEADER == phase) { // we are coming from the leader, just go local - add no urls forwardToLeader = false; } else if (isLeader) { // that means I want to forward onto my replicas... // so get the replicas... forwardToLeader = false; List<ZkCoreNodeProps> replicaProps = zkController.getZkStateReader() .getReplicaProps(collection, shardId, zkController.getNodeName(), coreName, null, ZkStateReader.DOWN); if (replicaProps != null) { nodes = new ArrayList<Node>(replicaProps.size()); for (ZkCoreNodeProps props : replicaProps) { nodes.add(new StdNode(props)); } } } else { // I need to forward onto the leader... nodes = new ArrayList<Node>(1); nodes.add(new RetryNode(leaderProps, zkController.getZkStateReader(), collection, shardId)); forwardToLeader = true; } } catch (InterruptedException e) { Thread.currentThread().interrupt(); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } } return nodes; }
// in core/src/java/org/apache/solr/update/processor/DistributedUpdateProcessor.java
private List<Node> setupRequest() { List<Node> nodes = null; String shardId = cloudDesc.getShardId(); try { ZkCoreNodeProps leaderProps = new ZkCoreNodeProps(zkController.getZkStateReader().getLeaderProps( collection, shardId)); String leaderNodeName = leaderProps.getCoreNodeName(); String coreName = req.getCore().getName(); String coreNodeName = zkController.getNodeName() + "_" + coreName; isLeader = coreNodeName.equals(leaderNodeName); // TODO: what if we are no longer the leader? forwardToLeader = false; List<ZkCoreNodeProps> replicaProps = zkController.getZkStateReader() .getReplicaProps(collection, shardId, zkController.getNodeName(), coreName); if (replicaProps != null) { nodes = new ArrayList<Node>(replicaProps.size()); for (ZkCoreNodeProps props : replicaProps) { nodes.add(new StdNode(props)); } } } catch (InterruptedException e) { Thread.currentThread().interrupt(); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } return nodes; }
// in core/src/java/org/apache/solr/update/processor/DistributedUpdateProcessor.java
private List<Node> getCollectionUrls(SolrQueryRequest req, String collection, String shardZkNodeName) { CloudState cloudState = req.getCore().getCoreDescriptor() .getCoreContainer().getZkController().getCloudState(); List<Node> urls = new ArrayList<Node>(); Map<String,Slice> slices = cloudState.getSlices(collection); if (slices == null) { throw new ZooKeeperException(ErrorCode.BAD_REQUEST, "Could not find collection in zk: " + cloudState); } for (Map.Entry<String,Slice> sliceEntry : slices.entrySet()) { Slice replicas = slices.get(sliceEntry.getKey()); Map<String,ZkNodeProps> shardMap = replicas.getShards(); for (Entry<String,ZkNodeProps> entry : shardMap.entrySet()) { ZkCoreNodeProps nodeProps = new ZkCoreNodeProps(entry.getValue()); if (cloudState.liveNodesContain(nodeProps.getNodeName()) && !entry.getKey().equals(shardZkNodeName)) { urls.add(new StdNode(nodeProps)); } } } if (urls.size() == 0) { return null; } return urls; }
// in core/src/java/org/apache/solr/core/CoreContainer.java
protected void initZooKeeper(String zkHost, int zkClientTimeout) { // if zkHost sys property is not set, we are not using ZooKeeper String zookeeperHost; if(zkHost == null) { zookeeperHost = System.getProperty("zkHost"); } else { zookeeperHost = zkHost; } String zkRun = System.getProperty("zkRun"); if (zkRun == null && zookeeperHost == null) return; // not in zk mode // zookeeper in quorum mode currently causes a failure when trying to // register log4j mbeans. See SOLR-2369 // TODO: remove after updating to an slf4j based zookeeper System.setProperty("zookeeper.jmx.log4j.disable", "true"); if (zkRun != null) { String zkDataHome = System.getProperty("zkServerDataDir", solrHome + "zoo_data"); String zkConfHome = System.getProperty("zkServerConfDir", solrHome); zkServer = new SolrZkServer(zkRun, zookeeperHost, zkDataHome, zkConfHome, hostPort); zkServer.parseConfig(); zkServer.start(); // set client from server config if not already set if (zookeeperHost == null) { zookeeperHost = zkServer.getClientString(); } } int zkClientConnectTimeout = 15000; if (zookeeperHost != null) { // we are ZooKeeper enabled try { // If this is an ensemble, allow for a long connect time for other servers to come up if (zkRun != null && zkServer.getServers().size() > 1) { zkClientConnectTimeout = 24 * 60 * 60 * 1000; // 1 day for embedded ensemble log.info("Zookeeper client=" + zookeeperHost + " Waiting for a quorum."); } else { log.info("Zookeeper client=" + zookeeperHost); } zkController = new ZkController(this, zookeeperHost, zkClientTimeout, zkClientConnectTimeout, host, hostPort, hostContext, new CurrentCoreDescriptorProvider() { @Override public List<CoreDescriptor> getCurrentDescriptors() { List<CoreDescriptor> descriptors = new ArrayList<CoreDescriptor>(getCoreNames().size()); for (SolrCore core : getCores()) { descriptors.add(core.getCoreDescriptor()); } return descriptors; } }); String confDir = System.getProperty("bootstrap_confdir"); if(confDir != null) { File dir = new File(confDir); if(!dir.isDirectory()) { throw new IllegalArgumentException("bootstrap_confdir must be a directory of configuration files"); } String confName = System.getProperty(ZkController.COLLECTION_PARAM_PREFIX+ZkController.CONFIGNAME_PROP, "configuration1"); zkController.uploadConfigDir(dir, confName); } boolean boostrapConf = Boolean.getBoolean("bootstrap_conf"); if(boostrapConf) { ZkController.bootstrapConf(zkController.getZkClient(), cfg, solrHome); } } catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } catch (TimeoutException e) { log.error("Could not connect to ZooKeeper", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } catch (IOException e) { log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } catch (KeeperException e) { log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } }
// in core/src/java/org/apache/solr/core/CoreContainer.java
private void registerInZk(SolrCore core) { if (zkController != null) { try { zkController.register(core.getName(), core.getCoreDescriptor()); } catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); SolrException.log(log, "", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } catch (Exception e) { // if register fails, this is really bad - close the zkController to // minimize any damage we can cause zkController.publish(core.getCoreDescriptor(), ZkStateReader.DOWN); SolrException.log(log, "", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } } }
// in core/src/java/org/apache/solr/core/CoreContainer.java
public SolrCore create(CoreDescriptor dcore) throws ParserConfigurationException, IOException, SAXException { // Make the instanceDir relative to the cores instanceDir if not absolute File idir = new File(dcore.getInstanceDir()); if (!idir.isAbsolute()) { idir = new File(solrHome, dcore.getInstanceDir()); } String instanceDir = idir.getPath(); log.info("Creating SolrCore '{}' using instanceDir: {}", dcore.getName(), instanceDir); // Initialize the solr config SolrResourceLoader solrLoader = null; SolrConfig config = null; String zkConfigName = null; if(zkController == null) { solrLoader = new SolrResourceLoader(instanceDir, libLoader, getCoreProps(instanceDir, dcore.getPropertiesName(),dcore.getCoreProperties())); config = new SolrConfig(solrLoader, dcore.getConfigName(), null); } else { try { String collection = dcore.getCloudDescriptor().getCollectionName(); zkController.createCollectionZkNode(dcore.getCloudDescriptor()); zkConfigName = zkController.readConfigName(collection); if (zkConfigName == null) { log.error("Could not find config name for collection:" + collection); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "Could not find config name for collection:" + collection); } solrLoader = new ZkSolrResourceLoader(instanceDir, zkConfigName, libLoader, getCoreProps(instanceDir, dcore.getPropertiesName(),dcore.getCoreProperties()), zkController); config = getSolrConfigFromZk(zkConfigName, dcore.getConfigName(), solrLoader); } catch (KeeperException e) { log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } } IndexSchema schema = null; if (indexSchemaCache != null) { if (zkController != null) { File schemaFile = new File(dcore.getSchemaName()); if (!schemaFile.isAbsolute()) { schemaFile = new File(solrLoader.getInstanceDir() + "conf" + File.separator + dcore.getSchemaName()); } if (schemaFile.exists()) { String key = schemaFile.getAbsolutePath() + ":" + new SimpleDateFormat("yyyyMMddHHmmss", Locale.US).format(new Date( schemaFile.lastModified())); schema = indexSchemaCache.get(key); if (schema == null) { log.info("creating new schema object for core: " + dcore.name); schema = new IndexSchema(config, dcore.getSchemaName(), null); indexSchemaCache.put(key, schema); } else { log.info("re-using schema object for core: " + dcore.name); } } } else { // TODO: handle caching from ZooKeeper - perhaps using ZooKeepers versioning // Don't like this cache though - how does it empty as last modified changes? } } if(schema == null){ if(zkController != null) { try { schema = getSchemaFromZk(zkConfigName, dcore.getSchemaName(), config, solrLoader); } catch (KeeperException e) { log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } } else { schema = new IndexSchema(config, dcore.getSchemaName(), null); } } SolrCore core = new SolrCore(dcore.getName(), null, config, schema, dcore); if (zkController == null && core.getUpdateHandler().getUpdateLog() != null) { // always kick off recovery if we are in standalone mode. core.getUpdateHandler().getUpdateLog().recoverFromLog(); } return core; }
// in core/src/java/org/apache/solr/core/CoreContainer.java
public void reload(String name) throws ParserConfigurationException, IOException, SAXException { name= checkDefault(name); SolrCore core; synchronized(cores) { core = cores.get(name); } if (core == null) throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "No such core: " + name ); CoreDescriptor cd = core.getCoreDescriptor(); File instanceDir = new File(cd.getInstanceDir()); if (!instanceDir.isAbsolute()) { instanceDir = new File(getSolrHome(), cd.getInstanceDir()); } log.info("Reloading SolrCore '{}' using instanceDir: {}", cd.getName(), instanceDir.getAbsolutePath()); SolrResourceLoader solrLoader; if(zkController == null) { solrLoader = new SolrResourceLoader(instanceDir.getAbsolutePath(), libLoader, getCoreProps(instanceDir.getAbsolutePath(), cd.getPropertiesName(),cd.getCoreProperties())); } else { try { String collection = cd.getCloudDescriptor().getCollectionName(); zkController.createCollectionZkNode(cd.getCloudDescriptor()); String zkConfigName = zkController.readConfigName(collection); if (zkConfigName == null) { log.error("Could not find config name for collection:" + collection); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "Could not find config name for collection:" + collection); } solrLoader = new ZkSolrResourceLoader(instanceDir.getAbsolutePath(), zkConfigName, libLoader, getCoreProps(instanceDir.getAbsolutePath(), cd.getPropertiesName(),cd.getCoreProperties()), zkController); } catch (KeeperException e) { log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } } SolrCore newCore = core.reload(solrLoader); // keep core to orig name link String origName = coreToOrigName.remove(core); if (origName != null) { coreToOrigName.put(newCore, origName); } register(name, newCore, false); }
42
              
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in solrj/src/java/org/apache/solr/common/cloud/ZkStateReader.java
catch (KeeperException e) { log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in solrj/src/java/org/apache/solr/common/cloud/ZkStateReader.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in solrj/src/java/org/apache/solr/common/cloud/ZkStateReader.java
catch (KeeperException e) { if (e.code() == KeeperException.Code.SESSIONEXPIRED || e.code() == KeeperException.Code.CONNECTIONLOSS) { log.warn("ZooKeeper watch triggered, but Solr cannot talk to ZK"); return; } log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in solrj/src/java/org/apache/solr/common/cloud/ZkStateReader.java
catch (KeeperException e) { if (e.code() == KeeperException.Code.SESSIONEXPIRED || e.code() == KeeperException.Code.CONNECTIONLOSS) { log.warn("ZooKeeper watch triggered, but Solr cannot talk to ZK"); return; } log.error("", e); throw new ZooKeeperException( SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in solrj/src/java/org/apache/solr/common/cloud/ZkStateReader.java
catch (KeeperException e) { if (e.code() == KeeperException.Code.SESSIONEXPIRED || e.code() == KeeperException.Code.CONNECTIONLOSS) { log.warn("ZooKeeper watch triggered, but Solr cannot talk to ZK"); return; } log.error("", e); throw new ZooKeeperException( SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in solrj/src/java/org/apache/solr/common/cloud/ZkStateReader.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException( SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in solrj/src/java/org/apache/solr/common/cloud/ZkStateReader.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/CloudSolrServer.java
catch (InterruptedException e) { Thread.currentThread().interrupt(); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/CloudSolrServer.java
catch (KeeperException e) { throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/CloudSolrServer.java
catch (IOException e) { throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/CloudSolrServer.java
catch (TimeoutException e) { throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/cloud/Overseer.java
catch (KeeperException e) { if (e.code() == KeeperException.Code.SESSIONEXPIRED || e.code() == KeeperException.Code.CONNECTIONLOSS) { log.warn("ZooKeeper watch triggered, but Solr cannot talk to ZK"); return; } SolrException.log(log, "", e); throw new ZooKeeperException( SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/cloud/Overseer.java
catch (KeeperException e) { if (e.code() == KeeperException.Code.SESSIONEXPIRED || e.code() == KeeperException.Code.CONNECTIONLOSS) { log.warn("ZooKeeper watch triggered, but Solr cannot talk to ZK"); return; } SolrException.log(log, "", e); throw new ZooKeeperException( SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/cloud/Overseer.java
catch (KeeperException e) { if (e.code() == KeeperException.Code.SESSIONEXPIRED || e.code() == KeeperException.Code.CONNECTIONLOSS) { log.warn("ZooKeeper watch triggered, but Solr cannot talk to ZK"); return; } SolrException.log(log, "", e); throw new ZooKeeperException( SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/cloud/ZkSolrResourceLoader.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/cloud/ZkSolrResourceLoader.java
catch (KeeperException e) { log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); throw new ZooKeeperException( SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (Exception e) { SolrException.log(log, "", e); throw new ZooKeeperException( SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.warn("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (KeeperException e) { log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (KeeperException e) { throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (IOException e) { throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (KeeperException e) { throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "could not publish node state", e); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "could not publish node state", e); }
// in core/src/java/org/apache/solr/cloud/LeaderElector.java
catch (KeeperException.NoNodeException e) { // we must have failed in creating the election node - someone else must // be working on it, lets try again if (tries++ > 9) { throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } cont = true; Thread.sleep(50); }
// in core/src/java/org/apache/solr/update/processor/DistributedUpdateProcessor.java
catch (InterruptedException e) { Thread.currentThread().interrupt(); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/update/processor/DistributedUpdateProcessor.java
catch (InterruptedException e) { Thread.currentThread().interrupt(); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (TimeoutException e) { log.error("Could not connect to ZooKeeper", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (IOException e) { log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (KeeperException e) { log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); SolrException.log(log, "", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (Exception e) { // if register fails, this is really bad - close the zkController to // minimize any damage we can cause zkController.publish(core.getCoreDescriptor(), ZkStateReader.DOWN); SolrException.log(log, "", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (KeeperException e) { log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (KeeperException e) { log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (KeeperException e) { log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
0
(Lib) IllegalArgumentException 48
              
// in solrj/src/java/org/apache/solr/common/util/Base64.java
public static byte[] base64ToByteArray(String s) { byte[] alphaToInt = base64ToInt; int sLen = s.length(); int numGroups = sLen / 4; if (4 * numGroups != sLen) throw new IllegalArgumentException( "String length must be a multiple of four."); int missingBytesInLastGroup = 0; int numFullGroups = numGroups; if (sLen != 0) { if (s.charAt(sLen - 1) == '=') { missingBytesInLastGroup++; numFullGroups--; } if (s.charAt(sLen - 2) == '=') missingBytesInLastGroup++; } byte[] result = new byte[3 * numGroups - missingBytesInLastGroup]; // Translate all full groups from base64 to byte array elements int inCursor = 0, outCursor = 0; for (int i = 0; i < numFullGroups; i++) { int ch0 = base64toInt(s.charAt(inCursor++), alphaToInt); int ch1 = base64toInt(s.charAt(inCursor++), alphaToInt); int ch2 = base64toInt(s.charAt(inCursor++), alphaToInt); int ch3 = base64toInt(s.charAt(inCursor++), alphaToInt); result[outCursor++] = (byte) ((ch0 << 2) | (ch1 >> 4)); result[outCursor++] = (byte) ((ch1 << 4) | (ch2 >> 2)); result[outCursor++] = (byte) ((ch2 << 6) | ch3); } // Translate partial group, if present if (missingBytesInLastGroup != 0) { int ch0 = base64toInt(s.charAt(inCursor++), alphaToInt); int ch1 = base64toInt(s.charAt(inCursor++), alphaToInt); result[outCursor++] = (byte) ((ch0 << 2) | (ch1 >> 4)); if (missingBytesInLastGroup == 1) { int ch2 = base64toInt(s.charAt(inCursor++), alphaToInt); result[outCursor++] = (byte) ((ch1 << 4) | (ch2 >> 2)); } } // assert inCursor == s.length()-missingBytesInLastGroup; // assert outCursor == result.length; return result; }
// in solrj/src/java/org/apache/solr/common/util/Base64.java
private static int base64toInt(char c, byte[] alphaToInt) { int result = alphaToInt[c]; if (result < 0) throw new IllegalArgumentException("Illegal character " + c); return result; }
// in solrj/src/java/org/apache/solr/common/util/DateUtil.java
public static Date parseDate( String dateValue, Collection<String> dateFormats, Date startDate ) throws ParseException { if (dateValue == null) { throw new IllegalArgumentException("dateValue is null"); } if (dateFormats == null) { dateFormats = DEFAULT_HTTP_CLIENT_PATTERNS; } if (startDate == null) { startDate = DEFAULT_TWO_DIGIT_YEAR_START; } // trim single quotes around date if present // see issue #5279 if (dateValue.length() > 1 && dateValue.startsWith("'") && dateValue.endsWith("'") ) { dateValue = dateValue.substring(1, dateValue.length() - 1); } SimpleDateFormat dateParser = null; Iterator formatIter = dateFormats.iterator(); while (formatIter.hasNext()) { String format = (String) formatIter.next(); if (dateParser == null) { dateParser = new SimpleDateFormat(format, Locale.US); dateParser.setTimeZone(GMT); dateParser.set2DigitYearStart(startDate); } else { dateParser.applyPattern(format); } try { return dateParser.parse(dateValue); } catch (ParseException pe) { // ignore this exception, we will try the next format } } // we were unable to parse the date throw new ParseException("Unable to parse the date " + dateValue, 0); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
public void setAliveCheckInterval(int interval) { if (interval <= 0) { throw new IllegalArgumentException("Alive check interval must be " + "positive, specified value = " + interval); } this.interval = interval; }
// in contrib/analysis-extras/src/java/org/apache/solr/analysis/MorfologikFilterFactory.java
Override public void init(Map<String,String> args) { super.init(args); String dictionaryName = args.get(DICTIONARY_SCHEMA_ATTRIBUTE); if (dictionaryName != null && !dictionaryName.isEmpty()) { try { DICTIONARY dictionary = DICTIONARY.valueOf(dictionaryName.toUpperCase(Locale.ENGLISH)); assert dictionary != null; this.dictionary = dictionary; } catch (IllegalArgumentException e) { throw new IllegalArgumentException("The " + DICTIONARY_SCHEMA_ATTRIBUTE + " attribute accepts the " + "following constants: " + Arrays.toString(DICTIONARY.values()) + ", this value is invalid: " + dictionaryName); } } }
// in contrib/analysis-extras/src/java/org/apache/solr/schema/ICUCollationField.java
private BytesRef analyzeRangePart(String field, String part) { TokenStream source; try { source = analyzer.tokenStream(field, new StringReader(part)); source.reset(); } catch (IOException e) { throw new RuntimeException("Unable to initialize TokenStream to analyze range part: " + part, e); } TermToBytesRefAttribute termAtt = source.getAttribute(TermToBytesRefAttribute.class); BytesRef bytes = termAtt.getBytesRef(); // we control the analyzer here: most errors are impossible try { if (!source.incrementToken()) throw new IllegalArgumentException("analyzer returned no terms for range part: " + part); termAtt.fillBytesRef(); assert !source.incrementToken(); } catch (IOException e) { throw new RuntimeException("error analyzing range part: " + part, e); } try { source.end(); source.close(); } catch (IOException e) { throw new RuntimeException("Unable to end & close TokenStream after analyzing range part: " + part, e); } return BytesRef.deepCopyOf(bytes); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DocBuilder.java
private String findMatchingPkColumn(String pk, Map<String, Object> row) { if (row.containsKey(pk)) throw new IllegalArgumentException( String.format("deltaQuery returned a row with null for primary key %s", pk)); String resolvedPk = null; for (String columnName : row.keySet()) { if (columnName.endsWith("." + pk) || pk.endsWith("." + columnName)) { if (resolvedPk != null) throw new IllegalArgumentException( String.format( "deltaQuery has more than one column (%s and %s) that might resolve to declared primary key pk='%s'", resolvedPk, columnName, pk)); resolvedPk = columnName; } } if (resolvedPk == null) throw new IllegalArgumentException( String.format("deltaQuery has no column to resolve to declared primary key pk='%s'", pk)); LOG.info(String.format("Resolving deltaQuery column '%s' to match entity's declared pk '%s'", resolvedPk, pk)); return resolvedPk; }
// in core/src/java/org/apache/solr/response/PHPSerializedResponseWriter.java
Override public void writeMapOpener(int size) throws IOException, IllegalArgumentException { // negative size value indicates that something has gone wrong if (size < 0) { throw new IllegalArgumentException("Map size must not be negative"); } writer.write("a:"+size+":{"); }
// in core/src/java/org/apache/solr/response/PHPSerializedResponseWriter.java
Override public void writeArrayOpener(int size) throws IOException, IllegalArgumentException { // negative size value indicates that something has gone wrong if (size < 0) { throw new IllegalArgumentException("Array size must not be negative"); } writer.write("a:"+size+":{"); }
// in core/src/java/org/apache/solr/schema/CollationField.java
private BytesRef analyzeRangePart(String field, String part) { TokenStream source; try { source = analyzer.tokenStream(field, new StringReader(part)); source.reset(); } catch (IOException e) { throw new RuntimeException("Unable to initialize TokenStream to analyze range part: " + part, e); } TermToBytesRefAttribute termAtt = source.getAttribute(TermToBytesRefAttribute.class); BytesRef bytes = termAtt.getBytesRef(); // we control the analyzer here: most errors are impossible try { if (!source.incrementToken()) throw new IllegalArgumentException("analyzer returned no terms for range part: " + part); termAtt.fillBytesRef(); assert !source.incrementToken(); } catch (IOException e) { throw new RuntimeException("error analyzing range part: " + part, e); } try { source.end(); source.close(); } catch (IOException e) { throw new RuntimeException("Unable to end & close TokenStream after analyzing range part: " + part, e); } return BytesRef.deepCopyOf(bytes); }
// in core/src/java/org/apache/solr/internal/csv/CSVUtils.java
public static String[][] parse(String s) throws IOException { if (s == null) { throw new IllegalArgumentException("Null argument not allowed."); } String[][] result = (new CSVParser(new StringReader(s))).getAllValues(); if (result == null) { // since CSVStrategy ignores empty lines an empty array is returned // (i.e. not "result = new String[][] {{""}};") result = EMPTY_DOUBLE_STRING_ARRAY; } return result; }
// in core/src/java/org/apache/solr/internal/csv/CSVUtils.java
public static String[] parseLine(String s) throws IOException { if (s == null) { throw new IllegalArgumentException("Null argument not allowed."); } // uh,jh: make sure that parseLine("").length == 0 if (s.length() == 0) { return EMPTY_STRING_ARRAY; } return (new CSVParser(new StringReader(s))).getLine(); }
// in core/src/java/org/apache/solr/internal/csv/ExtendedBufferedReader.java
public long skip(long n) throws IllegalArgumentException, IOException { if (lookaheadChar == UNDEFINED) { lookaheadChar = super.read(); } // illegal argument if (n < 0) { throw new IllegalArgumentException("negative argument not supported"); } // no skipping if (n == 0 || lookaheadChar == END_OF_STREAM) { return 0; } // skip and reread the lookahead-char long skiped = 0; if (n > 1) { skiped = super.skip(n - 1); } lookaheadChar = super.read(); // fixme uh: we should check the skiped sequence for line-terminations... lineCounter = Integer.MIN_VALUE; return skiped + 1; }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
public QueryCommand setFilterList(List<Query> filterList) { if( filter != null ) { throw new IllegalArgumentException( "Either filter or filterList may be set in the QueryCommand, but not both." ); } this.filterList = filterList; return this; }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
public QueryCommand setFilterList(Query f) { if( filter != null ) { throw new IllegalArgumentException( "Either filter or filterList may be set in the QueryCommand, but not both." ); } filterList = null; if (f != null) { filterList = new ArrayList<Query>(2); filterList.add(f); } return this; }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
public QueryCommand setFilter(DocSet filter) { if( filterList != null ) { throw new IllegalArgumentException( "Either filter or filterList may be set in the QueryCommand, but not both." ); } this.filter = filter; return this; }
// in core/src/java/org/apache/solr/cloud/SolrZkServer.java
public static Properties getProperties(String path) throws ConfigException { File configFile = new File(path); LOG.info("Reading configuration from: " + configFile); try { if (!configFile.exists()) { throw new IllegalArgumentException(configFile.toString() + " file is missing"); } Properties cfg = new Properties(); FileInputStream in = new FileInputStream(configFile); try { cfg.load(in); } finally { in.close(); } return cfg; } catch (IOException e) { throw new ConfigException("Error processing " + path, e); } catch (IllegalArgumentException e) { throw new ConfigException("Error processing " + path, e); } }
// in core/src/java/org/apache/solr/cloud/SolrZkServer.java
Override public void parseProperties(Properties zkProp) throws IOException, ConfigException { for (Entry<Object, Object> entry : zkProp.entrySet()) { String key = entry.getKey().toString().trim(); String value = entry.getValue().toString().trim(); if (key.equals("dataDir")) { dataDir = value; } else if (key.equals("dataLogDir")) { dataLogDir = value; } else if (key.equals("clientPort")) { setClientPort(Integer.parseInt(value)); } else if (key.equals("tickTime")) { tickTime = Integer.parseInt(value); } else if (key.equals("initLimit")) { initLimit = Integer.parseInt(value); } else if (key.equals("syncLimit")) { syncLimit = Integer.parseInt(value); } else if (key.equals("electionAlg")) { electionAlg = Integer.parseInt(value); } else if (key.equals("maxClientCnxns")) { maxClientCnxns = Integer.parseInt(value); } else if (key.startsWith("server.")) { int dot = key.indexOf('.'); long sid = Long.parseLong(key.substring(dot + 1)); String parts[] = value.split(":"); if ((parts.length != 2) && (parts.length != 3)) { LOG.error(value + " does not have the form host:port or host:port:port"); } InetSocketAddress addr = new InetSocketAddress(parts[0], Integer.parseInt(parts[1])); if (parts.length == 2) { servers.put(Long.valueOf(sid), new QuorumPeer.QuorumServer(sid, addr)); } else if (parts.length == 3) { InetSocketAddress electionAddr = new InetSocketAddress( parts[0], Integer.parseInt(parts[2])); servers.put(Long.valueOf(sid), new QuorumPeer.QuorumServer(sid, addr, electionAddr)); } } else if (key.startsWith("group")) { int dot = key.indexOf('.'); long gid = Long.parseLong(key.substring(dot + 1)); numGroups++; String parts[] = value.split(":"); for(String s : parts){ long sid = Long.parseLong(s); if(serverGroup.containsKey(sid)) throw new ConfigException("Server " + sid + "is in multiple groups"); else serverGroup.put(sid, gid); } } else if(key.startsWith("weight")) { int dot = key.indexOf('.'); long sid = Long.parseLong(key.substring(dot + 1)); serverWeight.put(sid, Long.parseLong(value)); } else { System.setProperty("zookeeper." + key, value); } } if (dataDir == null) { throw new IllegalArgumentException("dataDir is not set"); } if (dataLogDir == null) { dataLogDir = dataDir; } else { if (!new File(dataLogDir).isDirectory()) { throw new IllegalArgumentException("dataLogDir " + dataLogDir + " is missing."); } } if (tickTime == 0) { throw new IllegalArgumentException("tickTime is not set"); } if (servers.size() > 1) { if (initLimit == 0) { throw new IllegalArgumentException("initLimit is not set"); } if (syncLimit == 0) { throw new IllegalArgumentException("syncLimit is not set"); } /* * If using FLE, then every server requires a separate election * port. */ if (electionAlg != 0) { for (QuorumPeer.QuorumServer s : servers.values()) { if (s.electionAddr == null) throw new IllegalArgumentException( "Missing election port for server: " + s.id); } } /* * Default of quorum config is majority */ if(serverGroup.size() > 0){ if(servers.size() != serverGroup.size()) throw new ConfigException("Every server must be in exactly one group"); /* * The deafult weight of a server is 1 */ for(QuorumPeer.QuorumServer s : servers.values()){ if(!serverWeight.containsKey(s.id)) serverWeight.put(s.id, (long) 1); } /* * Set the quorumVerifier to be QuorumHierarchical */ quorumVerifier = new QuorumHierarchical(numGroups, serverWeight, serverGroup); } else { /* * The default QuorumVerifier is QuorumMaj */ LOG.info("Defaulting to majority quorums"); quorumVerifier = new QuorumMaj(servers.size()); } File myIdFile = new File(dataDir, "myid"); if (!myIdFile.exists()) { ///////////////// ADDED FOR SOLR ////// Long myid = getMySeverId(); if (myid != null) { serverId = myid; return; } if (zkRun == null) return; //////////////// END ADDED FOR SOLR ////// throw new IllegalArgumentException(myIdFile.toString() + " file is missing"); } BufferedReader br = new BufferedReader(new FileReader(myIdFile)); String myIdString; try { myIdString = br.readLine(); } finally { br.close(); } try { serverId = Long.parseLong(myIdString); } catch (NumberFormatException e) { throw new IllegalArgumentException("serverid " + myIdString + " is not a number"); } } }
// in core/src/java/org/apache/solr/cloud/ZkController.java
public static void uploadToZK(SolrZkClient zkClient, File dir, String zkPath) throws IOException, KeeperException, InterruptedException { File[] files = dir.listFiles(); if (files == null) { throw new IllegalArgumentException("Illegal directory: " + dir); } for(File file : files) { if (!file.getName().startsWith(".")) { if (!file.isDirectory()) { zkClient.makePath(zkPath + "/" + file.getName(), file, false, true); } else { uploadToZK(zkClient, file, zkPath + "/" + file.getName()); } } } }
// in core/src/java/org/apache/solr/core/DefaultCodecFactory.java
Override public PostingsFormat getPostingsFormatForField(String field) { final SchemaField fieldOrNull = schema.getFieldOrNull(field); if (fieldOrNull == null) { throw new IllegalArgumentException("no such field " + field); } String postingsFormatName = fieldOrNull.getType().getPostingsFormat(); if (postingsFormatName != null) { return PostingsFormat.forName(postingsFormatName); } return super.getPostingsFormatForField(field); }
// in core/src/java/org/apache/solr/core/CachingDirectoryFactory.java
private void close(Directory directory) throws IOException { synchronized (this) { CacheValue cacheValue = byDirectoryCache.get(directory); if (cacheValue == null) { throw new IllegalArgumentException("Unknown directory: " + directory + " " + byDirectoryCache); } cacheValue.refCnt--; if (cacheValue.refCnt == 0 && cacheValue.doneWithDir) { directory.close(); byDirectoryCache.remove(directory); byPathCache.remove(cacheValue.path); } } }
// in core/src/java/org/apache/solr/core/CachingDirectoryFactory.java
public void incRef(Directory directory) { synchronized (this) { CacheValue cacheValue = byDirectoryCache.get(directory); if (cacheValue == null) { throw new IllegalArgumentException("Unknown directory: " + directory); } cacheValue.refCnt++; } }
// in core/src/java/org/apache/solr/core/MMapDirectoryFactory.java
Override public void init(NamedList args) { SolrParams params = SolrParams.toSolrParams( args ); maxChunk = params.getInt("maxChunkSize", MMapDirectory.DEFAULT_MAX_BUFF); if (maxChunk <= 0){ throw new IllegalArgumentException("maxChunk must be greater than 0"); } unmapHack = params.getBool("unmap", true); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
protected void initZooKeeper(String zkHost, int zkClientTimeout) { // if zkHost sys property is not set, we are not using ZooKeeper String zookeeperHost; if(zkHost == null) { zookeeperHost = System.getProperty("zkHost"); } else { zookeeperHost = zkHost; } String zkRun = System.getProperty("zkRun"); if (zkRun == null && zookeeperHost == null) return; // not in zk mode // zookeeper in quorum mode currently causes a failure when trying to // register log4j mbeans. See SOLR-2369 // TODO: remove after updating to an slf4j based zookeeper System.setProperty("zookeeper.jmx.log4j.disable", "true"); if (zkRun != null) { String zkDataHome = System.getProperty("zkServerDataDir", solrHome + "zoo_data"); String zkConfHome = System.getProperty("zkServerConfDir", solrHome); zkServer = new SolrZkServer(zkRun, zookeeperHost, zkDataHome, zkConfHome, hostPort); zkServer.parseConfig(); zkServer.start(); // set client from server config if not already set if (zookeeperHost == null) { zookeeperHost = zkServer.getClientString(); } } int zkClientConnectTimeout = 15000; if (zookeeperHost != null) { // we are ZooKeeper enabled try { // If this is an ensemble, allow for a long connect time for other servers to come up if (zkRun != null && zkServer.getServers().size() > 1) { zkClientConnectTimeout = 24 * 60 * 60 * 1000; // 1 day for embedded ensemble log.info("Zookeeper client=" + zookeeperHost + " Waiting for a quorum."); } else { log.info("Zookeeper client=" + zookeeperHost); } zkController = new ZkController(this, zookeeperHost, zkClientTimeout, zkClientConnectTimeout, host, hostPort, hostContext, new CurrentCoreDescriptorProvider() { @Override public List<CoreDescriptor> getCurrentDescriptors() { List<CoreDescriptor> descriptors = new ArrayList<CoreDescriptor>(getCoreNames().size()); for (SolrCore core : getCores()) { descriptors.add(core.getCoreDescriptor()); } return descriptors; } }); String confDir = System.getProperty("bootstrap_confdir"); if(confDir != null) { File dir = new File(confDir); if(!dir.isDirectory()) { throw new IllegalArgumentException("bootstrap_confdir must be a directory of configuration files"); } String confName = System.getProperty(ZkController.COLLECTION_PARAM_PREFIX+ZkController.CONFIGNAME_PROP, "configuration1"); zkController.uploadConfigDir(dir, confName); } boolean boostrapConf = Boolean.getBoolean("bootstrap_conf"); if(boostrapConf) { ZkController.bootstrapConf(zkController.getZkClient(), cfg, solrHome); } } catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } catch (TimeoutException e) { log.error("Could not connect to ZooKeeper", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } catch (IOException e) { log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } catch (KeeperException e) { log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } }
// in core/src/java/org/apache/solr/core/CoreDescriptor.java
public void setConfigName(String name) { if (name == null || name.length() == 0) throw new IllegalArgumentException("name can not be null or empty"); this.configName = name; }
// in core/src/java/org/apache/solr/core/CoreDescriptor.java
public void setSchemaName(String name) { if (name == null || name.length() == 0) throw new IllegalArgumentException("name can not be null or empty"); this.schemaName = name; }
// in core/src/java/org/apache/solr/util/SystemIdResolver.java
public static String createSystemIdFromResourceName(String name) { name = name.replace(File.separatorChar, '/'); final String authority; if (name.startsWith("/")) { // a hack to preserve absolute filenames and keep them absolute after resolving, we set the URI's authority to "@" on absolute filenames: authority = RESOURCE_LOADER_AUTHORITY_ABSOLUTE; } else { authority = null; name = "/" + name; } try { return new URI(RESOURCE_LOADER_URI_SCHEME, authority, name, null, null).toASCIIString(); } catch (URISyntaxException use) { throw new IllegalArgumentException("Invalid syntax of Solr Resource URI", use); } }
// in core/src/java/org/apache/solr/util/DateMathParser.java
public static void add(Calendar c, int val, String unit) { Integer uu = CALENDAR_UNITS.get(unit); if (null == uu) { throw new IllegalArgumentException("Adding Unit not recognized: " + unit); } c.add(uu.intValue(), val); }
// in core/src/java/org/apache/solr/util/DateMathParser.java
public static void round(Calendar c, String unit) { Integer uu = CALENDAR_UNITS.get(unit); if (null == uu) { throw new IllegalArgumentException("Rounding Unit not recognized: " + unit); } int u = uu.intValue(); switch (u) { case Calendar.YEAR: c.clear(Calendar.MONTH); /* fall through */ case Calendar.MONTH: c.clear(Calendar.DAY_OF_MONTH); c.clear(Calendar.DAY_OF_WEEK); c.clear(Calendar.DAY_OF_WEEK_IN_MONTH); c.clear(Calendar.DAY_OF_YEAR); c.clear(Calendar.WEEK_OF_MONTH); c.clear(Calendar.WEEK_OF_YEAR); /* fall through */ case Calendar.DATE: c.clear(Calendar.HOUR_OF_DAY); c.clear(Calendar.HOUR); c.clear(Calendar.AM_PM); /* fall through */ case Calendar.HOUR_OF_DAY: c.clear(Calendar.MINUTE); /* fall through */ case Calendar.MINUTE: c.clear(Calendar.SECOND); /* fall through */ case Calendar.SECOND: c.clear(Calendar.MILLISECOND); break; default: throw new IllegalStateException ("No logic for rounding value ("+u+") " + unit); } }
3
              
// in contrib/analysis-extras/src/java/org/apache/solr/analysis/MorfologikFilterFactory.java
catch (IllegalArgumentException e) { throw new IllegalArgumentException("The " + DICTIONARY_SCHEMA_ATTRIBUTE + " attribute accepts the " + "following constants: " + Arrays.toString(DICTIONARY.values()) + ", this value is invalid: " + dictionaryName); }
// in core/src/java/org/apache/solr/cloud/SolrZkServer.java
catch (NumberFormatException e) { throw new IllegalArgumentException("serverid " + myIdString + " is not a number"); }
// in core/src/java/org/apache/solr/util/SystemIdResolver.java
catch (URISyntaxException use) { throw new IllegalArgumentException("Invalid syntax of Solr Resource URI", use); }
7
              
// in core/src/java/org/apache/solr/response/JSONResponseWriter.java
public void writeMapOpener(int size) throws IOException, IllegalArgumentException { writer.write('{'); }
// in core/src/java/org/apache/solr/response/JSONResponseWriter.java
public void writeArrayOpener(int size) throws IOException, IllegalArgumentException { writer.write('['); }
// in core/src/java/org/apache/solr/response/PHPSerializedResponseWriter.java
Override public void writeMapOpener(int size) throws IOException, IllegalArgumentException { // negative size value indicates that something has gone wrong if (size < 0) { throw new IllegalArgumentException("Map size must not be negative"); } writer.write("a:"+size+":{"); }
// in core/src/java/org/apache/solr/response/PHPSerializedResponseWriter.java
Override public void writeArrayOpener(int size) throws IOException, IllegalArgumentException { // negative size value indicates that something has gone wrong if (size < 0) { throw new IllegalArgumentException("Array size must not be negative"); } writer.write("a:"+size+":{"); }
// in core/src/java/org/apache/solr/internal/csv/ExtendedBufferedReader.java
public long skip(long n) throws IllegalArgumentException, IOException { if (lookaheadChar == UNDEFINED) { lookaheadChar = super.read(); } // illegal argument if (n < 0) { throw new IllegalArgumentException("negative argument not supported"); } // no skipping if (n == 0 || lookaheadChar == END_OF_STREAM) { return 0; } // skip and reread the lookahead-char long skiped = 0; if (n > 1) { skiped = super.skip(n - 1); } lookaheadChar = super.read(); // fixme uh: we should check the skiped sequence for line-terminations... lineCounter = Integer.MIN_VALUE; return skiped + 1; }
// in core/src/java/org/apache/solr/internal/csv/ExtendedBufferedReader.java
public long skipUntil(char c) throws IllegalArgumentException, IOException { if (lookaheadChar == UNDEFINED) { lookaheadChar = super.read(); } long counter = 0; while (lookaheadChar != c && lookaheadChar != END_OF_STREAM) { if (lookaheadChar == '\n') { lineCounter++; } lookaheadChar = super.read(); counter++; } return counter; }
// in core/src/java/org/apache/solr/update/DocumentBuilder.java
public Document getDoc() throws IllegalArgumentException { // Check for all required fields -- Note, all fields with a // default value are defacto 'required' fields. List<String> missingFields = null; for (SchemaField field : schema.getRequiredFields()) { if (doc.getField(field.getName() ) == null) { if (field.getDefaultValue() != null) { addField(doc, field, field.getDefaultValue(), 1.0f); } else { if (missingFields==null) { missingFields = new ArrayList<String>(1); } missingFields.add(field.getName()); } } } if (missingFields != null) { StringBuilder builder = new StringBuilder(); // add the uniqueKey if possible if( schema.getUniqueKeyField() != null ) { String n = schema.getUniqueKeyField().getName(); String v = doc.getField( n ).stringValue(); builder.append( "Document ["+n+"="+v+"] " ); } builder.append("missing required fields: " ); for (String field : missingFields) { builder.append(field); builder.append(" "); } throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, builder.toString()); } Document ret = doc; doc=null; return ret; }
(Lib) UnsupportedOperationException 48
              
// in solrj/src/java/org/apache/solr/common/SolrInputField.java
public void remove() { throw new UnsupportedOperationException(); }
// in solrj/src/java/org/apache/solr/common/util/NamedList.java
public void remove() { throw new UnsupportedOperationException(); }
// in solrj/src/java/org/apache/solr/common/util/IteratorChain.java
public void remove() { // we just need this class // to iterate in readonly mode throw new UnsupportedOperationException(); }
// in solrj/src/java/org/apache/solr/common/SolrDocument.java
public void clear() { throw new UnsupportedOperationException(); }
// in solrj/src/java/org/apache/solr/common/SolrDocument.java
public boolean containsValue(Object value) {throw new UnsupportedOperationException();}
// in solrj/src/java/org/apache/solr/common/SolrDocument.java
public Set<java.util.Map.Entry<String, Collection<Object>>> entrySet() {throw new UnsupportedOperationException();}
// in solrj/src/java/org/apache/solr/common/SolrDocument.java
public void putAll(Map<? extends String, ? extends Collection<Object>> t) {throw new UnsupportedOperationException();}
// in solrj/src/java/org/apache/solr/common/SolrDocument.java
public Collection<Collection<Object>> values() {throw new UnsupportedOperationException();}
// in solrj/src/java/org/apache/solr/common/SolrDocument.java
public Collection<Object> put(String key, Collection<Object> value) {throw new UnsupportedOperationException();}
// in solrj/src/java/org/apache/solr/common/SolrDocument.java
public Collection<Object> remove(Object key) {throw new UnsupportedOperationException();}
// in solrj/src/java/org/apache/solr/common/SolrDocument.java
public void clear() { throw new UnsupportedOperationException(); }
// in solrj/src/java/org/apache/solr/common/SolrDocument.java
public boolean containsValue(Object value) {throw new UnsupportedOperationException();}
// in solrj/src/java/org/apache/solr/common/SolrDocument.java
public Set<java.util.Map.Entry<String, Object>> entrySet() {throw new UnsupportedOperationException();}
// in solrj/src/java/org/apache/solr/common/SolrDocument.java
public void putAll(Map<? extends String, ? extends Object> t) {throw new UnsupportedOperationException();}
// in solrj/src/java/org/apache/solr/common/SolrDocument.java
public Collection<Object> values() {throw new UnsupportedOperationException();}
// in solrj/src/java/org/apache/solr/common/SolrDocument.java
public Collection<Object> put(String key, Object value) {throw new UnsupportedOperationException();}
// in solrj/src/java/org/apache/solr/common/SolrDocument.java
public Collection<Object> remove(Object key) {throw new UnsupportedOperationException();}
// in solrj/src/java/org/apache/solr/client/solrj/impl/HttpSolrServer.java
public void setAllowCompression(boolean allowCompression) { if (httpClient instanceof DefaultHttpClient) { HttpClientUtil.setAllowCompression((DefaultHttpClient) httpClient, allowCompression); } else { throw new UnsupportedOperationException( "HttpClient instance was not of type DefaultHttpClient"); } }
// in solrj/src/java/org/apache/solr/client/solrj/impl/HttpSolrServer.java
public void setDefaultMaxConnectionsPerHost(int max) { if (internalClient) { HttpClientUtil.setMaxConnectionsPerHost(httpClient, max); } else { throw new UnsupportedOperationException( "Client was created outside of HttpSolrServer"); } }
// in solrj/src/java/org/apache/solr/client/solrj/impl/HttpSolrServer.java
public void setMaxTotalConnections(int max) { if (internalClient) { HttpClientUtil.setMaxConnections(httpClient, max); } else { throw new UnsupportedOperationException( "Client was created outside of HttpSolrServer"); } }
// in contrib/dataimporthandler-extras/src/java/org/apache/solr/handler/dataimport/MailEntityProcessor.java
public void remove() { throw new UnsupportedOperationException("Its read only mode..."); }
// in contrib/dataimporthandler-extras/src/java/org/apache/solr/handler/dataimport/MailEntityProcessor.java
public void remove() { throw new UnsupportedOperationException("Its read only mode..."); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/SortedMapBackedCache.java
Override public void remove() { throw new UnsupportedOperationException(); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/SolrEntityProcessor.java
Override public void remove() { throw new UnsupportedOperationException(); }
// in core/src/java/org/apache/solr/schema/BCDIntField.java
Override public ValueSource getValueSource(SchemaField field, QParser qparser) { throw new UnsupportedOperationException("ValueSource not implemented"); }
// in core/src/java/org/apache/solr/schema/LatLonType.java
Override public IndexableField createField(SchemaField field, Object value, float boost) { throw new UnsupportedOperationException("LatLonType uses multiple fields. field=" + field.getName()); }
// in core/src/java/org/apache/solr/schema/PointType.java
Override public IndexableField createField(SchemaField field, Object value, float boost) { throw new UnsupportedOperationException("PointType uses multiple fields. field=" + field.getName()); }
// in core/src/java/org/apache/solr/schema/ExternalFileField.java
Override public void write(TextResponseWriter writer, String name, IndexableField f) throws IOException { throw new UnsupportedOperationException(); }
// in core/src/java/org/apache/solr/schema/ExternalFileField.java
Override public SortField getSortField(SchemaField field,boolean reverse) { throw new UnsupportedOperationException(); }
// in core/src/java/org/apache/solr/schema/AbstractSubTypeFieldType.java
Override public Query getFieldQuery(QParser parser, SchemaField field, String externalVal) { throw new UnsupportedOperationException(); }
// in core/src/java/org/apache/solr/search/SortedIntDocSet.java
public void remove() { throw new UnsupportedOperationException("The remove operation is not supported by this Iterator."); }
// in core/src/java/org/apache/solr/search/MissingStringLastComparatorSource.java
Override public int compare(int slot1, int slot2) { throw new UnsupportedOperationException(); }
// in core/src/java/org/apache/solr/search/MissingStringLastComparatorSource.java
Override public void setBottom(int slot) { throw new UnsupportedOperationException(); }
// in core/src/java/org/apache/solr/search/MissingStringLastComparatorSource.java
Override public int compareBottom(int doc) { throw new UnsupportedOperationException(); }
// in core/src/java/org/apache/solr/search/MissingStringLastComparatorSource.java
Override public void copy(int slot, int doc) { throw new UnsupportedOperationException(); }
// in core/src/java/org/apache/solr/search/MissingStringLastComparatorSource.java
Override public BytesRef value(int slot) { throw new UnsupportedOperationException(); }
// in core/src/java/org/apache/solr/search/MissingStringLastComparatorSource.java
Override public int compareDocToValue(int doc, Comparable docValue) { throw new UnsupportedOperationException(); }
// in core/src/java/org/apache/solr/search/DocSlice.java
public void remove() { throw new UnsupportedOperationException("The remove operation is not supported by this Iterator."); }
// in core/src/java/org/apache/solr/logging/CircularList.java
Override public void remove() { throw new UnsupportedOperationException(); }
// in core/src/java/org/apache/solr/spelling/SolrSpellChecker.java
protected float getAccuracy() { throw new UnsupportedOperationException(); }
// in core/src/java/org/apache/solr/spelling/SolrSpellChecker.java
protected StringDistance getStringDistance() { throw new UnsupportedOperationException(); }
// in core/src/java/org/apache/solr/spelling/PossibilityIterator.java
public void remove() { throw new UnsupportedOperationException(); }
// in core/src/java/org/apache/solr/core/AbstractSolrEventListener.java
public void postCommit() { throw new UnsupportedOperationException(); }
// in core/src/java/org/apache/solr/core/AbstractSolrEventListener.java
Override public void postSoftCommit() { throw new UnsupportedOperationException(); }
// in core/src/java/org/apache/solr/core/AbstractSolrEventListener.java
public void newSearcher(SolrIndexSearcher newSearcher, SolrIndexSearcher currentSearcher) { throw new UnsupportedOperationException(); }
// in core/src/java/org/apache/solr/core/JmxMonitoredMap.java
public void setAttribute(Attribute attribute) throws AttributeNotFoundException, InvalidAttributeValueException, MBeanException, ReflectionException { throw new UnsupportedOperationException("Operation not Supported"); }
// in core/src/java/org/apache/solr/core/JmxMonitoredMap.java
public AttributeList setAttributes(AttributeList attributes) { throw new UnsupportedOperationException("Operation not Supported"); }
// in core/src/java/org/apache/solr/core/JmxMonitoredMap.java
public Object invoke(String actionName, Object[] params, String[] signature) throws MBeanException, ReflectionException { throw new UnsupportedOperationException("Operation not Supported"); }
0 0
(Domain) ParseException 37
              
// in solrj/src/java/org/apache/solr/common/util/DateUtil.java
public static Date parseDate( String dateValue, Collection<String> dateFormats, Date startDate ) throws ParseException { if (dateValue == null) { throw new IllegalArgumentException("dateValue is null"); } if (dateFormats == null) { dateFormats = DEFAULT_HTTP_CLIENT_PATTERNS; } if (startDate == null) { startDate = DEFAULT_TWO_DIGIT_YEAR_START; } // trim single quotes around date if present // see issue #5279 if (dateValue.length() > 1 && dateValue.startsWith("'") && dateValue.endsWith("'") ) { dateValue = dateValue.substring(1, dateValue.length() - 1); } SimpleDateFormat dateParser = null; Iterator formatIter = dateFormats.iterator(); while (formatIter.hasNext()) { String format = (String) formatIter.next(); if (dateParser == null) { dateParser = new SimpleDateFormat(format, Locale.US); dateParser.setTimeZone(GMT); dateParser.set2DigitYearStart(startDate); } else { dateParser.applyPattern(format); } try { return dateParser.parse(dateValue); } catch (ParseException pe) { // ignore this exception, we will try the next format } } // we were unable to parse the date throw new ParseException("Unable to parse the date " + dateValue, 0); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/NumberFormatTransformer.java
private Number parseNumber(String val, NumberFormat numFormat) throws ParseException { ParsePosition parsePos = new ParsePosition(0); Number num = numFormat.parse(val, parsePos); if (parsePos.getIndex() != val.length()) { throw new ParseException("illegal number format", parsePos.getIndex()); } return num; }
// in core/src/java/org/apache/solr/search/QParser.java
private void checkRecurse() throws ParseException { if (recurseCount++ >= 100) { throw new ParseException("Infinite Recursion detected parsing query '" + qstr + "'"); } }
// in core/src/java/org/apache/solr/search/SurroundQParserPlugin.java
Override public Query parse() throws org.apache.lucene.queryparser.classic.ParseException { SrndQuery sq; String qstr = getString(); if (qstr == null) return null; String mbqparam = getParam(MBQParam); if (mbqparam == null) { this.maxBasicQueries = DEFMAXBASICQUERIES; } else { try { this.maxBasicQueries = Integer.parseInt(mbqparam); } catch (Exception e) { LOG.warn("Couldn't parse maxBasicQueries value " + mbqparam +", using default of 1000"); this.maxBasicQueries = DEFMAXBASICQUERIES; } } // ugh .. colliding ParseExceptions try { sq = org.apache.lucene.queryparser.surround.parser.QueryParser .parse(qstr); } catch (org.apache.lucene.queryparser.surround.parser.ParseException pe) { throw new org.apache.lucene.queryparser.classic.ParseException( pe.getMessage()); } // so what do we do with the SrndQuery ?? // processing based on example in LIA Ch 9 String defaultField = getParam(CommonParams.DF); if (defaultField == null) { defaultField = getReq().getSchema().getDefaultSearchFieldName(); } BasicQueryFactory bqFactory = new BasicQueryFactory(this.maxBasicQueries); Query lquery = sq.makeLuceneQueryField(defaultField, bqFactory); return lquery; }
// in core/src/java/org/apache/solr/search/ExtendedDismaxQParserPlugin.java
private void validateCyclicAliasing(String field) throws ParseException { Set<String> set = new HashSet<String>(); set.add(field); if(validateField(field, set)) { throw new ParseException("Field aliases lead to a cycle"); } }
// in core/src/java/org/apache/solr/search/QueryParsing.java
public static int parseLocalParams(String txt, int start, Map<String, String> target, SolrParams params, String startString, char endChar) throws ParseException { int off = start; if (!txt.startsWith(startString, off)) return start; StrParser p = new StrParser(txt, start, txt.length()); p.pos += startString.length(); // skip over "{!" for (; ;) { /* if (p.pos>=txt.length()) { throw new ParseException("Missing '}' parsing local params '" + txt + '"'); } */ char ch = p.peek(); if (ch == endChar) { return p.pos + 1; } String id = p.getId(); if (id.length() == 0) { throw new ParseException("Expected ending character '" + endChar + "' parsing local params '" + txt + '"'); } String val = null; ch = p.peek(); if (ch != '=') { // single word... treat {!func} as type=func for easy lookup val = id; id = TYPE; } else { // saw equals, so read value p.pos++; ch = p.peek(); boolean deref = false; if (ch == '$') { p.pos++; ch = p.peek(); deref = true; // dereference whatever value is read by treating it as a variable name } if (ch == '\"' || ch == '\'') { val = p.getQuotedString(); } else { // read unquoted literal ended by whitespace or endChar (normally '}') // there is no escaping. int valStart = p.pos; for (; ;) { if (p.pos >= p.end) { throw new ParseException("Missing end to unquoted value starting at " + valStart + " str='" + txt + "'"); } char c = p.val.charAt(p.pos); if (c == endChar || Character.isWhitespace(c)) { val = p.val.substring(valStart, p.pos); break; } p.pos++; } } if (deref) { // dereference parameter if (params != null) { val = params.get(val); } } } if (target != null) target.put(id, val); } }
// in core/src/java/org/apache/solr/search/QueryParsing.java
void expect(String s) throws ParseException { eatws(); int slen = s.length(); if (val.regionMatches(pos, s, 0, slen)) { pos += slen; } else { throw new ParseException("Expected '" + s + "' at position " + pos + " in '" + val + "'"); } }
// in core/src/java/org/apache/solr/search/QueryParsing.java
String getId(String errMessage) throws ParseException { eatws(); int id_start = pos; char ch; if (pos < end && (ch = val.charAt(pos)) != '$' && Character.isJavaIdentifierStart(ch)) { pos++; while (pos < end) { ch = val.charAt(pos); // if (!Character.isJavaIdentifierPart(ch) && ch != '.' && ch != ':') { if (!Character.isJavaIdentifierPart(ch) && ch != '.') { break; } pos++; } return val.substring(id_start, pos); } if (errMessage != null) { throw new ParseException(errMessage + " at pos " + pos + " str='" + val + "'"); } return null; }
// in core/src/java/org/apache/solr/search/QueryParsing.java
public String getGlobbedId(String errMessage) throws ParseException { eatws(); int id_start = pos; char ch; if (pos < end && (ch = val.charAt(pos)) != '$' && (Character.isJavaIdentifierStart(ch) || ch=='?' || ch=='*')) { pos++; while (pos < end) { ch = val.charAt(pos); if (!(Character.isJavaIdentifierPart(ch) || ch=='?' || ch=='*') && ch != '.') { break; } pos++; } return val.substring(id_start, pos); } if (errMessage != null) { throw new ParseException(errMessage + " at pos " + pos + " str='" + val + "'"); } return null; }
// in core/src/java/org/apache/solr/search/QueryParsing.java
String getQuotedString() throws ParseException { eatws(); char delim = peekChar(); if (!(delim == '\"' || delim == '\'')) { return null; } int val_start = ++pos; StringBuilder sb = new StringBuilder(); // needed for escaping for (; ;) { if (pos >= end) { throw new ParseException("Missing end quote for string at pos " + (val_start - 1) + " str='" + val + "'"); } char ch = val.charAt(pos); if (ch == '\\') { pos++; if (pos >= end) break; ch = val.charAt(pos); switch (ch) { case 'n': ch = '\n'; break; case 't': ch = '\t'; break; case 'r': ch = '\r'; break; case 'b': ch = '\b'; break; case 'f': ch = '\f'; break; case 'u': if (pos + 4 >= end) { throw new ParseException("bad unicode escape \\uxxxx at pos" + (val_start - 1) + " str='" + val + "'"); } ch = (char) Integer.parseInt(val.substring(pos + 1, pos + 5), 16); pos += 4; break; } } else if (ch == delim) { pos++; // skip over the quote break; } sb.append(ch); pos++; } return sb.toString(); }
// in core/src/java/org/apache/solr/search/LuceneQParserPlugin.java
Override public Query parse() throws ParseException { // handle legacy "query;sort" syntax if (getLocalParams() == null) { String qstr = getString(); if (qstr == null || qstr.length() == 0) return null; sortStr = getParams().get(CommonParams.SORT); if (sortStr == null) { // sort may be legacy form, included in the query string List<String> commands = StrUtils.splitSmart(qstr,';'); if (commands.size() == 2) { qstr = commands.get(0); sortStr = commands.get(1); } else if (commands.size() == 1) { // This is need to support the case where someone sends: "q=query;" qstr = commands.get(0); } else if (commands.size() > 2) { throw new ParseException("If you want to use multiple ';' in the query, use the 'sort' param."); } } setString(qstr); } return super.parse(); }
// in core/src/java/org/apache/solr/search/function/distance/HaversineConstFunction.java
Override public ValueSource parse(FunctionQParser fp) throws ParseException { // TODO: dispatch through SpatialQueriable in the future? List<ValueSource> sources = fp.parseValueSourceList(); // "m" is a multi-value source, "x" is a single-value source // allow (m,m) (m,x,x) (x,x,m) (x,x,x,x) // if not enough points are present, "pt" will be checked first, followed by "sfield". MultiValueSource mv1 = null; MultiValueSource mv2 = null; if (sources.size() == 0) { // nothing to do now } else if (sources.size() == 1) { ValueSource vs = sources.get(0); if (!(vs instanceof MultiValueSource)) { throw new ParseException("geodist - invalid parameters:" + sources); } mv1 = (MultiValueSource)vs; } else if (sources.size() == 2) { ValueSource vs1 = sources.get(0); ValueSource vs2 = sources.get(1); if (vs1 instanceof MultiValueSource && vs2 instanceof MultiValueSource) { mv1 = (MultiValueSource)vs1; mv2 = (MultiValueSource)vs2; } else { mv1 = makeMV(sources, sources); } } else if (sources.size()==3) { ValueSource vs1 = sources.get(0); ValueSource vs2 = sources.get(1); if (vs1 instanceof MultiValueSource) { // (m,x,x) mv1 = (MultiValueSource)vs1; mv2 = makeMV(sources.subList(1,3), sources); } else { // (x,x,m) mv1 = makeMV(sources.subList(0,2), sources); vs1 = sources.get(2); if (!(vs1 instanceof MultiValueSource)) { throw new ParseException("geodist - invalid parameters:" + sources); } mv2 = (MultiValueSource)vs1; } } else if (sources.size()==4) { mv1 = makeMV(sources.subList(0,2), sources); mv2 = makeMV(sources.subList(2,4), sources); } else if (sources.size() > 4) { throw new ParseException("geodist - invalid parameters:" + sources); } if (mv1 == null) { mv1 = parsePoint(fp); mv2 = parseSfield(fp); } else if (mv2 == null) { mv2 = parsePoint(fp); if (mv2 == null) mv2 = parseSfield(fp); } if (mv1 == null || mv2 == null) { throw new ParseException("geodist - not enough parameters:" + sources); } // We have all the parameters at this point, now check if one of the points is constant double[] constants; constants = getConstants(mv1); MultiValueSource other = mv2; if (constants == null) { constants = getConstants(mv2); other = mv1; } if (constants != null && other instanceof VectorValueSource) { return new HaversineConstFunction(constants[0], constants[1], (VectorValueSource)other); } return new HaversineFunction(mv1, mv2, DistanceUtils.EARTH_MEAN_RADIUS_KM, true); }
// in core/src/java/org/apache/solr/search/function/distance/HaversineConstFunction.java
private static VectorValueSource makeMV(List<ValueSource> sources, List<ValueSource> orig) throws ParseException { ValueSource vs1 = sources.get(0); ValueSource vs2 = sources.get(1); if (vs1 instanceof MultiValueSource || vs2 instanceof MultiValueSource) { throw new ParseException("geodist - invalid parameters:" + orig); } return new VectorValueSource(sources); }
// in core/src/java/org/apache/solr/search/function/distance/HaversineConstFunction.java
private static MultiValueSource parsePoint(FunctionQParser fp) throws ParseException { String pt = fp.getParam(SpatialParams.POINT); if (pt == null) return null; double[] point = null; try { point = ParseUtils.parseLatitudeLongitude(pt); } catch (InvalidShapeException e) { throw new ParseException("Bad spatial pt:" + pt); } return new VectorValueSource(Arrays.<ValueSource>asList(new DoubleConstValueSource(point[0]),new DoubleConstValueSource(point[1]))); }
// in core/src/java/org/apache/solr/search/function/distance/HaversineConstFunction.java
private static MultiValueSource parseSfield(FunctionQParser fp) throws ParseException { String sfield = fp.getParam(SpatialParams.FIELD); if (sfield == null) return null; SchemaField sf = fp.getReq().getSchema().getField(sfield); ValueSource vs = sf.getType().getValueSource(sf, fp); if (!(vs instanceof MultiValueSource)) { throw new ParseException("Spatial field must implement MultiValueSource:" + sf); } return (MultiValueSource)vs; }
// in core/src/java/org/apache/solr/search/FunctionQParser.java
Override public Query parse() throws ParseException { sp = new QueryParsing.StrParser(getString()); ValueSource vs = null; List<ValueSource> lst = null; for(;;) { ValueSource valsource = parseValueSource(false); sp.eatws(); if (!parseMultipleSources) { vs = valsource; break; } else { if (lst != null) { lst.add(valsource); } else { vs = valsource; } } // check if there is a "," separator if (sp.peek() != ',') break; consumeArgumentDelimiter(); if (lst == null) { lst = new ArrayList<ValueSource>(2); lst.add(valsource); } } if (parseToEnd && sp.pos < sp.end) { throw new ParseException("Unexpected text after function: " + sp.val.substring(sp.pos, sp.end)); } if (lst != null) { vs = new VectorValueSource(lst); } return new FunctionQuery(vs); }
// in core/src/java/org/apache/solr/search/FunctionQParser.java
public String parseId() throws ParseException { String value = parseArg(); if (argWasQuoted) throw new ParseException("Expected identifier instead of quoted string:" + value); return value; }
// in core/src/java/org/apache/solr/search/FunctionQParser.java
public Float parseFloat() throws ParseException { String str = parseArg(); if (argWasQuoted()) throw new ParseException("Expected float instead of quoted string:" + str); float value = Float.parseFloat(str); return value; }
// in core/src/java/org/apache/solr/search/FunctionQParser.java
public double parseDouble() throws ParseException { String str = parseArg(); if (argWasQuoted()) throw new ParseException("Expected double instead of quoted string:" + str); double value = Double.parseDouble(str); return value; }
// in core/src/java/org/apache/solr/search/FunctionQParser.java
public int parseInt() throws ParseException { String str = parseArg(); if (argWasQuoted()) throw new ParseException("Expected double instead of quoted string:" + str); int value = Integer.parseInt(str); return value; }
// in core/src/java/org/apache/solr/search/FunctionQParser.java
public String parseArg() throws ParseException { argWasQuoted = false; sp.eatws(); char ch = sp.peek(); String val = null; switch (ch) { case ')': return null; case '$': sp.pos++; String param = sp.getId(); val = getParam(param); break; case '\'': case '"': val = sp.getQuotedString(); argWasQuoted = true; break; default: // read unquoted literal ended by whitespace ',' or ')' // there is no escaping. int valStart = sp.pos; for (;;) { if (sp.pos >= sp.end) { throw new ParseException("Missing end to unquoted value starting at " + valStart + " str='" + sp.val +"'"); } char c = sp.val.charAt(sp.pos); if (c==')' || c==',' || Character.isWhitespace(c)) { val = sp.val.substring(valStart, sp.pos); break; } sp.pos++; } } sp.eatws(); consumeArgumentDelimiter(); return val; }
// in core/src/java/org/apache/solr/search/FunctionQParser.java
public Query parseNestedQuery() throws ParseException { Query nestedQuery; if (sp.opt("$")) { String param = sp.getId(); String qstr = getParam(param); qstr = qstr==null ? "" : qstr; nestedQuery = subQuery(qstr, null).getQuery(); } else { int start = sp.pos; String v = sp.val; String qs = v; HashMap nestedLocalParams = new HashMap<String,String>(); int end = QueryParsing.parseLocalParams(qs, start, nestedLocalParams, getParams()); QParser sub; if (end>start) { if (nestedLocalParams.get(QueryParsing.V) != null) { // value specified directly in local params... so the end of the // query should be the end of the local params. sub = subQuery(qs.substring(start, end), null); } else { // value here is *after* the local params... ask the parser. sub = subQuery(qs, null); // int subEnd = sub.findEnd(')'); // TODO.. implement functions to find the end of a nested query throw new ParseException("Nested local params must have value in v parameter. got '" + qs + "'"); } } else { throw new ParseException("Nested function query must use $param or {!v=value} forms. got '" + qs + "'"); } sp.pos += end-start; // advance past nested query nestedQuery = sub.getQuery(); } consumeArgumentDelimiter(); return nestedQuery; }
// in core/src/java/org/apache/solr/search/FunctionQParser.java
protected ValueSource parseValueSource(boolean doConsumeDelimiter) throws ParseException { ValueSource valueSource; int ch = sp.peek(); if (ch>='0' && ch<='9' || ch=='.' || ch=='+' || ch=='-') { Number num = sp.getNumber(); if (num instanceof Long) { valueSource = new LongConstValueSource(num.longValue()); } else if (num instanceof Double) { valueSource = new DoubleConstValueSource(num.doubleValue()); } else { // shouldn't happen valueSource = new ConstValueSource(num.floatValue()); } } else if (ch == '"' || ch == '\''){ valueSource = new LiteralValueSource(sp.getQuotedString()); } else if (ch == '$') { sp.pos++; String param = sp.getId(); String val = getParam(param); if (val == null) { throw new ParseException("Missing param " + param + " while parsing function '" + sp.val + "'"); } QParser subParser = subQuery(val, "func"); if (subParser instanceof FunctionQParser) { ((FunctionQParser)subParser).setParseMultipleSources(true); } Query subQuery = subParser.getQuery(); if (subQuery instanceof FunctionQuery) { valueSource = ((FunctionQuery) subQuery).getValueSource(); } else { valueSource = new QueryValueSource(subQuery, 0.0f); } /*** // dereference *simple* argument (i.e., can't currently be a function) // In the future we could support full function dereferencing via a stack of ValueSource (or StringParser) objects ch = val.length()==0 ? '\0' : val.charAt(0); if (ch>='0' && ch<='9' || ch=='.' || ch=='+' || ch=='-') { QueryParsing.StrParser sp = new QueryParsing.StrParser(val); Number num = sp.getNumber(); if (num instanceof Long) { valueSource = new LongConstValueSource(num.longValue()); } else if (num instanceof Double) { valueSource = new DoubleConstValueSource(num.doubleValue()); } else { // shouldn't happen valueSource = new ConstValueSource(num.floatValue()); } } else if (ch == '"' || ch == '\'') { QueryParsing.StrParser sp = new QueryParsing.StrParser(val); val = sp.getQuotedString(); valueSource = new LiteralValueSource(val); } else { if (val.length()==0) { valueSource = new LiteralValueSource(val); } else { String id = val; SchemaField f = req.getSchema().getField(id); valueSource = f.getType().getValueSource(f, this); } } ***/ } else { String id = sp.getId(); if (sp.opt("(")) { // a function... look it up. ValueSourceParser argParser = req.getCore().getValueSourceParser(id); if (argParser==null) { throw new ParseException("Unknown function " + id + " in FunctionQuery(" + sp + ")"); } valueSource = argParser.parse(this); sp.expect(")"); } else { if ("true".equals(id)) { valueSource = new BoolConstValueSource(true); } else if ("false".equals(id)) { valueSource = new BoolConstValueSource(false); } else { SchemaField f = req.getSchema().getField(id); valueSource = f.getType().getValueSource(f, this); } } } if (doConsumeDelimiter) consumeArgumentDelimiter(); return valueSource; }
// in core/src/java/org/apache/solr/util/DateMathParser.java
public Date parseMath(String math) throws ParseException { Calendar cal = Calendar.getInstance(zone, loc); cal.setTime(getNow()); /* check for No-Op */ if (0==math.length()) { return cal.getTime(); } String[] ops = splitter.split(math); int pos = 0; while ( pos < ops.length ) { if (1 != ops[pos].length()) { throw new ParseException ("Multi character command found: \"" + ops[pos] + "\"", pos); } char command = ops[pos++].charAt(0); switch (command) { case '/': if (ops.length < pos + 1) { throw new ParseException ("Need a unit after command: \"" + command + "\"", pos); } try { round(cal, ops[pos++]); } catch (IllegalArgumentException e) { throw new ParseException ("Unit not recognized: \"" + ops[pos-1] + "\"", pos-1); } break; case '+': /* fall through */ case '-': if (ops.length < pos + 2) { throw new ParseException ("Need a value and unit for command: \"" + command + "\"", pos); } int val = 0; try { val = Integer.valueOf(ops[pos++]); } catch (NumberFormatException e) { throw new ParseException ("Not a Number: \"" + ops[pos-1] + "\"", pos-1); } if ('-' == command) { val = 0 - val; } try { String unit = ops[pos++]; add(cal, val, unit); } catch (IllegalArgumentException e) { throw new ParseException ("Unit not recognized: \"" + ops[pos-1] + "\"", pos-1); } break; default: throw new ParseException ("Unrecognized command: \"" + command + "\"", pos-1); } } return cal.getTime(); }
5
              
// in core/src/java/org/apache/solr/search/SurroundQParserPlugin.java
catch (org.apache.lucene.queryparser.surround.parser.ParseException pe) { throw new org.apache.lucene.queryparser.classic.ParseException( pe.getMessage()); }
// in core/src/java/org/apache/solr/search/function/distance/HaversineConstFunction.java
catch (InvalidShapeException e) { throw new ParseException("Bad spatial pt:" + pt); }
// in core/src/java/org/apache/solr/util/DateMathParser.java
catch (IllegalArgumentException e) { throw new ParseException ("Unit not recognized: \"" + ops[pos-1] + "\"", pos-1); }
// in core/src/java/org/apache/solr/util/DateMathParser.java
catch (NumberFormatException e) { throw new ParseException ("Not a Number: \"" + ops[pos-1] + "\"", pos-1); }
// in core/src/java/org/apache/solr/util/DateMathParser.java
catch (IllegalArgumentException e) { throw new ParseException ("Unit not recognized: \"" + ops[pos-1] + "\"", pos-1); }
161
              
// in solrj/src/java/org/apache/solr/common/util/DateUtil.java
public static Date parseDate(String d) throws ParseException { return parseDate(d, DEFAULT_DATE_FORMATS); }
// in solrj/src/java/org/apache/solr/common/util/DateUtil.java
public static Date parseDate(String d, Collection<String> fmts) throws ParseException { // 2007-04-26T08:05:04Z if (d.endsWith("Z") && d.length() > 20) { return getThreadLocalDateFormat().parse(d); } return parseDate(d, fmts, null); }
// in solrj/src/java/org/apache/solr/common/util/DateUtil.java
public static Date parseDate( String dateValue, Collection<String> dateFormats, Date startDate ) throws ParseException { if (dateValue == null) { throw new IllegalArgumentException("dateValue is null"); } if (dateFormats == null) { dateFormats = DEFAULT_HTTP_CLIENT_PATTERNS; } if (startDate == null) { startDate = DEFAULT_TWO_DIGIT_YEAR_START; } // trim single quotes around date if present // see issue #5279 if (dateValue.length() > 1 && dateValue.startsWith("'") && dateValue.endsWith("'") ) { dateValue = dateValue.substring(1, dateValue.length() - 1); } SimpleDateFormat dateParser = null; Iterator formatIter = dateFormats.iterator(); while (formatIter.hasNext()) { String format = (String) formatIter.next(); if (dateParser == null) { dateParser = new SimpleDateFormat(format, Locale.US); dateParser.setTimeZone(GMT); dateParser.set2DigitYearStart(startDate); } else { dateParser.applyPattern(format); } try { return dateParser.parse(dateValue); } catch (ParseException pe) { // ignore this exception, we will try the next format } } // we were unable to parse the date throw new ParseException("Unable to parse the date " + dateValue, 0); }
// in solrj/src/java/org/apache/solr/client/solrj/util/ClientUtils.java
Deprecated public static Date parseDate( String d ) throws ParseException { return DateUtil.parseDate(d); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DateFormatTransformer.java
private Date process(Object value, String format, Locale locale) throws ParseException { if (value == null) return null; String strVal = value.toString().trim(); if (strVal.length() == 0) return null; SimpleDateFormat fmt = fmtCache.get(format); if (fmt == null) { fmt = new SimpleDateFormat(format, locale); fmtCache.put(format, fmt); } return fmt.parse(strVal); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/NumberFormatTransformer.java
private Number process(String val, String style, Locale locale) throws ParseException { if (INTEGER.equals(style)) { return parseNumber(val, NumberFormat.getIntegerInstance(locale)); } else if (NUMBER.equals(style)) { return parseNumber(val, NumberFormat.getNumberInstance(locale)); } else if (CURRENCY.equals(style)) { return parseNumber(val, NumberFormat.getCurrencyInstance(locale)); } else if (PERCENT.equals(style)) { return parseNumber(val, NumberFormat.getPercentInstance(locale)); } return null; }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/NumberFormatTransformer.java
private Number parseNumber(String val, NumberFormat numFormat) throws ParseException { ParsePosition parsePos = new ParsePosition(0); Number num = numFormat.parse(val, parsePos); if (parsePos.getIndex() != val.length()) { throw new ParseException("illegal number format", parsePos.getIndex()); } return num; }
// in core/src/java/org/apache/solr/handler/component/SearchHandler.java
Override public void handleRequestBody(SolrQueryRequest req, SolrQueryResponse rsp) throws Exception, ParseException, InstantiationException, IllegalAccessException { // int sleep = req.getParams().getInt("sleep",0); // if (sleep > 0) {log.error("SLEEPING for " + sleep); Thread.sleep(sleep);} ResponseBuilder rb = new ResponseBuilder(req, rsp, components); if (rb.requestInfo != null) { rb.requestInfo.setResponseBuilder(rb); } boolean dbg = req.getParams().getBool(CommonParams.DEBUG_QUERY, false); rb.setDebug(dbg); if (dbg == false){//if it's true, we are doing everything anyway. SolrPluginUtils.getDebugInterests(req.getParams().getParams(CommonParams.DEBUG), rb); } final RTimer timer = rb.isDebug() ? new RTimer() : null; ShardHandler shardHandler1 = shardHandlerFactory.getShardHandler(); shardHandler1.checkDistributed(rb); if (timer == null) { // non-debugging prepare phase for( SearchComponent c : components ) { c.prepare(rb); } } else { // debugging prepare phase RTimer subt = timer.sub( "prepare" ); for( SearchComponent c : components ) { rb.setTimer( subt.sub( c.getName() ) ); c.prepare(rb); rb.getTimer().stop(); } subt.stop(); } if (!rb.isDistrib) { // a normal non-distributed request // The semantics of debugging vs not debugging are different enough that // it makes sense to have two control loops if(!rb.isDebug()) { // Process for( SearchComponent c : components ) { c.process(rb); } } else { // Process RTimer subt = timer.sub( "process" ); for( SearchComponent c : components ) { rb.setTimer( subt.sub( c.getName() ) ); c.process(rb); rb.getTimer().stop(); } subt.stop(); timer.stop(); // add the timing info if (rb.isDebugTimings()) { rb.addDebugInfo("timing", timer.asNamedList() ); } } } else { // a distributed request if (rb.outgoing == null) { rb.outgoing = new LinkedList<ShardRequest>(); } rb.finished = new ArrayList<ShardRequest>(); int nextStage = 0; do { rb.stage = nextStage; nextStage = ResponseBuilder.STAGE_DONE; // call all components for( SearchComponent c : components ) { // the next stage is the minimum of what all components report nextStage = Math.min(nextStage, c.distributedProcess(rb)); } // check the outgoing queue and send requests while (rb.outgoing.size() > 0) { // submit all current request tasks at once while (rb.outgoing.size() > 0) { ShardRequest sreq = rb.outgoing.remove(0); sreq.actualShards = sreq.shards; if (sreq.actualShards==ShardRequest.ALL_SHARDS) { sreq.actualShards = rb.shards; } sreq.responses = new ArrayList<ShardResponse>(); // TODO: map from shard to address[] for (String shard : sreq.actualShards) { ModifiableSolrParams params = new ModifiableSolrParams(sreq.params); params.remove(ShardParams.SHARDS); // not a top-level request params.set("distrib", "false"); // not a top-level request params.remove("indent"); params.remove(CommonParams.HEADER_ECHO_PARAMS); params.set(ShardParams.IS_SHARD, true); // a sub (shard) request params.set(ShardParams.SHARD_URL, shard); // so the shard knows what was asked if (rb.requestInfo != null) { // we could try and detect when this is needed, but it could be tricky params.set("NOW", Long.toString(rb.requestInfo.getNOW().getTime())); } String shardQt = params.get(ShardParams.SHARDS_QT); if (shardQt == null) { params.remove(CommonParams.QT); } else { params.set(CommonParams.QT, shardQt); } shardHandler1.submit(sreq, shard, params); } } // now wait for replies, but if anyone puts more requests on // the outgoing queue, send them out immediately (by exiting // this loop) boolean tolerant = rb.req.getParams().getBool(ShardParams.SHARDS_TOLERANT, false); while (rb.outgoing.size() == 0) { ShardResponse srsp = tolerant ? shardHandler1.takeCompletedIncludingErrors(): shardHandler1.takeCompletedOrError(); if (srsp == null) break; // no more requests to wait for // Was there an exception? if (srsp.getException() != null) { // If things are not tolerant, abort everything and rethrow if(!tolerant) { shardHandler1.cancelAll(); if (srsp.getException() instanceof SolrException) { throw (SolrException)srsp.getException(); } else { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, srsp.getException()); } } } rb.finished.add(srsp.getShardRequest()); // let the components see the responses to the request for(SearchComponent c : components) { c.handleResponses(rb, srsp.getShardRequest()); } } } for(SearchComponent c : components) { c.finishStage(rb); } // we are done when the next stage is MAX_VALUE } while (nextStage != Integer.MAX_VALUE); } }
// in core/src/java/org/apache/solr/analysis/SynonymFilterFactory.java
private SynonymMap loadSolrSynonyms(ResourceLoader loader, boolean dedup, Analyzer analyzer) throws IOException, ParseException { final boolean expand = getBoolean("expand", true); String synonyms = args.get("synonyms"); if (synonyms == null) throw new InitializationException("Missing required argument 'synonyms'."); CharsetDecoder decoder = Charset.forName("UTF-8").newDecoder() .onMalformedInput(CodingErrorAction.REPORT) .onUnmappableCharacter(CodingErrorAction.REPORT); SolrSynonymParser parser = new SolrSynonymParser(dedup, expand, analyzer); File synonymFile = new File(synonyms); if (synonymFile.exists()) { decoder.reset(); parser.add(new InputStreamReader(loader.openResource(synonyms), decoder)); } else { List<String> files = StrUtils.splitFileNames(synonyms); for (String file : files) { decoder.reset(); parser.add(new InputStreamReader(loader.openResource(file), decoder)); } } return parser.build(); }
// in core/src/java/org/apache/solr/analysis/SynonymFilterFactory.java
private SynonymMap loadWordnetSynonyms(ResourceLoader loader, boolean dedup, Analyzer analyzer) throws IOException, ParseException { final boolean expand = getBoolean("expand", true); String synonyms = args.get("synonyms"); if (synonyms == null) throw new InitializationException("Missing required argument 'synonyms'."); CharsetDecoder decoder = Charset.forName("UTF-8").newDecoder() .onMalformedInput(CodingErrorAction.REPORT) .onUnmappableCharacter(CodingErrorAction.REPORT); WordnetSynonymParser parser = new WordnetSynonymParser(dedup, expand, analyzer); File synonymFile = new File(synonyms); if (synonymFile.exists()) { decoder.reset(); parser.add(new InputStreamReader(loader.openResource(synonyms), decoder)); } else { List<String> files = StrUtils.splitFileNames(synonyms); for (String file : files) { decoder.reset(); parser.add(new InputStreamReader(loader.openResource(file), decoder)); } } return parser.build(); }
// in core/src/java/org/apache/solr/request/SimpleFacets.java
void parseParams(String type, String param) throws ParseException, IOException { localParams = QueryParsing.getLocalParams(param, req.getParams()); base = docs; facetValue = param; key = param; threads = -1; if (localParams == null) return; // remove local params unless it's a query if (type != FacetParams.FACET_QUERY) { // TODO Cut over to an Enum here facetValue = localParams.get(CommonParams.VALUE); } // reset set the default key now that localParams have been removed key = facetValue; // allow explicit set of the key key = localParams.get(CommonParams.OUTPUT_KEY, key); String threadStr = localParams.get(CommonParams.THREADS); if (threadStr != null) { threads = Integer.parseInt(threadStr); } // figure out if we need a new base DocSet String excludeStr = localParams.get(CommonParams.EXCLUDE); if (excludeStr == null) return; Map<?,?> tagMap = (Map<?,?>)req.getContext().get("tags"); if (tagMap != null && rb != null) { List<String> excludeTagList = StrUtils.splitSmart(excludeStr,','); IdentityHashMap<Query,Boolean> excludeSet = new IdentityHashMap<Query,Boolean>(); for (String excludeTag : excludeTagList) { Object olst = tagMap.get(excludeTag); // tagMap has entries of List<String,List<QParser>>, but subject to change in the future if (!(olst instanceof Collection)) continue; for (Object o : (Collection<?>)olst) { if (!(o instanceof QParser)) continue; QParser qp = (QParser)o; excludeSet.put(qp.getQuery(), Boolean.TRUE); } } if (excludeSet.size() == 0) return; List<Query> qlist = new ArrayList<Query>(); // add the base query if (!excludeSet.containsKey(rb.getQuery())) { qlist.add(rb.getQuery()); } // add the filters if (rb.getFilters() != null) { for (Query q : rb.getFilters()) { if (!excludeSet.containsKey(q)) { qlist.add(q); } } } // get the new base docset for this facet DocSet base = searcher.getDocSet(qlist); if (rb.grouping() && rb.getGroupingSpec().isTruncateGroups()) { Grouping grouping = new Grouping(searcher, null, rb.getQueryCommand(), false, 0, false); if (rb.getGroupingSpec().getFields().length > 0) { grouping.addFieldCommand(rb.getGroupingSpec().getFields()[0], req); } else if (rb.getGroupingSpec().getFunctions().length > 0) { grouping.addFunctionCommand(rb.getGroupingSpec().getFunctions()[0], req); } else { this.base = base; return; } AbstractAllGroupHeadsCollector allGroupHeadsCollector = grouping.getCommands().get(0).createAllGroupCollector(); searcher.search(new MatchAllDocsQuery(), base.getTopFilter(), allGroupHeadsCollector); int maxDoc = searcher.maxDoc(); FixedBitSet fixedBitSet = allGroupHeadsCollector.retrieveGroupHeads(maxDoc); long[] bits = fixedBitSet.getBits(); this.base = new BitDocSet(new OpenBitSet(bits, bits.length)); } else { this.base = base; } } }
// in core/src/java/org/apache/solr/request/SimpleFacets.java
public NamedList<Integer> getFacetQueryCounts() throws IOException,ParseException { NamedList<Integer> res = new SimpleOrderedMap<Integer>(); /* Ignore CommonParams.DF - could have init param facet.query assuming * the schema default with query param DF intented to only affect Q. * If user doesn't want schema default for facet.query, they should be * explicit. */ // SolrQueryParser qp = searcher.getSchema().getSolrQueryParser(null); String[] facetQs = params.getParams(FacetParams.FACET_QUERY); if (null != facetQs && 0 != facetQs.length) { for (String q : facetQs) { parseParams(FacetParams.FACET_QUERY, q); // TODO: slight optimization would prevent double-parsing of any localParams Query qobj = QParser.getParser(q, null, req).getQuery(); res.add(key, searcher.numDocs(qobj, base)); } } return res; }
// in core/src/java/org/apache/solr/request/SimpleFacets.java
public NamedList<Object> getFacetFieldCounts() throws IOException, ParseException { NamedList<Object> res = new SimpleOrderedMap<Object>(); String[] facetFs = params.getParams(FacetParams.FACET_FIELD); if (null != facetFs) { for (String f : facetFs) { parseParams(FacetParams.FACET_FIELD, f); String termList = localParams == null ? null : localParams.get(CommonParams.TERMS); if (termList != null) { res.add(key, getListedTermCounts(facetValue, termList)); } else { res.add(key, getTermCounts(facetValue)); } } } return res; }
// in core/src/java/org/apache/solr/request/SimpleFacets.java
Deprecated public NamedList<Object> getFacetDateCounts() throws IOException, ParseException { final NamedList<Object> resOuter = new SimpleOrderedMap<Object>(); final String[] fields = params.getParams(FacetParams.FACET_DATE); if (null == fields || 0 == fields.length) return resOuter; for (String f : fields) { getFacetDateCounts(f, resOuter); } return resOuter; }
// in core/src/java/org/apache/solr/request/SimpleFacets.java
Deprecated public void getFacetDateCounts(String dateFacet, NamedList<Object> resOuter) throws IOException, ParseException { final IndexSchema schema = searcher.getSchema(); parseParams(FacetParams.FACET_DATE, dateFacet); String f = facetValue; final NamedList<Object> resInner = new SimpleOrderedMap<Object>(); resOuter.add(key, resInner); final SchemaField sf = schema.getField(f); if (! (sf.getType() instanceof DateField)) { throw new SolrException (SolrException.ErrorCode.BAD_REQUEST, "Can not date facet on a field which is not a DateField: " + f); } final DateField ft = (DateField) sf.getType(); final String startS = required.getFieldParam(f,FacetParams.FACET_DATE_START); final Date start; try { start = ft.parseMath(null, startS); } catch (SolrException e) { throw new SolrException (SolrException.ErrorCode.BAD_REQUEST, "date facet 'start' is not a valid Date string: " + startS, e); } final String endS = required.getFieldParam(f,FacetParams.FACET_DATE_END); Date end; // not final, hardend may change this try { end = ft.parseMath(null, endS); } catch (SolrException e) { throw new SolrException (SolrException.ErrorCode.BAD_REQUEST, "date facet 'end' is not a valid Date string: " + endS, e); } if (end.before(start)) { throw new SolrException (SolrException.ErrorCode.BAD_REQUEST, "date facet 'end' comes before 'start': "+endS+" < "+startS); } final String gap = required.getFieldParam(f,FacetParams.FACET_DATE_GAP); final DateMathParser dmp = new DateMathParser(); final int minCount = params.getFieldInt(f,FacetParams.FACET_MINCOUNT, 0); String[] iStrs = params.getFieldParams(f,FacetParams.FACET_DATE_INCLUDE); // Legacy support for default of [lower,upper,edge] for date faceting // this is not handled by FacetRangeInclude.parseParam because // range faceting has differnet defaults final EnumSet<FacetRangeInclude> include = (null == iStrs || 0 == iStrs.length ) ? EnumSet.of(FacetRangeInclude.LOWER, FacetRangeInclude.UPPER, FacetRangeInclude.EDGE) : FacetRangeInclude.parseParam(iStrs); try { Date low = start; while (low.before(end)) { dmp.setNow(low); String label = ft.toExternal(low); Date high = dmp.parseMath(gap); if (end.before(high)) { if (params.getFieldBool(f,FacetParams.FACET_DATE_HARD_END,false)) { high = end; } else { end = high; } } if (high.before(low)) { throw new SolrException (SolrException.ErrorCode.BAD_REQUEST, "date facet infinite loop (is gap negative?)"); } final boolean includeLower = (include.contains(FacetRangeInclude.LOWER) || (include.contains(FacetRangeInclude.EDGE) && low.equals(start))); final boolean includeUpper = (include.contains(FacetRangeInclude.UPPER) || (include.contains(FacetRangeInclude.EDGE) && high.equals(end))); final int count = rangeCount(sf,low,high,includeLower,includeUpper); if (count >= minCount) { resInner.add(label, count); } low = high; } } catch (java.text.ParseException e) { throw new SolrException (SolrException.ErrorCode.BAD_REQUEST, "date facet 'gap' is not a valid Date Math string: " + gap, e); } // explicitly return the gap and end so all the counts // (including before/after/between) are meaningful - even if mincount // has removed the neighboring ranges resInner.add("gap", gap); resInner.add("start", start); resInner.add("end", end); final String[] othersP = params.getFieldParams(f,FacetParams.FACET_DATE_OTHER); if (null != othersP && 0 < othersP.length ) { final Set<FacetRangeOther> others = EnumSet.noneOf(FacetRangeOther.class); for (final String o : othersP) { others.add(FacetRangeOther.get(o)); } // no matter what other values are listed, we don't do // anything if "none" is specified. if (! others.contains(FacetRangeOther.NONE) ) { boolean all = others.contains(FacetRangeOther.ALL); if (all || others.contains(FacetRangeOther.BEFORE)) { // include upper bound if "outer" or if first gap doesn't already include it resInner.add(FacetRangeOther.BEFORE.toString(), rangeCount(sf,null,start, false, (include.contains(FacetRangeInclude.OUTER) || (! (include.contains(FacetRangeInclude.LOWER) || include.contains(FacetRangeInclude.EDGE)))))); } if (all || others.contains(FacetRangeOther.AFTER)) { // include lower bound if "outer" or if last gap doesn't already include it resInner.add(FacetRangeOther.AFTER.toString(), rangeCount(sf,end,null, (include.contains(FacetRangeInclude.OUTER) || (! (include.contains(FacetRangeInclude.UPPER) || include.contains(FacetRangeInclude.EDGE)))), false)); } if (all || others.contains(FacetRangeOther.BETWEEN)) { resInner.add(FacetRangeOther.BETWEEN.toString(), rangeCount(sf,start,end, (include.contains(FacetRangeInclude.LOWER) || include.contains(FacetRangeInclude.EDGE)), (include.contains(FacetRangeInclude.UPPER) || include.contains(FacetRangeInclude.EDGE)))); } } } }
// in core/src/java/org/apache/solr/request/SimpleFacets.java
public NamedList<Object> getFacetRangeCounts() throws IOException, ParseException { final NamedList<Object> resOuter = new SimpleOrderedMap<Object>(); final String[] fields = params.getParams(FacetParams.FACET_RANGE); if (null == fields || 0 == fields.length) return resOuter; for (String f : fields) { getFacetRangeCounts(f, resOuter); } return resOuter; }
// in core/src/java/org/apache/solr/request/SimpleFacets.java
void getFacetRangeCounts(String facetRange, NamedList<Object> resOuter) throws IOException, ParseException { final IndexSchema schema = searcher.getSchema(); parseParams(FacetParams.FACET_RANGE, facetRange); String f = facetValue; final SchemaField sf = schema.getField(f); final FieldType ft = sf.getType(); RangeEndpointCalculator<?> calc = null; if (ft instanceof TrieField) { final TrieField trie = (TrieField)ft; switch (trie.getType()) { case FLOAT: calc = new FloatRangeEndpointCalculator(sf); break; case DOUBLE: calc = new DoubleRangeEndpointCalculator(sf); break; case INTEGER: calc = new IntegerRangeEndpointCalculator(sf); break; case LONG: calc = new LongRangeEndpointCalculator(sf); break; default: throw new SolrException (SolrException.ErrorCode.BAD_REQUEST, "Unable to range facet on tried field of unexpected type:" + f); } } else if (ft instanceof DateField) { calc = new DateRangeEndpointCalculator(sf, null); } else if (ft instanceof SortableIntField) { calc = new IntegerRangeEndpointCalculator(sf); } else if (ft instanceof SortableLongField) { calc = new LongRangeEndpointCalculator(sf); } else if (ft instanceof SortableFloatField) { calc = new FloatRangeEndpointCalculator(sf); } else if (ft instanceof SortableDoubleField) { calc = new DoubleRangeEndpointCalculator(sf); } else { throw new SolrException (SolrException.ErrorCode.BAD_REQUEST, "Unable to range facet on field:" + sf); } resOuter.add(key, getFacetRangeCounts(sf, calc)); }
// in core/src/java/org/apache/solr/request/SimpleFacets.java
protected Object parseGap(final String rawval) throws java.text.ParseException { return parseVal(rawval); }
// in core/src/java/org/apache/solr/request/SimpleFacets.java
Override public Date parseAndAddGap(Date value, String gap) throws java.text.ParseException { final DateMathParser dmp = new DateMathParser(); dmp.setNow(value); return dmp.parseMath(gap); }
// in core/src/java/org/apache/solr/schema/DateField.java
public Date toObject(String indexedForm) throws java.text.ParseException { return parseDate(indexedToReadable(indexedForm)); }
// in core/src/java/org/apache/solr/schema/DateField.java
public static Date parseDate(String s) throws ParseException { return fmtThreadLocal.get().parse(s); }
// in core/src/java/org/apache/solr/schema/DateField.java
public Date parseDateLenient(String s, SolrQueryRequest req) throws ParseException { // request could define timezone in the future try { return fmtThreadLocal.get().parse(s); } catch (Exception e) { return DateUtil.parseDate(s); } }
// in core/src/java/org/apache/solr/search/DisMaxQParser.java
Override public Query parse() throws ParseException { SolrParams solrParams = SolrParams.wrapDefaults(localParams, params); queryFields = SolrPluginUtils.parseFieldBoosts(solrParams.getParams(DisMaxParams.QF)); if (0 == queryFields.size()) { queryFields.put(req.getSchema().getDefaultSearchFieldName(), 1.0f); } /* the main query we will execute. we disable the coord because * this query is an artificial construct */ BooleanQuery query = new BooleanQuery(true); boolean notBlank = addMainQuery(query, solrParams); if (!notBlank) return null; addBoostQuery(query, solrParams); addBoostFunctions(query, solrParams); return query; }
// in core/src/java/org/apache/solr/search/DisMaxQParser.java
protected void addBoostFunctions(BooleanQuery query, SolrParams solrParams) throws ParseException { String[] boostFuncs = solrParams.getParams(DisMaxParams.BF); if (null != boostFuncs && 0 != boostFuncs.length) { for (String boostFunc : boostFuncs) { if (null == boostFunc || "".equals(boostFunc)) continue; Map<String, Float> ff = SolrPluginUtils.parseFieldBoosts(boostFunc); for (String f : ff.keySet()) { Query fq = subQuery(f, FunctionQParserPlugin.NAME).getQuery(); Float b = ff.get(f); if (null != b) { fq.setBoost(b); } query.add(fq, BooleanClause.Occur.SHOULD); } } } }
// in core/src/java/org/apache/solr/search/DisMaxQParser.java
protected void addBoostQuery(BooleanQuery query, SolrParams solrParams) throws ParseException { boostParams = solrParams.getParams(DisMaxParams.BQ); //List<Query> boostQueries = SolrPluginUtils.parseQueryStrings(req, boostParams); boostQueries = null; if (boostParams != null && boostParams.length > 0) { boostQueries = new ArrayList<Query>(); for (String qs : boostParams) { if (qs.trim().length() == 0) continue; Query q = subQuery(qs, null).getQuery(); boostQueries.add(q); } } if (null != boostQueries) { if (1 == boostQueries.size() && 1 == boostParams.length) { /* legacy logic */ Query f = boostQueries.get(0); if (1.0f == f.getBoost() && f instanceof BooleanQuery) { /* if the default boost was used, and we've got a BooleanQuery * extract the subqueries out and use them directly */ for (Object c : ((BooleanQuery) f).clauses()) { query.add((BooleanClause) c); } } else { query.add(f, BooleanClause.Occur.SHOULD); } } else { for (Query f : boostQueries) { query.add(f, BooleanClause.Occur.SHOULD); } } } }
// in core/src/java/org/apache/solr/search/DisMaxQParser.java
protected boolean addMainQuery(BooleanQuery query, SolrParams solrParams) throws ParseException { Map<String, Float> phraseFields = SolrPluginUtils.parseFieldBoosts(solrParams.getParams(DisMaxParams.PF)); float tiebreaker = solrParams.getFloat(DisMaxParams.TIE, 0.0f); /* a parser for dealing with user input, which will convert * things to DisjunctionMaxQueries */ SolrPluginUtils.DisjunctionMaxQueryParser up = getParser(queryFields, DisMaxParams.QS, solrParams, tiebreaker); /* for parsing sloppy phrases using DisjunctionMaxQueries */ SolrPluginUtils.DisjunctionMaxQueryParser pp = getParser(phraseFields, DisMaxParams.PS, solrParams, tiebreaker); /* * * Main User Query * * */ parsedUserQuery = null; String userQuery = getString(); altUserQuery = null; if (userQuery == null || userQuery.trim().length() < 1) { // If no query is specified, we may have an alternate altUserQuery = getAlternateUserQuery(solrParams); if (altUserQuery == null) return false; query.add(altUserQuery, BooleanClause.Occur.MUST); } else { // There is a valid query string userQuery = SolrPluginUtils.partialEscape(SolrPluginUtils.stripUnbalancedQuotes(userQuery)).toString(); userQuery = SolrPluginUtils.stripIllegalOperators(userQuery).toString(); parsedUserQuery = getUserQuery(userQuery, up, solrParams); query.add(parsedUserQuery, BooleanClause.Occur.MUST); Query phrase = getPhraseQuery(userQuery, pp); if (null != phrase) { query.add(phrase, BooleanClause.Occur.SHOULD); } } return true; }
// in core/src/java/org/apache/solr/search/DisMaxQParser.java
protected Query getAlternateUserQuery(SolrParams solrParams) throws ParseException { String altQ = solrParams.get(DisMaxParams.ALTQ); if (altQ != null) { QParser altQParser = subQuery(altQ, null); return altQParser.getQuery(); } else { return null; } }
// in core/src/java/org/apache/solr/search/DisMaxQParser.java
protected Query getPhraseQuery(String userQuery, SolrPluginUtils.DisjunctionMaxQueryParser pp) throws ParseException { /* * * Add on Phrases for the Query * * */ /* build up phrase boosting queries */ /* if the userQuery already has some quotes, strip them out. * we've already done the phrases they asked for in the main * part of the query, this is to boost docs that may not have * matched those phrases but do match looser phrases. */ String userPhraseQuery = userQuery.replace("\"", ""); return pp.parse("\"" + userPhraseQuery + "\""); }
// in core/src/java/org/apache/solr/search/DisMaxQParser.java
protected Query getUserQuery(String userQuery, SolrPluginUtils.DisjunctionMaxQueryParser up, SolrParams solrParams) throws ParseException { String minShouldMatch = parseMinShouldMatch(req.getSchema(), solrParams); Query dis = up.parse(userQuery); Query query = dis; if (dis instanceof BooleanQuery) { BooleanQuery t = new BooleanQuery(); SolrPluginUtils.flattenBooleanQuery(t, (BooleanQuery) dis); SolrPluginUtils.setMinShouldMatch(t, minShouldMatch); query = t; } return query; }
// in core/src/java/org/apache/solr/search/DisMaxQParser.java
Override public Query getHighlightQuery() throws ParseException { return parsedUserQuery == null ? altUserQuery : parsedUserQuery; }
// in core/src/java/org/apache/solr/search/SpatialFilterQParser.java
Override public Query parse() throws ParseException { //if more than one, we need to treat them as a point... //TODO: Should we accept multiple fields String[] fields = localParams.getParams("f"); if (fields == null || fields.length == 0) { String field = getParam(SpatialParams.FIELD); if (field == null) throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, " missing sfield for spatial request"); fields = new String[] {field}; } String pointStr = getParam(SpatialParams.POINT); if (pointStr == null) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, SpatialParams.POINT + " missing."); } double dist = -1; String distS = getParam(SpatialParams.DISTANCE); if (distS != null) dist = Double.parseDouble(distS); if (dist < 0) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, SpatialParams.DISTANCE + " must be >= 0"); } String measStr = localParams.get(SpatialParams.MEASURE); //TODO: Need to do something with Measures Query result = null; //fields is valid at this point if (fields.length == 1) { SchemaField sf = req.getSchema().getField(fields[0]); FieldType type = sf.getType(); if (type instanceof SpatialQueryable) { double radius = localParams.getDouble(SpatialParams.SPHERE_RADIUS, DistanceUtils.EARTH_MEAN_RADIUS_KM); SpatialOptions opts = new SpatialOptions(pointStr, dist, sf, measStr, radius, DistanceUnits.KILOMETERS); opts.bbox = bbox; result = ((SpatialQueryable)type).createSpatialQuery(this, opts); } else { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "The field " + fields[0] + " does not support spatial filtering"); } } else {// fields.length > 1 //TODO: Not sure about this just yet, is there a way to delegate, or do we just have a helper class? //Seems like we could just use FunctionQuery, but then what about scoring /*List<ValueSource> sources = new ArrayList<ValueSource>(fields.length); for (String field : fields) { SchemaField sf = schema.getField(field); sources.add(sf.getType().getValueSource(sf, this)); } MultiValueSource vs = new VectorValueSource(sources); ValueSourceRangeFilter rf = new ValueSourceRangeFilter(vs, "0", String.valueOf(dist), true, true); result = new SolrConstantScoreQuery(rf);*/ } return result; }
// in core/src/java/org/apache/solr/search/TermQParserPlugin.java
Override public QParser createParser(String qstr, SolrParams localParams, SolrParams params, SolrQueryRequest req) { return new QParser(qstr, localParams, params, req) { @Override public Query parse() throws ParseException { String fname = localParams.get(QueryParsing.F); FieldType ft = req.getSchema().getFieldTypeNoEx(fname); String val = localParams.get(QueryParsing.V); BytesRef term = new BytesRef(); if (ft != null) { ft.readableToIndexed(val, term); } else { term.copyChars(val); } return new TermQuery(new Term(fname, term)); } }; }
// in core/src/java/org/apache/solr/search/TermQParserPlugin.java
Override public Query parse() throws ParseException { String fname = localParams.get(QueryParsing.F); FieldType ft = req.getSchema().getFieldTypeNoEx(fname); String val = localParams.get(QueryParsing.V); BytesRef term = new BytesRef(); if (ft != null) { ft.readableToIndexed(val, term); } else { term.copyChars(val); } return new TermQuery(new Term(fname, term)); }
// in core/src/java/org/apache/solr/search/QParser.java
public Query getQuery() throws ParseException { if (query==null) { query=parse(); if (localParams != null) { String cacheStr = localParams.get(CommonParams.CACHE); if (cacheStr != null) { if (CommonParams.FALSE.equals(cacheStr)) { extendedQuery().setCache(false); } else if (CommonParams.TRUE.equals(cacheStr)) { extendedQuery().setCache(true); } else if ("sep".equals(cacheStr)) { extendedQuery().setCacheSep(true); } } int cost = localParams.getInt(CommonParams.COST, Integer.MIN_VALUE); if (cost != Integer.MIN_VALUE) { extendedQuery().setCost(cost); } } } return query; }
// in core/src/java/org/apache/solr/search/QParser.java
private void checkRecurse() throws ParseException { if (recurseCount++ >= 100) { throw new ParseException("Infinite Recursion detected parsing query '" + qstr + "'"); } }
// in core/src/java/org/apache/solr/search/QParser.java
public QParser subQuery(String q, String defaultType) throws ParseException { checkRecurse(); if (defaultType == null && localParams != null) { // if not passed, try and get the defaultType from local params defaultType = localParams.get(QueryParsing.DEFTYPE); } QParser nestedParser = getParser(q, defaultType, getReq()); nestedParser.recurseCount = recurseCount; recurseCount--; return nestedParser; }
// in core/src/java/org/apache/solr/search/QParser.java
public ScoreDoc getPaging() throws ParseException { return null; /*** This is not ready for prime-time... see SOLR-1726 String pageScoreS = null; String pageDocS = null; pageScoreS = params.get(CommonParams.PAGESCORE); pageDocS = params.get(CommonParams.PAGEDOC); if (pageScoreS == null || pageDocS == null) return null; int pageDoc = pageDocS != null ? Integer.parseInt(pageDocS) : -1; float pageScore = pageScoreS != null ? new Float(pageScoreS) : -1; if(pageDoc != -1 && pageScore != -1){ return new ScoreDoc(pageDoc, pageScore); } else { return null; } ***/ }
// in core/src/java/org/apache/solr/search/QParser.java
public SortSpec getSort(boolean useGlobalParams) throws ParseException { getQuery(); // ensure query is parsed first String sortStr = null; String startS = null; String rowsS = null; if (localParams != null) { sortStr = localParams.get(CommonParams.SORT); startS = localParams.get(CommonParams.START); rowsS = localParams.get(CommonParams.ROWS); // if any of these parameters are present, don't go back to the global params if (sortStr != null || startS != null || rowsS != null) { useGlobalParams = false; } } if (useGlobalParams) { if (sortStr ==null) { sortStr = params.get(CommonParams.SORT); } if (startS==null) { startS = params.get(CommonParams.START); } if (rowsS==null) { rowsS = params.get(CommonParams.ROWS); } } int start = startS != null ? Integer.parseInt(startS) : 0; int rows = rowsS != null ? Integer.parseInt(rowsS) : 10; Sort sort = null; if( sortStr != null ) { sort = QueryParsing.parseSort(sortStr, req); } return new SortSpec( sort, start, rows ); }
// in core/src/java/org/apache/solr/search/QParser.java
public Query getHighlightQuery() throws ParseException { Query query = getQuery(); return query instanceof WrappedQuery ? ((WrappedQuery)query).getWrappedQuery() : query; }
// in core/src/java/org/apache/solr/search/QParser.java
public static QParser getParser(String qstr, String defaultType, SolrQueryRequest req) throws ParseException { // SolrParams localParams = QueryParsing.getLocalParams(qstr, req.getParams()); String stringIncludingLocalParams = qstr; SolrParams localParams = null; SolrParams globalParams = req.getParams(); boolean valFollowedParams = true; int localParamsEnd = -1; if (qstr != null && qstr.startsWith(QueryParsing.LOCALPARAM_START)) { Map<String, String> localMap = new HashMap<String, String>(); localParamsEnd = QueryParsing.parseLocalParams(qstr, 0, localMap, globalParams); String val = localMap.get(QueryParsing.V); if (val != null) { // val was directly specified in localParams via v=<something> or v=$arg valFollowedParams = false; } else { // use the remainder of the string as the value valFollowedParams = true; val = qstr.substring(localParamsEnd); localMap.put(QueryParsing.V, val); } localParams = new MapSolrParams(localMap); } String type; if (localParams == null) { type = defaultType; } else { type = localParams.get(QueryParsing.TYPE,defaultType); qstr = localParams.get("v"); } type = type==null ? QParserPlugin.DEFAULT_QTYPE : type; QParserPlugin qplug = req.getCore().getQueryPlugin(type); QParser parser = qplug.createParser(qstr, localParams, req.getParams(), req); parser.stringIncludingLocalParams = stringIncludingLocalParams; parser.valFollowedParams = valFollowedParams; parser.localParamsEnd = localParamsEnd; return parser; }
// in core/src/java/org/apache/solr/search/SurroundQParserPlugin.java
Override public Query parse() throws org.apache.lucene.queryparser.classic.ParseException { SrndQuery sq; String qstr = getString(); if (qstr == null) return null; String mbqparam = getParam(MBQParam); if (mbqparam == null) { this.maxBasicQueries = DEFMAXBASICQUERIES; } else { try { this.maxBasicQueries = Integer.parseInt(mbqparam); } catch (Exception e) { LOG.warn("Couldn't parse maxBasicQueries value " + mbqparam +", using default of 1000"); this.maxBasicQueries = DEFMAXBASICQUERIES; } } // ugh .. colliding ParseExceptions try { sq = org.apache.lucene.queryparser.surround.parser.QueryParser .parse(qstr); } catch (org.apache.lucene.queryparser.surround.parser.ParseException pe) { throw new org.apache.lucene.queryparser.classic.ParseException( pe.getMessage()); } // so what do we do with the SrndQuery ?? // processing based on example in LIA Ch 9 String defaultField = getParam(CommonParams.DF); if (defaultField == null) { defaultField = getReq().getSchema().getDefaultSearchFieldName(); } BasicQueryFactory bqFactory = new BasicQueryFactory(this.maxBasicQueries); Query lquery = sq.makeLuceneQueryField(defaultField, bqFactory); return lquery; }
// in core/src/java/org/apache/solr/search/Grouping.java
public void addFieldCommand(String field, SolrQueryRequest request) throws ParseException { SchemaField schemaField = searcher.getSchema().getField(field); // Throws an exception when field doesn't exist. Bad request. FieldType fieldType = schemaField.getType(); ValueSource valueSource = fieldType.getValueSource(schemaField, null); if (!(valueSource instanceof StrFieldSource)) { addFunctionCommand(field, request); return; } Grouping.CommandField gc = new CommandField(); gc.groupSort = groupSort; gc.groupBy = field; gc.key = field; gc.numGroups = limitDefault; gc.docsPerGroup = docsPerGroupDefault; gc.groupOffset = groupOffsetDefault; gc.offset = cmd.getOffset(); gc.sort = sort; gc.format = defaultFormat; gc.totalCount = defaultTotalCount; if (main) { gc.main = true; gc.format = Grouping.Format.simple; } if (gc.format == Grouping.Format.simple) { gc.groupOffset = 0; // doesn't make sense } commands.add(gc); }
// in core/src/java/org/apache/solr/search/Grouping.java
public void addFunctionCommand(String groupByStr, SolrQueryRequest request) throws ParseException { QParser parser = QParser.getParser(groupByStr, "func", request); Query q = parser.getQuery(); final Grouping.Command gc; if (q instanceof FunctionQuery) { ValueSource valueSource = ((FunctionQuery) q).getValueSource(); if (valueSource instanceof StrFieldSource) { String field = ((StrFieldSource) valueSource).getField(); CommandField commandField = new CommandField(); commandField.groupBy = field; gc = commandField; } else { CommandFunc commandFunc = new CommandFunc(); commandFunc.groupBy = valueSource; gc = commandFunc; } } else { CommandFunc commandFunc = new CommandFunc(); commandFunc.groupBy = new QueryValueSource(q, 0.0f); gc = commandFunc; } gc.groupSort = groupSort; gc.key = groupByStr; gc.numGroups = limitDefault; gc.docsPerGroup = docsPerGroupDefault; gc.groupOffset = groupOffsetDefault; gc.offset = cmd.getOffset(); gc.sort = sort; gc.format = defaultFormat; gc.totalCount = defaultTotalCount; if (main) { gc.main = true; gc.format = Grouping.Format.simple; } if (gc.format == Grouping.Format.simple) { gc.groupOffset = 0; // doesn't make sense } commands.add(gc); }
// in core/src/java/org/apache/solr/search/Grouping.java
public void addQueryCommand(String groupByStr, SolrQueryRequest request) throws ParseException { QParser parser = QParser.getParser(groupByStr, null, request); Query gq = parser.getQuery(); Grouping.CommandQuery gc = new CommandQuery(); gc.query = gq; gc.groupSort = groupSort; gc.key = groupByStr; gc.numGroups = limitDefault; gc.docsPerGroup = docsPerGroupDefault; gc.groupOffset = groupOffsetDefault; // these two params will only be used if this is for the main result set gc.offset = cmd.getOffset(); gc.numGroups = limitDefault; gc.format = defaultFormat; if (main) { gc.main = true; gc.format = Grouping.Format.simple; } if (gc.format == Grouping.Format.simple) { gc.docsPerGroup = gc.numGroups; // doesn't make sense to limit to one gc.groupOffset = gc.offset; } commands.add(gc); }
// in core/src/java/org/apache/solr/search/ExtendedDismaxQParserPlugin.java
Override public Query parse() throws ParseException { SolrParams localParams = getLocalParams(); SolrParams params = getParams(); solrParams = SolrParams.wrapDefaults(localParams, params); final String minShouldMatch = DisMaxQParser.parseMinShouldMatch(req.getSchema(), solrParams); userFields = new UserFields(U.parseFieldBoosts(solrParams.getParams(DMP.UF))); queryFields = SolrPluginUtils.parseFieldBoosts(solrParams.getParams(DisMaxParams.QF)); if (0 == queryFields.size()) { queryFields.put(req.getSchema().getDefaultSearchFieldName(), 1.0f); } // Boosted phrase of the full query string List<FieldParams> phraseFields = U.parseFieldBoostsAndSlop(solrParams.getParams(DMP.PF),0); // Boosted Bi-Term Shingles from the query string List<FieldParams> phraseFields2 = U.parseFieldBoostsAndSlop(solrParams.getParams("pf2"),2); // Boosted Tri-Term Shingles from the query string List<FieldParams> phraseFields3 = U.parseFieldBoostsAndSlop(solrParams.getParams("pf3"),3); float tiebreaker = solrParams.getFloat(DisMaxParams.TIE, 0.0f); int pslop = solrParams.getInt(DisMaxParams.PS, 0); int qslop = solrParams.getInt(DisMaxParams.QS, 0); // remove stopwords from mandatory "matching" component? boolean stopwords = solrParams.getBool("stopwords", true); /* the main query we will execute. we disable the coord because * this query is an artificial construct */ BooleanQuery query = new BooleanQuery(true); /* * * Main User Query * * */ parsedUserQuery = null; String userQuery = getString(); altUserQuery = null; if( userQuery == null || userQuery.trim().length() == 0 ) { // If no query is specified, we may have an alternate String altQ = solrParams.get( DisMaxParams.ALTQ ); if (altQ != null) { altQParser = subQuery(altQ, null); altUserQuery = altQParser.getQuery(); query.add( altUserQuery , BooleanClause.Occur.MUST ); } else { return null; // throw new ParseException("missing query string" ); } } else { // There is a valid query string // userQuery = partialEscape(U.stripUnbalancedQuotes(userQuery)).toString(); boolean lowercaseOperators = solrParams.getBool("lowercaseOperators", true); String mainUserQuery = userQuery; ExtendedSolrQueryParser up = new ExtendedSolrQueryParser(this, IMPOSSIBLE_FIELD_NAME); up.addAlias(IMPOSSIBLE_FIELD_NAME, tiebreaker, queryFields); addAliasesFromRequest(up, tiebreaker); up.setPhraseSlop(qslop); // slop for explicit user phrase queries up.setAllowLeadingWildcard(true); // defer escaping and only do if lucene parsing fails, or we need phrases // parsing fails. Need to sloppy phrase queries anyway though. List<Clause> clauses = null; int numPluses = 0; int numMinuses = 0; int numOR = 0; int numNOT = 0; clauses = splitIntoClauses(userQuery, false); for (Clause clause : clauses) { if (clause.must == '+') numPluses++; if (clause.must == '-') numMinuses++; if (clause.isBareWord()) { String s = clause.val; if ("OR".equals(s)) { numOR++; } else if ("NOT".equals(s)) { numNOT++; } else if (lowercaseOperators && "or".equals(s)) { numOR++; } } } // Always rebuild mainUserQuery from clauses to catch modifications from splitIntoClauses // This was necessary for userFields modifications to get propagated into the query. // Convert lower or mixed case operators to uppercase if we saw them. // only do this for the lucene query part and not for phrase query boosting // since some fields might not be case insensitive. // We don't use a regex for this because it might change and AND or OR in // a phrase query in a case sensitive field. StringBuilder sb = new StringBuilder(); for (int i=0; i<clauses.size(); i++) { Clause clause = clauses.get(i); String s = clause.raw; // and and or won't be operators at the start or end if (i>0 && i+1<clauses.size()) { if ("AND".equalsIgnoreCase(s)) { s="AND"; } else if ("OR".equalsIgnoreCase(s)) { s="OR"; } } sb.append(s); sb.append(' '); } mainUserQuery = sb.toString(); // For correct lucene queries, turn off mm processing if there // were explicit operators (except for AND). boolean doMinMatched = (numOR + numNOT + numPluses + numMinuses) == 0; try { up.setRemoveStopFilter(!stopwords); up.exceptions = true; parsedUserQuery = up.parse(mainUserQuery); if (stopwords && isEmpty(parsedUserQuery)) { // if the query was all stop words, remove none of them up.setRemoveStopFilter(true); parsedUserQuery = up.parse(mainUserQuery); } } catch (Exception e) { // ignore failure and reparse later after escaping reserved chars up.exceptions = false; } if (parsedUserQuery != null && doMinMatched) { if (parsedUserQuery instanceof BooleanQuery) { SolrPluginUtils.setMinShouldMatch((BooleanQuery)parsedUserQuery, minShouldMatch); } } if (parsedUserQuery == null) { sb = new StringBuilder(); for (Clause clause : clauses) { boolean doQuote = clause.isPhrase; String s=clause.val; if (!clause.isPhrase && ("OR".equals(s) || "AND".equals(s) || "NOT".equals(s))) { doQuote=true; } if (clause.must != 0) { sb.append(clause.must); } if (clause.field != null) { sb.append(clause.field); sb.append(':'); } if (doQuote) { sb.append('"'); } sb.append(clause.val); if (doQuote) { sb.append('"'); } if (clause.field != null) { // Add the default user field boost, if any Float boost = userFields.getBoost(clause.field); if(boost != null) sb.append("^").append(boost); } sb.append(' '); } String escapedUserQuery = sb.toString(); parsedUserQuery = up.parse(escapedUserQuery); if (parsedUserQuery instanceof BooleanQuery) { BooleanQuery t = new BooleanQuery(); SolrPluginUtils.flattenBooleanQuery(t, (BooleanQuery)parsedUserQuery); SolrPluginUtils.setMinShouldMatch(t, minShouldMatch); parsedUserQuery = t; } } query.add(parsedUserQuery, BooleanClause.Occur.MUST); // sloppy phrase queries for proximity List<FieldParams> allPhraseFields = new ArrayList<FieldParams>(); allPhraseFields.addAll(phraseFields); allPhraseFields.addAll(phraseFields2); allPhraseFields.addAll(phraseFields3); if (allPhraseFields.size() > 0) { // find non-field clauses List<Clause> normalClauses = new ArrayList<Clause>(clauses.size()); for (Clause clause : clauses) { if (clause.field != null || clause.isPhrase) continue; // check for keywords "AND,OR,TO" if (clause.isBareWord()) { String s = clause.val.toString(); // avoid putting explict operators in the phrase query if ("OR".equals(s) || "AND".equals(s) || "NOT".equals(s) || "TO".equals(s)) continue; } normalClauses.add(clause); } // full phrase and shingles for (FieldParams phraseField: allPhraseFields) { int slop = (phraseField.getSlop() == 0) ? pslop : phraseField.getSlop(); Map<String,Float> pf = new HashMap<String,Float>(1); pf.put(phraseField.getField(),phraseField.getBoost()); addShingledPhraseQueries(query, normalClauses, pf, phraseField.getWordGrams(),tiebreaker, slop); } } } /* * * Boosting Query * * */ boostParams = solrParams.getParams(DisMaxParams.BQ); boostQueries=null; if (boostParams!=null && boostParams.length>0) { Map<String,Float> bqBoosts = SolrPluginUtils.parseFieldBoosts(boostParams); boostQueries = new ArrayList<Query>(); for (Map.Entry<String,Float> bqs : bqBoosts.entrySet()) { if (bqs.getKey().trim().length()==0) continue; Query q = subQuery(bqs.getKey(), null).getQuery(); Float b = bqs.getValue(); if(b!=null) { q.setBoost(b); } boostQueries.add(q); } } if (null != boostQueries) { for(Query f : boostQueries) { query.add(f, BooleanClause.Occur.SHOULD); } } /* * * Boosting Functions * * */ String[] boostFuncs = solrParams.getParams(DisMaxParams.BF); if (null != boostFuncs && 0 != boostFuncs.length) { for (String boostFunc : boostFuncs) { if(null == boostFunc || "".equals(boostFunc)) continue; Map<String,Float> ff = SolrPluginUtils.parseFieldBoosts(boostFunc); for (String f : ff.keySet()) { Query fq = subQuery(f, FunctionQParserPlugin.NAME).getQuery(); Float b = ff.get(f); if (null != b) { fq.setBoost(b); } query.add(fq, BooleanClause.Occur.SHOULD); } } } // // create a boosted query (scores multiplied by boosts) // Query topQuery = query; multBoosts = solrParams.getParams("boost"); if (multBoosts!=null && multBoosts.length>0) { List<ValueSource> boosts = new ArrayList<ValueSource>(); for (String boostStr : multBoosts) { if (boostStr==null || boostStr.length()==0) continue; Query boost = subQuery(boostStr, FunctionQParserPlugin.NAME).getQuery(); ValueSource vs; if (boost instanceof FunctionQuery) { vs = ((FunctionQuery)boost).getValueSource(); } else { vs = new QueryValueSource(boost, 1.0f); } boosts.add(vs); } if (boosts.size()>1) { ValueSource prod = new ProductFloatFunction(boosts.toArray(new ValueSource[boosts.size()])); topQuery = new BoostedQuery(query, prod); } else if (boosts.size() == 1) { topQuery = new BoostedQuery(query, boosts.get(0)); } } return topQuery; }
// in core/src/java/org/apache/solr/search/ExtendedDismaxQParserPlugin.java
private void addAliasesFromRequest(ExtendedSolrQueryParser up, float tiebreaker) throws ParseException { Iterator<String> it = solrParams.getParameterNamesIterator(); while(it.hasNext()) { String param = it.next(); if(param.startsWith("f.") && param.endsWith(".qf")) { // Add the alias String fname = param.substring(2,param.length()-3); String qfReplacement = solrParams.get(param); Map<String,Float> parsedQf = SolrPluginUtils.parseFieldBoosts(qfReplacement); if(parsedQf.size() == 0) return; up.addAlias(fname, tiebreaker, parsedQf); } } }
// in core/src/java/org/apache/solr/search/ExtendedDismaxQParserPlugin.java
private void addShingledPhraseQueries(final BooleanQuery mainQuery, final List<Clause> clauses, final Map<String,Float> fields, int shingleSize, final float tiebreaker, final int slop) throws ParseException { if (null == fields || fields.isEmpty() || null == clauses || clauses.size() < shingleSize ) return; if (0 == shingleSize) shingleSize = clauses.size(); final int goat = shingleSize-1; // :TODO: better name for var? StringBuilder userPhraseQuery = new StringBuilder(); for (int i=0; i < clauses.size() - goat; i++) { userPhraseQuery.append('"'); for (int j=0; j <= goat; j++) { userPhraseQuery.append(clauses.get(i + j).val); userPhraseQuery.append(' '); } userPhraseQuery.append('"'); userPhraseQuery.append(' '); } /* for parsing sloppy phrases using DisjunctionMaxQueries */ ExtendedSolrQueryParser pp = new ExtendedSolrQueryParser(this, IMPOSSIBLE_FIELD_NAME); pp.addAlias(IMPOSSIBLE_FIELD_NAME, tiebreaker, fields); pp.setPhraseSlop(slop); pp.setRemoveStopFilter(true); // remove stop filter and keep stopwords /* :TODO: reevaluate using makeDismax=true vs false... * * The DismaxQueryParser always used DisjunctionMaxQueries for the * pf boost, for the same reasons it used them for the qf fields. * When Yonik first wrote the ExtendedDismaxQParserPlugin, he added * the "makeDismax=false" property to use BooleanQueries instead, but * when asked why his response was "I honestly don't recall" ... * * https://issues.apache.org/jira/browse/SOLR-1553?focusedCommentId=12793813#action_12793813 * * so for now, we continue to use dismax style queries becuse it * seems the most logical and is back compatible, but we should * try to figure out what Yonik was thinking at the time (because he * rarely does things for no reason) */ pp.makeDismax = true; // minClauseSize is independent of the shingleSize because of stop words // (if they are removed from the middle, so be it, but we need at least // two or there shouldn't be a boost) pp.minClauseSize = 2; // TODO: perhaps we shouldn't use synonyms either... Query phrase = pp.parse(userPhraseQuery.toString()); if (phrase != null) { mainQuery.add(phrase, BooleanClause.Occur.SHOULD); } }
// in core/src/java/org/apache/solr/search/ExtendedDismaxQParserPlugin.java
Override public Query getHighlightQuery() throws ParseException { return parsedUserQuery == null ? altUserQuery : parsedUserQuery; }
// in core/src/java/org/apache/solr/search/ExtendedDismaxQParserPlugin.java
Override protected Query getBooleanQuery(List clauses, boolean disableCoord) throws ParseException { Query q = super.getBooleanQuery(clauses, disableCoord); if (q != null) { q = QueryUtils.makeQueryable(q); } return q; }
// in core/src/java/org/apache/solr/search/ExtendedDismaxQParserPlugin.java
Override protected Query getFieldQuery(String field, String val, boolean quoted) throws ParseException { //System.out.println("getFieldQuery: val="+val); this.type = QType.FIELD; this.field = field; this.val = val; this.slop = getPhraseSlop(); // unspecified return getAliasedQuery(); }
// in core/src/java/org/apache/solr/search/ExtendedDismaxQParserPlugin.java
Override protected Query getFieldQuery(String field, String val, int slop) throws ParseException { //System.out.println("getFieldQuery: val="+val+" slop="+slop); this.type = QType.PHRASE; this.field = field; this.val = val; this.slop = slop; return getAliasedQuery(); }
// in core/src/java/org/apache/solr/search/ExtendedDismaxQParserPlugin.java
Override protected Query getPrefixQuery(String field, String val) throws ParseException { //System.out.println("getPrefixQuery: val="+val); if (val.equals("") && field.equals("*")) { return new MatchAllDocsQuery(); } this.type = QType.PREFIX; this.field = field; this.val = val; return getAliasedQuery(); }
// in core/src/java/org/apache/solr/search/ExtendedDismaxQParserPlugin.java
Override protected Query newFieldQuery(Analyzer analyzer, String field, String queryText, boolean quoted) throws ParseException { Analyzer actualAnalyzer; if (removeStopFilter) { if (nonStopFilterAnalyzerPerField == null) { nonStopFilterAnalyzerPerField = new HashMap<String, Analyzer>(); } actualAnalyzer = nonStopFilterAnalyzerPerField.get(field); if (actualAnalyzer == null) { actualAnalyzer = noStopwordFilterAnalyzer(field); } } else { actualAnalyzer = parser.getReq().getSchema().getFieldType(field).getQueryAnalyzer(); } return super.newFieldQuery(actualAnalyzer, field, queryText, quoted); }
// in core/src/java/org/apache/solr/search/ExtendedDismaxQParserPlugin.java
Override protected Query getRangeQuery(String field, String a, String b, boolean startInclusive, boolean endInclusive) throws ParseException { //System.out.println("getRangeQuery:"); this.type = QType.RANGE; this.field = field; this.val = a; this.val2 = b; this.bool = startInclusive; this.bool2 = endInclusive; return getAliasedQuery(); }
// in core/src/java/org/apache/solr/search/ExtendedDismaxQParserPlugin.java
Override protected Query getWildcardQuery(String field, String val) throws ParseException { //System.out.println("getWildcardQuery: val="+val); if (val.equals("*")) { if (field.equals("*")) { return new MatchAllDocsQuery(); } else{ return getPrefixQuery(field,""); } } this.type = QType.WILDCARD; this.field = field; this.val = val; return getAliasedQuery(); }
// in core/src/java/org/apache/solr/search/ExtendedDismaxQParserPlugin.java
Override protected Query getFuzzyQuery(String field, String val, float minSimilarity) throws ParseException { //System.out.println("getFuzzyQuery: val="+val); this.type = QType.FUZZY; this.field = field; this.val = val; this.flt = minSimilarity; return getAliasedQuery(); }
// in core/src/java/org/apache/solr/search/ExtendedDismaxQParserPlugin.java
protected Query getAliasedQuery() throws ParseException { Alias a = aliases.get(field); this.validateCyclicAliasing(field); if (a != null) { List<Query> lst = getQueries(a); if (lst == null || lst.size()==0) return getQuery(); // make a DisjunctionMaxQuery in this case too... it will stop // the "mm" processing from making everything required in the case // that the query expanded to multiple clauses. // DisMaxQuery.rewrite() removes itself if there is just a single clause anyway. // if (lst.size()==1) return lst.get(0); if (makeDismax) { DisjunctionMaxQuery q = new DisjunctionMaxQuery(lst, a.tie); return q; } else { // should we disable coord? BooleanQuery q = new BooleanQuery(disableCoord); for (Query sub : lst) { q.add(sub, BooleanClause.Occur.SHOULD); } return q; } } else { // verify that a fielded query is actually on a field that exists... if not, // then throw an exception to get us out of here, and we'll treat it like a // literal when we try the escape+re-parse. if (exceptions) { FieldType ft = schema.getFieldTypeNoEx(field); if (ft == null && null == MagicFieldName.get(field)) { throw unknownField; } } return getQuery(); } }
// in core/src/java/org/apache/solr/search/ExtendedDismaxQParserPlugin.java
private void validateCyclicAliasing(String field) throws ParseException { Set<String> set = new HashSet<String>(); set.add(field); if(validateField(field, set)) { throw new ParseException("Field aliases lead to a cycle"); } }
// in core/src/java/org/apache/solr/search/ExtendedDismaxQParserPlugin.java
protected List<Query> getQueries(Alias a) throws ParseException { if (a == null) return null; if (a.fields.size()==0) return null; List<Query> lst= new ArrayList<Query>(4); for (String f : a.fields.keySet()) { this.field = f; Query sub = getAliasedQuery(); if (sub != null) { Float boost = a.fields.get(f); if (boost != null) { sub.setBoost(boost); } lst.add(sub); } } return lst; }
// in core/src/java/org/apache/solr/search/ExtendedDismaxQParserPlugin.java
private Query getQuery() throws ParseException { try { switch (type) { case FIELD: // fallthrough case PHRASE: Query query = super.getFieldQuery(field, val, type == QType.PHRASE); if (query instanceof PhraseQuery) { PhraseQuery pq = (PhraseQuery)query; if (minClauseSize > 1 && pq.getTerms().length < minClauseSize) return null; ((PhraseQuery)query).setSlop(slop); } else if (query instanceof MultiPhraseQuery) { MultiPhraseQuery pq = (MultiPhraseQuery)query; if (minClauseSize > 1 && pq.getTermArrays().size() < minClauseSize) return null; ((MultiPhraseQuery)query).setSlop(slop); } else if (minClauseSize > 1) { // if it's not a type of phrase query, it doesn't meet the minClauseSize requirements return null; } return query; case PREFIX: return super.getPrefixQuery(field, val); case WILDCARD: return super.getWildcardQuery(field, val); case FUZZY: return super.getFuzzyQuery(field, val, flt); case RANGE: return super.getRangeQuery(field, val, val2, bool, bool2); } return null; } catch (Exception e) { // an exception here is due to the field query not being compatible with the input text // for example, passing a string to a numeric field. return null; } }
// in core/src/java/org/apache/solr/search/ReturnFields.java
String getFieldName(QueryParsing.StrParser sp) throws ParseException { sp.eatws(); int id_start = sp.pos; char ch; if (sp.pos < sp.end && (ch = sp.val.charAt(sp.pos)) != '$' && Character.isJavaIdentifierStart(ch)) { sp.pos++; while (sp.pos < sp.end) { ch = sp.val.charAt(sp.pos); if (!Character.isJavaIdentifierPart(ch) && ch != '.' && ch != '-') { break; } sp.pos++; } return sp.val.substring(id_start, sp.pos); } return null; }
// in core/src/java/org/apache/solr/search/NestedQParserPlugin.java
Override public QParser createParser(String qstr, SolrParams localParams, SolrParams params, SolrQueryRequest req) { return new QParser(qstr, localParams, params, req) { QParser baseParser; ValueSource vs; String b; @Override public Query parse() throws ParseException { baseParser = subQuery(localParams.get(QueryParsing.V), null); return baseParser.getQuery(); } @Override public String[] getDefaultHighlightFields() { return baseParser.getDefaultHighlightFields(); } @Override public Query getHighlightQuery() throws ParseException { return baseParser.getHighlightQuery(); } @Override public void addDebugInfo(NamedList<Object> debugInfo) { // encapsulate base debug info in a sub-list? baseParser.addDebugInfo(debugInfo); } }; }
// in core/src/java/org/apache/solr/search/NestedQParserPlugin.java
Override public Query parse() throws ParseException { baseParser = subQuery(localParams.get(QueryParsing.V), null); return baseParser.getQuery(); }
// in core/src/java/org/apache/solr/search/NestedQParserPlugin.java
Override public Query getHighlightQuery() throws ParseException { return baseParser.getHighlightQuery(); }
// in core/src/java/org/apache/solr/search/QueryParsing.java
public static int parseLocalParams(String txt, int start, Map<String, String> target, SolrParams params) throws ParseException { return parseLocalParams(txt, start, target, params, LOCALPARAM_START, LOCALPARAM_END); }
// in core/src/java/org/apache/solr/search/QueryParsing.java
public static int parseLocalParams(String txt, int start, Map<String, String> target, SolrParams params, String startString, char endChar) throws ParseException { int off = start; if (!txt.startsWith(startString, off)) return start; StrParser p = new StrParser(txt, start, txt.length()); p.pos += startString.length(); // skip over "{!" for (; ;) { /* if (p.pos>=txt.length()) { throw new ParseException("Missing '}' parsing local params '" + txt + '"'); } */ char ch = p.peek(); if (ch == endChar) { return p.pos + 1; } String id = p.getId(); if (id.length() == 0) { throw new ParseException("Expected ending character '" + endChar + "' parsing local params '" + txt + '"'); } String val = null; ch = p.peek(); if (ch != '=') { // single word... treat {!func} as type=func for easy lookup val = id; id = TYPE; } else { // saw equals, so read value p.pos++; ch = p.peek(); boolean deref = false; if (ch == '$') { p.pos++; ch = p.peek(); deref = true; // dereference whatever value is read by treating it as a variable name } if (ch == '\"' || ch == '\'') { val = p.getQuotedString(); } else { // read unquoted literal ended by whitespace or endChar (normally '}') // there is no escaping. int valStart = p.pos; for (; ;) { if (p.pos >= p.end) { throw new ParseException("Missing end to unquoted value starting at " + valStart + " str='" + txt + "'"); } char c = p.val.charAt(p.pos); if (c == endChar || Character.isWhitespace(c)) { val = p.val.substring(valStart, p.pos); break; } p.pos++; } } if (deref) { // dereference parameter if (params != null) { val = params.get(val); } } } if (target != null) target.put(id, val); } }
// in core/src/java/org/apache/solr/search/QueryParsing.java
public static SolrParams getLocalParams(String txt, SolrParams params) throws ParseException { if (txt == null || !txt.startsWith(LOCALPARAM_START)) { return null; } Map<String, String> localParams = new HashMap<String, String>(); int start = QueryParsing.parseLocalParams(txt, 0, localParams, params); String val = localParams.get(V); if (val == null) { val = txt.substring(start); localParams.put(V, val); } else { // localParams.put(VAL_EXPLICIT, "true"); } return new MapSolrParams(localParams); }
// in core/src/java/org/apache/solr/search/QueryParsing.java
void expect(String s) throws ParseException { eatws(); int slen = s.length(); if (val.regionMatches(pos, s, 0, slen)) { pos += slen; } else { throw new ParseException("Expected '" + s + "' at position " + pos + " in '" + val + "'"); } }
// in core/src/java/org/apache/solr/search/QueryParsing.java
float getFloat() throws ParseException { eatws(); char[] arr = new char[end - pos]; int i; for (i = 0; i < arr.length; i++) { char ch = val.charAt(pos); if ((ch >= '0' && ch <= '9') || ch == '+' || ch == '-' || ch == '.' || ch == 'e' || ch == 'E' ) { pos++; arr[i] = ch; } else { break; } } return Float.parseFloat(new String(arr, 0, i)); }
// in core/src/java/org/apache/solr/search/QueryParsing.java
Number getNumber() throws ParseException { eatws(); int start = pos; boolean flt = false; while (pos < end) { char ch = val.charAt(pos); if ((ch >= '0' && ch <= '9') || ch == '+' || ch == '-') { pos++; } else if (ch == '.' || ch =='e' || ch=='E') { flt = true; pos++; } else { break; } } String v = val.substring(start,pos); if (flt) { return Double.parseDouble(v); } else { return Long.parseLong(v); } }
// in core/src/java/org/apache/solr/search/QueryParsing.java
double getDouble() throws ParseException { eatws(); char[] arr = new char[end - pos]; int i; for (i = 0; i < arr.length; i++) { char ch = val.charAt(pos); if ((ch >= '0' && ch <= '9') || ch == '+' || ch == '-' || ch == '.' || ch == 'e' || ch == 'E' ) { pos++; arr[i] = ch; } else { break; } } return Double.parseDouble(new String(arr, 0, i)); }
// in core/src/java/org/apache/solr/search/QueryParsing.java
int getInt() throws ParseException { eatws(); char[] arr = new char[end - pos]; int i; for (i = 0; i < arr.length; i++) { char ch = val.charAt(pos); if ((ch >= '0' && ch <= '9') || ch == '+' || ch == '-' ) { pos++; arr[i] = ch; } else { break; } } return Integer.parseInt(new String(arr, 0, i)); }
// in core/src/java/org/apache/solr/search/QueryParsing.java
String getId() throws ParseException { return getId("Expected identifier"); }
// in core/src/java/org/apache/solr/search/QueryParsing.java
String getId(String errMessage) throws ParseException { eatws(); int id_start = pos; char ch; if (pos < end && (ch = val.charAt(pos)) != '$' && Character.isJavaIdentifierStart(ch)) { pos++; while (pos < end) { ch = val.charAt(pos); // if (!Character.isJavaIdentifierPart(ch) && ch != '.' && ch != ':') { if (!Character.isJavaIdentifierPart(ch) && ch != '.') { break; } pos++; } return val.substring(id_start, pos); } if (errMessage != null) { throw new ParseException(errMessage + " at pos " + pos + " str='" + val + "'"); } return null; }
// in core/src/java/org/apache/solr/search/QueryParsing.java
public String getGlobbedId(String errMessage) throws ParseException { eatws(); int id_start = pos; char ch; if (pos < end && (ch = val.charAt(pos)) != '$' && (Character.isJavaIdentifierStart(ch) || ch=='?' || ch=='*')) { pos++; while (pos < end) { ch = val.charAt(pos); if (!(Character.isJavaIdentifierPart(ch) || ch=='?' || ch=='*') && ch != '.') { break; } pos++; } return val.substring(id_start, pos); } if (errMessage != null) { throw new ParseException(errMessage + " at pos " + pos + " str='" + val + "'"); } return null; }
// in core/src/java/org/apache/solr/search/QueryParsing.java
Boolean getSortDirection() throws ParseException { final int startPos = pos; final String order = getId(null); Boolean top = null; if (null != order) { if ("desc".equals(order) || "top".equals(order)) { top = true; } else if ("asc".equals(order) || "bottom".equals(order)) { top = false; } // it's not a legal direction if more stuff comes after it eatws(); final char c = ch(); if (0 == c) { // :NOOP } else if (',' == c) { pos++; } else { top = null; } } if (null == top) pos = startPos; // no direction, reset return top; }
// in core/src/java/org/apache/solr/search/QueryParsing.java
String getQuotedString() throws ParseException { eatws(); char delim = peekChar(); if (!(delim == '\"' || delim == '\'')) { return null; } int val_start = ++pos; StringBuilder sb = new StringBuilder(); // needed for escaping for (; ;) { if (pos >= end) { throw new ParseException("Missing end quote for string at pos " + (val_start - 1) + " str='" + val + "'"); } char ch = val.charAt(pos); if (ch == '\\') { pos++; if (pos >= end) break; ch = val.charAt(pos); switch (ch) { case 'n': ch = '\n'; break; case 't': ch = '\t'; break; case 'r': ch = '\r'; break; case 'b': ch = '\b'; break; case 'f': ch = '\f'; break; case 'u': if (pos + 4 >= end) { throw new ParseException("bad unicode escape \\uxxxx at pos" + (val_start - 1) + " str='" + val + "'"); } ch = (char) Integer.parseInt(val.substring(pos + 1, pos + 5), 16); pos += 4; break; } } else if (ch == delim) { pos++; // skip over the quote break; } sb.append(ch); pos++; } return sb.toString(); }
// in core/src/java/org/apache/solr/search/grouping/distributed/command/QueryCommand.java
public Builder setQuery(String groupQueryString, SolrQueryRequest request) throws ParseException { QParser parser = QParser.getParser(groupQueryString, null, request); this.queryString = groupQueryString; return setQuery(parser.getQuery()); }
// in core/src/java/org/apache/solr/search/JoinQParserPlugin.java
public QParser createParser(String qstr, SolrParams localParams, SolrParams params, SolrQueryRequest req) { return new QParser(qstr, localParams, params, req) { public Query parse() throws ParseException { String fromField = getParam("from"); String fromIndex = getParam("fromIndex"); String toField = getParam("to"); String v = localParams.get("v"); Query fromQuery; long fromCoreOpenTime = 0; if (fromIndex != null && !fromIndex.equals(req.getCore().getCoreDescriptor().getName()) ) { CoreContainer container = req.getCore().getCoreDescriptor().getCoreContainer(); final SolrCore fromCore = container.getCore(fromIndex); RefCounted<SolrIndexSearcher> fromHolder = null; if (fromCore == null) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Cross-core join: no such core " + fromIndex); } LocalSolrQueryRequest otherReq = new LocalSolrQueryRequest(fromCore, params); try { QParser parser = QParser.getParser(v, "lucene", otherReq); fromQuery = parser.getQuery(); fromHolder = fromCore.getRegisteredSearcher(); if (fromHolder != null) fromCoreOpenTime = fromHolder.get().getOpenTime(); } finally { otherReq.close(); fromCore.close(); if (fromHolder != null) fromHolder.decref(); } } else { QParser fromQueryParser = subQuery(v, null); fromQuery = fromQueryParser.getQuery(); } JoinQuery jq = new JoinQuery(fromField, toField, fromIndex, fromQuery); jq.fromCoreOpenTime = fromCoreOpenTime; return jq; } }; }
// in core/src/java/org/apache/solr/search/JoinQParserPlugin.java
public Query parse() throws ParseException { String fromField = getParam("from"); String fromIndex = getParam("fromIndex"); String toField = getParam("to"); String v = localParams.get("v"); Query fromQuery; long fromCoreOpenTime = 0; if (fromIndex != null && !fromIndex.equals(req.getCore().getCoreDescriptor().getName()) ) { CoreContainer container = req.getCore().getCoreDescriptor().getCoreContainer(); final SolrCore fromCore = container.getCore(fromIndex); RefCounted<SolrIndexSearcher> fromHolder = null; if (fromCore == null) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Cross-core join: no such core " + fromIndex); } LocalSolrQueryRequest otherReq = new LocalSolrQueryRequest(fromCore, params); try { QParser parser = QParser.getParser(v, "lucene", otherReq); fromQuery = parser.getQuery(); fromHolder = fromCore.getRegisteredSearcher(); if (fromHolder != null) fromCoreOpenTime = fromHolder.get().getOpenTime(); } finally { otherReq.close(); fromCore.close(); if (fromHolder != null) fromHolder.decref(); } } else { QParser fromQueryParser = subQuery(v, null); fromQuery = fromQueryParser.getQuery(); } JoinQuery jq = new JoinQuery(fromField, toField, fromIndex, fromQuery); jq.fromCoreOpenTime = fromCoreOpenTime; return jq; }
// in core/src/java/org/apache/solr/search/PrefixQParserPlugin.java
Override public QParser createParser(String qstr, SolrParams localParams, SolrParams params, SolrQueryRequest req) { return new QParser(qstr, localParams, params, req) { @Override public Query parse() throws ParseException { return new PrefixQuery(new Term(localParams.get(QueryParsing.F), localParams.get(QueryParsing.V))); } }; }
// in core/src/java/org/apache/solr/search/PrefixQParserPlugin.java
Override public Query parse() throws ParseException { return new PrefixQuery(new Term(localParams.get(QueryParsing.F), localParams.get(QueryParsing.V))); }
// in core/src/java/org/apache/solr/search/LuceneQParserPlugin.java
Override public Query parse() throws ParseException { String qstr = getString(); if (qstr == null || qstr.length()==0) return null; String defaultField = getParam(CommonParams.DF); if (defaultField==null) { defaultField = getReq().getSchema().getDefaultSearchFieldName(); } lparser = new SolrQueryParser(this, defaultField); lparser.setDefaultOperator (QueryParsing.getQueryParserDefaultOperator(getReq().getSchema(), getParam(QueryParsing.OP))); return lparser.parse(qstr); }
// in core/src/java/org/apache/solr/search/LuceneQParserPlugin.java
Override public Query parse() throws ParseException { // handle legacy "query;sort" syntax if (getLocalParams() == null) { String qstr = getString(); if (qstr == null || qstr.length() == 0) return null; sortStr = getParams().get(CommonParams.SORT); if (sortStr == null) { // sort may be legacy form, included in the query string List<String> commands = StrUtils.splitSmart(qstr,';'); if (commands.size() == 2) { qstr = commands.get(0); sortStr = commands.get(1); } else if (commands.size() == 1) { // This is need to support the case where someone sends: "q=query;" qstr = commands.get(0); } else if (commands.size() > 2) { throw new ParseException("If you want to use multiple ';' in the query, use the 'sort' param."); } } setString(qstr); } return super.parse(); }
// in core/src/java/org/apache/solr/search/LuceneQParserPlugin.java
Override public SortSpec getSort(boolean useGlobal) throws ParseException { SortSpec sort = super.getSort(useGlobal); if (sortStr != null && sortStr.length()>0 && sort.getSort()==null) { Sort oldSort = QueryParsing.parseSort(sortStr, getReq()); if( oldSort != null ) { sort.sort = oldSort; } } return sort; }
// in core/src/java/org/apache/solr/search/SolrQueryParser.java
Override protected Query getFieldQuery(String field, String queryText, boolean quoted) throws ParseException { checkNullField(field); // intercept magic field name of "_" to use as a hook for our // own functions. if (field.charAt(0) == '_' && parser != null) { MagicFieldName magic = MagicFieldName.get(field); if (null != magic) { QParser nested = parser.subQuery(queryText, magic.subParser); return nested.getQuery(); } } SchemaField sf = schema.getFieldOrNull(field); if (sf != null) { FieldType ft = sf.getType(); // delegate to type for everything except tokenized fields if (ft.isTokenized()) { return super.getFieldQuery(field, queryText, quoted || (ft instanceof TextField && ((TextField)ft).getAutoGeneratePhraseQueries())); } else { return sf.getType().getFieldQuery(parser, sf, queryText); } } // default to a normal field query return super.getFieldQuery(field, queryText, quoted); }
// in core/src/java/org/apache/solr/search/SolrQueryParser.java
Override protected Query getRangeQuery(String field, String part1, String part2, boolean startInclusive, boolean endInclusive) throws ParseException { checkNullField(field); SchemaField sf = schema.getField(field); return sf.getType().getRangeQuery(parser, sf, part1, part2, startInclusive, endInclusive); }
// in core/src/java/org/apache/solr/search/SolrQueryParser.java
Override protected Query getPrefixQuery(String field, String termStr) throws ParseException { checkNullField(field); termStr = analyzeIfMultitermTermText(field, termStr, schema.getFieldType(field)); // Solr has always used constant scoring for prefix queries. This should return constant scoring by default. return newPrefixQuery(new Term(field, termStr)); }
// in core/src/java/org/apache/solr/search/SolrQueryParser.java
Override protected Query getWildcardQuery(String field, String termStr) throws ParseException { // *:* -> MatchAllDocsQuery if ("*".equals(field) && "*".equals(termStr)) { return newMatchAllDocsQuery(); } FieldType fieldType = schema.getFieldType(field); termStr = analyzeIfMultitermTermText(field, termStr, fieldType); // can we use reversed wildcards in this field? ReversedWildcardFilterFactory factory = getReversedWildcardFilterFactory(fieldType); if (factory != null) { Term term = new Term(field, termStr); // fsa representing the query Automaton automaton = WildcardQuery.toAutomaton(term); // TODO: we should likely use the automaton to calculate shouldReverse, too. if (factory.shouldReverse(termStr)) { automaton = BasicOperations.concatenate(automaton, BasicAutomata.makeChar(factory.getMarkerChar())); SpecialOperations.reverse(automaton); } else { // reverse wildcardfilter is active: remove false positives // fsa representing false positives (markerChar*) Automaton falsePositives = BasicOperations.concatenate( BasicAutomata.makeChar(factory.getMarkerChar()), BasicAutomata.makeAnyString()); // subtract these away automaton = BasicOperations.minus(automaton, falsePositives); } return new AutomatonQuery(term, automaton) { // override toString so its completely transparent @Override public String toString(String field) { StringBuilder buffer = new StringBuilder(); if (!getField().equals(field)) { buffer.append(getField()); buffer.append(":"); } buffer.append(term.text()); buffer.append(ToStringUtils.boost(getBoost())); return buffer.toString(); } }; } // Solr has always used constant scoring for wildcard queries. This should return constant scoring by default. return newWildcardQuery(new Term(field, termStr)); }
// in core/src/java/org/apache/solr/search/SolrQueryParser.java
Override protected Query getRegexpQuery(String field, String termStr) throws ParseException { termStr = analyzeIfMultitermTermText(field, termStr, schema.getFieldType(field)); return newRegexpQuery(new Term(field, termStr)); }
// in core/src/java/org/apache/solr/search/BoostQParserPlugin.java
Override public QParser createParser(String qstr, SolrParams localParams, SolrParams params, SolrQueryRequest req) { return new QParser(qstr, localParams, params, req) { QParser baseParser; ValueSource vs; String b; @Override public Query parse() throws ParseException { b = localParams.get(BOOSTFUNC); baseParser = subQuery(localParams.get(QueryParsing.V), null); Query q = baseParser.getQuery(); if (b == null) return q; Query bq = subQuery(b, FunctionQParserPlugin.NAME).getQuery(); if (bq instanceof FunctionQuery) { vs = ((FunctionQuery)bq).getValueSource(); } else { vs = new QueryValueSource(bq, 0.0f); } return new BoostedQuery(q, vs); } @Override public String[] getDefaultHighlightFields() { return baseParser.getDefaultHighlightFields(); } @Override public Query getHighlightQuery() throws ParseException { return baseParser.getHighlightQuery(); } @Override public void addDebugInfo(NamedList<Object> debugInfo) { // encapsulate base debug info in a sub-list? baseParser.addDebugInfo(debugInfo); debugInfo.add("boost_str",b); debugInfo.add("boost_parsed",vs); } }; }
// in core/src/java/org/apache/solr/search/BoostQParserPlugin.java
Override public Query parse() throws ParseException { b = localParams.get(BOOSTFUNC); baseParser = subQuery(localParams.get(QueryParsing.V), null); Query q = baseParser.getQuery(); if (b == null) return q; Query bq = subQuery(b, FunctionQParserPlugin.NAME).getQuery(); if (bq instanceof FunctionQuery) { vs = ((FunctionQuery)bq).getValueSource(); } else { vs = new QueryValueSource(bq, 0.0f); } return new BoostedQuery(q, vs); }
// in core/src/java/org/apache/solr/search/BoostQParserPlugin.java
Override public Query getHighlightQuery() throws ParseException { return baseParser.getHighlightQuery(); }
// in core/src/java/org/apache/solr/search/function/distance/HaversineConstFunction.java
Override public ValueSource parse(FunctionQParser fp) throws ParseException { // TODO: dispatch through SpatialQueriable in the future? List<ValueSource> sources = fp.parseValueSourceList(); // "m" is a multi-value source, "x" is a single-value source // allow (m,m) (m,x,x) (x,x,m) (x,x,x,x) // if not enough points are present, "pt" will be checked first, followed by "sfield". MultiValueSource mv1 = null; MultiValueSource mv2 = null; if (sources.size() == 0) { // nothing to do now } else if (sources.size() == 1) { ValueSource vs = sources.get(0); if (!(vs instanceof MultiValueSource)) { throw new ParseException("geodist - invalid parameters:" + sources); } mv1 = (MultiValueSource)vs; } else if (sources.size() == 2) { ValueSource vs1 = sources.get(0); ValueSource vs2 = sources.get(1); if (vs1 instanceof MultiValueSource && vs2 instanceof MultiValueSource) { mv1 = (MultiValueSource)vs1; mv2 = (MultiValueSource)vs2; } else { mv1 = makeMV(sources, sources); } } else if (sources.size()==3) { ValueSource vs1 = sources.get(0); ValueSource vs2 = sources.get(1); if (vs1 instanceof MultiValueSource) { // (m,x,x) mv1 = (MultiValueSource)vs1; mv2 = makeMV(sources.subList(1,3), sources); } else { // (x,x,m) mv1 = makeMV(sources.subList(0,2), sources); vs1 = sources.get(2); if (!(vs1 instanceof MultiValueSource)) { throw new ParseException("geodist - invalid parameters:" + sources); } mv2 = (MultiValueSource)vs1; } } else if (sources.size()==4) { mv1 = makeMV(sources.subList(0,2), sources); mv2 = makeMV(sources.subList(2,4), sources); } else if (sources.size() > 4) { throw new ParseException("geodist - invalid parameters:" + sources); } if (mv1 == null) { mv1 = parsePoint(fp); mv2 = parseSfield(fp); } else if (mv2 == null) { mv2 = parsePoint(fp); if (mv2 == null) mv2 = parseSfield(fp); } if (mv1 == null || mv2 == null) { throw new ParseException("geodist - not enough parameters:" + sources); } // We have all the parameters at this point, now check if one of the points is constant double[] constants; constants = getConstants(mv1); MultiValueSource other = mv2; if (constants == null) { constants = getConstants(mv2); other = mv1; } if (constants != null && other instanceof VectorValueSource) { return new HaversineConstFunction(constants[0], constants[1], (VectorValueSource)other); } return new HaversineFunction(mv1, mv2, DistanceUtils.EARTH_MEAN_RADIUS_KM, true); }
// in core/src/java/org/apache/solr/search/function/distance/HaversineConstFunction.java
private static VectorValueSource makeMV(List<ValueSource> sources, List<ValueSource> orig) throws ParseException { ValueSource vs1 = sources.get(0); ValueSource vs2 = sources.get(1); if (vs1 instanceof MultiValueSource || vs2 instanceof MultiValueSource) { throw new ParseException("geodist - invalid parameters:" + orig); } return new VectorValueSource(sources); }
// in core/src/java/org/apache/solr/search/function/distance/HaversineConstFunction.java
private static MultiValueSource parsePoint(FunctionQParser fp) throws ParseException { String pt = fp.getParam(SpatialParams.POINT); if (pt == null) return null; double[] point = null; try { point = ParseUtils.parseLatitudeLongitude(pt); } catch (InvalidShapeException e) { throw new ParseException("Bad spatial pt:" + pt); } return new VectorValueSource(Arrays.<ValueSource>asList(new DoubleConstValueSource(point[0]),new DoubleConstValueSource(point[1]))); }
// in core/src/java/org/apache/solr/search/function/distance/HaversineConstFunction.java
private static MultiValueSource parseSfield(FunctionQParser fp) throws ParseException { String sfield = fp.getParam(SpatialParams.FIELD); if (sfield == null) return null; SchemaField sf = fp.getReq().getSchema().getField(sfield); ValueSource vs = sf.getType().getValueSource(sf, fp); if (!(vs instanceof MultiValueSource)) { throw new ParseException("Spatial field must implement MultiValueSource:" + sf); } return (MultiValueSource)vs; }
// in core/src/java/org/apache/solr/search/FunctionRangeQParserPlugin.java
Override public QParser createParser(String qstr, SolrParams localParams, SolrParams params, SolrQueryRequest req) { return new QParser(qstr, localParams, params, req) { ValueSource vs; String funcStr; @Override public Query parse() throws ParseException { funcStr = localParams.get(QueryParsing.V, null); Query funcQ = subQuery(funcStr, FunctionQParserPlugin.NAME).getQuery(); if (funcQ instanceof FunctionQuery) { vs = ((FunctionQuery)funcQ).getValueSource(); } else { vs = new QueryValueSource(funcQ, 0.0f); } String l = localParams.get("l"); String u = localParams.get("u"); boolean includeLower = localParams.getBool("incl",true); boolean includeUpper = localParams.getBool("incu",true); // TODO: add a score=val option to allow score to be the value ValueSourceRangeFilter rf = new ValueSourceRangeFilter(vs, l, u, includeLower, includeUpper); FunctionRangeQuery frq = new FunctionRangeQuery(rf); return frq; } }; }
// in core/src/java/org/apache/solr/search/FunctionRangeQParserPlugin.java
Override public Query parse() throws ParseException { funcStr = localParams.get(QueryParsing.V, null); Query funcQ = subQuery(funcStr, FunctionQParserPlugin.NAME).getQuery(); if (funcQ instanceof FunctionQuery) { vs = ((FunctionQuery)funcQ).getValueSource(); } else { vs = new QueryValueSource(funcQ, 0.0f); } String l = localParams.get("l"); String u = localParams.get("u"); boolean includeLower = localParams.getBool("incl",true); boolean includeUpper = localParams.getBool("incu",true); // TODO: add a score=val option to allow score to be the value ValueSourceRangeFilter rf = new ValueSourceRangeFilter(vs, l, u, includeLower, includeUpper); FunctionRangeQuery frq = new FunctionRangeQuery(rf); return frq; }
// in core/src/java/org/apache/solr/search/RawQParserPlugin.java
Override public QParser createParser(String qstr, SolrParams localParams, SolrParams params, SolrQueryRequest req) { return new QParser(qstr, localParams, params, req) { @Override public Query parse() throws ParseException { return new TermQuery(new Term(localParams.get(QueryParsing.F), localParams.get(QueryParsing.V))); } }; }
// in core/src/java/org/apache/solr/search/RawQParserPlugin.java
Override public Query parse() throws ParseException { return new TermQuery(new Term(localParams.get(QueryParsing.F), localParams.get(QueryParsing.V))); }
// in core/src/java/org/apache/solr/search/FunctionQParser.java
Override public Query parse() throws ParseException { sp = new QueryParsing.StrParser(getString()); ValueSource vs = null; List<ValueSource> lst = null; for(;;) { ValueSource valsource = parseValueSource(false); sp.eatws(); if (!parseMultipleSources) { vs = valsource; break; } else { if (lst != null) { lst.add(valsource); } else { vs = valsource; } } // check if there is a "," separator if (sp.peek() != ',') break; consumeArgumentDelimiter(); if (lst == null) { lst = new ArrayList<ValueSource>(2); lst.add(valsource); } } if (parseToEnd && sp.pos < sp.end) { throw new ParseException("Unexpected text after function: " + sp.val.substring(sp.pos, sp.end)); } if (lst != null) { vs = new VectorValueSource(lst); } return new FunctionQuery(vs); }
// in core/src/java/org/apache/solr/search/FunctionQParser.java
public boolean hasMoreArguments() throws ParseException { int ch = sp.peek(); /* determine whether the function is ending with a paren or end of str */ return (! (ch == 0 || ch == ')') ); }
// in core/src/java/org/apache/solr/search/FunctionQParser.java
public String parseId() throws ParseException { String value = parseArg(); if (argWasQuoted) throw new ParseException("Expected identifier instead of quoted string:" + value); return value; }
// in core/src/java/org/apache/solr/search/FunctionQParser.java
public Float parseFloat() throws ParseException { String str = parseArg(); if (argWasQuoted()) throw new ParseException("Expected float instead of quoted string:" + str); float value = Float.parseFloat(str); return value; }
// in core/src/java/org/apache/solr/search/FunctionQParser.java
public double parseDouble() throws ParseException { String str = parseArg(); if (argWasQuoted()) throw new ParseException("Expected double instead of quoted string:" + str); double value = Double.parseDouble(str); return value; }
// in core/src/java/org/apache/solr/search/FunctionQParser.java
public int parseInt() throws ParseException { String str = parseArg(); if (argWasQuoted()) throw new ParseException("Expected double instead of quoted string:" + str); int value = Integer.parseInt(str); return value; }
// in core/src/java/org/apache/solr/search/FunctionQParser.java
public String parseArg() throws ParseException { argWasQuoted = false; sp.eatws(); char ch = sp.peek(); String val = null; switch (ch) { case ')': return null; case '$': sp.pos++; String param = sp.getId(); val = getParam(param); break; case '\'': case '"': val = sp.getQuotedString(); argWasQuoted = true; break; default: // read unquoted literal ended by whitespace ',' or ')' // there is no escaping. int valStart = sp.pos; for (;;) { if (sp.pos >= sp.end) { throw new ParseException("Missing end to unquoted value starting at " + valStart + " str='" + sp.val +"'"); } char c = sp.val.charAt(sp.pos); if (c==')' || c==',' || Character.isWhitespace(c)) { val = sp.val.substring(valStart, sp.pos); break; } sp.pos++; } } sp.eatws(); consumeArgumentDelimiter(); return val; }
// in core/src/java/org/apache/solr/search/FunctionQParser.java
public List<ValueSource> parseValueSourceList() throws ParseException { List<ValueSource> sources = new ArrayList<ValueSource>(3); while (hasMoreArguments()) { sources.add(parseValueSource(true)); } return sources; }
// in core/src/java/org/apache/solr/search/FunctionQParser.java
public ValueSource parseValueSource() throws ParseException { /* consume the delimiter afterward for an external call to parseValueSource */ return parseValueSource(true); }
// in core/src/java/org/apache/solr/search/FunctionQParser.java
public Query parseNestedQuery() throws ParseException { Query nestedQuery; if (sp.opt("$")) { String param = sp.getId(); String qstr = getParam(param); qstr = qstr==null ? "" : qstr; nestedQuery = subQuery(qstr, null).getQuery(); } else { int start = sp.pos; String v = sp.val; String qs = v; HashMap nestedLocalParams = new HashMap<String,String>(); int end = QueryParsing.parseLocalParams(qs, start, nestedLocalParams, getParams()); QParser sub; if (end>start) { if (nestedLocalParams.get(QueryParsing.V) != null) { // value specified directly in local params... so the end of the // query should be the end of the local params. sub = subQuery(qs.substring(start, end), null); } else { // value here is *after* the local params... ask the parser. sub = subQuery(qs, null); // int subEnd = sub.findEnd(')'); // TODO.. implement functions to find the end of a nested query throw new ParseException("Nested local params must have value in v parameter. got '" + qs + "'"); } } else { throw new ParseException("Nested function query must use $param or {!v=value} forms. got '" + qs + "'"); } sp.pos += end-start; // advance past nested query nestedQuery = sub.getQuery(); } consumeArgumentDelimiter(); return nestedQuery; }
// in core/src/java/org/apache/solr/search/FunctionQParser.java
protected ValueSource parseValueSource(boolean doConsumeDelimiter) throws ParseException { ValueSource valueSource; int ch = sp.peek(); if (ch>='0' && ch<='9' || ch=='.' || ch=='+' || ch=='-') { Number num = sp.getNumber(); if (num instanceof Long) { valueSource = new LongConstValueSource(num.longValue()); } else if (num instanceof Double) { valueSource = new DoubleConstValueSource(num.doubleValue()); } else { // shouldn't happen valueSource = new ConstValueSource(num.floatValue()); } } else if (ch == '"' || ch == '\''){ valueSource = new LiteralValueSource(sp.getQuotedString()); } else if (ch == '$') { sp.pos++; String param = sp.getId(); String val = getParam(param); if (val == null) { throw new ParseException("Missing param " + param + " while parsing function '" + sp.val + "'"); } QParser subParser = subQuery(val, "func"); if (subParser instanceof FunctionQParser) { ((FunctionQParser)subParser).setParseMultipleSources(true); } Query subQuery = subParser.getQuery(); if (subQuery instanceof FunctionQuery) { valueSource = ((FunctionQuery) subQuery).getValueSource(); } else { valueSource = new QueryValueSource(subQuery, 0.0f); } /*** // dereference *simple* argument (i.e., can't currently be a function) // In the future we could support full function dereferencing via a stack of ValueSource (or StringParser) objects ch = val.length()==0 ? '\0' : val.charAt(0); if (ch>='0' && ch<='9' || ch=='.' || ch=='+' || ch=='-') { QueryParsing.StrParser sp = new QueryParsing.StrParser(val); Number num = sp.getNumber(); if (num instanceof Long) { valueSource = new LongConstValueSource(num.longValue()); } else if (num instanceof Double) { valueSource = new DoubleConstValueSource(num.doubleValue()); } else { // shouldn't happen valueSource = new ConstValueSource(num.floatValue()); } } else if (ch == '"' || ch == '\'') { QueryParsing.StrParser sp = new QueryParsing.StrParser(val); val = sp.getQuotedString(); valueSource = new LiteralValueSource(val); } else { if (val.length()==0) { valueSource = new LiteralValueSource(val); } else { String id = val; SchemaField f = req.getSchema().getField(id); valueSource = f.getType().getValueSource(f, this); } } ***/ } else { String id = sp.getId(); if (sp.opt("(")) { // a function... look it up. ValueSourceParser argParser = req.getCore().getValueSourceParser(id); if (argParser==null) { throw new ParseException("Unknown function " + id + " in FunctionQuery(" + sp + ")"); } valueSource = argParser.parse(this); sp.expect(")"); } else { if ("true".equals(id)) { valueSource = new BoolConstValueSource(true); } else if ("false".equals(id)) { valueSource = new BoolConstValueSource(false); } else { SchemaField f = req.getSchema().getField(id); valueSource = f.getType().getValueSource(f, this); } } } if (doConsumeDelimiter) consumeArgumentDelimiter(); return valueSource; }
// in core/src/java/org/apache/solr/search/FunctionQParser.java
protected boolean consumeArgumentDelimiter() throws ParseException { /* if a list of args is ending, don't expect the comma */ if (hasMoreArguments()) { sp.expect(","); return true; } return false; }
// in core/src/java/org/apache/solr/search/ValueSourceParser.java
Override public ValueSource parse(FunctionQParser fp) throws ParseException { final ValueSource source = fp.parseValueSource(); return new TestValueSource(source); }
// in core/src/java/org/apache/solr/search/ValueSourceParser.java
Override public ValueSource parse(FunctionQParser fp) throws ParseException { String field = fp.parseId(); return new OrdFieldSource(field); }
// in core/src/java/org/apache/solr/search/ValueSourceParser.java
Override public ValueSource parse(FunctionQParser fp) throws ParseException { return new LiteralValueSource(fp.getString()); }
// in core/src/java/org/apache/solr/search/ValueSourceParser.java
Override public ValueSource parse(FunctionQParser fp) throws ParseException { String field = fp.parseId(); return new ReverseOrdFieldSource(field); }
// in core/src/java/org/apache/solr/search/ValueSourceParser.java
Override public ValueSource parse(FunctionQParser fp) throws ParseException { // top(vs) is now a no-op ValueSource source = fp.parseValueSource(); return source; }
// in core/src/java/org/apache/solr/search/ValueSourceParser.java
Override public ValueSource parse(FunctionQParser fp) throws ParseException { ValueSource source = fp.parseValueSource(); float slope = fp.parseFloat(); float intercept = fp.parseFloat(); return new LinearFloatFunction(source, slope, intercept); }
// in core/src/java/org/apache/solr/search/ValueSourceParser.java
Override public ValueSource parse(FunctionQParser fp) throws ParseException { ValueSource source = fp.parseValueSource(); float m = fp.parseFloat(); float a = fp.parseFloat(); float b = fp.parseFloat(); return new ReciprocalFloatFunction(source, m, a, b); }
// in core/src/java/org/apache/solr/search/ValueSourceParser.java
Override public ValueSource parse(FunctionQParser fp) throws ParseException { ValueSource source = fp.parseValueSource(); float min = fp.parseFloat(); float max = fp.parseFloat(); return new ScaleFloatFunction(source, min, max); }
// in core/src/java/org/apache/solr/search/ValueSourceParser.java
Override public ValueSource parse(FunctionQParser fp) throws ParseException { ValueSource a = fp.parseValueSource(); ValueSource b = fp.parseValueSource(); return new DivFloatFunction(a, b); }
// in core/src/java/org/apache/solr/search/ValueSourceParser.java
Override public ValueSource parse(FunctionQParser fp) throws ParseException { ValueSource source = fp.parseValueSource(); float min = fp.parseFloat(); float max = fp.parseFloat(); float target = fp.parseFloat(); Float def = fp.hasMoreArguments() ? fp.parseFloat() : null; return new RangeMapFloatFunction(source, min, max, target, def); }
// in core/src/java/org/apache/solr/search/ValueSourceParser.java
Override public ValueSource parse(FunctionQParser fp) throws ParseException { ValueSource source = fp.parseValueSource(); return new SimpleFloatFunction(source) { @Override protected String name() { return "abs"; } @Override protected float func(int doc, FunctionValues vals) { return Math.abs(vals.floatVal(doc)); } }; }
// in core/src/java/org/apache/solr/search/ValueSourceParser.java
Override public ValueSource parse(FunctionQParser fp) throws ParseException { List<ValueSource> sources = fp.parseValueSourceList(); return new SumFloatFunction(sources.toArray(new ValueSource[sources.size()])); }
// in core/src/java/org/apache/solr/search/ValueSourceParser.java
Override public ValueSource parse(FunctionQParser fp) throws ParseException { List<ValueSource> sources = fp.parseValueSourceList(); return new ProductFloatFunction(sources.toArray(new ValueSource[sources.size()])); }
// in core/src/java/org/apache/solr/search/ValueSourceParser.java
Override public ValueSource parse(FunctionQParser fp) throws ParseException { ValueSource a = fp.parseValueSource(); ValueSource b = fp.parseValueSource(); return new DualFloatFunction(a, b) { @Override protected String name() { return "sub"; } @Override protected float func(int doc, FunctionValues aVals, FunctionValues bVals) { return aVals.floatVal(doc) - bVals.floatVal(doc); } }; }
// in core/src/java/org/apache/solr/search/ValueSourceParser.java
Override public ValueSource parse(FunctionQParser fp) throws ParseException{ return new VectorValueSource(fp.parseValueSourceList()); }
// in core/src/java/org/apache/solr/search/ValueSourceParser.java
Override public ValueSource parse(FunctionQParser fp) throws ParseException { Query q = fp.parseNestedQuery(); float defVal = 0.0f; if (fp.hasMoreArguments()) { defVal = fp.parseFloat(); } return new QueryValueSource(q, defVal); }
// in core/src/java/org/apache/solr/search/ValueSourceParser.java
Override public ValueSource parse(FunctionQParser fp) throws ParseException { Query q = fp.parseNestedQuery(); ValueSource vs = fp.parseValueSource(); BoostedQuery bq = new BoostedQuery(q, vs); return new QueryValueSource(bq, 0.0f); }
// in core/src/java/org/apache/solr/search/ValueSourceParser.java
Override public ValueSource parse(FunctionQParser fp) throws ParseException { String f0 = fp.parseArg(); String qf = fp.parseArg(); return new JoinDocFreqValueSource( f0, qf ); }
// in core/src/java/org/apache/solr/search/ValueSourceParser.java
Override public ValueSource parse(FunctionQParser fp) throws ParseException { double radius = fp.parseDouble(); //SOLR-2114, make the convert flag required, since the parser doesn't support much in the way of lookahead or the ability to convert a String into a ValueSource boolean convert = Boolean.parseBoolean(fp.parseArg()); MultiValueSource pv1; MultiValueSource pv2; ValueSource one = fp.parseValueSource(); ValueSource two = fp.parseValueSource(); if (fp.hasMoreArguments()) { List<ValueSource> s1 = new ArrayList<ValueSource>(); s1.add(one); s1.add(two); pv1 = new VectorValueSource(s1); ValueSource x2 = fp.parseValueSource(); ValueSource y2 = fp.parseValueSource(); List<ValueSource> s2 = new ArrayList<ValueSource>(); s2.add(x2); s2.add(y2); pv2 = new VectorValueSource(s2); } else { //check to see if we have multiValue source if (one instanceof MultiValueSource && two instanceof MultiValueSource){ pv1 = (MultiValueSource) one; pv2 = (MultiValueSource) two; } else { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Input must either be 2 MultiValueSources, or there must be 4 ValueSources"); } } return new HaversineFunction(pv1, pv2, radius, convert); }
// in core/src/java/org/apache/solr/search/ValueSourceParser.java
Override public ValueSource parse(FunctionQParser fp) throws ParseException { double radius = fp.parseDouble(); ValueSource gh1 = fp.parseValueSource(); ValueSource gh2 = fp.parseValueSource(); return new GeohashHaversineFunction(gh1, gh2, radius); }
// in core/src/java/org/apache/solr/search/ValueSourceParser.java
Override public ValueSource parse(FunctionQParser fp) throws ParseException { ValueSource lat = fp.parseValueSource(); ValueSource lon = fp.parseValueSource(); return new GeohashFunction(lat, lon); }
// in core/src/java/org/apache/solr/search/ValueSourceParser.java
Override public ValueSource parse(FunctionQParser fp) throws ParseException { ValueSource str1 = fp.parseValueSource(); ValueSource str2 = fp.parseValueSource(); String distClass = fp.parseArg(); StringDistance dist = null; if (distClass.equalsIgnoreCase("jw")) { dist = new JaroWinklerDistance(); } else if (distClass.equalsIgnoreCase("edit")) { dist = new LevensteinDistance(); } else if (distClass.equalsIgnoreCase("ngram")) { int ngram = 2; if (fp.hasMoreArguments()) { ngram = fp.parseInt(); } dist = new NGramDistance(ngram); } else { dist = fp.req.getCore().getResourceLoader().newInstance(distClass, StringDistance.class); } return new StringDistanceFunction(str1, str2, dist); }
// in core/src/java/org/apache/solr/search/ValueSourceParser.java
Override public ValueSource parse(FunctionQParser fp) throws ParseException { String fieldName = fp.parseArg(); SchemaField f = fp.getReq().getSchema().getField(fieldName); return f.getType().getValueSource(f, fp); }
// in core/src/java/org/apache/solr/search/ValueSourceParser.java
Override public ValueSource parse(FunctionQParser fp) throws ParseException { List<ValueSource> sources = fp.parseValueSourceList(); return new MaxFloatFunction(sources.toArray(new ValueSource[sources.size()])); }
// in core/src/java/org/apache/solr/search/ValueSourceParser.java
Override public ValueSource parse(FunctionQParser fp) throws ParseException { List<ValueSource> sources = fp.parseValueSourceList(); return new MinFloatFunction(sources.toArray(new ValueSource[sources.size()])); }
// in core/src/java/org/apache/solr/search/ValueSourceParser.java
Override public ValueSource parse(FunctionQParser fp) throws ParseException { List<ValueSource> sources = fp.parseValueSourceList(); MVResult mvr = getMultiValueSources(sources); return new SquaredEuclideanFunction(mvr.mv1, mvr.mv2); }
// in core/src/java/org/apache/solr/search/ValueSourceParser.java
Override public ValueSource parse(FunctionQParser fp) throws ParseException { float power = fp.parseFloat(); List<ValueSource> sources = fp.parseValueSourceList(); MVResult mvr = getMultiValueSources(sources); return new VectorDistanceFunction(power, mvr.mv1, mvr.mv2); }
// in core/src/java/org/apache/solr/search/ValueSourceParser.java
Override public ValueSource parse(FunctionQParser fp) throws ParseException { return new DoubleConstValueSource(Math.PI); }
// in core/src/java/org/apache/solr/search/ValueSourceParser.java
Override public ValueSource parse(FunctionQParser fp) throws ParseException { return new DoubleConstValueSource(Math.E); }
// in core/src/java/org/apache/solr/search/ValueSourceParser.java
Override public ValueSource parse(FunctionQParser fp) throws ParseException { TInfo tinfo = parseTerm(fp); return new DocFreqValueSource(tinfo.field, tinfo.val, tinfo.indexedField, tinfo.indexedBytes); }
// in core/src/java/org/apache/solr/search/ValueSourceParser.java
Override public ValueSource parse(FunctionQParser fp) throws ParseException { TInfo tinfo = parseTerm(fp); return new TotalTermFreqValueSource(tinfo.field, tinfo.val, tinfo.indexedField, tinfo.indexedBytes); }
// in core/src/java/org/apache/solr/search/ValueSourceParser.java
Override public ValueSource parse(FunctionQParser fp) throws ParseException { String field = fp.parseArg(); return new SumTotalTermFreqValueSource(field); }
// in core/src/java/org/apache/solr/search/ValueSourceParser.java
Override public ValueSource parse(FunctionQParser fp) throws ParseException { TInfo tinfo = parseTerm(fp); return new IDFValueSource(tinfo.field, tinfo.val, tinfo.indexedField, tinfo.indexedBytes); }
// in core/src/java/org/apache/solr/search/ValueSourceParser.java
Override public ValueSource parse(FunctionQParser fp) throws ParseException { TInfo tinfo = parseTerm(fp); return new TermFreqValueSource(tinfo.field, tinfo.val, tinfo.indexedField, tinfo.indexedBytes); }
// in core/src/java/org/apache/solr/search/ValueSourceParser.java
Override public ValueSource parse(FunctionQParser fp) throws ParseException { TInfo tinfo = parseTerm(fp); return new TFValueSource(tinfo.field, tinfo.val, tinfo.indexedField, tinfo.indexedBytes); }
// in core/src/java/org/apache/solr/search/ValueSourceParser.java
Override public ValueSource parse(FunctionQParser fp) throws ParseException { String field = fp.parseArg(); return new NormValueSource(field); }
// in core/src/java/org/apache/solr/search/ValueSourceParser.java
Override public ValueSource parse(FunctionQParser fp) throws ParseException { return new MaxDocValueSource(); }
// in core/src/java/org/apache/solr/search/ValueSourceParser.java
Override public ValueSource parse(FunctionQParser fp) throws ParseException { return new NumDocsValueSource(); }
// in core/src/java/org/apache/solr/search/ValueSourceParser.java
Override public ValueSource parse(FunctionQParser fp) throws ParseException { return new BoolConstValueSource(true); }
// in core/src/java/org/apache/solr/search/ValueSourceParser.java
Override public ValueSource parse(FunctionQParser fp) throws ParseException { return new BoolConstValueSource(false); }
// in core/src/java/org/apache/solr/search/ValueSourceParser.java
Override public ValueSource parse(FunctionQParser fp) throws ParseException { ValueSource vs = fp.parseValueSource(); return new SimpleBoolFunction(vs) { @Override protected String name() { return "exists"; } @Override protected boolean func(int doc, FunctionValues vals) { return vals.exists(doc); } }; }
// in core/src/java/org/apache/solr/search/ValueSourceParser.java
Override public ValueSource parse(FunctionQParser fp) throws ParseException { ValueSource vs = fp.parseValueSource(); return new SimpleBoolFunction(vs) { @Override protected boolean func(int doc, FunctionValues vals) { return !vals.boolVal(doc); } @Override protected String name() { return "not"; } }; }
// in core/src/java/org/apache/solr/search/ValueSourceParser.java
Override public ValueSource parse(FunctionQParser fp) throws ParseException { List<ValueSource> sources = fp.parseValueSourceList(); return new MultiBoolFunction(sources) { @Override protected String name() { return "and"; } @Override protected boolean func(int doc, FunctionValues[] vals) { for (FunctionValues dv : vals) if (!dv.boolVal(doc)) return false; return true; } }; }
// in core/src/java/org/apache/solr/search/ValueSourceParser.java
Override public ValueSource parse(FunctionQParser fp) throws ParseException { List<ValueSource> sources = fp.parseValueSourceList(); return new MultiBoolFunction(sources) { @Override protected String name() { return "or"; } @Override protected boolean func(int doc, FunctionValues[] vals) { for (FunctionValues dv : vals) if (dv.boolVal(doc)) return true; return false; } }; }
// in core/src/java/org/apache/solr/search/ValueSourceParser.java
Override public ValueSource parse(FunctionQParser fp) throws ParseException { List<ValueSource> sources = fp.parseValueSourceList(); return new MultiBoolFunction(sources) { @Override protected String name() { return "xor"; } @Override protected boolean func(int doc, FunctionValues[] vals) { int nTrue=0, nFalse=0; for (FunctionValues dv : vals) { if (dv.boolVal(doc)) nTrue++; else nFalse++; } return nTrue != 0 && nFalse != 0; } }; }
// in core/src/java/org/apache/solr/search/ValueSourceParser.java
Override public ValueSource parse(FunctionQParser fp) throws ParseException { ValueSource ifValueSource = fp.parseValueSource(); ValueSource trueValueSource = fp.parseValueSource(); ValueSource falseValueSource = fp.parseValueSource(); return new IfFunction(ifValueSource, trueValueSource, falseValueSource); }
// in core/src/java/org/apache/solr/search/ValueSourceParser.java
Override public ValueSource parse(FunctionQParser fp) throws ParseException { return new DefFunction(fp.parseValueSourceList()); }
// in core/src/java/org/apache/solr/search/ValueSourceParser.java
private static TInfo parseTerm(FunctionQParser fp) throws ParseException { TInfo tinfo = new TInfo(); tinfo.indexedField = tinfo.field = fp.parseArg(); tinfo.val = fp.parseArg(); tinfo.indexedBytes = new BytesRef(); FieldType ft = fp.getReq().getSchema().getFieldTypeNoEx(tinfo.field); if (ft == null) ft = new StrField(); if (ft instanceof TextField) { // need to do analysis on the term String indexedVal = tinfo.val; Query q = ft.getFieldQuery(fp, fp.getReq().getSchema().getFieldOrNull(tinfo.field), tinfo.val); if (q instanceof TermQuery) { Term term = ((TermQuery)q).getTerm(); tinfo.indexedField = term.field(); indexedVal = term.text(); } UnicodeUtil.UTF16toUTF8(indexedVal, 0, indexedVal.length(), tinfo.indexedBytes); } else { ft.readableToIndexed(tinfo.val, tinfo.indexedBytes); } return tinfo; }
// in core/src/java/org/apache/solr/search/ValueSourceParser.java
Override public ValueSource parse(FunctionQParser fp) throws ParseException { String first = fp.parseArg(); String second = fp.parseArg(); if (first == null) first = "NOW"; Date d1 = getDate(fp, first); ValueSource v1 = d1 == null ? getValueSource(fp, first) : null; Date d2 = getDate(fp, second); ValueSource v2 = d2 == null ? getValueSource(fp, second) : null; // d constant // v field // dd constant // dv subtract field from constant // vd subtract constant from field // vv subtract fields final long ms1 = (d1 == null) ? 0 : d1.getTime(); final long ms2 = (d2 == null) ? 0 : d2.getTime(); // "d,dd" handle both constant cases if (d1 != null && v2 == null) { return new LongConstValueSource(ms1 - ms2); } // "v" just the date field if (v1 != null && v2 == null && d2 == null) { return v1; } // "dv" if (d1 != null && v2 != null) return new DualFloatFunction(new LongConstValueSource(ms1), v2) { @Override protected String name() { return "ms"; } @Override protected float func(int doc, FunctionValues aVals, FunctionValues bVals) { return ms1 - bVals.longVal(doc); } }; // "vd" if (v1 != null && d2 != null) return new DualFloatFunction(v1, new LongConstValueSource(ms2)) { @Override protected String name() { return "ms"; } @Override protected float func(int doc, FunctionValues aVals, FunctionValues bVals) { return aVals.longVal(doc) - ms2; } }; // "vv" if (v1 != null && v2 != null) return new DualFloatFunction(v1, v2) { @Override protected String name() { return "ms"; } @Override protected float func(int doc, FunctionValues aVals, FunctionValues bVals) { return aVals.longVal(doc) - bVals.longVal(doc); } }; return null; // shouldn't happen }
// in core/src/java/org/apache/solr/search/ValueSourceParser.java
Override public ValueSource parse(FunctionQParser fp) throws ParseException { return new Function(fp.parseValueSource()); }
// in core/src/java/org/apache/solr/search/ValueSourceParser.java
Override public ValueSource parse(FunctionQParser fp) throws ParseException { return new Function(fp.parseValueSource(), fp.parseValueSource()); }
// in core/src/java/org/apache/solr/search/FieldQParserPlugin.java
Override public QParser createParser(String qstr, SolrParams localParams, SolrParams params, SolrQueryRequest req) { return new QParser(qstr, localParams, params, req) { @Override public Query parse() throws ParseException { String field = localParams.get(QueryParsing.F); String queryText = localParams.get(QueryParsing.V); SchemaField sf = req.getSchema().getField(field); FieldType ft = sf.getType(); return ft.getFieldQuery(this, sf, queryText); } }; }
// in core/src/java/org/apache/solr/search/FieldQParserPlugin.java
Override public Query parse() throws ParseException { String field = localParams.get(QueryParsing.F); String queryText = localParams.get(QueryParsing.V); SchemaField sf = req.getSchema().getField(field); FieldType ft = sf.getType(); return ft.getFieldQuery(this, sf, queryText); }
// in core/src/java/org/apache/solr/util/SolrPluginUtils.java
Override protected Query getFieldQuery(String field, String queryText, boolean quoted) throws ParseException { if (aliases.containsKey(field)) { Alias a = aliases.get(field); DisjunctionMaxQuery q = new DisjunctionMaxQuery(a.tie); /* we might not get any valid queries from delegation, * in which case we should return null */ boolean ok = false; for (String f : a.fields.keySet()) { Query sub = getFieldQuery(f,queryText,quoted); if (null != sub) { if (null != a.fields.get(f)) { sub.setBoost(a.fields.get(f)); } q.add(sub); ok = true; } } return ok ? q : null; } else { try { return super.getFieldQuery(field, queryText, quoted); } catch (Exception e) { return null; } } }
// in core/src/java/org/apache/solr/util/SolrPluginUtils.java
public static List<Query> parseQueryStrings(SolrQueryRequest req, String[] queries) throws ParseException { if (null == queries || 0 == queries.length) return null; List<Query> out = new ArrayList<Query>(queries.length); for (String q : queries) { if (null != q && 0 != q.trim().length()) { out.add(QParser.getParser(q, null, req).getQuery()); } } return out; }
// in core/src/java/org/apache/solr/util/DateMathParser.java
public Date parseMath(String math) throws ParseException { Calendar cal = Calendar.getInstance(zone, loc); cal.setTime(getNow()); /* check for No-Op */ if (0==math.length()) { return cal.getTime(); } String[] ops = splitter.split(math); int pos = 0; while ( pos < ops.length ) { if (1 != ops[pos].length()) { throw new ParseException ("Multi character command found: \"" + ops[pos] + "\"", pos); } char command = ops[pos++].charAt(0); switch (command) { case '/': if (ops.length < pos + 1) { throw new ParseException ("Need a unit after command: \"" + command + "\"", pos); } try { round(cal, ops[pos++]); } catch (IllegalArgumentException e) { throw new ParseException ("Unit not recognized: \"" + ops[pos-1] + "\"", pos-1); } break; case '+': /* fall through */ case '-': if (ops.length < pos + 2) { throw new ParseException ("Need a value and unit for command: \"" + command + "\"", pos); } int val = 0; try { val = Integer.valueOf(ops[pos++]); } catch (NumberFormatException e) { throw new ParseException ("Not a Number: \"" + ops[pos-1] + "\"", pos-1); } if ('-' == command) { val = 0 - val; } try { String unit = ops[pos++]; add(cal, val, unit); } catch (IllegalArgumentException e) { throw new ParseException ("Unit not recognized: \"" + ops[pos-1] + "\"", pos-1); } break; default: throw new ParseException ("Unrecognized command: \"" + command + "\"", pos-1); } } return cal.getTime(); }
(Lib) IOException 25
              
// in solrj/src/java/org/apache/solr/client/solrj/impl/ConcurrentUpdateSolrServer.java
public NamedList<Object> request(final SolrRequest request) throws SolrServerException, IOException { if (!(request instanceof UpdateRequest)) { return server.request(request); } UpdateRequest req = (UpdateRequest) request; // this happens for commit... if (req.getDocuments() == null || req.getDocuments().isEmpty()) { blockUntilFinished(); return server.request(request); } SolrParams params = req.getParams(); if (params != null) { // check if it is waiting for the searcher if (params.getBool(UpdateParams.WAIT_SEARCHER, false)) { log.info("blocking for commit/optimize"); blockUntilFinished(); // empty the queue return server.request(request); } } try { CountDownLatch tmpLock = lock; if (tmpLock != null) { tmpLock.await(); } boolean success = queue.offer(req); for (;;) { synchronized (runners) { if (runners.isEmpty() || (queue.remainingCapacity() < queue.size() // queue // is // half // full // and // we // can // add // more // runners && runners.size() < threadCount)) { // We need more runners, so start a new one. Runner r = new Runner(); runners.add(r); scheduler.execute(r); } else { // break out of the retry loop if we added the element to the queue // successfully, *and* // while we are still holding the runners lock to prevent race // conditions. // race conditions. if (success) break; } } // Retry to add to the queue w/o the runners lock held (else we risk // temporary deadlock) // This retry could also fail because // 1) existing runners were not able to take off any new elements in the // queue // 2) the queue was filled back up since our last try // If we succeed, the queue may have been completely emptied, and all // runners stopped. // In all cases, we should loop back to the top to see if we need to // start more runners. // if (!success) { success = queue.offer(req, 100, TimeUnit.MILLISECONDS); } } } catch (InterruptedException e) { log.error("interrupted", e); throw new IOException(e.getLocalizedMessage()); } // RETURN A DUMMY result NamedList<Object> dummy = new NamedList<Object>(); dummy.add("NOTE", "the request is processed in a background stream"); return dummy; }
// in contrib/velocity/src/java/org/apache/solr/response/VelocityResponseWriter.java
public void write(Writer writer, SolrQueryRequest request, SolrQueryResponse response) throws IOException { VelocityEngine engine = getEngine(request); // TODO: have HTTP headers available for configuring engine Template template = getTemplate(engine, request); VelocityContext context = new VelocityContext(); context.put("request", request); // Turn the SolrQueryResponse into a SolrResponse. // QueryResponse has lots of conveniences suitable for a view // Problem is, which SolrResponse class to use? // One patch to SOLR-620 solved this by passing in a class name as // as a parameter and using reflection and Solr's class loader to // create a new instance. But for now the implementation simply // uses QueryResponse, and if it chokes in a known way, fall back // to bare bones SolrResponseBase. // TODO: Can this writer know what the handler class is? With echoHandler=true it can get its string name at least SolrResponse rsp = new QueryResponse(); NamedList<Object> parsedResponse = BinaryResponseWriter.getParsedResponse(request, response); try { rsp.setResponse(parsedResponse); // page only injected if QueryResponse works context.put("page", new PageTool(request, response)); // page tool only makes sense for a SearchHandler request... *sigh* } catch (ClassCastException e) { // known edge case where QueryResponse's extraction assumes "response" is a SolrDocumentList // (AnalysisRequestHandler emits a "response") e.printStackTrace(); rsp = new SolrResponseBase(); rsp.setResponse(parsedResponse); } context.put("response", rsp); // Velocity context tools - TODO: make these pluggable context.put("esc", new EscapeTool()); context.put("date", new ComparisonDateTool()); context.put("list", new ListTool()); context.put("math", new MathTool()); context.put("number", new NumberTool()); context.put("sort", new SortTool()); context.put("engine", engine); // for $engine.resourceExists(...) String layout_template = request.getParams().get("v.layout"); String json_wrapper = request.getParams().get("v.json"); boolean wrap_response = (layout_template != null) || (json_wrapper != null); // create output, optionally wrap it into a json object if (wrap_response) { StringWriter stringWriter = new StringWriter(); template.merge(context, stringWriter); if (layout_template != null) { context.put("content", stringWriter.toString()); stringWriter = new StringWriter(); try { engine.getTemplate(layout_template + ".vm").merge(context, stringWriter); } catch (Exception e) { throw new IOException(e.getMessage()); } } if (json_wrapper != null) { writer.write(request.getParams().get("v.json") + "("); writer.write(getJSONWrap(stringWriter.toString())); writer.write(')'); } else { // using a layout, but not JSON wrapping writer.write(stringWriter.toString()); } } else { template.merge(context, writer); } }
// in contrib/velocity/src/java/org/apache/solr/response/VelocityResponseWriter.java
private Template getTemplate(VelocityEngine engine, SolrQueryRequest request) throws IOException { Template template; String template_name = request.getParams().get("v.template"); String qt = request.getParams().get("qt"); String path = (String) request.getContext().get("path"); if (template_name == null && path != null) { template_name = path; } // TODO: path is never null, so qt won't get picked up maybe special case for '/select' to use qt, otherwise use path? if (template_name == null && qt != null) { template_name = qt; } if (template_name == null) template_name = "index"; try { template = engine.getTemplate(template_name + ".vm"); } catch (Exception e) { throw new IOException(e.getMessage()); } return template; }
// in core/src/java/org/apache/solr/handler/SnapShooter.java
public void copyFile(File source, File destination, boolean preserveFileDate) throws IOException { // check source exists if (!source.exists()) { String message = "File " + source + " does not exist"; throw new FileNotFoundException(message); } // does destinations directory exist ? if (destination.getParentFile() != null && !destination.getParentFile().exists()) { destination.getParentFile().mkdirs(); } // make sure we can write to destination if (destination.exists() && !destination.canWrite()) { String message = "Unable to open file " + destination + " for writing."; throw new IOException(message); } FileInputStream input = null; FileOutputStream output = null; try { input = new FileInputStream(source); output = new FileOutputStream(destination); int count = 0; int n = 0; int rcnt = 0; while (-1 != (n = input.read(buffer))) { output.write(buffer, 0, n); count += n; rcnt++; /*** // reserve every 4.6875 MB if (rcnt == 150) { rcnt = 0; delPolicy.setReserveDuration(indexCommit.getVersion(), reserveTime); } ***/ } } finally { try { IOUtils.closeQuietly(input); } finally { IOUtils.closeQuietly(output); } } if (source.length() != destination.length()) { String message = "Failed to copy full contents from " + source + " to " + destination; throw new IOException(message); } if (preserveFileDate) { // file copy should preserve file date destination.setLastModified(source.lastModified()); } }
// in core/src/java/org/apache/solr/response/XSLTResponseWriter.java
protected Transformer getTransformer(SolrQueryRequest request) throws IOException { final String xslt = request.getParams().get(CommonParams.TR,null); if(xslt==null) { throw new IOException("'" + CommonParams.TR + "' request parameter is required to use the XSLTResponseWriter"); } // not the cleanest way to achieve this SolrConfig solrConfig = request.getCore().getSolrConfig(); // no need to synchronize access to context, right? // Nothing else happens with it at the same time final Map<Object,Object> ctx = request.getContext(); Transformer result = (Transformer)ctx.get(CONTEXT_TRANSFORMER_KEY); if(result==null) { result = TransformerProvider.instance.getTransformer(solrConfig, xslt,xsltCacheLifetimeSeconds.intValue()); result.setErrorListener(xmllog); ctx.put(CONTEXT_TRANSFORMER_KEY,result); } return result; }
// in core/src/java/org/apache/solr/response/RawResponseWriter.java
public void write(OutputStream out, SolrQueryRequest request, SolrQueryResponse response) throws IOException { Object obj = response.getValues().get( CONTENT ); if( obj != null && (obj instanceof ContentStream ) ) { // copy the contents to the writer... ContentStream content = (ContentStream)obj; java.io.InputStream in = content.getStream(); try { IOUtils.copy( in, out ); } finally { in.close(); } } else { //getBaseWriter( request ).write( writer, request, response ); throw new IOException("did not find a CONTENT object"); } }
// in core/src/java/org/apache/solr/schema/SimplePreAnalyzedParser.java
Override public ParseResult parse(Reader reader, AttributeSource parent) throws IOException { ParseResult res = new ParseResult(); StringBuilder sb = new StringBuilder(); char[] buf = new char[128]; int cnt; while ((cnt = reader.read(buf)) > 0) { sb.append(buf, 0, cnt); } String val = sb.toString(); // empty string - accept even without version number if (val.length() == 0) { return res; } // first consume the version int idx = val.indexOf(' '); if (idx == -1) { throw new IOException("Missing VERSION token"); } String version = val.substring(0, idx); if (!VERSION.equals(version)) { throw new IOException("Unknown VERSION " + version); } val = val.substring(idx + 1); // then consume the optional stored part int tsStart = 0; boolean hasStored = false; StringBuilder storedBuf = new StringBuilder(); if (val.charAt(0) == '=') { hasStored = true; if (val.length() > 1) { for (int i = 1; i < val.length(); i++) { char c = val.charAt(i); if (c == '\\') { if (i < val.length() - 1) { c = val.charAt(++i); if (c == '=') { // we recognize only \= escape in the stored part storedBuf.append('='); } else { storedBuf.append('\\'); storedBuf.append(c); continue; } } else { storedBuf.append(c); continue; } } else if (c == '=') { // end of stored text tsStart = i + 1; break; } else { storedBuf.append(c); } } if (tsStart == 0) { // missing end-of-stored marker throw new IOException("Missing end marker of stored part"); } } else { throw new IOException("Unexpected end of stored field"); } } if (hasStored) { res.str = storedBuf.toString(); } Tok tok = new Tok(); StringBuilder attName = new StringBuilder(); StringBuilder attVal = new StringBuilder(); // parser state S s = S.UNDEF; int lastPos = 0; for (int i = tsStart; i < val.length(); i++) { char c = val.charAt(i); if (c == ' ') { // collect leftovers switch (s) { case VALUE : if (attVal.length() == 0) { throw new IOException("Unexpected character '" + c + "' at position " + i + " - empty value of attribute."); } if (attName.length() > 0) { tok.attr.put(attName.toString(), attVal.toString()); } break; case NAME: // attr name without a value ? if (attName.length() > 0) { throw new IOException("Unexpected character '" + c + "' at position " + i + " - missing attribute value."); } else { // accept missing att name and value } break; case TOKEN: case UNDEF: // do nothing, advance to next token } attName.setLength(0); attVal.setLength(0); if (!tok.isEmpty() || s == S.NAME) { AttributeSource.State state = createState(parent, tok, lastPos); if (state != null) res.states.add(state.clone()); } // reset tok s = S.UNDEF; tok.reset(); // skip lastPos++; continue; } StringBuilder tgt = null; switch (s) { case TOKEN: tgt = tok.token; break; case NAME: tgt = attName; break; case VALUE: tgt = attVal; break; case UNDEF: tgt = tok.token; s = S.TOKEN; } if (c == '\\') { if (s == S.TOKEN) lastPos++; if (i >= val.length() - 1) { // end tgt.append(c); continue; } else { c = val.charAt(++i); switch (c) { case '\\' : case '=' : case ',' : case ' ' : tgt.append(c); break; case 'n': tgt.append('\n'); break; case 'r': tgt.append('\r'); break; case 't': tgt.append('\t'); break; default: tgt.append('\\'); tgt.append(c); lastPos++; } } } else { // state switch if (c == ',') { if (s == S.TOKEN) { s = S.NAME; } else if (s == S.VALUE) { // end of value, start of next attr if (attVal.length() == 0) { throw new IOException("Unexpected character '" + c + "' at position " + i + " - empty value of attribute."); } if (attName.length() > 0 && attVal.length() > 0) { tok.attr.put(attName.toString(), attVal.toString()); } // reset attName.setLength(0); attVal.setLength(0); s = S.NAME; } else { throw new IOException("Unexpected character '" + c + "' at position " + i + " - missing attribute value."); } } else if (c == '=') { if (s == S.NAME) { s = S.VALUE; } else { throw new IOException("Unexpected character '" + c + "' at position " + i + " - empty value of attribute."); } } else { tgt.append(c); if (s == S.TOKEN) lastPos++; } } } // collect leftovers if (!tok.isEmpty() || s == S.NAME || s == S.VALUE) { // remaining attrib? if (s == S.VALUE) { if (attName.length() > 0 && attVal.length() > 0) { tok.attr.put(attName.toString(), attVal.toString()); } } AttributeSource.State state = createState(parent, tok, lastPos); if (state != null) res.states.add(state.clone()); } return res; }
// in core/src/java/org/apache/solr/internal/csv/CSVParser.java
public String nextValue() throws IOException { Token tkn = nextToken(); String ret = null; switch (tkn.type) { case TT_TOKEN: case TT_EORECORD: ret = tkn.content.toString(); break; case TT_EOF: ret = null; break; case TT_INVALID: default: // error no token available (or error) throw new IOException( "(line " + getLineNumber() + ") invalid parse sequence"); // unreachable: break; } return ret; }
// in core/src/java/org/apache/solr/internal/csv/CSVParser.java
public String[] getLine() throws IOException { String[] ret = EMPTY_STRING_ARRAY; record.clear(); while (true) { reusableToken.reset(); nextToken(reusableToken); switch (reusableToken.type) { case TT_TOKEN: record.add(reusableToken.content.toString()); break; case TT_EORECORD: record.add(reusableToken.content.toString()); break; case TT_EOF: if (reusableToken.isReady) { record.add(reusableToken.content.toString()); } else { ret = null; } break; case TT_INVALID: default: // error: throw IOException throw new IOException("(line " + getLineNumber() + ") invalid parse sequence"); // unreachable: break; } if (reusableToken.type != TT_TOKEN) { break; } } if (!record.isEmpty()) { ret = (String[]) record.toArray(new String[record.size()]); } return ret; }
// in core/src/java/org/apache/solr/internal/csv/CSVParser.java
private Token encapsulatedTokenLexer(Token tkn, int c) throws IOException { // save current line int startLineNumber = getLineNumber(); // ignore the given delimiter // assert c == delimiter; for (;;) { c = in.read(); if (c == '\\' && strategy.getUnicodeEscapeInterpretation() && in.lookAhead()=='u') { tkn.content.append((char) unicodeEscapeLexer(c)); } else if (c == strategy.getEscape()) { tkn.content.append((char)readEscape(c)); } else if (c == strategy.getEncapsulator()) { if (in.lookAhead() == strategy.getEncapsulator()) { // double or escaped encapsulator -> add single encapsulator to token c = in.read(); tkn.content.append((char) c); } else { // token finish mark (encapsulator) reached: ignore whitespace till delimiter for (;;) { c = in.read(); if (c == strategy.getDelimiter()) { tkn.type = TT_TOKEN; tkn.isReady = true; return tkn; } else if (isEndOfFile(c)) { tkn.type = TT_EOF; tkn.isReady = true; return tkn; } else if (isEndOfLine(c)) { // ok eo token reached tkn.type = TT_EORECORD; tkn.isReady = true; return tkn; } else if (!isWhitespace(c)) { // error invalid char between token and next delimiter throw new IOException( "(line " + getLineNumber() + ") invalid char between encapsulated token end delimiter" ); } } } } else if (isEndOfFile(c)) { // error condition (end of file before end of token) throw new IOException( "(startline " + startLineNumber + ")" + "eof reached before encapsulated token finished" ); } else { // consume character tkn.content.append((char) c); } } }
// in core/src/java/org/apache/solr/internal/csv/CSVParser.java
protected int unicodeEscapeLexer(int c) throws IOException { int ret = 0; // ignore 'u' (assume c==\ now) and read 4 hex digits c = in.read(); code.clear(); try { for (int i = 0; i < 4; i++) { c = in.read(); if (isEndOfFile(c) || isEndOfLine(c)) { throw new NumberFormatException("number too short"); } code.append((char) c); } ret = Integer.parseInt(code.toString(), 16); } catch (NumberFormatException e) { throw new IOException( "(line " + getLineNumber() + ") Wrong unicode escape sequence found '" + code.toString() + "'" + e.toString()); } return ret; }
5
              
// in solrj/src/java/org/apache/solr/client/solrj/impl/ConcurrentUpdateSolrServer.java
catch (InterruptedException e) { log.error("interrupted", e); throw new IOException(e.getLocalizedMessage()); }
// in contrib/velocity/src/java/org/apache/solr/response/VelocityResponseWriter.java
catch (Exception e) { throw new IOException(e.getMessage()); }
// in contrib/velocity/src/java/org/apache/solr/response/VelocityResponseWriter.java
catch (Exception e) { throw new IOException(e.getMessage()); }
// in core/src/java/org/apache/solr/internal/csv/CSVParser.java
catch (NumberFormatException e) { throw new IOException( "(line " + getLineNumber() + ") Wrong unicode escape sequence found '" + code.toString() + "'" + e.toString()); }
// in core/src/java/org/apache/solr/util/SystemIdResolver.java
catch (RuntimeException re) { // unfortunately XInclude fallback only works with IOException, but openResource() never throws that one throw (IOException) (new IOException(re.getMessage()).initCause(re)); }
1072
              
// in solrj/src/java/org/apache/solr/common/cloud/DefaultConnectionStrategy.java
Override public void connect(String serverAddress, int timeout, Watcher watcher, ZkUpdate updater) throws IOException, InterruptedException, TimeoutException { updater.update(new SolrZooKeeper(serverAddress, timeout, watcher)); }
// in solrj/src/java/org/apache/solr/common/cloud/DefaultConnectionStrategy.java
Override public void reconnect(final String serverAddress, final int zkClientTimeout, final Watcher watcher, final ZkUpdate updater) throws IOException { log.info("Connection expired - starting a new one..."); try { updater .update(new SolrZooKeeper(serverAddress, zkClientTimeout, watcher)); log.info("Reconnected to ZooKeeper"); } catch (Exception e) { SolrException.log(log, "Reconnect to ZooKeeper failed", e); log.info("Reconnect to ZooKeeper failed"); } }
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
public void makePath(String path, File file, boolean failOnExists, boolean retryOnConnLoss) throws IOException, KeeperException, InterruptedException { makePath(path, FileUtils.readFileToString(file).getBytes("UTF-8"), CreateMode.PERSISTENT, null, failOnExists, retryOnConnLoss); }
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
public void makePath(String path, File file, boolean retryOnConnLoss) throws IOException, KeeperException, InterruptedException { makePath(path, FileUtils.readFileToString(file).getBytes("UTF-8"), retryOnConnLoss); }
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
public void setData(String path, File file, boolean retryOnConnLoss) throws IOException, KeeperException, InterruptedException { if (log.isInfoEnabled()) { log.info("Write to ZooKeepeer " + file.getAbsolutePath() + " to " + path); } String data = FileUtils.readFileToString(file); setData(path, data.getBytes("UTF-8"), retryOnConnLoss); }
// in solrj/src/java/org/apache/solr/common/cloud/ConnectionManager.java
public synchronized void process(WatchedEvent event) { if (log.isInfoEnabled()) { log.info("Watcher " + this + " name:" + name + " got event " + event + " path:" + event.getPath() + " type:" + event.getType()); } state = event.getState(); if (state == KeeperState.SyncConnected) { connected = true; clientConnected.countDown(); } else if (state == KeeperState.Expired) { connected = false; log.info("Attempting to reconnect to recover relationship with ZooKeeper..."); try { connectionStrategy.reconnect(zkServerAddress, zkClientTimeout, this, new ZkClientConnectionStrategy.ZkUpdate() { @Override public void update(SolrZooKeeper keeper) throws InterruptedException, TimeoutException, IOException { synchronized (connectionStrategy) { waitForConnected(SolrZkClient.DEFAULT_CLIENT_CONNECT_TIMEOUT); client.updateKeeper(keeper); if (onReconnect != null) { onReconnect.command(); } synchronized (ConnectionManager.this) { ConnectionManager.this.connected = true; } } } }); } catch (Exception e) { SolrException.log(log, "", e); } log.info("Connected:" + connected); }
// in solrj/src/java/org/apache/solr/common/cloud/ConnectionManager.java
Override public void update(SolrZooKeeper keeper) throws InterruptedException, TimeoutException, IOException { synchronized (connectionStrategy) { waitForConnected(SolrZkClient.DEFAULT_CLIENT_CONNECT_TIMEOUT); client.updateKeeper(keeper); if (onReconnect != null) { onReconnect.command(); } synchronized (ConnectionManager.this) { ConnectionManager.this.connected = true; } } }
// in solrj/src/java/org/apache/solr/common/cloud/ConnectionManager.java
public synchronized void waitForConnected(long waitForConnection) throws InterruptedException, TimeoutException, IOException { long expire = System.currentTimeMillis() + waitForConnection; long left = waitForConnection; while (!connected && left > 0) { wait(left); left = expire - System.currentTimeMillis(); } if (!connected) { throw new TimeoutException("Could not connect to ZooKeeper " + zkServerAddress + " within " + waitForConnection + " ms"); } }
// in solrj/src/java/org/apache/solr/common/util/StrUtils.java
public static void partialURLEncodeVal(Appendable dest, String val) throws IOException { for (int i=0; i<val.length(); i++) { char ch = val.charAt(i); if (ch < 32) { dest.append('%'); if (ch < 0x10) dest.append('0'); dest.append(Integer.toHexString(ch)); } else { switch (ch) { case ' ': dest.append('+'); break; case '&': dest.append("%26"); break; case '%': dest.append("%25"); break; case '=': dest.append("%3D"); break; case '+': dest.append("%2B"); break; default : dest.append(ch); break; } } } }
// in solrj/src/java/org/apache/solr/common/util/ContentStreamBase.java
public InputStream getStream() throws IOException { URLConnection conn = this.url.openConnection(); contentType = conn.getContentType(); name = url.toExternalForm(); size = new Long( conn.getContentLength() ); return conn.getInputStream(); }
// in solrj/src/java/org/apache/solr/common/util/ContentStreamBase.java
public InputStream getStream() throws IOException { return new FileInputStream( file ); }
// in solrj/src/java/org/apache/solr/common/util/ContentStreamBase.java
Override public Reader getReader() throws IOException { String charset = getCharsetFromContentType( contentType ); return charset == null ? new FileReader( file ) : new InputStreamReader( getStream(), charset ); }
// in solrj/src/java/org/apache/solr/common/util/ContentStreamBase.java
public InputStream getStream() throws IOException { return new ByteArrayInputStream( str.getBytes(DEFAULT_CHARSET) ); }
// in solrj/src/java/org/apache/solr/common/util/ContentStreamBase.java
Override public Reader getReader() throws IOException { String charset = getCharsetFromContentType( contentType ); return charset == null ? new StringReader( str ) : new InputStreamReader( getStream(), charset ); }
// in solrj/src/java/org/apache/solr/common/util/ContentStreamBase.java
public Reader getReader() throws IOException { String charset = getCharsetFromContentType( getContentType() ); return charset == null ? new InputStreamReader( getStream(), DEFAULT_CHARSET ) : new InputStreamReader( getStream(), charset ); }
// in solrj/src/java/org/apache/solr/common/util/XML.java
public static void escapeCharData(String str, Writer out) throws IOException { escape(str, out, chardata_escapes); }
// in solrj/src/java/org/apache/solr/common/util/XML.java
public static void escapeAttributeValue(String str, Writer out) throws IOException { escape(str, out, attribute_escapes); }
// in solrj/src/java/org/apache/solr/common/util/XML.java
public static void escapeAttributeValue(char [] chars, int start, int length, Writer out) throws IOException { escape(chars, start, length, out, attribute_escapes); }
// in solrj/src/java/org/apache/solr/common/util/XML.java
public final static void writeXML(Writer out, String tag, String val) throws IOException { out.write('<'); out.write(tag); if (val == null) { out.write('/'); out.write('>'); } else { out.write('>'); escapeCharData(val,out); out.write('<'); out.write('/'); out.write(tag); out.write('>'); } }
// in solrj/src/java/org/apache/solr/common/util/XML.java
public final static void writeUnescapedXML(Writer out, String tag, String val, Object... attrs) throws IOException { out.write('<'); out.write(tag); for (int i=0; i<attrs.length; i++) { out.write(' '); out.write(attrs[i++].toString()); out.write('='); out.write('"'); out.write(attrs[i].toString()); out.write('"'); } if (val == null) { out.write('/'); out.write('>'); } else { out.write('>'); out.write(val); out.write('<'); out.write('/'); out.write(tag); out.write('>'); } }
// in solrj/src/java/org/apache/solr/common/util/XML.java
public final static void writeXML(Writer out, String tag, String val, Object... attrs) throws IOException { out.write('<'); out.write(tag); for (int i=0; i<attrs.length; i++) { out.write(' '); out.write(attrs[i++].toString()); out.write('='); out.write('"'); escapeAttributeValue(attrs[i].toString(), out); out.write('"'); } if (val == null) { out.write('/'); out.write('>'); } else { out.write('>'); escapeCharData(val,out); out.write('<'); out.write('/'); out.write(tag); out.write('>'); } }
// in solrj/src/java/org/apache/solr/common/util/XML.java
public static void writeXML(Writer out, String tag, String val, Map<String, String> attrs) throws IOException { out.write('<'); out.write(tag); for (Map.Entry<String, String> entry : attrs.entrySet()) { out.write(' '); out.write(entry.getKey()); out.write('='); out.write('"'); escapeAttributeValue(entry.getValue(), out); out.write('"'); } if (val == null) { out.write('/'); out.write('>'); } else { out.write('>'); escapeCharData(val,out); out.write('<'); out.write('/'); out.write(tag); out.write('>'); } }
// in solrj/src/java/org/apache/solr/common/util/XML.java
private static void escape(char [] chars, int offset, int length, Writer out, String [] escapes) throws IOException{ for (int i=offset; i<length; i++) { char ch = chars[i]; if (ch<escapes.length) { String replacement = escapes[ch]; if (replacement != null) { out.write(replacement); continue; } } out.write(ch); } }
// in solrj/src/java/org/apache/solr/common/util/XML.java
private static void escape(String str, Writer out, String[] escapes) throws IOException { for (int i=0; i<str.length(); i++) { char ch = str.charAt(i); if (ch<escapes.length) { String replacement = escapes[ch]; if (replacement != null) { out.write(replacement); continue; } } out.write(ch); } }
// in solrj/src/java/org/apache/solr/common/util/FastInputStream.java
Override public int read() throws IOException { if (pos >= end) { refill(); if (pos >= end) return -1; } return buf[pos++] & 0xff; }
// in solrj/src/java/org/apache/solr/common/util/FastInputStream.java
public int peek() throws IOException { if (pos >= end) { refill(); if (pos >= end) return -1; } return buf[pos] & 0xff; }
// in solrj/src/java/org/apache/solr/common/util/FastInputStream.java
public int readUnsignedByte() throws IOException { if (pos >= end) { refill(); if (pos >= end) { throw new EOFException(); } } return buf[pos++] & 0xff; }
// in solrj/src/java/org/apache/solr/common/util/FastInputStream.java
public int readWrappedStream(byte[] target, int offset, int len) throws IOException { return in.read(target, offset, len); }
// in solrj/src/java/org/apache/solr/common/util/FastInputStream.java
public void refill() throws IOException { // this will set end to -1 at EOF end = readWrappedStream(buf, 0, buf.length); if (end > 0) readFromStream += end; pos = 0; }
// in solrj/src/java/org/apache/solr/common/util/FastInputStream.java
Override public int available() throws IOException { return end - pos; }
// in solrj/src/java/org/apache/solr/common/util/FastInputStream.java
Override public int read(byte b[], int off, int len) throws IOException { int r=0; // number of bytes we have read // first read from our buffer; if (end-pos > 0) { r = Math.min(end-pos, len); System.arraycopy(buf, pos, b, off, r); pos += r; } if (r == len) return r; // amount left to read is >= buffer size if (len-r >= buf.length) { int ret = readWrappedStream(b, off+r, len-r); if (ret >= 0) { readFromStream += ret; r += ret; return r; } else { // negative return code return r > 0 ? r : -1; } } refill(); // read rest from our buffer if (end-pos > 0) { int toRead = Math.min(end-pos, len-r); System.arraycopy(buf, pos, b, off+r, toRead); pos += toRead; r += toRead; return r; } return r > 0 ? r : -1; }
// in solrj/src/java/org/apache/solr/common/util/FastInputStream.java
Override public void close() throws IOException { in.close(); }
// in solrj/src/java/org/apache/solr/common/util/FastInputStream.java
public void readFully(byte b[]) throws IOException { readFully(b, 0, b.length); }
// in solrj/src/java/org/apache/solr/common/util/FastInputStream.java
public void readFully(byte b[], int off, int len) throws IOException { while (len>0) { int ret = read(b, off, len); if (ret==-1) { throw new EOFException(); } off += ret; len -= ret; } }
// in solrj/src/java/org/apache/solr/common/util/FastInputStream.java
public int skipBytes(int n) throws IOException { if (end-pos >= n) { pos += n; return n; } if (end-pos<0) return -1; int r = end-pos; pos = end; while (r < n) { refill(); if (end-pos <= 0) return r; int toRead = Math.min(end-pos, n-r); r += toRead; pos += toRead; } return r; }
// in solrj/src/java/org/apache/solr/common/util/FastInputStream.java
public boolean readBoolean() throws IOException { return readByte()==1; }
// in solrj/src/java/org/apache/solr/common/util/FastInputStream.java
public byte readByte() throws IOException { if (pos >= end) { refill(); if (pos >= end) throw new EOFException(); } return buf[pos++]; }
// in solrj/src/java/org/apache/solr/common/util/FastInputStream.java
public short readShort() throws IOException { return (short)((readUnsignedByte() << 8) | readUnsignedByte()); }
// in solrj/src/java/org/apache/solr/common/util/FastInputStream.java
public int readUnsignedShort() throws IOException { return (readUnsignedByte() << 8) | readUnsignedByte(); }
// in solrj/src/java/org/apache/solr/common/util/FastInputStream.java
public char readChar() throws IOException { return (char)((readUnsignedByte() << 8) | readUnsignedByte()); }
// in solrj/src/java/org/apache/solr/common/util/FastInputStream.java
public int readInt() throws IOException { return ((readUnsignedByte() << 24) |(readUnsignedByte() << 16) |(readUnsignedByte() << 8) | readUnsignedByte()); }
// in solrj/src/java/org/apache/solr/common/util/FastInputStream.java
public long readLong() throws IOException { return (((long)readUnsignedByte()) << 56) | (((long)readUnsignedByte()) << 48) | (((long)readUnsignedByte()) << 40) | (((long)readUnsignedByte()) << 32) | (((long)readUnsignedByte()) << 24) | (readUnsignedByte() << 16) | (readUnsignedByte() << 8) | (readUnsignedByte()); }
// in solrj/src/java/org/apache/solr/common/util/FastInputStream.java
public float readFloat() throws IOException { return Float.intBitsToFloat(readInt()); }
// in solrj/src/java/org/apache/solr/common/util/FastInputStream.java
public double readDouble() throws IOException { return Double.longBitsToDouble(readLong()); }
// in solrj/src/java/org/apache/solr/common/util/FastInputStream.java
public String readLine() throws IOException { return new DataInputStream(this).readLine(); }
// in solrj/src/java/org/apache/solr/common/util/FastInputStream.java
public String readUTF() throws IOException { return new DataInputStream(this).readUTF(); }
// in solrj/src/java/org/apache/solr/common/util/FastOutputStream.java
Override public void write(int b) throws IOException { write((byte)b); }
// in solrj/src/java/org/apache/solr/common/util/FastOutputStream.java
Override public void write(byte b[]) throws IOException { write(b,0,b.length); }
// in solrj/src/java/org/apache/solr/common/util/FastOutputStream.java
public void write(byte b) throws IOException { if (pos >= buf.length) { out.write(buf); written += pos; pos=0; } buf[pos++] = b; }
// in solrj/src/java/org/apache/solr/common/util/FastOutputStream.java
Override public void write(byte arr[], int off, int len) throws IOException { int space = buf.length - pos; if (len < space) { System.arraycopy(arr, off, buf, pos, len); pos += len; } else if (len<buf.length) { // if the data to write is small enough, buffer it. System.arraycopy(arr, off, buf, pos, space); out.write(buf); written += buf.length; pos = len-space; System.arraycopy(arr, off+space, buf, 0, pos); } else { if (pos>0) { out.write(buf,0,pos); // flush written += pos; pos=0; } // don't buffer, just write to sink out.write(arr, off, len); written += len; } }
// in solrj/src/java/org/apache/solr/common/util/FastOutputStream.java
public void reserve(int len) throws IOException { if (len > (buf.length - pos)) flushBuffer(); }
// in solrj/src/java/org/apache/solr/common/util/FastOutputStream.java
public void writeBoolean(boolean v) throws IOException { write(v ? 1:0); }
// in solrj/src/java/org/apache/solr/common/util/FastOutputStream.java
public void writeByte(int v) throws IOException { write((byte)v); }
// in solrj/src/java/org/apache/solr/common/util/FastOutputStream.java
public void writeShort(int v) throws IOException { write((byte)(v >>> 8)); write((byte)v); }
// in solrj/src/java/org/apache/solr/common/util/FastOutputStream.java
public void writeChar(int v) throws IOException { writeShort(v); }
// in solrj/src/java/org/apache/solr/common/util/FastOutputStream.java
public void writeInt(int v) throws IOException { reserve(4); buf[pos] = (byte)(v>>>24); buf[pos+1] = (byte)(v>>>16); buf[pos+2] = (byte)(v>>>8); buf[pos+3] = (byte)(v); pos+=4; }
// in solrj/src/java/org/apache/solr/common/util/FastOutputStream.java
public void writeLong(long v) throws IOException { reserve(8); buf[pos] = (byte)(v>>>56); buf[pos+1] = (byte)(v>>>48); buf[pos+2] = (byte)(v>>>40); buf[pos+3] = (byte)(v>>>32); buf[pos+4] = (byte)(v>>>24); buf[pos+5] = (byte)(v>>>16); buf[pos+6] = (byte)(v>>>8); buf[pos+7] = (byte)(v); pos+=8; }
// in solrj/src/java/org/apache/solr/common/util/FastOutputStream.java
public void writeFloat(float v) throws IOException { writeInt(Float.floatToRawIntBits(v)); }
// in solrj/src/java/org/apache/solr/common/util/FastOutputStream.java
public void writeDouble(double v) throws IOException { writeLong(Double.doubleToRawLongBits(v)); }
// in solrj/src/java/org/apache/solr/common/util/FastOutputStream.java
public void writeBytes(String s) throws IOException { // non-optimized version, but this shouldn't be used anyway for (int i=0; i<s.length(); i++) write((byte)s.charAt(i)); }
// in solrj/src/java/org/apache/solr/common/util/FastOutputStream.java
public void writeChars(String s) throws IOException { // non-optimized version for (int i=0; i<s.length(); i++) writeChar(s.charAt(i)); }
// in solrj/src/java/org/apache/solr/common/util/FastOutputStream.java
public void writeUTF(String s) throws IOException { // non-optimized version, but this shouldn't be used anyway DataOutputStream daos = new DataOutputStream(this); daos.writeUTF(s); }
// in solrj/src/java/org/apache/solr/common/util/FastOutputStream.java
Override public void flush() throws IOException { flushBuffer(); out.flush(); }
// in solrj/src/java/org/apache/solr/common/util/FastOutputStream.java
Override public void close() throws IOException { flushBuffer(); out.close(); }
// in solrj/src/java/org/apache/solr/common/util/FastOutputStream.java
public void flushBuffer() throws IOException { if (pos > 0) { out.write(buf, 0, pos); written += pos; pos=0; } }
// in solrj/src/java/org/apache/solr/common/util/JavaBinCodec.java
public void marshal(Object nl, OutputStream os) throws IOException { init(FastOutputStream.wrap(os)); try { daos.writeByte(VERSION); writeVal(nl); } finally { daos.flushBuffer(); } }
// in solrj/src/java/org/apache/solr/common/util/JavaBinCodec.java
public Object unmarshal(InputStream is) throws IOException { FastInputStream dis = FastInputStream.wrap(is); version = dis.readByte(); if (version != VERSION) { throw new RuntimeException("Invalid version (expected " + VERSION + ", but " + version + ") or the data in not in 'javabin' format"); } return readVal(dis); }
// in solrj/src/java/org/apache/solr/common/util/JavaBinCodec.java
public SimpleOrderedMap<Object> readOrderedMap(FastInputStream dis) throws IOException { int sz = readSize(dis); SimpleOrderedMap<Object> nl = new SimpleOrderedMap<Object>(); for (int i = 0; i < sz; i++) { String name = (String) readVal(dis); Object val = readVal(dis); nl.add(name, val); } return nl; }
// in solrj/src/java/org/apache/solr/common/util/JavaBinCodec.java
public NamedList<Object> readNamedList(FastInputStream dis) throws IOException { int sz = readSize(dis); NamedList<Object> nl = new NamedList<Object>(); for (int i = 0; i < sz; i++) { String name = (String) readVal(dis); Object val = readVal(dis); nl.add(name, val); } return nl; }
// in solrj/src/java/org/apache/solr/common/util/JavaBinCodec.java
public void writeNamedList(NamedList<?> nl) throws IOException { writeTag(nl instanceof SimpleOrderedMap ? ORDERED_MAP : NAMED_LST, nl.size()); for (int i = 0; i < nl.size(); i++) { String name = nl.getName(i); writeExternString(name); Object val = nl.getVal(i); writeVal(val); } }
// in solrj/src/java/org/apache/solr/common/util/JavaBinCodec.java
public void writeVal(Object val) throws IOException { if (writeKnownType(val)) { return; } else { Object tmpVal = val; if (resolver != null) { tmpVal = resolver.resolve(val, this); if (tmpVal == null) return; // null means the resolver took care of it fully if (writeKnownType(tmpVal)) return; } } writeVal(val.getClass().getName() + ':' + val.toString()); }
// in solrj/src/java/org/apache/solr/common/util/JavaBinCodec.java
public Object readVal(FastInputStream dis) throws IOException { tagByte = dis.readByte(); // if ((tagByte & 0xe0) == 0) { // if top 3 bits are clear, this is a normal tag // OK, try type + size in single byte switch (tagByte >>> 5) { case STR >>> 5: return readStr(dis); case SINT >>> 5: return readSmallInt(dis); case SLONG >>> 5: return readSmallLong(dis); case ARR >>> 5: return readArray(dis); case ORDERED_MAP >>> 5: return readOrderedMap(dis); case NAMED_LST >>> 5: return readNamedList(dis); case EXTERN_STRING >>> 5: return readExternString(dis); } switch (tagByte) { case NULL: return null; case DATE: return new Date(dis.readLong()); case INT: return dis.readInt(); case BOOL_TRUE: return Boolean.TRUE; case BOOL_FALSE: return Boolean.FALSE; case FLOAT: return dis.readFloat(); case DOUBLE: return dis.readDouble(); case LONG: return dis.readLong(); case BYTE: return dis.readByte(); case SHORT: return dis.readShort(); case MAP: return readMap(dis); case SOLRDOC: return readSolrDocument(dis); case SOLRDOCLST: return readSolrDocumentList(dis); case BYTEARR: return readByteArray(dis); case ITERATOR: return readIterator(dis); case END: return END_OBJ; case SOLRINPUTDOC: return readSolrInputDocument(dis); } throw new RuntimeException("Unknown type " + tagByte); }
// in solrj/src/java/org/apache/solr/common/util/JavaBinCodec.java
public boolean writeKnownType(Object val) throws IOException { if (writePrimitive(val)) return true; if (val instanceof NamedList) { writeNamedList((NamedList<?>) val); return true; } if (val instanceof SolrDocumentList) { // SolrDocumentList is a List, so must come before List check writeSolrDocumentList((SolrDocumentList) val); return true; } if (val instanceof Collection) { writeArray((Collection) val); return true; } if (val instanceof Object[]) { writeArray((Object[]) val); return true; } if (val instanceof SolrDocument) { //this needs special treatment to know which fields are to be written if (resolver == null) { writeSolrDocument((SolrDocument) val); } else { Object retVal = resolver.resolve(val, this); if (retVal != null) { if (retVal instanceof SolrDocument) { writeSolrDocument((SolrDocument) retVal); } else { writeVal(retVal); } } } return true; } if (val instanceof SolrInputDocument) { writeSolrInputDocument((SolrInputDocument)val); return true; } if (val instanceof Map) { writeMap((Map) val); return true; } if (val instanceof Iterator) { writeIterator((Iterator) val); return true; } if (val instanceof Iterable) { writeIterator(((Iterable) val).iterator()); return true; } return false; }
// in solrj/src/java/org/apache/solr/common/util/JavaBinCodec.java
public void writeTag(byte tag) throws IOException { daos.writeByte(tag); }
// in solrj/src/java/org/apache/solr/common/util/JavaBinCodec.java
public void writeTag(byte tag, int size) throws IOException { if ((tag & 0xe0) != 0) { if (size < 0x1f) { daos.writeByte(tag | size); } else { daos.writeByte(tag | 0x1f); writeVInt(size - 0x1f, daos); } } else { daos.writeByte(tag); writeVInt(size, daos); } }
// in solrj/src/java/org/apache/solr/common/util/JavaBinCodec.java
public void writeByteArray(byte[] arr, int offset, int len) throws IOException { writeTag(BYTEARR, len); daos.write(arr, offset, len); }
// in solrj/src/java/org/apache/solr/common/util/JavaBinCodec.java
public byte[] readByteArray(FastInputStream dis) throws IOException { byte[] arr = new byte[readVInt(dis)]; dis.readFully(arr); return arr; }
// in solrj/src/java/org/apache/solr/common/util/JavaBinCodec.java
public void writeSolrDocument(SolrDocument doc) throws IOException { writeTag(SOLRDOC); writeTag(ORDERED_MAP, doc.size()); for (Map.Entry<String, Object> entry : doc) { String name = entry.getKey(); writeExternString(name); Object val = entry.getValue(); writeVal(val); } }
// in solrj/src/java/org/apache/solr/common/util/JavaBinCodec.java
public SolrDocument readSolrDocument(FastInputStream dis) throws IOException { NamedList nl = (NamedList) readVal(dis); SolrDocument doc = new SolrDocument(); for (int i = 0; i < nl.size(); i++) { String name = nl.getName(i); Object val = nl.getVal(i); doc.setField(name, val); } return doc; }
// in solrj/src/java/org/apache/solr/common/util/JavaBinCodec.java
public SolrDocumentList readSolrDocumentList(FastInputStream dis) throws IOException { SolrDocumentList solrDocs = new SolrDocumentList(); List list = (List) readVal(dis); solrDocs.setNumFound((Long) list.get(0)); solrDocs.setStart((Long) list.get(1)); solrDocs.setMaxScore((Float) list.get(2)); @SuppressWarnings("unchecked") List<SolrDocument> l = (List<SolrDocument>) readVal(dis); solrDocs.addAll(l); return solrDocs; }
// in solrj/src/java/org/apache/solr/common/util/JavaBinCodec.java
public void writeSolrDocumentList(SolrDocumentList docs) throws IOException { writeTag(SOLRDOCLST); List<Number> l = new ArrayList<Number>(3); l.add(docs.getNumFound()); l.add(docs.getStart()); l.add(docs.getMaxScore()); writeArray(l); writeArray(docs); }
// in solrj/src/java/org/apache/solr/common/util/JavaBinCodec.java
public SolrInputDocument readSolrInputDocument(FastInputStream dis) throws IOException { int sz = readVInt(dis); float docBoost = (Float)readVal(dis); SolrInputDocument sdoc = new SolrInputDocument(); sdoc.setDocumentBoost(docBoost); for (int i = 0; i < sz; i++) { float boost = 1.0f; String fieldName; Object boostOrFieldName = readVal(dis); if (boostOrFieldName instanceof Float) { boost = (Float)boostOrFieldName; fieldName = (String)readVal(dis); } else { fieldName = (String)boostOrFieldName; } Object fieldVal = readVal(dis); sdoc.setField(fieldName, fieldVal, boost); } return sdoc; }
// in solrj/src/java/org/apache/solr/common/util/JavaBinCodec.java
public void writeSolrInputDocument(SolrInputDocument sdoc) throws IOException { writeTag(SOLRINPUTDOC, sdoc.size()); writeFloat(sdoc.getDocumentBoost()); for (SolrInputField inputField : sdoc.values()) { if (inputField.getBoost() != 1.0f) { writeFloat(inputField.getBoost()); } writeExternString(inputField.getName()); writeVal(inputField.getValue()); } }
// in solrj/src/java/org/apache/solr/common/util/JavaBinCodec.java
public Map<Object,Object> readMap(FastInputStream dis) throws IOException { int sz = readVInt(dis); Map<Object,Object> m = new LinkedHashMap<Object,Object>(); for (int i = 0; i < sz; i++) { Object key = readVal(dis); Object val = readVal(dis); m.put(key, val); } return m; }
// in solrj/src/java/org/apache/solr/common/util/JavaBinCodec.java
public void writeIterator(Iterator iter) throws IOException { writeTag(ITERATOR); while (iter.hasNext()) { writeVal(iter.next()); } writeVal(END_OBJ); }
// in solrj/src/java/org/apache/solr/common/util/JavaBinCodec.java
public List<Object> readIterator(FastInputStream fis) throws IOException { ArrayList<Object> l = new ArrayList<Object>(); while (true) { Object o = readVal(fis); if (o == END_OBJ) break; l.add(o); } return l; }
// in solrj/src/java/org/apache/solr/common/util/JavaBinCodec.java
public void writeArray(List l) throws IOException { writeTag(ARR, l.size()); for (int i = 0; i < l.size(); i++) { writeVal(l.get(i)); } }
// in solrj/src/java/org/apache/solr/common/util/JavaBinCodec.java
public void writeArray(Collection coll) throws IOException { writeTag(ARR, coll.size()); for (Object o : coll) { writeVal(o); } }
// in solrj/src/java/org/apache/solr/common/util/JavaBinCodec.java
public void writeArray(Object[] arr) throws IOException { writeTag(ARR, arr.length); for (int i = 0; i < arr.length; i++) { Object o = arr[i]; writeVal(o); } }
// in solrj/src/java/org/apache/solr/common/util/JavaBinCodec.java
public List<Object> readArray(FastInputStream dis) throws IOException { int sz = readSize(dis); ArrayList<Object> l = new ArrayList<Object>(sz); for (int i = 0; i < sz; i++) { l.add(readVal(dis)); } return l; }
// in solrj/src/java/org/apache/solr/common/util/JavaBinCodec.java
public void writeStr(String s) throws IOException { if (s == null) { writeTag(NULL); return; } int end = s.length(); int maxSize = end * 4; if (bytes == null || bytes.length < maxSize) bytes = new byte[maxSize]; int sz = ByteUtils.UTF16toUTF8(s, 0, end, bytes, 0); writeTag(STR, sz); daos.write(bytes, 0, sz); }
// in solrj/src/java/org/apache/solr/common/util/JavaBinCodec.java
public String readStr(FastInputStream dis) throws IOException { int sz = readSize(dis); if (bytes == null || bytes.length < sz) bytes = new byte[sz]; dis.readFully(bytes, 0, sz); arr.reset(); ByteUtils.UTF8toUTF16(bytes, 0, sz, arr); return arr.toString(); }
// in solrj/src/java/org/apache/solr/common/util/JavaBinCodec.java
public void writeInt(int val) throws IOException { if (val > 0) { int b = SINT | (val & 0x0f); if (val >= 0x0f) { b |= 0x10; daos.writeByte(b); writeVInt(val >>> 4, daos); } else { daos.writeByte(b); } } else { daos.writeByte(INT); daos.writeInt(val); } }
// in solrj/src/java/org/apache/solr/common/util/JavaBinCodec.java
public int readSmallInt(FastInputStream dis) throws IOException { int v = tagByte & 0x0F; if ((tagByte & 0x10) != 0) v = (readVInt(dis) << 4) | v; return v; }
// in solrj/src/java/org/apache/solr/common/util/JavaBinCodec.java
public void writeLong(long val) throws IOException { if ((val & 0xff00000000000000L) == 0) { int b = SLONG | ((int) val & 0x0f); if (val >= 0x0f) { b |= 0x10; daos.writeByte(b); writeVLong(val >>> 4, daos); } else { daos.writeByte(b); } } else { daos.writeByte(LONG); daos.writeLong(val); } }
// in solrj/src/java/org/apache/solr/common/util/JavaBinCodec.java
public long readSmallLong(FastInputStream dis) throws IOException { long v = tagByte & 0x0F; if ((tagByte & 0x10) != 0) v = (readVLong(dis) << 4) | v; return v; }
// in solrj/src/java/org/apache/solr/common/util/JavaBinCodec.java
public void writeFloat(float val) throws IOException { daos.writeByte(FLOAT); daos.writeFloat(val); }
// in solrj/src/java/org/apache/solr/common/util/JavaBinCodec.java
public boolean writePrimitive(Object val) throws IOException { if (val == null) { daos.writeByte(NULL); return true; } else if (val instanceof String) { writeStr((String) val); return true; } else if (val instanceof Number) { if (val instanceof Integer) { writeInt(((Integer) val).intValue()); return true; } else if (val instanceof Long) { writeLong(((Long) val).longValue()); return true; } else if (val instanceof Float) { writeFloat(((Float) val).floatValue()); return true; } else if (val instanceof Double) { daos.writeByte(DOUBLE); daos.writeDouble(((Double) val).doubleValue()); return true; } else if (val instanceof Byte) { daos.writeByte(BYTE); daos.writeByte(((Byte) val).intValue()); return true; } else if (val instanceof Short) { daos.writeByte(SHORT); daos.writeShort(((Short) val).intValue()); return true; } return false; } else if (val instanceof Date) { daos.writeByte(DATE); daos.writeLong(((Date) val).getTime()); return true; } else if (val instanceof Boolean) { if ((Boolean) val) daos.writeByte(BOOL_TRUE); else daos.writeByte(BOOL_FALSE); return true; } else if (val instanceof byte[]) { writeByteArray((byte[]) val, 0, ((byte[]) val).length); return true; } else if (val instanceof ByteBuffer) { ByteBuffer buf = (ByteBuffer) val; writeByteArray(buf.array(),buf.position(),buf.limit() - buf.position()); return true; } else if (val == END_OBJ) { writeTag(END); return true; } return false; }
// in solrj/src/java/org/apache/solr/common/util/JavaBinCodec.java
public void writeMap(Map<?,?> val) throws IOException { writeTag(MAP, val.size()); for (Map.Entry<?,?> entry : val.entrySet()) { Object key = entry.getKey(); if (key instanceof String) { writeExternString((String) key); } else { writeVal(key); } writeVal(entry.getValue()); } }
// in solrj/src/java/org/apache/solr/common/util/JavaBinCodec.java
public int readSize(FastInputStream in) throws IOException { int sz = tagByte & 0x1f; if (sz == 0x1f) sz += readVInt(in); return sz; }
// in solrj/src/java/org/apache/solr/common/util/JavaBinCodec.java
public static void writeVInt(int i, FastOutputStream out) throws IOException { while ((i & ~0x7F) != 0) { out.writeByte((byte) ((i & 0x7f) | 0x80)); i >>>= 7; } out.writeByte((byte) i); }
// in solrj/src/java/org/apache/solr/common/util/JavaBinCodec.java
public static int readVInt(FastInputStream in) throws IOException { byte b = in.readByte(); int i = b & 0x7F; for (int shift = 7; (b & 0x80) != 0; shift += 7) { b = in.readByte(); i |= (b & 0x7F) << shift; }
// in solrj/src/java/org/apache/solr/common/util/JavaBinCodec.java
public static void writeVLong(long i, FastOutputStream out) throws IOException { while ((i & ~0x7F) != 0) { out.writeByte((byte) ((i & 0x7f) | 0x80)); i >>>= 7; } out.writeByte((byte) i); }
// in solrj/src/java/org/apache/solr/common/util/JavaBinCodec.java
public static long readVLong(FastInputStream in) throws IOException { byte b = in.readByte(); long i = b & 0x7F; for (int shift = 7; (b & 0x80) != 0; shift += 7) { b = in.readByte(); i |= (long) (b & 0x7F) << shift; }
// in solrj/src/java/org/apache/solr/common/util/JavaBinCodec.java
public void writeExternString(String s) throws IOException { if (s == null) { writeTag(NULL); return; } Integer idx = stringsMap == null ? null : stringsMap.get(s); if (idx == null) idx = 0; writeTag(EXTERN_STRING, idx); if (idx == 0) { writeStr(s); if (stringsMap == null) stringsMap = new HashMap<String, Integer>(); stringsMap.put(s, ++stringsCount); } }
// in solrj/src/java/org/apache/solr/common/util/JavaBinCodec.java
public String readExternString(FastInputStream fis) throws IOException { int idx = readSize(fis); if (idx != 0) {// idx != 0 is the index of the extern string return stringsList.get(idx - 1); } else {// idx == 0 means it has a string value String s = (String) readVal(fis); if (stringsList == null) stringsList = new ArrayList<String>(); stringsList.add(s); return s; } }
// in solrj/src/java/org/apache/solr/common/util/DateUtil.java
public static Calendar formatDate(Date date, Calendar cal, Appendable out) throws IOException { // using a stringBuilder for numbers can be nice since // a temporary string isn't used (it's added directly to the // builder's buffer. StringBuilder sb = out instanceof StringBuilder ? (StringBuilder)out : new StringBuilder(); if (cal==null) cal = Calendar.getInstance(TimeZone.getTimeZone("GMT"), Locale.US); cal.setTime(date); int i = cal.get(Calendar.YEAR); sb.append(i); sb.append('-'); i = cal.get(Calendar.MONTH) + 1; // 0 based, so add 1 if (i<10) sb.append('0'); sb.append(i); sb.append('-'); i=cal.get(Calendar.DAY_OF_MONTH); if (i<10) sb.append('0'); sb.append(i); sb.append('T'); i=cal.get(Calendar.HOUR_OF_DAY); // 24 hour time format if (i<10) sb.append('0'); sb.append(i); sb.append(':'); i=cal.get(Calendar.MINUTE); if (i<10) sb.append('0'); sb.append(i); sb.append(':'); i=cal.get(Calendar.SECOND); if (i<10) sb.append('0'); sb.append(i); i=cal.get(Calendar.MILLISECOND); if (i != 0) { sb.append('.'); if (i<100) sb.append('0'); if (i<10) sb.append('0'); sb.append(i); // handle canonical format specifying fractional // seconds shall not end in '0'. Given the slowness of // integer div/mod, simply checking the last character // is probably the fastest way to check. int lastIdx = sb.length()-1; if (sb.charAt(lastIdx)=='0') { lastIdx--; if (sb.charAt(lastIdx)=='0') { lastIdx--; } sb.setLength(lastIdx+1); } } sb.append('Z'); if (out != sb) out.append(sb); return cal; }
// in solrj/src/java/org/apache/solr/client/solrj/request/DirectXmlRequest.java
Override public UpdateResponse process( SolrServer server ) throws SolrServerException, IOException { long startTime = System.currentTimeMillis(); UpdateResponse res = new UpdateResponse(); res.setResponse( server.request( this ) ); res.setElapsedTime( System.currentTimeMillis()-startTime ); return res; }
// in solrj/src/java/org/apache/solr/client/solrj/request/SolrPing.java
Override public SolrPingResponse process( SolrServer server ) throws SolrServerException, IOException { long startTime = System.currentTimeMillis(); SolrPingResponse res = new SolrPingResponse(); res.setResponse( server.request( this ) ); res.setElapsedTime( System.currentTimeMillis()-startTime ); return res; }
// in solrj/src/java/org/apache/solr/client/solrj/request/AbstractUpdateRequest.java
Override public UpdateResponse process( SolrServer server ) throws SolrServerException, IOException { long startTime = System.currentTimeMillis(); UpdateResponse res = new UpdateResponse(); res.setResponse( server.request( this ) ); res.setElapsedTime( System.currentTimeMillis()-startTime ); return res; }
// in solrj/src/java/org/apache/solr/client/solrj/request/ContentStreamUpdateRequest.java
Override public Collection<ContentStream> getContentStreams() throws IOException { return contentStreams; }
// in solrj/src/java/org/apache/solr/client/solrj/request/ContentStreamUpdateRequest.java
public void addFile(File file, String contentType) throws IOException { ContentStreamBase cs = new ContentStreamBase.FileStream(file); cs.setContentType(contentType); addContentStream(cs); }
// in solrj/src/java/org/apache/solr/client/solrj/request/FieldAnalysisRequest.java
Override public Collection<ContentStream> getContentStreams() throws IOException { return null; }
// in solrj/src/java/org/apache/solr/client/solrj/request/FieldAnalysisRequest.java
Override public FieldAnalysisResponse process(SolrServer server) throws SolrServerException, IOException { if (fieldTypes == null && fieldNames == null) { throw new IllegalStateException("At least one field type or field name need to be specified"); } if (fieldValue == null) { throw new IllegalStateException("The field value must be set"); } long startTime = System.currentTimeMillis(); FieldAnalysisResponse res = new FieldAnalysisResponse(); res.setResponse(server.request(this)); res.setElapsedTime(System.currentTimeMillis() - startTime); return res; }
// in solrj/src/java/org/apache/solr/client/solrj/request/RequestWriter.java
public Collection<ContentStream> getContentStreams(SolrRequest req) throws IOException { if (req instanceof UpdateRequest) { UpdateRequest updateRequest = (UpdateRequest) req; if (isEmpty(updateRequest)) return null; List<ContentStream> l = new ArrayList<ContentStream>(); l.add(new LazyContentStream(updateRequest)); return l; } return req.getContentStreams(); }
// in solrj/src/java/org/apache/solr/client/solrj/request/RequestWriter.java
public ContentStream getContentStream(UpdateRequest req) throws IOException { return new ContentStreamBase.StringStream(req.getXML()); }
// in solrj/src/java/org/apache/solr/client/solrj/request/RequestWriter.java
public void write(SolrRequest request, OutputStream os) throws IOException { if (request instanceof UpdateRequest) { UpdateRequest updateRequest = (UpdateRequest) request; OutputStreamWriter writer = new OutputStreamWriter(os, UTF_8); updateRequest.writeXML(writer); writer.flush(); } }
// in solrj/src/java/org/apache/solr/client/solrj/request/RequestWriter.java
public InputStream getStream() throws IOException { return getDelegate().getStream(); }
// in solrj/src/java/org/apache/solr/client/solrj/request/RequestWriter.java
public Reader getReader() throws IOException { return getDelegate().getReader(); }
// in solrj/src/java/org/apache/solr/client/solrj/request/RequestWriter.java
public void writeTo(OutputStream os) throws IOException { write(req, os); }
// in solrj/src/java/org/apache/solr/client/solrj/request/UpdateRequest.java
Override public Collection<ContentStream> getContentStreams() throws IOException { return ClientUtils.toContentStreams( getXML(), ClientUtils.TEXT_XML ); }
// in solrj/src/java/org/apache/solr/client/solrj/request/UpdateRequest.java
public String getXML() throws IOException { StringWriter writer = new StringWriter(); writeXML( writer ); writer.flush(); // If action is COMMIT or OPTIMIZE, it is sent with params String xml = writer.toString(); //System.out.println( "SEND:"+xml ); return (xml.length() > 0) ? xml : null; }
// in solrj/src/java/org/apache/solr/client/solrj/request/UpdateRequest.java
public void writeXML( Writer writer ) throws IOException { if( (documents != null && documents.size() > 0) || docIterator != null) { if( commitWithin > 0 ) { writer.write("<add commitWithin=\""+commitWithin+"\">"); } else { writer.write("<add>"); } if(documents != null) { for (SolrInputDocument doc : documents) { if (doc != null) { ClientUtils.writeXML(doc, writer); } } } if (docIterator != null) { while (docIterator.hasNext()) { SolrInputDocument doc = docIterator.next(); if (doc != null) { ClientUtils.writeXML(doc, writer); } } } writer.write("</add>"); } // Add the delete commands boolean deleteI = deleteById != null && deleteById.size() > 0; boolean deleteQ = deleteQuery != null && deleteQuery.size() > 0; if( deleteI || deleteQ ) { if(commitWithin>0) { writer.append( "<delete commitWithin=\"" + commitWithin + "\">" ); } else { writer.append( "<delete>" ); } if( deleteI ) { for( String id : deleteById ) { writer.append( "<id>" ); XML.escapeCharData( id, writer ); writer.append( "</id>" ); } } if( deleteQ ) { for( String q : deleteQuery ) { writer.append( "<query>" ); XML.escapeCharData( q, writer ); writer.append( "</query>" ); } } writer.append( "</delete>" ); } }
// in solrj/src/java/org/apache/solr/client/solrj/request/LukeRequest.java
Override public LukeResponse process( SolrServer server ) throws SolrServerException, IOException { long startTime = System.currentTimeMillis(); LukeResponse res = new LukeResponse(); res.setResponse( server.request( this ) ); res.setElapsedTime( System.currentTimeMillis()-startTime ); return res; }
// in solrj/src/java/org/apache/solr/client/solrj/request/UpdateRequestExt.java
Override public Collection<ContentStream> getContentStreams() throws IOException { return ClientUtils.toContentStreams(getXML(), ClientUtils.TEXT_XML); }
// in solrj/src/java/org/apache/solr/client/solrj/request/UpdateRequestExt.java
public String getXML() throws IOException { StringWriter writer = new StringWriter(); writeXML(writer); writer.flush(); String xml = writer.toString(); return (xml.length() > 0) ? xml : null; }
// in solrj/src/java/org/apache/solr/client/solrj/request/UpdateRequestExt.java
public void writeXML(Writer writer) throws IOException { List<List<SolrDoc>> getDocLists = getDocLists(documents); for (List<SolrDoc> docs : getDocLists) { if ((docs != null && docs.size() > 0)) { SolrDoc firstDoc = docs.get(0); int commitWithin = firstDoc.commitWithin != -1 ? firstDoc.commitWithin : this.commitWithin; boolean overwrite = firstDoc.overwrite; if (commitWithin > -1 || overwrite != true) { writer.write("<add commitWithin=\"" + commitWithin + "\" " + "overwrite=\"" + overwrite + "\">"); } else { writer.write("<add>"); } if (documents != null) { for (SolrDoc doc : documents) { if (doc != null) { ClientUtils.writeXML(doc.document, writer); } } } writer.write("</add>"); } } // Add the delete commands boolean deleteI = deleteById != null && deleteById.size() > 0; boolean deleteQ = deleteQuery != null && deleteQuery.size() > 0; if (deleteI || deleteQ) { writer.append("<delete>"); if (deleteI) { for (Map.Entry<String,Long> entry : deleteById.entrySet()) { writer.append("<id"); Long version = entry.getValue(); if (version != null) { writer.append(" version=\"" + version + "\""); } writer.append(">"); XML.escapeCharData(entry.getKey(), writer); writer.append("</id>"); } } if (deleteQ) { for (String q : deleteQuery) { writer.append("<query>"); XML.escapeCharData(q, writer); writer.append("</query>"); } } writer.append("</delete>"); } }
// in solrj/src/java/org/apache/solr/client/solrj/request/DocumentAnalysisRequest.java
Override public Collection<ContentStream> getContentStreams() throws IOException { return ClientUtils.toContentStreams(getXML(), ClientUtils.TEXT_XML); }
// in solrj/src/java/org/apache/solr/client/solrj/request/DocumentAnalysisRequest.java
Override public DocumentAnalysisResponse process(SolrServer server) throws SolrServerException, IOException { long startTime = System.currentTimeMillis(); DocumentAnalysisResponse res = new DocumentAnalysisResponse(); res.setResponse(server.request(this)); res.setElapsedTime(System.currentTimeMillis() - startTime); return res; }
// in solrj/src/java/org/apache/solr/client/solrj/request/DocumentAnalysisRequest.java
String getXML() throws IOException { StringWriter writer = new StringWriter(); writer.write("<docs>"); for (SolrInputDocument document : documents) { ClientUtils.writeXML(document, writer); } writer.write("</docs>"); writer.flush(); String xml = writer.toString(); return (xml.length() > 0) ? xml : null; }
// in solrj/src/java/org/apache/solr/client/solrj/request/JavaBinUpdateRequestCodec.java
public void marshal(UpdateRequest updateRequest, OutputStream os) throws IOException { NamedList nl = new NamedList(); NamedList params = solrParamsToNamedList(updateRequest.getParams()); if (updateRequest.getCommitWithin() != -1) { params.add("commitWithin", updateRequest.getCommitWithin()); } Iterator<SolrInputDocument> docIter = null; if (updateRequest.getDocuments() != null) { docIter = updateRequest.getDocuments().iterator(); } if(updateRequest.getDocIterator() != null){ docIter = updateRequest.getDocIterator(); } nl.add("params", params);// 0: params nl.add("delById", updateRequest.getDeleteById()); nl.add("delByQ", updateRequest.getDeleteQuery()); nl.add("docs", docIter); JavaBinCodec codec = new JavaBinCodec(); codec.marshal(nl, os); }
// in solrj/src/java/org/apache/solr/client/solrj/request/JavaBinUpdateRequestCodec.java
public UpdateRequest unmarshal(InputStream is, final StreamingUpdateHandler handler) throws IOException { final UpdateRequest updateRequest = new UpdateRequest(); List<List<NamedList>> doclist; List<String> delById; List<String> delByQ; final NamedList[] namedList = new NamedList[1]; JavaBinCodec codec = new JavaBinCodec() { // NOTE: this only works because this is an anonymous inner class // which will only ever be used on a single stream -- if this class // is ever refactored, this will not work. private boolean seenOuterMostDocIterator = false; @Override public NamedList readNamedList(FastInputStream dis) throws IOException { int sz = readSize(dis); NamedList nl = new NamedList(); if (namedList[0] == null) { namedList[0] = nl; } for (int i = 0; i < sz; i++) { String name = (String) readVal(dis); Object val = readVal(dis); nl.add(name, val); } return nl; } @Override public List readIterator(FastInputStream fis) throws IOException { // default behavior for reading any regular Iterator in the stream if (seenOuterMostDocIterator) return super.readIterator(fis); // special treatment for first outermost Iterator // (the list of documents) seenOuterMostDocIterator = true; return readOuterMostDocIterator(fis); } private List readOuterMostDocIterator(FastInputStream fis) throws IOException { NamedList params = (NamedList) namedList[0].getVal(0); updateRequest.setParams(new ModifiableSolrParams(SolrParams.toSolrParams(params))); if (handler == null) return super.readIterator(fis); while (true) { Object o = readVal(fis); if (o == END_OBJ) break; SolrInputDocument sdoc = null; if (o instanceof List) { sdoc = listToSolrInputDocument((List<NamedList>) o); } else if (o instanceof NamedList) { UpdateRequest req = new UpdateRequest(); req.setParams(new ModifiableSolrParams(SolrParams.toSolrParams((NamedList) o))); handler.update(null, req); } else { sdoc = (SolrInputDocument) o; } handler.update(sdoc, updateRequest); } return Collections.EMPTY_LIST; } }; codec.unmarshal(is); // NOTE: if the update request contains only delete commands the params // must be loaded now if(updateRequest.getParams()==null) { NamedList params = (NamedList) namedList[0].get("params"); if(params!=null) { updateRequest.setParams(new ModifiableSolrParams(SolrParams.toSolrParams(params))); } } delById = (List<String>) namedList[0].get("delById"); delByQ = (List<String>) namedList[0].get("delByQ"); doclist = (List) namedList[0].get("docs"); if (doclist != null && !doclist.isEmpty()) { List<SolrInputDocument> solrInputDocs = new ArrayList<SolrInputDocument>(); for (Object o : doclist) { if (o instanceof List) { solrInputDocs.add(listToSolrInputDocument((List<NamedList>)o)); } else { solrInputDocs.add((SolrInputDocument)o); } } updateRequest.add(solrInputDocs); } if (delById != null) { for (String s : delById) { updateRequest.deleteById(s); } } if (delByQ != null) { for (String s : delByQ) { updateRequest.deleteByQuery(s); } } return updateRequest; }
// in solrj/src/java/org/apache/solr/client/solrj/request/JavaBinUpdateRequestCodec.java
Override public NamedList readNamedList(FastInputStream dis) throws IOException { int sz = readSize(dis); NamedList nl = new NamedList(); if (namedList[0] == null) { namedList[0] = nl; } for (int i = 0; i < sz; i++) { String name = (String) readVal(dis); Object val = readVal(dis); nl.add(name, val); } return nl; }
// in solrj/src/java/org/apache/solr/client/solrj/request/JavaBinUpdateRequestCodec.java
Override public List readIterator(FastInputStream fis) throws IOException { // default behavior for reading any regular Iterator in the stream if (seenOuterMostDocIterator) return super.readIterator(fis); // special treatment for first outermost Iterator // (the list of documents) seenOuterMostDocIterator = true; return readOuterMostDocIterator(fis); }
// in solrj/src/java/org/apache/solr/client/solrj/request/JavaBinUpdateRequestCodec.java
private List readOuterMostDocIterator(FastInputStream fis) throws IOException { NamedList params = (NamedList) namedList[0].getVal(0); updateRequest.setParams(new ModifiableSolrParams(SolrParams.toSolrParams(params))); if (handler == null) return super.readIterator(fis); while (true) { Object o = readVal(fis); if (o == END_OBJ) break; SolrInputDocument sdoc = null; if (o instanceof List) { sdoc = listToSolrInputDocument((List<NamedList>) o); } else if (o instanceof NamedList) { UpdateRequest req = new UpdateRequest(); req.setParams(new ModifiableSolrParams(SolrParams.toSolrParams((NamedList) o))); handler.update(null, req); } else { sdoc = (SolrInputDocument) o; } handler.update(sdoc, updateRequest); } return Collections.EMPTY_LIST; }
// in solrj/src/java/org/apache/solr/client/solrj/request/CoreAdminRequest.java
Override public Collection<ContentStream> getContentStreams() throws IOException { return null; }
// in solrj/src/java/org/apache/solr/client/solrj/request/CoreAdminRequest.java
Override public CoreAdminResponse process(SolrServer server) throws SolrServerException, IOException { long startTime = System.currentTimeMillis(); CoreAdminResponse res = new CoreAdminResponse(); res.setResponse( server.request( this ) ); res.setElapsedTime( System.currentTimeMillis()-startTime ); return res; }
// in solrj/src/java/org/apache/solr/client/solrj/request/CoreAdminRequest.java
public static CoreAdminResponse reloadCore( String name, SolrServer server ) throws SolrServerException, IOException { CoreAdminRequest req = new CoreAdminRequest(); req.setCoreName( name ); req.setAction( CoreAdminAction.RELOAD ); return req.process( server ); }
// in solrj/src/java/org/apache/solr/client/solrj/request/CoreAdminRequest.java
public static CoreAdminResponse unloadCore( String name, SolrServer server ) throws SolrServerException, IOException { return unloadCore(name, false, server); }
// in solrj/src/java/org/apache/solr/client/solrj/request/CoreAdminRequest.java
public static CoreAdminResponse unloadCore( String name, boolean deleteIndex, SolrServer server ) throws SolrServerException, IOException { Unload req = new Unload(deleteIndex); req.setCoreName( name ); return req.process( server ); }
// in solrj/src/java/org/apache/solr/client/solrj/request/CoreAdminRequest.java
public static CoreAdminResponse renameCore(String coreName, String newName, SolrServer server ) throws SolrServerException, IOException { CoreAdminRequest req = new CoreAdminRequest(); req.setCoreName(coreName); req.setOtherCoreName(newName); req.setAction( CoreAdminAction.RENAME ); return req.process( server ); }
// in solrj/src/java/org/apache/solr/client/solrj/request/CoreAdminRequest.java
public static CoreAdminResponse getStatus( String name, SolrServer server ) throws SolrServerException, IOException { CoreAdminRequest req = new CoreAdminRequest(); req.setCoreName( name ); req.setAction( CoreAdminAction.STATUS ); return req.process( server ); }
// in solrj/src/java/org/apache/solr/client/solrj/request/CoreAdminRequest.java
public static CoreAdminResponse createCore( String name, String instanceDir, SolrServer server ) throws SolrServerException, IOException { return CoreAdminRequest.createCore(name, instanceDir, server, null, null); }
// in solrj/src/java/org/apache/solr/client/solrj/request/CoreAdminRequest.java
public static CoreAdminResponse createCore( String name, String instanceDir, SolrServer server, String configFile, String schemaFile ) throws SolrServerException, IOException { CoreAdminRequest.Create req = new CoreAdminRequest.Create(); req.setCoreName( name ); req.setInstanceDir(instanceDir); if(configFile != null){ req.setConfigName(configFile); } if(schemaFile != null){ req.setSchemaName(schemaFile); } return req.process( server ); }
// in solrj/src/java/org/apache/solr/client/solrj/request/CoreAdminRequest.java
public static CoreAdminResponse persist(String fileName, SolrServer server) throws SolrServerException, IOException { CoreAdminRequest.Persist req = new CoreAdminRequest.Persist(); req.setFileName(fileName); return req.process(server); }
// in solrj/src/java/org/apache/solr/client/solrj/request/CoreAdminRequest.java
public static CoreAdminResponse mergeIndexes(String name, String[] indexDirs, String[] srcCores, SolrServer server) throws SolrServerException, IOException { CoreAdminRequest.MergeIndexes req = new CoreAdminRequest.MergeIndexes(); req.setCoreName(name); req.setIndexDirs(Arrays.asList(indexDirs)); req.setSrcCores(Arrays.asList(srcCores)); return req.process(server); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/ConcurrentUpdateSolrServer.java
public void run() { runnerLock.lock(); // info is ok since this should only happen once for each thread log.info("starting runner: {}", this); HttpPost method = null; HttpResponse response = null; try { while (!queue.isEmpty()) { try { final UpdateRequest updateRequest = queue.poll(250, TimeUnit.MILLISECONDS); if (updateRequest == null) break; String contentType = server.requestWriter.getUpdateContentType(); final boolean isXml = ClientUtils.TEXT_XML.equals(contentType); final ModifiableSolrParams origParams = new ModifiableSolrParams(updateRequest.getParams()); EntityTemplate template = new EntityTemplate(new ContentProducer() { public void writeTo(OutputStream out) throws IOException { try { if (isXml) { out.write("<stream>".getBytes("UTF-8")); // can be anything } UpdateRequest req = updateRequest; while (req != null) { SolrParams currentParams = new ModifiableSolrParams(req.getParams()); if (!origParams.toNamedList().equals(currentParams.toNamedList())) { queue.add(req); // params are different, push back to queue break; } server.requestWriter.write(req, out); if (isXml) { // check for commit or optimize SolrParams params = req.getParams(); if (params != null) { String fmt = null; if (params.getBool(UpdateParams.OPTIMIZE, false)) { fmt = "<optimize waitSearcher=\"%s\" waitFlush=\"%s\" />"; } else if (params.getBool(UpdateParams.COMMIT, false)) { fmt = "<commit waitSearcher=\"%s\" waitFlush=\"%s\" />"; } if (fmt != null) { byte[] content = String.format( fmt, params.getBool(UpdateParams.WAIT_SEARCHER, false) + "").getBytes("UTF-8"); out.write(content); } } } out.flush(); req = queue.poll(250, TimeUnit.MILLISECONDS); } if (isXml) { out.write("</stream>".getBytes("UTF-8")); } } catch (InterruptedException e) { e.printStackTrace(); } } }); // The parser 'wt=' and 'version=' params are used instead of the // original params ModifiableSolrParams requestParams = new ModifiableSolrParams(origParams); requestParams.set(CommonParams.WT, server.parser.getWriterType()); requestParams.set(CommonParams.VERSION, server.parser.getVersion()); method = new HttpPost(server.getBaseURL() + "/update" + ClientUtils.toQueryString(requestParams, false)); method.setEntity(template); method.addHeader("User-Agent", HttpSolrServer.AGENT); method.addHeader("Content-Type", contentType); response = server.getHttpClient().execute(method); int statusCode = response.getStatusLine().getStatusCode(); log.info("Status for: " + updateRequest.getDocuments().get(0).getFieldValue("id") + " is " + statusCode); if (statusCode != HttpStatus.SC_OK) { StringBuilder msg = new StringBuilder(); msg.append(response.getStatusLine().getReasonPhrase()); msg.append("\n\n"); msg.append("\n\n"); msg.append("request: ").append(method.getURI()); handleError(new Exception(msg.toString())); } } finally { try { if (response != null) { response.getEntity().getContent().close(); } } catch (Exception ex) { } } } } catch (Throwable e) { handleError(e); } finally { // remove it from the list of running things unless we are the last // runner and the queue is full... // in which case, the next queue.put() would block and there would be no // runners to handle it. // This case has been further handled by using offer instead of put, and // using a retry loop // to avoid blocking forever (see request()). synchronized (runners) { if (runners.size() == 1 && queue.remainingCapacity() == 0) { // keep this runner alive scheduler.execute(this); } else { runners.remove(this); } } log.info("finished: {}", this); runnerLock.unlock(); } }
// in solrj/src/java/org/apache/solr/client/solrj/impl/ConcurrentUpdateSolrServer.java
public void writeTo(OutputStream out) throws IOException { try { if (isXml) { out.write("<stream>".getBytes("UTF-8")); // can be anything } UpdateRequest req = updateRequest; while (req != null) { SolrParams currentParams = new ModifiableSolrParams(req.getParams()); if (!origParams.toNamedList().equals(currentParams.toNamedList())) { queue.add(req); // params are different, push back to queue break; } server.requestWriter.write(req, out); if (isXml) { // check for commit or optimize SolrParams params = req.getParams(); if (params != null) { String fmt = null; if (params.getBool(UpdateParams.OPTIMIZE, false)) { fmt = "<optimize waitSearcher=\"%s\" waitFlush=\"%s\" />"; } else if (params.getBool(UpdateParams.COMMIT, false)) { fmt = "<commit waitSearcher=\"%s\" waitFlush=\"%s\" />"; } if (fmt != null) { byte[] content = String.format( fmt, params.getBool(UpdateParams.WAIT_SEARCHER, false) + "").getBytes("UTF-8"); out.write(content); } } } out.flush(); req = queue.poll(250, TimeUnit.MILLISECONDS); } if (isXml) { out.write("</stream>".getBytes("UTF-8")); } } catch (InterruptedException e) { e.printStackTrace(); } }
// in solrj/src/java/org/apache/solr/client/solrj/impl/ConcurrentUpdateSolrServer.java
public NamedList<Object> request(final SolrRequest request) throws SolrServerException, IOException { if (!(request instanceof UpdateRequest)) { return server.request(request); } UpdateRequest req = (UpdateRequest) request; // this happens for commit... if (req.getDocuments() == null || req.getDocuments().isEmpty()) { blockUntilFinished(); return server.request(request); } SolrParams params = req.getParams(); if (params != null) { // check if it is waiting for the searcher if (params.getBool(UpdateParams.WAIT_SEARCHER, false)) { log.info("blocking for commit/optimize"); blockUntilFinished(); // empty the queue return server.request(request); } } try { CountDownLatch tmpLock = lock; if (tmpLock != null) { tmpLock.await(); } boolean success = queue.offer(req); for (;;) { synchronized (runners) { if (runners.isEmpty() || (queue.remainingCapacity() < queue.size() // queue // is // half // full // and // we // can // add // more // runners && runners.size() < threadCount)) { // We need more runners, so start a new one. Runner r = new Runner(); runners.add(r); scheduler.execute(r); } else { // break out of the retry loop if we added the element to the queue // successfully, *and* // while we are still holding the runners lock to prevent race // conditions. // race conditions. if (success) break; } } // Retry to add to the queue w/o the runners lock held (else we risk // temporary deadlock) // This retry could also fail because // 1) existing runners were not able to take off any new elements in the // queue // 2) the queue was filled back up since our last try // If we succeed, the queue may have been completely emptied, and all // runners stopped. // In all cases, we should loop back to the top to see if we need to // start more runners. // if (!success) { success = queue.offer(req, 100, TimeUnit.MILLISECONDS); } } } catch (InterruptedException e) { log.error("interrupted", e); throw new IOException(e.getLocalizedMessage()); } // RETURN A DUMMY result NamedList<Object> dummy = new NamedList<Object>(); dummy.add("NOTE", "the request is processed in a background stream"); return dummy; }
// in solrj/src/java/org/apache/solr/client/solrj/impl/BinaryRequestWriter.java
Override public Collection<ContentStream> getContentStreams(SolrRequest req) throws IOException { if (req instanceof UpdateRequest) { UpdateRequest updateRequest = (UpdateRequest) req; if (isNull(updateRequest.getDocuments()) && isNull(updateRequest.getDeleteById()) && isNull(updateRequest.getDeleteQuery()) && (updateRequest.getDocIterator() == null) ) { return null; } List<ContentStream> l = new ArrayList<ContentStream>(); l.add(new LazyContentStream(updateRequest)); return l; } else { return super.getContentStreams(req); } }
// in solrj/src/java/org/apache/solr/client/solrj/impl/BinaryRequestWriter.java
Override public ContentStream getContentStream(final UpdateRequest request) throws IOException { final BAOS baos = new BAOS(); new JavaBinUpdateRequestCodec().marshal(request, baos); return new ContentStream() { public String getName() { return null; } public String getSourceInfo() { return "javabin"; } public String getContentType() { return "application/javabin"; } public Long getSize() // size if we know it, otherwise null { return new Long(baos.size()); } public InputStream getStream() throws IOException { return new ByteArrayInputStream(baos.getbuf(), 0, baos.size()); } public Reader getReader() throws IOException { throw new RuntimeException("No reader available . this is a binarystream"); } }; }
// in solrj/src/java/org/apache/solr/client/solrj/impl/BinaryRequestWriter.java
public InputStream getStream() throws IOException { return new ByteArrayInputStream(baos.getbuf(), 0, baos.size()); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/BinaryRequestWriter.java
public Reader getReader() throws IOException { throw new RuntimeException("No reader available . this is a binarystream"); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/BinaryRequestWriter.java
Override public void write(SolrRequest request, OutputStream os) throws IOException { if (request instanceof UpdateRequest) { UpdateRequest updateRequest = (UpdateRequest) request; new JavaBinUpdateRequestCodec().marshal(updateRequest, os); } }
// in solrj/src/java/org/apache/solr/client/solrj/impl/CloudSolrServer.java
Override public NamedList<Object> request(SolrRequest request) throws SolrServerException, IOException { connect(); // TODO: if you can hash here, you could favor the shard leader CloudState cloudState = zkStateReader.getCloudState(); SolrParams reqParams = request.getParams(); if (reqParams == null) { reqParams = new ModifiableSolrParams(); } String collection = reqParams.get("collection", defaultCollection); if (collection == null) { throw new SolrServerException("No collection param specified on request and no default collection has been set."); } // Extract each comma separated collection name and store in a List. List<String> collectionList = StrUtils.splitSmart(collection, ",", true); // Retrieve slices from the cloud state and, for each collection specified, // add it to the Map of slices. Map<String,Slice> slices = new HashMap<String,Slice>(); for (int i = 0; i < collectionList.size(); i++) { String coll= collectionList.get(i); ClientUtils.appendMap(coll, slices, cloudState.getSlices(coll)); } Set<String> liveNodes = cloudState.getLiveNodes(); // IDEA: have versions on various things... like a global cloudState version // or shardAddressVersion (which only changes when the shards change) // to allow caching. // build a map of unique nodes // TODO: allow filtering by group, role, etc Map<String,ZkNodeProps> nodes = new HashMap<String,ZkNodeProps>(); List<String> urlList = new ArrayList<String>(); for (Slice slice : slices.values()) { for (ZkNodeProps nodeProps : slice.getShards().values()) { ZkCoreNodeProps coreNodeProps = new ZkCoreNodeProps(nodeProps); String node = coreNodeProps.getNodeName(); if (!liveNodes.contains(coreNodeProps.getNodeName()) || !coreNodeProps.getState().equals( ZkStateReader.ACTIVE)) continue; if (nodes.put(node, nodeProps) == null) { String url = coreNodeProps.getCoreUrl(); urlList.add(url); } } } Collections.shuffle(urlList, rand); //System.out.println("########################## MAKING REQUEST TO " + urlList); LBHttpSolrServer.Req req = new LBHttpSolrServer.Req(request, urlList); LBHttpSolrServer.Rsp rsp = lbServer.request(req); return rsp.getResponse(); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/StreamingBinaryResponseParser.java
Override public NamedList<Object> processResponse(InputStream body, String encoding) { try { JavaBinCodec codec = new JavaBinCodec() { @Override public SolrDocument readSolrDocument(FastInputStream dis) throws IOException { SolrDocument doc = super.readSolrDocument(dis); callback.streamSolrDocument( doc ); return null; } @Override public SolrDocumentList readSolrDocumentList(FastInputStream dis) throws IOException { SolrDocumentList solrDocs = new SolrDocumentList(); List list = (List) readVal(dis); solrDocs.setNumFound((Long) list.get(0)); solrDocs.setStart((Long) list.get(1)); solrDocs.setMaxScore((Float) list.get(2)); callback.streamDocListInfo( solrDocs.getNumFound(), solrDocs.getStart(), solrDocs.getMaxScore() ); // Read the Array tagByte = dis.readByte(); if( (tagByte >>> 5) != (ARR >>> 5) ) { throw new RuntimeException( "doclist must have an array" ); } int sz = readSize(dis); for (int i = 0; i < sz; i++) { // must be a SolrDocument readVal( dis ); } return solrDocs; } }; return (NamedList<Object>) codec.unmarshal(body); } catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "parsing error", e); } }
// in solrj/src/java/org/apache/solr/client/solrj/impl/StreamingBinaryResponseParser.java
Override public SolrDocument readSolrDocument(FastInputStream dis) throws IOException { SolrDocument doc = super.readSolrDocument(dis); callback.streamSolrDocument( doc ); return null; }
// in solrj/src/java/org/apache/solr/client/solrj/impl/StreamingBinaryResponseParser.java
Override public SolrDocumentList readSolrDocumentList(FastInputStream dis) throws IOException { SolrDocumentList solrDocs = new SolrDocumentList(); List list = (List) readVal(dis); solrDocs.setNumFound((Long) list.get(0)); solrDocs.setStart((Long) list.get(1)); solrDocs.setMaxScore((Float) list.get(2)); callback.streamDocListInfo( solrDocs.getNumFound(), solrDocs.getStart(), solrDocs.getMaxScore() ); // Read the Array tagByte = dis.readByte(); if( (tagByte >>> 5) != (ARR >>> 5) ) { throw new RuntimeException( "doclist must have an array" ); } int sz = readSize(dis); for (int i = 0; i < sz; i++) { // must be a SolrDocument readVal( dis ); } return solrDocs; }
// in solrj/src/java/org/apache/solr/client/solrj/impl/HttpClientUtil.java
Override public void process(HttpRequest request, HttpContext context) throws HttpException, IOException { if (!request.containsHeader("Accept-Encoding")) { request.addHeader("Accept-Encoding", "gzip, deflate"); } }
// in solrj/src/java/org/apache/solr/client/solrj/impl/HttpClientUtil.java
public void process(final HttpResponse response, final HttpContext context) throws HttpException, IOException { HttpEntity entity = response.getEntity(); Header ceheader = entity.getContentEncoding(); if (ceheader != null) { HeaderElement[] codecs = ceheader.getElements(); for (int i = 0; i < codecs.length; i++) { if (codecs[i].getName().equalsIgnoreCase("gzip")) { response .setEntity(new GzipDecompressingEntity(response.getEntity())); return; } if (codecs[i].getName().equalsIgnoreCase("deflate")) { response.setEntity(new DeflateDecompressingEntity(response .getEntity())); return; } } } }
// in solrj/src/java/org/apache/solr/client/solrj/impl/HttpClientUtil.java
public InputStream getContent() throws IOException, IllegalStateException { return new GZIPInputStream(wrappedEntity.getContent()); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/HttpClientUtil.java
public InputStream getContent() throws IOException, IllegalStateException { return new InflaterInputStream(wrappedEntity.getContent()); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/HttpSolrServer.java
Override public NamedList<Object> request(final SolrRequest request) throws SolrServerException, IOException { ResponseParser responseParser = request.getResponseParser(); if (responseParser == null) { responseParser = parser; } return request(request, responseParser); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/HttpSolrServer.java
public NamedList<Object> request(final SolrRequest request, final ResponseParser processor) throws SolrServerException, IOException { HttpRequestBase method = null; InputStream is = null; SolrParams params = request.getParams(); Collection<ContentStream> streams = requestWriter.getContentStreams(request); String path = requestWriter.getPath(request); if (path == null || !path.startsWith("/")) { path = DEFAULT_PATH; } ResponseParser parser = request.getResponseParser(); if (parser == null) { parser = this.parser; } // The parser 'wt=' and 'version=' params are used instead of the original // params ModifiableSolrParams wparams = new ModifiableSolrParams(params); wparams.set(CommonParams.WT, parser.getWriterType()); wparams.set(CommonParams.VERSION, parser.getVersion()); if (invariantParams != null) { wparams.add(invariantParams); } params = wparams; int tries = maxRetries + 1; try { while( tries-- > 0 ) { // Note: since we aren't do intermittent time keeping // ourselves, the potential non-timeout latency could be as // much as tries-times (plus scheduling effects) the given // timeAllowed. try { if( SolrRequest.METHOD.GET == request.getMethod() ) { if( streams != null ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "GET can't send streams!" ); } method = new HttpGet( baseUrl + path + ClientUtils.toQueryString( params, false ) ); } else if( SolrRequest.METHOD.POST == request.getMethod() ) { String url = baseUrl + path; boolean isMultipart = ( streams != null && streams.size() > 1 ); LinkedList<NameValuePair> postParams = new LinkedList<NameValuePair>(); if (streams == null || isMultipart) { HttpPost post = new HttpPost(url); post.setHeader("Content-Charset", "UTF-8"); if (!this.useMultiPartPost && !isMultipart) { post.addHeader("Content-Type", "application/x-www-form-urlencoded; charset=UTF-8"); } List<FormBodyPart> parts = new LinkedList<FormBodyPart>(); Iterator<String> iter = params.getParameterNamesIterator(); while (iter.hasNext()) { String p = iter.next(); String[] vals = params.getParams(p); if (vals != null) { for (String v : vals) { if (this.useMultiPartPost || isMultipart) { parts.add(new FormBodyPart(p, new StringBody(v, Charset.forName("UTF-8")))); } else { postParams.add(new BasicNameValuePair(p, v)); } } } } if (isMultipart) { for (ContentStream content : streams) { String contentType = content.getContentType(); if(contentType==null) { contentType = "application/octet-stream"; // default } parts.add(new FormBodyPart(content.getName(), new InputStreamBody( content.getStream(), contentType, content.getName()))); } } if (parts.size() > 0) { MultipartEntity entity = new MultipartEntity(HttpMultipartMode.STRICT); for(FormBodyPart p: parts) { entity.addPart(p); } post.setEntity(entity); } else { //not using multipart post.setEntity(new UrlEncodedFormEntity(postParams, "UTF-8")); } method = post; } // It is has one stream, it is the post body, put the params in the URL else { String pstr = ClientUtils.toQueryString(params, false); HttpPost post = new HttpPost(url + pstr); // Single stream as body // Using a loop just to get the first one final ContentStream[] contentStream = new ContentStream[1]; for (ContentStream content : streams) { contentStream[0] = content; break; } if (contentStream[0] instanceof RequestWriter.LazyContentStream) { post.setEntity(new InputStreamEntity(contentStream[0].getStream(), -1) { @Override public Header getContentType() { return new BasicHeader("Content-Type", contentStream[0].getContentType()); } @Override public boolean isRepeatable() { return false; } }); } else { post.setEntity(new InputStreamEntity(contentStream[0].getStream(), -1) { @Override public Header getContentType() { return new BasicHeader("Content-Type", contentStream[0].getContentType()); } @Override public boolean isRepeatable() { return false; } }); } method = post; } } else { throw new SolrServerException("Unsupported method: "+request.getMethod() ); } } catch( NoHttpResponseException r ) { method = null; if(is != null) { is.close(); } // If out of tries then just rethrow (as normal error). if (tries < 1) { throw r; } } }
// in solrj/src/java/org/apache/solr/client/solrj/impl/HttpSolrServer.java
public UpdateResponse add(Iterator<SolrInputDocument> docIterator) throws SolrServerException, IOException { UpdateRequest req = new UpdateRequest(); req.setDocIterator(docIterator); return req.process(this); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/HttpSolrServer.java
public UpdateResponse addBeans(final Iterator<?> beanIterator) throws SolrServerException, IOException { UpdateRequest req = new UpdateRequest(); req.setDocIterator(new Iterator<SolrInputDocument>() { public boolean hasNext() { return beanIterator.hasNext(); } public SolrInputDocument next() { Object o = beanIterator.next(); if (o == null) return null; return getBinder().toSolrInputDocument(o); } public void remove() { beanIterator.remove(); } }); return req.process(this); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
public Rsp request(Req req) throws SolrServerException, IOException { Rsp rsp = new Rsp(); Exception ex = null; List<ServerWrapper> skipped = new ArrayList<ServerWrapper>(req.getNumDeadServersToTry()); for (String serverStr : req.getServers()) { serverStr = normalize(serverStr); // if the server is currently a zombie, just skip to the next one ServerWrapper wrapper = zombieServers.get(serverStr); if (wrapper != null) { // System.out.println("ZOMBIE SERVER QUERIED: " + serverStr); if (skipped.size() < req.getNumDeadServersToTry()) skipped.add(wrapper); continue; } rsp.server = serverStr; HttpSolrServer server = makeServer(serverStr); try { rsp.rsp = server.request(req.getRequest()); return rsp; // SUCCESS } catch (SolrException e) { // we retry on 404 or 403 or 503 - you can see this on solr shutdown if (e.code() == 404 || e.code() == 403 || e.code() == 503 || e.code() == 500) { ex = addZombie(server, e); } else { // Server is alive but the request was likely malformed or invalid throw e; } // TODO: consider using below above - currently does cause a problem with distrib updates: // seems to match up against a failed forward to leader exception as well... // || e.getMessage().contains("java.net.SocketException") // || e.getMessage().contains("java.net.ConnectException") } catch (SocketException e) { ex = addZombie(server, e); } catch (SocketTimeoutException e) { ex = addZombie(server, e); } catch (SolrServerException e) { Throwable rootCause = e.getRootCause(); if (rootCause instanceof IOException) { ex = addZombie(server, e); } else { throw e; } } catch (Exception e) { throw new SolrServerException(e); } } // try the servers we previously skipped for (ServerWrapper wrapper : skipped) { try { rsp.rsp = wrapper.solrServer.request(req.getRequest()); zombieServers.remove(wrapper.getKey()); return rsp; // SUCCESS } catch (SolrException e) { // we retry on 404 or 403 or 503 - you can see this on solr shutdown if (e.code() == 404 || e.code() == 403 || e.code() == 503 || e.code() == 500) { ex = e; // already a zombie, no need to re-add } else { // Server is alive but the request was malformed or invalid zombieServers.remove(wrapper.getKey()); throw e; } } catch (SocketException e) { ex = e; } catch (SocketTimeoutException e) { ex = e; } catch (SolrServerException e) { Throwable rootCause = e.getRootCause(); if (rootCause instanceof IOException) { ex = e; // already a zombie, no need to re-add } else { throw e; } } catch (Exception e) { throw new SolrServerException(e); } } if (ex == null) { throw new SolrServerException("No live SolrServers available to handle this request"); } else { throw new SolrServerException("No live SolrServers available to handle this request:" + zombieServers.keySet(), ex); } }
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
Override public NamedList<Object> request(final SolrRequest request) throws SolrServerException, IOException { Exception ex = null; ServerWrapper[] serverList = aliveServerList; int maxTries = serverList.length; Map<String,ServerWrapper> justFailed = null; for (int attempts=0; attempts<maxTries; attempts++) { int count = counter.incrementAndGet(); ServerWrapper wrapper = serverList[count % serverList.length]; wrapper.lastUsed = System.currentTimeMillis(); try { return wrapper.solrServer.request(request); } catch (SolrException e) { // Server is alive but the request was malformed or invalid throw e; } catch (SolrServerException e) { if (e.getRootCause() instanceof IOException) { ex = e; moveAliveToDead(wrapper); if (justFailed == null) justFailed = new HashMap<String,ServerWrapper>(); justFailed.put(wrapper.getKey(), wrapper); } else { throw e; } } catch (Exception e) { throw new SolrServerException(e); } } // try other standard servers that we didn't try just now for (ServerWrapper wrapper : zombieServers.values()) { if (wrapper.standard==false || justFailed!=null && justFailed.containsKey(wrapper.getKey())) continue; try { NamedList<Object> rsp = wrapper.solrServer.request(request); // remove from zombie list *before* adding to alive to avoid a race that could lose a server zombieServers.remove(wrapper.getKey()); addToAlive(wrapper); return rsp; } catch (SolrException e) { // Server is alive but the request was malformed or invalid throw e; } catch (SolrServerException e) { if (e.getRootCause() instanceof IOException) { ex = e; // still dead } else { throw e; } } catch (Exception e) { throw new SolrServerException(e); } } if (ex == null) { throw new SolrServerException("No live SolrServers available to handle this request"); } else { throw new SolrServerException("No live SolrServers available to handle this request", ex); } }
// in solrj/src/java/org/apache/solr/client/solrj/SolrServer.java
public UpdateResponse add(Collection<SolrInputDocument> docs) throws SolrServerException, IOException { return add(docs, -1); }
// in solrj/src/java/org/apache/solr/client/solrj/SolrServer.java
public UpdateResponse add(Collection<SolrInputDocument> docs, int commitWithinMs) throws SolrServerException, IOException { UpdateRequest req = new UpdateRequest(); req.add(docs); req.setCommitWithin(commitWithinMs); return req.process(this); }
// in solrj/src/java/org/apache/solr/client/solrj/SolrServer.java
public UpdateResponse addBeans(Collection<?> beans ) throws SolrServerException, IOException { return addBeans(beans, -1); }
// in solrj/src/java/org/apache/solr/client/solrj/SolrServer.java
public UpdateResponse addBeans(Collection<?> beans, int commitWithinMs) throws SolrServerException, IOException { DocumentObjectBinder binder = this.getBinder(); ArrayList<SolrInputDocument> docs = new ArrayList<SolrInputDocument>(beans.size()); for (Object bean : beans) { docs.add(binder.toSolrInputDocument(bean)); } return add(docs, commitWithinMs); }
// in solrj/src/java/org/apache/solr/client/solrj/SolrServer.java
public UpdateResponse add(SolrInputDocument doc ) throws SolrServerException, IOException { return add(doc, -1); }
// in solrj/src/java/org/apache/solr/client/solrj/SolrServer.java
public UpdateResponse add(SolrInputDocument doc, int commitWithinMs) throws SolrServerException, IOException { UpdateRequest req = new UpdateRequest(); req.add(doc); req.setCommitWithin(commitWithinMs); return req.process(this); }
// in solrj/src/java/org/apache/solr/client/solrj/SolrServer.java
public UpdateResponse addBean(Object obj) throws IOException, SolrServerException { return addBean(obj, -1); }
// in solrj/src/java/org/apache/solr/client/solrj/SolrServer.java
public UpdateResponse addBean(Object obj, int commitWithinMs) throws IOException, SolrServerException { return add(getBinder().toSolrInputDocument(obj),commitWithinMs); }
// in solrj/src/java/org/apache/solr/client/solrj/SolrServer.java
public UpdateResponse commit( ) throws SolrServerException, IOException { return commit(true, true); }
// in solrj/src/java/org/apache/solr/client/solrj/SolrServer.java
public UpdateResponse optimize( ) throws SolrServerException, IOException { return optimize(true, true, 1); }
// in solrj/src/java/org/apache/solr/client/solrj/SolrServer.java
public UpdateResponse commit( boolean waitFlush, boolean waitSearcher ) throws SolrServerException, IOException { return new UpdateRequest().setAction( UpdateRequest.ACTION.COMMIT, waitFlush, waitSearcher ).process( this ); }
// in solrj/src/java/org/apache/solr/client/solrj/SolrServer.java
public UpdateResponse commit( boolean waitFlush, boolean waitSearcher, boolean softCommit ) throws SolrServerException, IOException { return new UpdateRequest().setAction( UpdateRequest.ACTION.COMMIT, waitFlush, waitSearcher, softCommit ).process( this ); }
// in solrj/src/java/org/apache/solr/client/solrj/SolrServer.java
public UpdateResponse optimize( boolean waitFlush, boolean waitSearcher ) throws SolrServerException, IOException { return optimize(waitFlush, waitSearcher, 1); }
// in solrj/src/java/org/apache/solr/client/solrj/SolrServer.java
public UpdateResponse optimize(boolean waitFlush, boolean waitSearcher, int maxSegments ) throws SolrServerException, IOException { return new UpdateRequest().setAction( UpdateRequest.ACTION.OPTIMIZE, waitFlush, waitSearcher, maxSegments ).process( this ); }
// in solrj/src/java/org/apache/solr/client/solrj/SolrServer.java
public UpdateResponse rollback() throws SolrServerException, IOException { return new UpdateRequest().rollback().process( this ); }
// in solrj/src/java/org/apache/solr/client/solrj/SolrServer.java
public UpdateResponse deleteById(String id) throws SolrServerException, IOException { return deleteById(id, -1); }
// in solrj/src/java/org/apache/solr/client/solrj/SolrServer.java
public UpdateResponse deleteById(String id, int commitWithinMs) throws SolrServerException, IOException { UpdateRequest req = new UpdateRequest(); req.deleteById(id); req.setCommitWithin(commitWithinMs); return req.process(this); }
// in solrj/src/java/org/apache/solr/client/solrj/SolrServer.java
public UpdateResponse deleteById(List<String> ids) throws SolrServerException, IOException { return deleteById(ids, -1); }
// in solrj/src/java/org/apache/solr/client/solrj/SolrServer.java
public UpdateResponse deleteById(List<String> ids, int commitWithinMs) throws SolrServerException, IOException { UpdateRequest req = new UpdateRequest(); req.deleteById(ids); req.setCommitWithin(commitWithinMs); return req.process(this); }
// in solrj/src/java/org/apache/solr/client/solrj/SolrServer.java
public UpdateResponse deleteByQuery(String query) throws SolrServerException, IOException { return deleteByQuery(query, -1); }
// in solrj/src/java/org/apache/solr/client/solrj/SolrServer.java
public UpdateResponse deleteByQuery(String query, int commitWithinMs) throws SolrServerException, IOException { UpdateRequest req = new UpdateRequest(); req.deleteByQuery(query); req.setCommitWithin(commitWithinMs); return req.process(this); }
// in solrj/src/java/org/apache/solr/client/solrj/SolrServer.java
public SolrPingResponse ping() throws SolrServerException, IOException { return new SolrPing().process( this ); }
// in solrj/src/java/org/apache/solr/client/solrj/SolrServer.java
public QueryResponse queryAndStreamResponse( SolrParams params, StreamingResponseCallback callback ) throws SolrServerException, IOException { ResponseParser parser = new StreamingBinaryResponseParser( callback ); QueryRequest req = new QueryRequest( params ); req.setStreamingResponseCallback( callback ); req.setResponseParser( parser ); return req.process(this); }
// in solrj/src/java/org/apache/solr/client/solrj/util/ClientUtils.java
public static void writeXML( SolrInputDocument doc, Writer writer ) throws IOException { writer.write("<doc boost=\""+doc.getDocumentBoost()+"\">"); for( SolrInputField field : doc ) { float boost = field.getBoost(); String name = field.getName(); for( Object v : field ) { String update = null; if (v instanceof Map) { // currently only supports a single value for (Entry<Object,Object> entry : ((Map<Object,Object>)v).entrySet()) { update = entry.getKey().toString(); Object fieldVal = entry.getValue(); v = fieldVal; } } if (v instanceof Date) { v = DateUtil.getThreadLocalDateFormat().format( (Date)v ); } else if (v instanceof byte[]) { byte[] bytes = (byte[]) v; v = Base64.byteArrayToBase64(bytes, 0,bytes.length); } else if (v instanceof ByteBuffer) { ByteBuffer bytes = (ByteBuffer) v; v = Base64.byteArrayToBase64(bytes.array(), bytes.position(),bytes.limit() - bytes.position()); } if (update == null) { if( boost != 1.0f ) { XML.writeXML(writer, "field", v.toString(), "name", name, "boost", boost ); } else if (v != null) { XML.writeXML(writer, "field", v.toString(), "name", name ); } } else { if( boost != 1.0f ) { XML.writeXML(writer, "field", v.toString(), "name", name, "boost", boost, "update", update); } else if (v != null) { XML.writeXML(writer, "field", v.toString(), "name", name, "update", update); } } // only write the boost for the first multi-valued field // otherwise, the used boost is the product of all the boost values boost = 1.0f; } } writer.write("</doc>"); }
// in solrj/src/java/org/apache/noggit/CharArr.java
public int read() throws IOException { if (start>=end) return -1; return buf[start++]; }
// in solrj/src/java/org/apache/noggit/CharArr.java
public int read(CharBuffer cb) throws IOException { /*** int sz = size(); if (sz<=0) return -1; if (sz>0) cb.put(buf, start, sz); return -1; ***/ int sz = size(); if (sz>0) cb.put(buf, start, sz); start=end; while (true) { fill(); int s = size(); if (s==0) return sz==0 ? -1 : sz; sz += s; cb.put(buf, start, s); } }
// in solrj/src/java/org/apache/noggit/CharArr.java
public int fill() throws IOException { return 0; // or -1? }
// in solrj/src/java/org/apache/noggit/CharArr.java
public final Appendable append(CharSequence csq) throws IOException { return append(csq, 0, csq.length()); }
// in solrj/src/java/org/apache/noggit/CharArr.java
public Appendable append(CharSequence csq, int start, int end) throws IOException { write(csq.subSequence(start, end).toString()); return null; }
// in solrj/src/java/org/apache/noggit/CharArr.java
public final Appendable append(char c) throws IOException { write(c); return this; }
// in solrj/src/java/org/apache/noggit/CharArr.java
public Appendable append(CharSequence csq, int start, int end) throws IOException { return this; }
// in solrj/src/java/org/apache/noggit/CharArr.java
public int read() throws IOException { if (start>=end) fill(); return start>=end ? -1 : buf[start++]; }
// in solrj/src/java/org/apache/noggit/CharArr.java
public int read(CharBuffer cb) throws IOException { // empty the buffer and then read direct int sz = size(); if (sz>0) cb.put(buf,start,end); int sz2 = in.read(cb); if (sz2>=0) return sz+sz2; return sz>0 ? sz : -1; }
// in solrj/src/java/org/apache/noggit/CharArr.java
public int fill() throws IOException { if (start>=end) { reset(); } else if (start>0) { System.arraycopy(buf, start, buf, 0, size()); end=size(); start=0; } /*** // fill fully or not??? do { int sz = in.read(buf,end,buf.length-end); if (sz==-1) return; end+=sz; } while (end < buf.length); ***/ int sz = in.read(buf,end,buf.length-end); if (sz>0) end+=sz; return sz; }
// in solrj/src/java/org/apache/noggit/JSONParser.java
protected void fill() throws IOException { if (in!=null) { gpos += end; start=0; int num = in.read(buf,0,buf.length); end = num>=0 ? num : 0; } if (start>=end) eof=true; }
// in solrj/src/java/org/apache/noggit/JSONParser.java
private void getMore() throws IOException { fill(); if (start>=end) { throw err(null); } }
// in solrj/src/java/org/apache/noggit/JSONParser.java
protected int getChar() throws IOException { if (start>=end) { fill(); if (start>=end) return -1; } return buf[start++]; }
// in solrj/src/java/org/apache/noggit/JSONParser.java
private int getCharNWS() throws IOException { for (;;) { int ch = getChar(); if (!(ch==' ' || ch=='\t' || ch=='\n' || ch=='\r')) return ch; } }
// in solrj/src/java/org/apache/noggit/JSONParser.java
private void expect(char[] arr) throws IOException { for (int i=1; i<arr.length; i++) { int ch = getChar(); if (ch != arr[i]) { throw err("Expected " + new String(arr)); } } }
// in solrj/src/java/org/apache/noggit/JSONParser.java
private long readNumber(int firstChar, boolean isNeg) throws IOException { out.unsafeWrite(firstChar); // unsafe OK since we know output is big enough // We build up the number in the negative plane since it's larger (by one) than // the positive plane. long v = '0' - firstChar; // can't overflow a long in 18 decimal digits (i.e. 17 additional after the first). // we also need 22 additional to handle double so we'll handle in 2 separate loops. int i; for (i=0; i<17; i++) { int ch = getChar(); // TODO: is this switch faster as an if-then-else? switch(ch) { case '0': case '1': case '2': case '3': case '4': case '5': case '6': case '7': case '8': case '9': v = v*10 - (ch-'0'); out.unsafeWrite(ch); continue; case '.': out.unsafeWrite('.'); valstate = readFrac(out,22-i); return 0; case 'e': case 'E': out.unsafeWrite(ch); nstate=0; valstate = readExp(out,22-i); return 0; default: // return the number, relying on nextEvent() to return an error // for invalid chars following the number. if (ch!=-1) --start; // push back last char if not EOF valstate = LONG; return isNeg ? v : -v; } } // after this, we could overflow a long and need to do extra checking boolean overflow = false; long maxval = isNeg ? Long.MIN_VALUE : -Long.MAX_VALUE; for (; i<22; i++) { int ch = getChar(); switch(ch) { case '0': case '1': case '2': case '3': case '4': case '5': case '6': case '7': case '8': case '9': if (v < (0x8000000000000000L/10)) overflow=true; // can't multiply by 10 w/o overflowing v *= 10; int digit = ch - '0'; if (v < maxval + digit) overflow=true; // can't add digit w/o overflowing v -= digit; out.unsafeWrite(ch); continue; case '.': out.unsafeWrite('.'); valstate = readFrac(out,22-i); return 0; case 'e': case 'E': out.unsafeWrite(ch); nstate=0; valstate = readExp(out,22-i); return 0; default: // return the number, relying on nextEvent() to return an error // for invalid chars following the number. if (ch!=-1) --start; // push back last char if not EOF valstate = overflow ? BIGNUMBER : LONG; return isNeg ? v : -v; } } nstate=0; valstate = BIGNUMBER; return 0; }
// in solrj/src/java/org/apache/noggit/JSONParser.java
private int readFrac(CharArr arr, int lim) throws IOException { nstate = HAS_FRACTION; // deliberate set instead of '|' while(--lim>=0) { int ch = getChar(); if (ch>='0' && ch<='9') { arr.write(ch); } else if (ch=='e' || ch=='E') { arr.write(ch); return readExp(arr,lim); } else { if (ch!=-1) start--; // back up return NUMBER; } } return BIGNUMBER; }
// in solrj/src/java/org/apache/noggit/JSONParser.java
private int readExp(CharArr arr, int lim) throws IOException { nstate |= HAS_EXPONENT; int ch = getChar(); lim--; if (ch=='+' || ch=='-') { arr.write(ch); ch = getChar(); lim--; } // make sure at least one digit is read. if (ch<'0' || ch>'9') { throw err("missing exponent number"); } arr.write(ch); return readExpDigits(arr,lim); }
// in solrj/src/java/org/apache/noggit/JSONParser.java
private int readExpDigits(CharArr arr, int lim) throws IOException { while (--lim>=0) { int ch = getChar(); if (ch>='0' && ch<='9') { arr.write(ch); } else { if (ch!=-1) start--; // back up return NUMBER; } } return BIGNUMBER; }
// in solrj/src/java/org/apache/noggit/JSONParser.java
private void continueNumber(CharArr arr) throws IOException { if (arr != out) arr.write(out); if ((nstate & HAS_EXPONENT)!=0){ readExpDigits(arr, Integer.MAX_VALUE); return; } if (nstate != 0) { readFrac(arr, Integer.MAX_VALUE); return; } for(;;) { int ch = getChar(); if (ch>='0' && ch <='9') { arr.write(ch); } else if (ch=='.') { arr.write(ch); readFrac(arr,Integer.MAX_VALUE); return; } else if (ch=='e' || ch=='E') { arr.write(ch); readExp(arr,Integer.MAX_VALUE); return; } else { if (ch!=-1) start--; return; } } }
// in solrj/src/java/org/apache/noggit/JSONParser.java
private char readEscapedChar() throws IOException { switch (getChar()) { case '"' : return '"'; case '\\' : return '\\'; case '/' : return '/'; case 'n' : return '\n'; case 'r' : return '\r'; case 't' : return '\t'; case 'f' : return '\f'; case 'b' : return '\b'; case 'u' : return (char)( (hexval(getChar()) << 12) | (hexval(getChar()) << 8) | (hexval(getChar()) << 4) | (hexval(getChar()))); } throw err("Invalid character escape in string"); }
// in solrj/src/java/org/apache/noggit/JSONParser.java
private CharArr readStringChars() throws IOException { char c=0; int i; for (i=start; i<end; i++) { c = buf[i]; if (c=='"') { tmp.set(buf,start,i); // directly use input buffer start=i+1; // advance past last '"' return tmp; } else if (c=='\\') { break; } } out.reset(); readStringChars2(out, i); return out; }
// in solrj/src/java/org/apache/noggit/JSONParser.java
private void readStringChars2(CharArr arr, int middle) throws IOException { for (;;) { if (middle>=end) { arr.write(buf,start,middle-start); start=middle; getMore(); middle=start; } int ch = buf[middle++]; if (ch=='"') { int len = middle-start-1; if (len>0) arr.write(buf,start,len); start=middle; return; } else if (ch=='\\') { int len = middle-start-1; if (len>0) arr.write(buf,start,len); start=middle; arr.write(readEscapedChar()); middle=start; } } }
// in solrj/src/java/org/apache/noggit/JSONParser.java
private int next(int ch) throws IOException { for(;;) { switch (ch) { case ' ': case '\t': break; case '\r': case '\n': break; // try and keep track of linecounts? case '"' : valstate = STRING; return STRING; case '{' : push(); state= DID_OBJSTART; return OBJECT_START; case '[': push(); state=DID_ARRSTART; return ARRAY_START; case '0' : out.reset(); //special case '0'? If next char isn't '.' val=0 ch=getChar(); if (ch=='.') { start--; ch='0'; readNumber('0',false); return valstate; } else if (ch>'9' || ch<'0') { out.unsafeWrite('0'); if (ch!=-1) start--; lval = 0; valstate=LONG; return LONG; } else { throw err("Leading zeros not allowed"); } case '1' : case '2' : case '3' : case '4' : case '5' : case '6' : case '7' : case '8' : case '9' : out.reset(); lval = readNumber(ch,false); return valstate; case '-' : out.reset(); out.unsafeWrite('-'); ch = getChar(); if (ch<'0' || ch>'9') throw err("expected digit after '-'"); lval = readNumber(ch,true); return valstate; case 't': valstate=BOOLEAN; // TODO: test performance of this non-branching inline version. // if ((('r'-getChar())|('u'-getChar())|('e'-getChar())) != 0) err(""); expect(JSONUtil.TRUE_CHARS); bool=true; return BOOLEAN; case 'f': valstate=BOOLEAN; expect(JSONUtil.FALSE_CHARS); bool=false; return BOOLEAN; case 'n': valstate=NULL; expect(JSONUtil.NULL_CHARS); return NULL; case -1: if (getLevel()>0) throw err("Premature EOF"); return EOF; default: throw err(null); } ch = getChar(); } }
// in solrj/src/java/org/apache/noggit/JSONParser.java
public int nextEvent() throws IOException { if (valstate==STRING) { readStringChars2(devNull,start); } else if (valstate==BIGNUMBER) { continueNumber(devNull); } valstate=0; int ch; // TODO: factor out getCharNWS() to here and check speed switch (state) { case 0: return event = next(getCharNWS()); case DID_OBJSTART: ch = getCharNWS(); if (ch=='}') { pop(); return event = OBJECT_END; } if (ch != '"') { throw err("Expected string"); } state = DID_MEMNAME; valstate = STRING; return event = STRING; case DID_MEMNAME: ch = getCharNWS(); if (ch!=':') { throw err("Expected key,value separator ':'"); } state = DID_MEMVAL; // set state first because it might be pushed... return event = next(getChar()); case DID_MEMVAL: ch = getCharNWS(); if (ch=='}') { pop(); return event = OBJECT_END; } else if (ch!=',') { throw err("Expected ',' or '}'"); } ch = getCharNWS(); if (ch != '"') { throw err("Expected string"); } state = DID_MEMNAME; valstate = STRING; return event = STRING; case DID_ARRSTART: ch = getCharNWS(); if (ch==']') { pop(); return event = ARRAY_END; } state = DID_ARRELEM; // set state first, might be pushed... return event = next(ch); case DID_ARRELEM: ch = getCharNWS(); if (ch==']') { pop(); return event = ARRAY_END; } else if (ch!=',') { throw err("Expected ',' or ']'"); } // state = DID_ARRELEM; return event = next(getChar()); } return 0; }
// in solrj/src/java/org/apache/noggit/JSONParser.java
private void goTo(int what) throws IOException { if (valstate==what) { valstate=0; return; } if (valstate==0) { int ev = nextEvent(); // TODO if (valstate!=what) { throw err("type mismatch"); } valstate=0; } else { throw err("type mismatch"); } }
// in solrj/src/java/org/apache/noggit/JSONParser.java
public String getString() throws IOException { return getStringChars().toString(); }
// in solrj/src/java/org/apache/noggit/JSONParser.java
public CharArr getStringChars() throws IOException { goTo(STRING); return readStringChars(); }
// in solrj/src/java/org/apache/noggit/JSONParser.java
public void getString(CharArr output) throws IOException { goTo(STRING); readStringChars2(output,start); }
// in solrj/src/java/org/apache/noggit/JSONParser.java
public long getLong() throws IOException { goTo(LONG); return lval; }
// in solrj/src/java/org/apache/noggit/JSONParser.java
public double getDouble() throws IOException { return Double.parseDouble(getNumberChars().toString()); }
// in solrj/src/java/org/apache/noggit/JSONParser.java
public CharArr getNumberChars() throws IOException { int ev=0; if (valstate==0) ev = nextEvent(); if (valstate == LONG || valstate == NUMBER) { valstate=0; return out; } else if (valstate==BIGNUMBER) { continueNumber(out); valstate=0; return out; } else { throw err("Unexpected " + ev); } }
// in solrj/src/java/org/apache/noggit/JSONParser.java
public void getNumberChars(CharArr output) throws IOException { int ev=0; if (valstate==0) ev=nextEvent(); if (valstate == LONG || valstate == NUMBER) output.write(this.out); else if (valstate==BIGNUMBER) { continueNumber(output); } else { throw err("Unexpected " + ev); } valstate=0; }
// in solrj/src/java/org/apache/noggit/JSONParser.java
public boolean getBoolean() throws IOException { goTo(BOOLEAN); return bool; }
// in solrj/src/java/org/apache/noggit/JSONParser.java
public void getNull() throws IOException { goTo(NULL); }
// in solrj/src/java/org/apache/noggit/ObjectBuilder.java
public static Object fromJSON(String json) throws IOException { JSONParser p = new JSONParser(json); return getVal(p); }
// in solrj/src/java/org/apache/noggit/ObjectBuilder.java
public static Object getVal(JSONParser parser) throws IOException { return new ObjectBuilder(parser).getVal(); }
// in solrj/src/java/org/apache/noggit/ObjectBuilder.java
public Object getVal() throws IOException { int ev = parser.lastEvent(); switch(ev) { case JSONParser.STRING: return getString(); case JSONParser.LONG: return getLong(); case JSONParser.NUMBER: return getNumber(); case JSONParser.BIGNUMBER: return getBigNumber(); case JSONParser.BOOLEAN: return getBoolean(); case JSONParser.NULL: return getNull(); case JSONParser.OBJECT_START: return getObject(); case JSONParser.OBJECT_END: return null; // OR ERROR? case JSONParser.ARRAY_START: return getArray(); case JSONParser.ARRAY_END: return null; // OR ERROR? case JSONParser.EOF: return null; // OR ERROR? default: return null; // OR ERROR? } }
// in solrj/src/java/org/apache/noggit/ObjectBuilder.java
public Object getString() throws IOException { return parser.getString(); }
// in solrj/src/java/org/apache/noggit/ObjectBuilder.java
public Object getLong() throws IOException { return Long.valueOf(parser.getLong()); }
// in solrj/src/java/org/apache/noggit/ObjectBuilder.java
public Object getNumber() throws IOException { CharArr num = parser.getNumberChars(); String numstr = num.toString(); double d = Double.parseDouble(numstr); if (!Double.isInfinite(d)) return Double.valueOf(d); // TODO: use more efficient constructor in Java5 return new BigDecimal(numstr); }
// in solrj/src/java/org/apache/noggit/ObjectBuilder.java
public Object getBigNumber() throws IOException { CharArr num = parser.getNumberChars(); String numstr = num.toString(); for(int ch; (ch=num.read())!=-1;) { if (ch=='.' || ch=='e' || ch=='E') return new BigDecimal(numstr); }
// in solrj/src/java/org/apache/noggit/ObjectBuilder.java
public Object getBoolean() throws IOException { return parser.getBoolean(); }
// in solrj/src/java/org/apache/noggit/ObjectBuilder.java
public Object getNull() throws IOException { parser.getNull(); return null; }
// in solrj/src/java/org/apache/noggit/ObjectBuilder.java
public Object newObject() throws IOException { return new LinkedHashMap(); }
// in solrj/src/java/org/apache/noggit/ObjectBuilder.java
public Object getKey() throws IOException { return parser.getString(); }
// in solrj/src/java/org/apache/noggit/ObjectBuilder.java
public void addKeyVal(Object map, Object key, Object val) throws IOException { Object prev = ((Map)map).put(key,val); // TODO: test for repeated value? }
// in solrj/src/java/org/apache/noggit/ObjectBuilder.java
public Object getObject() throws IOException { Object m = newObject(); for(;;) { int ev = parser.nextEvent(); if (ev==JSONParser.OBJECT_END) return objectEnd(m); Object key = getKey(); ev = parser.nextEvent(); Object val = getVal(); addKeyVal(m, key, val); } }
// in solrj/src/java/org/apache/noggit/ObjectBuilder.java
public void addArrayVal(Object arr, Object val) throws IOException { ((List)arr).add(val); }
// in solrj/src/java/org/apache/noggit/ObjectBuilder.java
public Object getArray() throws IOException { Object arr = newArray(); for(;;) { int ev = parser.nextEvent(); if (ev==JSONParser.ARRAY_END) return endArray(arr); Object val = getVal(); addArrayVal(arr, val); } }
// in contrib/extraction/src/java/org/apache/solr/handler/extraction/ExtractingDocumentLoader.java
void doAdd(SolrContentHandler handler, AddUpdateCommand template) throws IOException { template.solrDoc = handler.newDocument(); processor.processAdd(template); }
// in contrib/extraction/src/java/org/apache/solr/handler/extraction/ExtractingDocumentLoader.java
void addDoc(SolrContentHandler handler) throws IOException { templateAdd.clear(); doAdd(handler, templateAdd); }
// in contrib/extraction/src/java/org/apache/solr/handler/extraction/ExtractingRequestHandler.java
private TikaConfig getDefaultConfig(ClassLoader classLoader) throws MimeTypeException, IOException { return new TikaConfig(classLoader); }
// in contrib/uima/src/java/org/apache/solr/uima/processor/UIMAUpdateRequestProcessor.java
Override public void processAdd(AddUpdateCommand cmd) throws IOException { String text = null; try { /* get Solr document */ SolrInputDocument solrInputDocument = cmd.getSolrInputDocument(); /* get the fields to analyze */ String[] texts = getTextsToAnalyze(solrInputDocument); for (int i = 0; i < texts.length; i++) { text = texts[i]; if (text != null && text.length()>0) { /* process the text value */ JCas jcas = processText(text); UIMAToSolrMapper uimaToSolrMapper = new UIMAToSolrMapper(solrInputDocument, jcas); /* get field mapping from config */ Map<String, Map<String, MapField>> typesAndFeaturesFieldsMap = solrUIMAConfiguration .getTypesFeaturesFieldsMapping(); /* map type features on fields */ for (String typeFQN : typesAndFeaturesFieldsMap.keySet()) { uimaToSolrMapper.map(typeFQN, typesAndFeaturesFieldsMap.get(typeFQN)); } } } } catch (Exception e) { String logField = solrUIMAConfiguration.getLogField(); if(logField == null){ SchemaField uniqueKeyField = solrCore.getSchema().getUniqueKeyField(); if(uniqueKeyField != null){ logField = uniqueKeyField.getName(); } } String optionalFieldInfo = logField == null ? "." : new StringBuilder(". ").append(logField).append("=") .append((String)cmd.getSolrInputDocument().getField(logField).getValue()) .append(", ").toString(); int len = Math.min(text.length(), 100); if (solrUIMAConfiguration.isIgnoreErrors()) { log.warn(new StringBuilder("skip the text processing due to ") .append(e.getLocalizedMessage()).append(optionalFieldInfo) .append(" text=\"").append(text.substring(0, len)).append("...\"").toString()); } else { throw new SolrException(ErrorCode.SERVER_ERROR, new StringBuilder("processing error: ") .append(e.getLocalizedMessage()).append(optionalFieldInfo) .append(" text=\"").append(text.substring(0, len)).append("...\"").toString(), e); } } super.processAdd(cmd); }
// in contrib/clustering/src/java/org/apache/solr/handler/clustering/carrot2/LuceneCarrot2TokenizerFactory.java
public short nextToken() throws IOException { final boolean hasNextToken = wordTokenFilter.incrementToken(); if (hasNextToken) { short flags = 0; final char[] image = term.buffer(); final int length = term.length(); tempCharSequence.reset(image, 0, length); if (length == 1 && image[0] == ',') { // ChineseTokenizer seems to convert all punctuation to ',' // characters flags = ITokenizer.TT_PUNCTUATION; } else if (numeric.matcher(tempCharSequence).matches()) { flags = ITokenizer.TT_NUMERIC; } else { flags = ITokenizer.TT_TERM; } return flags; } return ITokenizer.TT_EOF; }
// in contrib/clustering/src/java/org/apache/solr/handler/clustering/carrot2/LuceneCarrot2TokenizerFactory.java
public void reset(Reader input) throws IOException { try { sentenceTokenizer.reset(input); wordTokenFilter = (TokenStream) tokenFilterClass.getConstructor( TokenStream.class).newInstance(sentenceTokenizer); term = wordTokenFilter.addAttribute(CharTermAttribute.class); } catch (Exception e) { throw ExceptionUtils.wrapAsRuntimeException(e); } }
// in contrib/clustering/src/java/org/apache/solr/handler/clustering/carrot2/CarrotClusteringEngine.java
Override public IResource[] getAll(final String resource) { final String resourceName = carrot2ResourcesDir + "/" + resource; log.debug("Looking for Solr resource: " + resourceName); InputStream resourceStream = null; final byte [] asBytes; try { resourceStream = resourceLoader.openResource(resourceName); asBytes = IOUtils.toByteArray(resourceStream); } catch (RuntimeException e) { log.debug("Resource not found in Solr's config: " + resourceName + ". Using the default " + resource + " from Carrot JAR."); return new IResource[] {}; } catch (IOException e) { log.warn("Could not read Solr resource " + resourceName); return new IResource[] {}; } finally { if (resourceStream != null) Closeables.closeQuietly(resourceStream); } log.info("Loaded Solr resource: " + resourceName); final IResource foundResource = new IResource() { @Override public InputStream open() throws IOException { return new ByteArrayInputStream(asBytes); } @Override public int hashCode() { // In case multiple resources are found they will be deduped, but we don't use it in Solr, // so simply rely on instance equivalence. return super.hashCode(); } @Override public boolean equals(Object obj) { // In case multiple resources are found they will be deduped, but we don't use it in Solr, // so simply rely on instance equivalence. return super.equals(obj); } @Override public String toString() { return "Solr config resource: " + resourceName; } }; return new IResource[] { foundResource }; }
// in contrib/clustering/src/java/org/apache/solr/handler/clustering/carrot2/CarrotClusteringEngine.java
Override public InputStream open() throws IOException { return new ByteArrayInputStream(asBytes); }
// in contrib/clustering/src/java/org/apache/solr/handler/clustering/carrot2/CarrotClusteringEngine.java
private List<Document> getDocuments(SolrDocumentList solrDocList, Map<SolrDocument, Integer> docIds, Query query, final SolrQueryRequest sreq) throws IOException { SolrHighlighter highlighter = null; SolrParams solrParams = sreq.getParams(); SolrCore core = sreq.getCore(); String urlField = solrParams.get(CarrotParams.URL_FIELD_NAME, "url"); String titleFieldSpec = solrParams.get(CarrotParams.TITLE_FIELD_NAME, "title"); String snippetFieldSpec = solrParams.get(CarrotParams.SNIPPET_FIELD_NAME, titleFieldSpec); String languageField = solrParams.get(CarrotParams.LANGUAGE_FIELD_NAME, null); // Maps Solr field names to Carrot2 custom field names Map<String, String> customFields = getCustomFieldsMap(solrParams); // Parse language code map string into a map Map<String, String> languageCodeMap = Maps.newHashMap(); if (StringUtils.isNotBlank(languageField)) { for (String pair : solrParams.get(CarrotParams.LANGUAGE_CODE_MAP, "") .split("[, ]")) { final String[] split = pair.split(":"); if (split.length == 2 && StringUtils.isNotBlank(split[0]) && StringUtils.isNotBlank(split[1])) { languageCodeMap.put(split[0], split[1]); } else { log.warn("Unsupported format for " + CarrotParams.LANGUAGE_CODE_MAP + ": '" + pair + "'. Skipping this mapping."); } } } // Get the documents boolean produceSummary = solrParams.getBool(CarrotParams.PRODUCE_SUMMARY, false); SolrQueryRequest req = null; String[] snippetFieldAry = null; if (produceSummary) { highlighter = HighlightComponent.getHighlighter(core); if (highlighter != null){ Map<String, Object> args = Maps.newHashMap(); snippetFieldAry = snippetFieldSpec.split("[, ]"); args.put(HighlightParams.FIELDS, snippetFieldAry); args.put(HighlightParams.HIGHLIGHT, "true"); args.put(HighlightParams.SIMPLE_PRE, ""); //we don't care about actually highlighting the area args.put(HighlightParams.SIMPLE_POST, ""); args.put(HighlightParams.FRAGSIZE, solrParams.getInt(CarrotParams.SUMMARY_FRAGSIZE, solrParams.getInt(HighlightParams.FRAGSIZE, 100))); args.put(HighlightParams.SNIPPETS, solrParams.getInt(CarrotParams.SUMMARY_SNIPPETS, solrParams.getInt(HighlightParams.SNIPPETS, 1))); req = new LocalSolrQueryRequest(core, query.toString(), "", 0, 1, args) { @Override public SolrIndexSearcher getSearcher() { return sreq.getSearcher(); } }; } else { log.warn("No highlighter configured, cannot produce summary"); produceSummary = false; } } Iterator<SolrDocument> docsIter = solrDocList.iterator(); List<Document> result = new ArrayList<Document>(solrDocList.size()); float[] scores = {1.0f}; int[] docsHolder = new int[1]; Query theQuery = query; while (docsIter.hasNext()) { SolrDocument sdoc = docsIter.next(); String snippet = null; // TODO: docIds will be null when running distributed search. // See comment in ClusteringComponent#finishStage(). if (produceSummary && docIds != null) { docsHolder[0] = docIds.get(sdoc).intValue(); DocList docAsList = new DocSlice(0, 1, docsHolder, scores, 1, 1.0f); NamedList<Object> highlights = highlighter.doHighlighting(docAsList, theQuery, req, snippetFieldAry); if (highlights != null && highlights.size() == 1) {//should only be one value given our setup //should only be one document @SuppressWarnings("unchecked") NamedList<String []> tmp = (NamedList<String[]>) highlights.getVal(0); final StringBuilder sb = new StringBuilder(); for (int j = 0; j < snippetFieldAry.length; j++) { // Join fragments with a period, so that Carrot2 does not create // cross-fragment phrases, such phrases rarely make sense. String [] highlt = tmp.get(snippetFieldAry[j]); if (highlt != null && highlt.length > 0) { for (int i = 0; i < highlt.length; i++) { sb.append(highlt[i]); sb.append(" . "); } } } snippet = sb.toString(); } } // If summaries not enabled or summary generation failed, use full content. if (snippet == null) { snippet = getConcatenated(sdoc, snippetFieldSpec); } // Create a Carrot2 document Document carrotDocument = new Document(getConcatenated(sdoc, titleFieldSpec), snippet, ObjectUtils.toString(sdoc.getFieldValue(urlField), "")); // Store Solr id of the document, we need it to map document instances // found in clusters back to identifiers. carrotDocument.setField(SOLR_DOCUMENT_ID, sdoc.getFieldValue(idFieldName)); // Set language if (StringUtils.isNotBlank(languageField)) { Collection<Object> languages = sdoc.getFieldValues(languageField); if (languages != null) { // Use the first Carrot2-supported language for (Object l : languages) { String lang = ObjectUtils.toString(l, ""); if (languageCodeMap.containsKey(lang)) { lang = languageCodeMap.get(lang); } // Language detection Library for Java uses dashes to separate // language variants, such as 'zh-cn', but Carrot2 uses underscores. if (lang.indexOf('-') > 0) { lang = lang.replace('-', '_'); } // If the language is supported by Carrot2, we'll get a non-null value final LanguageCode carrot2Language = LanguageCode.forISOCode(lang); if (carrot2Language != null) { carrotDocument.setLanguage(carrot2Language); break; } } } } // Add custom fields if (customFields != null) { for (Entry<String, String> entry : customFields.entrySet()) { carrotDocument.setField(entry.getValue(), sdoc.getFieldValue(entry.getKey())); } } result.add(carrotDocument); } return result; }
// in contrib/clustering/src/java/org/apache/solr/handler/clustering/ClusteringComponent.java
Override public void prepare(ResponseBuilder rb) throws IOException { SolrParams params = rb.req.getParams(); if (!params.getBool(COMPONENT_NAME, false)) { return; } }
// in contrib/clustering/src/java/org/apache/solr/handler/clustering/ClusteringComponent.java
Override public void process(ResponseBuilder rb) throws IOException { SolrParams params = rb.req.getParams(); if (!params.getBool(COMPONENT_NAME, false)) { return; } String name = getClusteringEngineName(rb); boolean useResults = params.getBool(ClusteringParams.USE_SEARCH_RESULTS, false); if (useResults == true) { SearchClusteringEngine engine = getSearchClusteringEngine(rb); if (engine != null) { DocListAndSet results = rb.getResults(); Map<SolrDocument,Integer> docIds = new HashMap<SolrDocument, Integer>(results.docList.size()); SolrDocumentList solrDocList = engine.getSolrDocumentList(results.docList, rb.req, docIds); Object clusters = engine.cluster(rb.getQuery(), solrDocList, docIds, rb.req); rb.rsp.add("clusters", clusters); } else { log.warn("No engine for: " + name); } } boolean useCollection = params.getBool(ClusteringParams.USE_COLLECTION, false); if (useCollection == true) { DocumentClusteringEngine engine = documentClusteringEngines.get(name); if (engine != null) { boolean useDocSet = params.getBool(ClusteringParams.USE_DOC_SET, false); NamedList nl = null; //TODO: This likely needs to be made into a background task that runs in an executor if (useDocSet == true) { nl = engine.cluster(rb.getResults().docSet, params); } else { nl = engine.cluster(params); } rb.rsp.add("clusters", nl); } else { log.warn("No engine for " + name); } } }
// in contrib/clustering/src/java/org/apache/solr/handler/clustering/SearchClusteringEngine.java
public SolrDocumentList getSolrDocumentList(DocList docList, SolrQueryRequest sreq, Map<SolrDocument, Integer> docIds) throws IOException{ return SolrPluginUtils.docListToSolrDocumentList( docList, sreq.getSearcher(), getFieldsToLoad(sreq), docIds); }
// in contrib/langid/src/java/org/apache/solr/update/processor/LanguageIdentifierUpdateProcessor.java
Override public void processAdd(AddUpdateCommand cmd) throws IOException { if (isEnabled()) { process(cmd.getSolrInputDocument()); } else { log.debug("Processor not enabled, not running"); } super.processAdd(cmd); }
// in contrib/langid/src/java/org/apache/solr/update/processor/LangDetectLanguageIdentifierUpdateProcessorFactory.java
public static synchronized void loadData() throws IOException, LangDetectException { if (loaded) { return; } loaded = true; List<String> profileData = new ArrayList<String>(); Charset encoding = Charset.forName("UTF-8"); for (String language : languages) { InputStream stream = LangDetectLanguageIdentifierUpdateProcessor.class.getResourceAsStream("langdetect-profiles/" + language); BufferedReader reader = new BufferedReader(new InputStreamReader(stream, encoding)); profileData.add(new String(IOUtils.toCharArray(reader))); reader.close(); } DetectorFactory.loadProfile(profileData); DetectorFactory.setSeed(0); }
// in contrib/analysis-extras/src/java/org/apache/solr/schema/ICUCollationField.java
Override public void write(TextResponseWriter writer, String name, IndexableField f) throws IOException { writer.writeStr(name, f.stringValue(), true); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/SolrWriter.java
static String getResourceAsString(InputStream in) throws IOException { ByteArrayOutputStream baos = new ByteArrayOutputStream(1024); byte[] buf = new byte[1024]; int sz = 0; try { while ((sz = in.read(buf)) != -1) { baos.write(buf, 0, sz); } } finally { try { in.close(); } catch (Exception e) { } } return new String(baos.toByteArray(), "UTF-8"); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/XPathRecordReader.java
private void parse(XMLStreamReader parser, Handler handler, Map<String, Object> values, Stack<Set<String>> stack, // lists of values to purge boolean recordStarted ) throws IOException, XMLStreamException { Set<String> valuesAddedinThisFrame = null; if (isRecord) { // This Node is a match for an XPATH from a forEach attribute, // prepare for the clean up that will occurr when the record // is emitted after its END_ELEMENT is matched recordStarted = true; valuesAddedinThisFrame = new HashSet<String>(); stack.push(valuesAddedinThisFrame); } else if (recordStarted) { // This node is a child of some parent which matched against forEach // attribute. Continue to add values to an existing record. valuesAddedinThisFrame = stack.peek(); } try { /* The input stream has deposited us at this Node in our tree of * intresting nodes. Depending on how this node is of interest, * process further tokens from the input stream and decide what * we do next */ if (attributes != null) { // we interested in storing attributes from the input stream for (Node node : attributes) { String value = parser.getAttributeValue(null, node.name); if (value != null || (recordStarted && !isRecord)) { putText(values, value, node.fieldName, node.multiValued); valuesAddedinThisFrame.add(node.fieldName); } } } Set<Node> childrenFound = new HashSet<Node>(); int event = -1; int flattenedStarts=0; // our tag depth when flattening elements StringBuilder text = new StringBuilder(); while (true) { event = parser.next(); if (event == END_ELEMENT) { if (flattenedStarts > 0) flattenedStarts--; else { if (hasText && valuesAddedinThisFrame != null) { valuesAddedinThisFrame.add(fieldName); putText(values, text.toString(), fieldName, multiValued); } if (isRecord) handler.handle(getDeepCopy(values), forEachPath); if (childNodes != null && recordStarted && !isRecord && !childrenFound.containsAll(childNodes)) { // nonReccord nodes where we have not collected text for ALL // the child nodes. for (Node n : childNodes) { // For the multivalue child nodes where we could have, but // didnt, collect text. Push a null string into values. if (!childrenFound.contains(n)) n.putNulls(values); } } return; } } else if (hasText && (event==CDATA || event==CHARACTERS || event==SPACE)) { text.append(parser.getText()); } else if (event == START_ELEMENT) { if ( flatten ) flattenedStarts++; else handleStartElement(parser, childrenFound, handler, values, stack, recordStarted); } // END_DOCUMENT is least likely to appear and should be // last in if-then-else skip chain else if (event == END_DOCUMENT) return; } }finally { if ((isRecord || !recordStarted) && !stack.empty()) { Set<String> cleanThis = stack.pop(); if (cleanThis != null) { for (String fld : cleanThis) values.remove(fld); } } } }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/XPathRecordReader.java
private void handleStartElement(XMLStreamReader parser, Set<Node> childrenFound, Handler handler, Map<String, Object> values, Stack<Set<String>> stack, boolean recordStarted) throws IOException, XMLStreamException { Node n = getMatchingNode(parser,childNodes); Map<String, Object> decends=new HashMap<String, Object>(); if (n != null) { childrenFound.add(n); n.parse(parser, handler, values, stack, recordStarted); return; } // The stream has diverged from the tree of interesting elements, but // are there any wildCardNodes ... anywhere in our path from the root? Node dn = this; // checking our Node first! do { if (dn.wildCardNodes != null) { // Check to see if the streams tag matches one of the "//" all // decendents type expressions for this node. n = getMatchingNode(parser, dn.wildCardNodes); if (n != null) { childrenFound.add(n); n.parse(parser, handler, values, stack, recordStarted); break; } // add the list of this nodes wild decendents to the cache for (Node nn : dn.wildCardNodes) decends.put(nn.name, nn); } dn = dn.wildAncestor; // leap back along the tree toward root } while (dn != null) ; if (n == null) { // we have a START_ELEMENT which is not within the tree of // interesting nodes. Skip over the contents of this element // but recursivly repeat the above for any START_ELEMENTs // found within this element. int count = 1; // we have had our first START_ELEMENT while (count != 0) { int token = parser.next(); if (token == START_ELEMENT) { Node nn = (Node) decends.get(parser.getLocalName()); if (nn != null) { // We have a //Node which matches the stream's parser.localName childrenFound.add(nn); // Parse the contents of this stream element nn.parse(parser, handler, values, stack, recordStarted); } else count++; } else if (token == END_ELEMENT) count--; } } }
// in contrib/velocity/src/java/org/apache/solr/response/VelocityResponseWriter.java
public void write(Writer writer, SolrQueryRequest request, SolrQueryResponse response) throws IOException { VelocityEngine engine = getEngine(request); // TODO: have HTTP headers available for configuring engine Template template = getTemplate(engine, request); VelocityContext context = new VelocityContext(); context.put("request", request); // Turn the SolrQueryResponse into a SolrResponse. // QueryResponse has lots of conveniences suitable for a view // Problem is, which SolrResponse class to use? // One patch to SOLR-620 solved this by passing in a class name as // as a parameter and using reflection and Solr's class loader to // create a new instance. But for now the implementation simply // uses QueryResponse, and if it chokes in a known way, fall back // to bare bones SolrResponseBase. // TODO: Can this writer know what the handler class is? With echoHandler=true it can get its string name at least SolrResponse rsp = new QueryResponse(); NamedList<Object> parsedResponse = BinaryResponseWriter.getParsedResponse(request, response); try { rsp.setResponse(parsedResponse); // page only injected if QueryResponse works context.put("page", new PageTool(request, response)); // page tool only makes sense for a SearchHandler request... *sigh* } catch (ClassCastException e) { // known edge case where QueryResponse's extraction assumes "response" is a SolrDocumentList // (AnalysisRequestHandler emits a "response") e.printStackTrace(); rsp = new SolrResponseBase(); rsp.setResponse(parsedResponse); } context.put("response", rsp); // Velocity context tools - TODO: make these pluggable context.put("esc", new EscapeTool()); context.put("date", new ComparisonDateTool()); context.put("list", new ListTool()); context.put("math", new MathTool()); context.put("number", new NumberTool()); context.put("sort", new SortTool()); context.put("engine", engine); // for $engine.resourceExists(...) String layout_template = request.getParams().get("v.layout"); String json_wrapper = request.getParams().get("v.json"); boolean wrap_response = (layout_template != null) || (json_wrapper != null); // create output, optionally wrap it into a json object if (wrap_response) { StringWriter stringWriter = new StringWriter(); template.merge(context, stringWriter); if (layout_template != null) { context.put("content", stringWriter.toString()); stringWriter = new StringWriter(); try { engine.getTemplate(layout_template + ".vm").merge(context, stringWriter); } catch (Exception e) { throw new IOException(e.getMessage()); } } if (json_wrapper != null) { writer.write(request.getParams().get("v.json") + "("); writer.write(getJSONWrap(stringWriter.toString())); writer.write(')'); } else { // using a layout, but not JSON wrapping writer.write(stringWriter.toString()); } } else { template.merge(context, writer); } }
// in contrib/velocity/src/java/org/apache/solr/response/VelocityResponseWriter.java
private Template getTemplate(VelocityEngine engine, SolrQueryRequest request) throws IOException { Template template; String template_name = request.getParams().get("v.template"); String qt = request.getParams().get("qt"); String path = (String) request.getContext().get("path"); if (template_name == null && path != null) { template_name = path; } // TODO: path is never null, so qt won't get picked up maybe special case for '/select' to use qt, otherwise use path? if (template_name == null && qt != null) { template_name = qt; } if (template_name == null) template_name = "index"; try { template = engine.getTemplate(template_name + ".vm"); } catch (Exception e) { throw new IOException(e.getMessage()); } return template; }
// in core/src/java/org/apache/solr/handler/DocumentAnalysisRequestHandler.java
DocumentAnalysisRequest resolveAnalysisRequest(SolrQueryRequest req) throws IOException, XMLStreamException { DocumentAnalysisRequest request = new DocumentAnalysisRequest(); SolrParams params = req.getParams(); String query = params.get(AnalysisParams.QUERY, params.get(CommonParams.Q, null)); request.setQuery(query); boolean showMatch = params.getBool(AnalysisParams.SHOW_MATCH, false); request.setShowMatch(showMatch); ContentStream stream = extractSingleContentStream(req); InputStream is = null; XMLStreamReader parser = null; try { is = stream.getStream(); final String charset = ContentStreamBase.getCharsetFromContentType(stream.getContentType()); parser = (charset == null) ? inputFactory.createXMLStreamReader(is) : inputFactory.createXMLStreamReader(is, charset); while (true) { int event = parser.next(); switch (event) { case XMLStreamConstants.END_DOCUMENT: { parser.close(); return request; } case XMLStreamConstants.START_ELEMENT: { String currTag = parser.getLocalName(); if ("doc".equals(currTag)) { log.trace("Reading doc..."); SolrInputDocument document = readDocument(parser, req.getSchema()); request.addDocument(document); } break; } } } } finally { if (parser != null) parser.close(); IOUtils.closeQuietly(is); } }
// in core/src/java/org/apache/solr/handler/DumpRequestHandler.java
Override public void handleRequestBody(SolrQueryRequest req, SolrQueryResponse rsp) throws IOException { // Show params rsp.add( "params", req.getParams().toNamedList() ); // Write the streams... if( req.getContentStreams() != null ) { ArrayList<NamedList<Object>> streams = new ArrayList<NamedList<Object>>(); // Cycle through each stream for( ContentStream content : req.getContentStreams() ) { NamedList<Object> stream = new SimpleOrderedMap<Object>(); stream.add( "name", content.getName() ); stream.add( "sourceInfo", content.getSourceInfo() ); stream.add( "size", content.getSize() ); stream.add( "contentType", content.getContentType() ); Reader reader = content.getReader(); try { stream.add( "stream", IOUtils.toString(reader) ); } finally { reader.close(); } streams.add( stream ); } rsp.add( "streams", streams ); } rsp.add("context", req.getContext()); }
// in core/src/java/org/apache/solr/handler/AnalysisRequestHandlerBase.java
Override public boolean incrementToken() throws IOException { if (tokenIterator.hasNext()) { clearAttributes(); AttributeSource next = tokenIterator.next(); Iterator<Class<? extends Attribute>> atts = next.getAttributeClassesIterator(); while (atts.hasNext()) // make sure all att impls in the token exist here addAttribute(atts.next()); next.copyTo(this); return true; } else { return false; } }
// in core/src/java/org/apache/solr/handler/AnalysisRequestHandlerBase.java
Override public void reset() throws IOException { super.reset(); tokenIterator = tokens.iterator(); }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
NamedList getCommandResponse(NamedList<String> commands) throws IOException { HttpPost post = new HttpPost(masterUrl); List<BasicNameValuePair> formparams = new ArrayList<BasicNameValuePair>(); formparams.add(new BasicNameValuePair("wt", "javabin")); for (Map.Entry<String, String> c : commands) { formparams.add(new BasicNameValuePair(c.getKey(), c.getValue())); } UrlEncodedFormEntity entity = new UrlEncodedFormEntity(formparams, "UTF-8"); post.setEntity(entity); return getNamedListResponse(post); }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
private NamedList<?> getNamedListResponse(HttpPost method) throws IOException { InputStream input = null; NamedList<?> result = null; try { HttpResponse response = myHttpClient.execute(method); int status = response.getStatusLine().getStatusCode(); if (status != HttpStatus.SC_OK) { throw new SolrException(SolrException.ErrorCode.SERVICE_UNAVAILABLE, "Request failed for the url " + method); } input = response.getEntity().getContent(); result = (NamedList<?>)new JavaBinCodec().unmarshal(input); } finally { try { if (input != null) { input.close(); } } catch (Exception e) { } } return result; }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
void fetchFileList(long gen) throws IOException { HttpPost post = new HttpPost(masterUrl); List<BasicNameValuePair> formparams = new ArrayList<BasicNameValuePair>(); formparams.add(new BasicNameValuePair("wt", "javabin")); formparams.add(new BasicNameValuePair(COMMAND, CMD_GET_FILE_LIST)); formparams.add(new BasicNameValuePair(GENERATION, String.valueOf(gen))); UrlEncodedFormEntity entity = new UrlEncodedFormEntity(formparams, "UTF-8"); post.setEntity(entity); @SuppressWarnings("unchecked") NamedList<List<Map<String, Object>>> nl = (NamedList<List<Map<String, Object>>>) getNamedListResponse(post); List<Map<String, Object>> f = nl.get(CMD_GET_FILE_LIST); if (f != null) filesToDownload = Collections.synchronizedList(f); else { filesToDownload = Collections.emptyList(); LOG.error("No files to download for index generation: "+ gen); } f = nl.get(CONF_FILES); if (f != null) confFilesToDownload = Collections.synchronizedList(f); }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
boolean fetchLatestIndex(SolrCore core, boolean force) throws IOException, InterruptedException { successfulInstall = false; replicationStartTime = System.currentTimeMillis(); try { //get the current 'replicateable' index version in the master NamedList response = null; try { response = getLatestVersion(); } catch (Exception e) { LOG.error("Master at: " + masterUrl + " is not available. Index fetch failed. Exception: " + e.getMessage()); return false; } long latestVersion = (Long) response.get(CMD_INDEX_VERSION); long latestGeneration = (Long) response.get(GENERATION); IndexCommit commit; RefCounted<SolrIndexSearcher> searcherRefCounted = null; try { searcherRefCounted = core.getNewestSearcher(false); if (searcherRefCounted == null) { SolrException.log(LOG, "No open searcher found - fetch aborted"); return false; } commit = searcherRefCounted.get().getIndexReader().getIndexCommit(); } finally { if (searcherRefCounted != null) searcherRefCounted.decref(); } if (latestVersion == 0L) { if (force && commit.getGeneration() != 0) { // since we won't get the files for an empty index, // we just clear ours and commit core.getUpdateHandler().getSolrCoreState().getIndexWriter(core).deleteAll(); SolrQueryRequest req = new LocalSolrQueryRequest(core, new ModifiableSolrParams()); core.getUpdateHandler().commit(new CommitUpdateCommand(req, false)); } //there is nothing to be replicated successfulInstall = true; return true; } if (!force && IndexDeletionPolicyWrapper.getCommitTimestamp(commit) == latestVersion) { //master and slave are already in sync just return LOG.info("Slave in sync with master."); successfulInstall = true; return true; } LOG.info("Master's generation: " + latestGeneration); LOG.info("Slave's generation: " + commit.getGeneration()); LOG.info("Starting replication process"); // get the list of files first fetchFileList(latestGeneration); // this can happen if the commit point is deleted before we fetch the file list. if(filesToDownload.isEmpty()) return false; LOG.info("Number of files in latest index in master: " + filesToDownload.size()); // Create the sync service fsyncService = Executors.newSingleThreadExecutor(); // use a synchronized list because the list is read by other threads (to show details) filesDownloaded = Collections.synchronizedList(new ArrayList<Map<String, Object>>()); // if the generateion of master is older than that of the slave , it means they are not compatible to be copied // then a new index direcory to be created and all the files need to be copied boolean isFullCopyNeeded = IndexDeletionPolicyWrapper.getCommitTimestamp(commit) >= latestVersion || force; File tmpIndexDir = createTempindexDir(core); if (isIndexStale()) isFullCopyNeeded = true; successfulInstall = false; boolean deleteTmpIdxDir = true; File indexDir = null ; try { indexDir = new File(core.getIndexDir()); downloadIndexFiles(isFullCopyNeeded, tmpIndexDir, latestGeneration); LOG.info("Total time taken for download : " + ((System.currentTimeMillis() - replicationStartTime) / 1000) + " secs"); Collection<Map<String, Object>> modifiedConfFiles = getModifiedConfFiles(confFilesToDownload); if (!modifiedConfFiles.isEmpty()) { downloadConfFiles(confFilesToDownload, latestGeneration); if (isFullCopyNeeded) { successfulInstall = modifyIndexProps(tmpIndexDir.getName()); deleteTmpIdxDir = false; } else { successfulInstall = copyIndexFiles(tmpIndexDir, indexDir); } if (successfulInstall) { LOG.info("Configuration files are modified, core will be reloaded"); logReplicationTimeAndConfFiles(modifiedConfFiles, successfulInstall);//write to a file time of replication and conf files. reloadCore(); } } else { terminateAndWaitFsyncService(); if (isFullCopyNeeded) { successfulInstall = modifyIndexProps(tmpIndexDir.getName()); deleteTmpIdxDir = false; } else { successfulInstall = copyIndexFiles(tmpIndexDir, indexDir); } if (successfulInstall) { logReplicationTimeAndConfFiles(modifiedConfFiles, successfulInstall); doCommit(); } } replicationStartTime = 0; return successfulInstall; } catch (ReplicationHandlerException e) { LOG.error("User aborted Replication"); return false; } catch (SolrException e) { throw e; } catch (InterruptedException e) { throw new InterruptedException("Index fetch interrupted"); } catch (Exception e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Index fetch failed : ", e); } finally { if (deleteTmpIdxDir) delTree(tmpIndexDir); else delTree(indexDir); } } finally { if (!successfulInstall) { logReplicationTimeAndConfFiles(null, successfulInstall); } filesToDownload = filesDownloaded = confFilesDownloaded = confFilesToDownload = null; replicationStartTime = 0; fileFetcher = null; if (fsyncService != null && !fsyncService.isShutdown()) fsyncService.shutdownNow(); fsyncService = null; stop = false; fsyncException = null; } }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
private void doCommit() throws IOException { SolrQueryRequest req = new LocalSolrQueryRequest(solrCore, new ModifiableSolrParams()); // reboot the writer on the new index and get a new searcher solrCore.getUpdateHandler().newIndexWriter(); try { // first try to open an NRT searcher so that the new // IndexWriter is registered with the reader Future[] waitSearcher = new Future[1]; solrCore.getSearcher(true, false, waitSearcher, true); if (waitSearcher[0] != null) { try { waitSearcher[0].get(); } catch (InterruptedException e) { SolrException.log(LOG,e); } catch (ExecutionException e) { SolrException.log(LOG,e); } } // update our commit point to the right dir solrCore.getUpdateHandler().commit(new CommitUpdateCommand(req, false)); } finally { req.close(); } }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
private boolean copyIndexFiles(File tmpIdxDir, File indexDir) throws IOException { String segmentsFile = null; List<String> copiedfiles = new ArrayList<String>(); for (Map<String, Object> f : filesDownloaded) { String fname = (String) f.get(NAME); // the segments file must be copied last // or else if there is a failure in between the // index will be corrupted if (fname.startsWith("segments_")) { //The segments file must be copied in the end //Otherwise , if the copy fails index ends up corrupted segmentsFile = fname; continue; } if (!copyAFile(tmpIdxDir, indexDir, fname, copiedfiles)) return false; copiedfiles.add(fname); } //copy the segments file last if (segmentsFile != null) { if (!copyAFile(tmpIdxDir, indexDir, segmentsFile, copiedfiles)) return false; } return true; }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
private void copyTmpConfFiles2Conf(File tmpconfDir) throws IOException { File confDir = new File(solrCore.getResourceLoader().getConfigDir()); for (File file : tmpconfDir.listFiles()) { File oldFile = new File(confDir, file.getName()); if (oldFile.exists()) { File backupFile = new File(confDir, oldFile.getName() + "." + getDateAsStr(new Date(oldFile.lastModified()))); boolean status = oldFile.renameTo(backupFile); if (!status) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Unable to rename: " + oldFile + " to: " + backupFile); } } boolean status = file.renameTo(oldFile); if (status) { } else { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Unable to rename: " + file + " to: " + oldFile); } } }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
FastInputStream getStream() throws IOException { post = new HttpPost(masterUrl); //the method is command=filecontent List<BasicNameValuePair> formparams = new ArrayList<BasicNameValuePair>(); formparams.add(new BasicNameValuePair(COMMAND, CMD_GET_FILE)); //add the version to download. This is used to reserve the download formparams.add(new BasicNameValuePair(GENERATION, indexGen.toString())); if (isConf) { //set cf instead of file for config file formparams.add(new BasicNameValuePair(CONF_FILE_SHORT, fileName)); } else { formparams.add(new BasicNameValuePair(FILE, fileName)); } if (useInternal) { formparams.add(new BasicNameValuePair(COMPRESSION, "true")); } if (useExternal) { formparams.add(new BasicNameValuePair("Accept-Encoding", "gzip,deflate")); } //use checksum if (this.includeChecksum) formparams.add(new BasicNameValuePair(CHECKSUM, "true")); //wt=filestream this is a custom protocol formparams.add(new BasicNameValuePair("wt", FILE_STREAM)); // This happen if there is a failure there is a retry. the offset=<sizedownloaded> ensures that // the server starts from the offset if (bytesDownloaded > 0) { formparams.add(new BasicNameValuePair(OFFSET, "" + bytesDownloaded)); } UrlEncodedFormEntity entity = new UrlEncodedFormEntity(formparams, "UTF-8"); post.setEntity(entity); HttpResponse response = myHttpClient.execute(post); InputStream is = response.getEntity().getContent(); //wrap it using FastInputStream if (useInternal) { is = new InflaterInputStream(is); } else if (useExternal) { is = checkCompressed(post, is); } return new FastInputStream(is); }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
private InputStream checkCompressed(AbstractHttpMessage method, InputStream respBody) throws IOException { Header contentEncodingHeader = method.getFirstHeader("Content-Encoding"); if (contentEncodingHeader != null) { String contentEncoding = contentEncodingHeader.getValue(); if (contentEncoding.contains("gzip")) { respBody = new GZIPInputStream(respBody); } else if (contentEncoding.contains("deflate")) { respBody = new InflaterInputStream(respBody); } } else { Header contentTypeHeader = method.getFirstHeader("Content-Type"); if (contentTypeHeader != null) { String contentType = contentTypeHeader.getValue(); if (contentType != null) { if (contentType.startsWith("application/x-gzip-compressed")) { respBody = new GZIPInputStream(respBody); } else if (contentType.startsWith("application/x-deflate")) { respBody = new InflaterInputStream(respBody); } } } } return respBody; }
// in core/src/java/org/apache/solr/handler/loader/XMLLoader.java
Transformer getTransformer(String xslt, SolrQueryRequest request) throws IOException { // not the cleanest way to achieve this // no need to synchronize access to context, right? // Nothing else happens with it at the same time final Map<Object,Object> ctx = request.getContext(); Transformer result = (Transformer)ctx.get(CONTEXT_TRANSFORMER_KEY); if(result==null) { SolrConfig solrConfig = request.getCore().getSolrConfig(); result = TransformerProvider.instance.getTransformer(solrConfig, xslt, xsltCacheLifetimeSeconds); result.setErrorListener(xmllog); ctx.put(CONTEXT_TRANSFORMER_KEY,result); } return result; }
// in core/src/java/org/apache/solr/handler/loader/XMLLoader.java
void processUpdate(SolrQueryRequest req, UpdateRequestProcessor processor, XMLStreamReader parser) throws XMLStreamException, IOException, FactoryConfigurationError, InstantiationException, IllegalAccessException, TransformerConfigurationException { AddUpdateCommand addCmd = null; SolrParams params = req.getParams(); while (true) { int event = parser.next(); switch (event) { case XMLStreamConstants.END_DOCUMENT: parser.close(); return; case XMLStreamConstants.START_ELEMENT: String currTag = parser.getLocalName(); if (currTag.equals(UpdateRequestHandler.ADD)) { log.trace("SolrCore.update(add)"); addCmd = new AddUpdateCommand(req); // First look for commitWithin parameter on the request, will be overwritten for individual <add>'s addCmd.commitWithin = params.getInt(UpdateParams.COMMIT_WITHIN, -1); addCmd.overwrite = params.getBool(UpdateParams.OVERWRITE, true); for (int i = 0; i < parser.getAttributeCount(); i++) { String attrName = parser.getAttributeLocalName(i); String attrVal = parser.getAttributeValue(i); if (UpdateRequestHandler.OVERWRITE.equals(attrName)) { addCmd.overwrite = StrUtils.parseBoolean(attrVal); } else if (UpdateRequestHandler.COMMIT_WITHIN.equals(attrName)) { addCmd.commitWithin = Integer.parseInt(attrVal); } else { log.warn("Unknown attribute id in add:" + attrName); } } } else if ("doc".equals(currTag)) { if(addCmd != null) { log.trace("adding doc..."); addCmd.clear(); addCmd.solrDoc = readDoc(parser); processor.processAdd(addCmd); } else { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Unexpected <doc> tag without an <add> tag surrounding it."); } } else if (UpdateRequestHandler.COMMIT.equals(currTag) || UpdateRequestHandler.OPTIMIZE.equals(currTag)) { log.trace("parsing " + currTag); CommitUpdateCommand cmd = new CommitUpdateCommand(req, UpdateRequestHandler.OPTIMIZE.equals(currTag)); ModifiableSolrParams mp = new ModifiableSolrParams(); for (int i = 0; i < parser.getAttributeCount(); i++) { String attrName = parser.getAttributeLocalName(i); String attrVal = parser.getAttributeValue(i); mp.set(attrName, attrVal); } RequestHandlerUtils.validateCommitParams(mp); SolrParams p = SolrParams.wrapDefaults(mp, req.getParams()); // default to the normal request params for commit options RequestHandlerUtils.updateCommit(cmd, p); processor.processCommit(cmd); } // end commit else if (UpdateRequestHandler.ROLLBACK.equals(currTag)) { log.trace("parsing " + currTag); RollbackUpdateCommand cmd = new RollbackUpdateCommand(req); processor.processRollback(cmd); } // end rollback else if (UpdateRequestHandler.DELETE.equals(currTag)) { log.trace("parsing delete"); processDelete(req, processor, parser); } // end delete break; } } }
// in core/src/java/org/apache/solr/handler/loader/XMLLoader.java
void processDelete(SolrQueryRequest req, UpdateRequestProcessor processor, XMLStreamReader parser) throws XMLStreamException, IOException { // Parse the command DeleteUpdateCommand deleteCmd = new DeleteUpdateCommand(req); // First look for commitWithin parameter on the request, will be overwritten for individual <delete>'s SolrParams params = req.getParams(); deleteCmd.commitWithin = params.getInt(UpdateParams.COMMIT_WITHIN, -1); for (int i = 0; i < parser.getAttributeCount(); i++) { String attrName = parser.getAttributeLocalName(i); String attrVal = parser.getAttributeValue(i); if ("fromPending".equals(attrName)) { // deprecated } else if ("fromCommitted".equals(attrName)) { // deprecated } else if (UpdateRequestHandler.COMMIT_WITHIN.equals(attrName)) { deleteCmd.commitWithin = Integer.parseInt(attrVal); } else { log.warn("unexpected attribute delete/@" + attrName); } } StringBuilder text = new StringBuilder(); while (true) { int event = parser.next(); switch (event) { case XMLStreamConstants.START_ELEMENT: String mode = parser.getLocalName(); if (!("id".equals(mode) || "query".equals(mode))) { log.warn("unexpected XML tag /delete/" + mode); throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "unexpected XML tag /delete/" + mode); } text.setLength(0); if ("id".equals(mode)) { for (int i = 0; i < parser.getAttributeCount(); i++) { String attrName = parser.getAttributeLocalName(i); String attrVal = parser.getAttributeValue(i); if (UpdateRequestHandler.VERSION.equals(attrName)) { deleteCmd.setVersion(Long.parseLong(attrVal)); } } } break; case XMLStreamConstants.END_ELEMENT: String currTag = parser.getLocalName(); if ("id".equals(currTag)) { deleteCmd.setId(text.toString()); } else if ("query".equals(currTag)) { deleteCmd.setQuery(text.toString()); } else if ("delete".equals(currTag)) { return; } else { log.warn("unexpected XML tag /delete/" + currTag); throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "unexpected XML tag /delete/" + currTag); } processor.processDelete(deleteCmd); deleteCmd.clear(); break; // Add everything to the text case XMLStreamConstants.SPACE: case XMLStreamConstants.CDATA: case XMLStreamConstants.CHARACTERS: text.append(parser.getText()); break; } } }
// in core/src/java/org/apache/solr/handler/loader/JavabinLoader.java
private void parseAndLoadDocs(final SolrQueryRequest req, SolrQueryResponse rsp, InputStream stream, final UpdateRequestProcessor processor) throws IOException { UpdateRequest update = null; JavaBinUpdateRequestCodec.StreamingUpdateHandler handler = new JavaBinUpdateRequestCodec.StreamingUpdateHandler() { private AddUpdateCommand addCmd = null; @Override public void update(SolrInputDocument document, UpdateRequest updateRequest) { if (document == null) { // Perhaps commit from the parameters try { RequestHandlerUtils.handleCommit(req, processor, updateRequest.getParams(), false); RequestHandlerUtils.handleRollback(req, processor, updateRequest.getParams(), false); } catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "ERROR handling commit/rollback"); } return; } if (addCmd == null) { addCmd = getAddCommand(req, updateRequest.getParams()); } addCmd.solrDoc = document; try { processor.processAdd(addCmd); addCmd.clear(); } catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "ERROR adding document " + document); } } }; FastInputStream in = FastInputStream.wrap(stream); for (; ; ) { try { update = new JavaBinUpdateRequestCodec().unmarshal(in, handler); } catch (EOFException e) { break; // this is expected } catch (Exception e) { log.error("Exception while processing update request", e); break; } if (update.getDeleteById() != null || update.getDeleteQuery() != null) { delete(req, update, processor); } } }
// in core/src/java/org/apache/solr/handler/loader/JavabinLoader.java
private void delete(SolrQueryRequest req, UpdateRequest update, UpdateRequestProcessor processor) throws IOException { SolrParams params = update.getParams(); DeleteUpdateCommand delcmd = new DeleteUpdateCommand(req); if(params != null) { delcmd.commitWithin = params.getInt(UpdateParams.COMMIT_WITHIN, -1); } if(update.getDeleteById() != null) { for (String s : update.getDeleteById()) { delcmd.id = s; processor.processDelete(delcmd); } delcmd.id = null; } if(update.getDeleteQuery() != null) { for (String s : update.getDeleteQuery()) { delcmd.query = s; processor.processDelete(delcmd); } } }
// in core/src/java/org/apache/solr/handler/loader/JsonLoader.java
DeleteUpdateCommand parseDelete() throws IOException { assertNextEvent( JSONParser.OBJECT_START ); DeleteUpdateCommand cmd = new DeleteUpdateCommand(req); cmd.commitWithin = commitWithin; while( true ) { int ev = parser.nextEvent(); if( ev == JSONParser.STRING ) { String key = parser.getString(); if( parser.wasKey() ) { if( "id".equals( key ) ) { cmd.setId(parser.getString()); } else if( "query".equals(key) ) { cmd.setQuery(parser.getString()); } else if( "commitWithin".equals(key) ) { cmd.commitWithin = Integer.parseInt(parser.getString()); } else { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Unknown key: "+key+" ["+parser.getPosition()+"]" ); } } else { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "invalid string: " + key +" at ["+parser.getPosition()+"]" ); } } else if( ev == JSONParser.OBJECT_END ) { if( cmd.getId() == null && cmd.getQuery() == null ) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Missing id or query for delete ["+parser.getPosition()+"]" ); } return cmd; } else { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Got: "+JSONParser.getEventString( ev ) +" at ["+parser.getPosition()+"]" ); } } }
// in core/src/java/org/apache/solr/handler/loader/JsonLoader.java
RollbackUpdateCommand parseRollback() throws IOException { assertNextEvent( JSONParser.OBJECT_START ); assertNextEvent( JSONParser.OBJECT_END ); return new RollbackUpdateCommand(req); }
// in core/src/java/org/apache/solr/handler/loader/JsonLoader.java
void parseCommitOptions(CommitUpdateCommand cmd ) throws IOException { assertNextEvent( JSONParser.OBJECT_START ); final Map<String,Object> map = (Map)ObjectBuilder.getVal(parser); // SolrParams currently expects string values... SolrParams p = new SolrParams() { @Override public String get(String param) { Object o = map.get(param); return o == null ? null : o.toString(); } @Override public String[] getParams(String param) { return new String[]{get(param)}; } @Override public Iterator<String> getParameterNamesIterator() { return map.keySet().iterator(); } }; RequestHandlerUtils.validateCommitParams(p); p = SolrParams.wrapDefaults(p, req.getParams()); // default to the normal request params for commit options RequestHandlerUtils.updateCommit(cmd, p); }
// in core/src/java/org/apache/solr/handler/loader/JsonLoader.java
AddUpdateCommand parseAdd() throws IOException { AddUpdateCommand cmd = new AddUpdateCommand(req); cmd.commitWithin = commitWithin; cmd.overwrite = overwrite; float boost = 1.0f; while( true ) { int ev = parser.nextEvent(); if( ev == JSONParser.STRING ) { if( parser.wasKey() ) { String key = parser.getString(); if( "doc".equals( key ) ) { if( cmd.solrDoc != null ) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "multiple docs in same add command" ); } ev = assertNextEvent( JSONParser.OBJECT_START ); cmd.solrDoc = parseDoc( ev ); } else if( UpdateRequestHandler.OVERWRITE.equals( key ) ) { cmd.overwrite = parser.getBoolean(); // reads next boolean } else if( UpdateRequestHandler.COMMIT_WITHIN.equals( key ) ) { cmd.commitWithin = (int)parser.getLong(); } else if( "boost".equals( key ) ) { boost = Float.parseFloat( parser.getNumberChars().toString() ); } else { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Unknown key: "+key+" ["+parser.getPosition()+"]" ); } } else { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Should be a key " +" at ["+parser.getPosition()+"]" ); } } else if( ev == JSONParser.OBJECT_END ) { if( cmd.solrDoc == null ) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST,"missing solr document. "+parser.getPosition() ); } cmd.solrDoc.setDocumentBoost( boost ); return cmd; } else { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Got: "+JSONParser.getEventString( ev ) +" at ["+parser.getPosition()+"]" ); } } }
// in core/src/java/org/apache/solr/handler/loader/JsonLoader.java
void handleAdds() throws IOException { while( true ) { AddUpdateCommand cmd = new AddUpdateCommand(req); cmd.commitWithin = commitWithin; cmd.overwrite = overwrite; int ev = parser.nextEvent(); if (ev == JSONParser.ARRAY_END) break; assertEvent(ev, JSONParser.OBJECT_START); cmd.solrDoc = parseDoc(ev); processor.processAdd(cmd); } }
// in core/src/java/org/apache/solr/handler/loader/JsonLoader.java
int assertNextEvent(int expected ) throws IOException { int got = parser.nextEvent(); assertEvent(got, expected); return got; }
// in core/src/java/org/apache/solr/handler/loader/JsonLoader.java
private SolrInputDocument parseDoc(int ev) throws IOException { assert ev == JSONParser.OBJECT_START; SolrInputDocument sdoc = new SolrInputDocument(); for (;;) { SolrInputField sif = parseField(); if (sif == null) return sdoc; SolrInputField prev = sdoc.put(sif.getName(), sif); if (prev != null) { // blech - repeated keys sif.addValue(prev.getValue(), prev.getBoost()); } } }
// in core/src/java/org/apache/solr/handler/loader/JsonLoader.java
private SolrInputField parseField() throws IOException { int ev = parser.nextEvent(); if (ev == JSONParser.OBJECT_END) { return null; } String fieldName = parser.getString(); SolrInputField sif = new SolrInputField(fieldName); parseFieldValue(sif); return sif; }
// in core/src/java/org/apache/solr/handler/loader/JsonLoader.java
private void parseFieldValue(SolrInputField sif) throws IOException { int ev = parser.nextEvent(); if (ev == JSONParser.OBJECT_START) { parseExtendedFieldValue(sif, ev); } else { Object val = parseNormalFieldValue(ev); sif.setValue(val, 1.0f); } }
// in core/src/java/org/apache/solr/handler/loader/JsonLoader.java
private void parseExtendedFieldValue(SolrInputField sif, int ev) throws IOException { assert ev == JSONParser.OBJECT_START; float boost = 1.0f; Object normalFieldValue = null; Map<String, Object> extendedInfo = null; for (;;) { ev = parser.nextEvent(); switch (ev) { case JSONParser.STRING: String label = parser.getString(); if ("boost".equals(label)) { ev = parser.nextEvent(); if( ev != JSONParser.NUMBER && ev != JSONParser.LONG && ev != JSONParser.BIGNUMBER ) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "boost should have number! "+JSONParser.getEventString(ev) ); } boost = (float)parser.getDouble(); } else if ("value".equals(label)) { normalFieldValue = parseNormalFieldValue(parser.nextEvent()); } else { // If we encounter other unknown map keys, then use a map if (extendedInfo == null) { extendedInfo = new HashMap<String, Object>(2); } // for now, the only extended info will be field values // we could either store this as an Object or a SolrInputField Object val = parseNormalFieldValue(parser.nextEvent()); extendedInfo.put(label, val); } break; case JSONParser.OBJECT_END: if (extendedInfo != null) { if (normalFieldValue != null) { extendedInfo.put("value",normalFieldValue); } sif.setValue(extendedInfo, boost); } else { sif.setValue(normalFieldValue, boost); } return; default: throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Error parsing JSON extended field value. Unexpected "+JSONParser.getEventString(ev) ); } } }
// in core/src/java/org/apache/solr/handler/loader/JsonLoader.java
private Object parseNormalFieldValue(int ev) throws IOException { if (ev == JSONParser.ARRAY_START) { List<Object> val = parseArrayFieldValue(ev); return val; } else { Object val = parseSingleFieldValue(ev); return val; } }
// in core/src/java/org/apache/solr/handler/loader/JsonLoader.java
private Object parseSingleFieldValue(int ev) throws IOException { switch (ev) { case JSONParser.STRING: return parser.getString(); case JSONParser.LONG: case JSONParser.NUMBER: case JSONParser.BIGNUMBER: return parser.getNumberChars().toString(); case JSONParser.BOOLEAN: return Boolean.toString(parser.getBoolean()); // for legacy reasons, single values s are expected to be strings case JSONParser.NULL: parser.getNull(); return null; case JSONParser.ARRAY_START: return parseArrayFieldValue(ev); default: throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Error parsing JSON field value. Unexpected "+JSONParser.getEventString(ev) ); } }
// in core/src/java/org/apache/solr/handler/loader/JsonLoader.java
private List<Object> parseArrayFieldValue(int ev) throws IOException { assert ev == JSONParser.ARRAY_START; ArrayList lst = new ArrayList(2); for (;;) { ev = parser.nextEvent(); if (ev == JSONParser.ARRAY_END) { return lst; } Object val = parseSingleFieldValue(ev); lst.add(val); } }
// in core/src/java/org/apache/solr/handler/loader/CSVLoader.java
Override void addDoc(int line, String[] vals) throws IOException { templateAdd.clear(); SolrInputDocument doc = new SolrInputDocument(); doAdd(line, vals, doc, templateAdd); }
// in core/src/java/org/apache/solr/handler/loader/CSVLoaderBase.java
Override public void load(SolrQueryRequest req, SolrQueryResponse rsp, ContentStream stream, UpdateRequestProcessor processor) throws IOException { errHeader = "CSVLoader: input=" + stream.getSourceInfo(); Reader reader = null; try { reader = stream.getReader(); if (skipLines>0) { if (!(reader instanceof BufferedReader)) { reader = new BufferedReader(reader); } BufferedReader r = (BufferedReader)reader; for (int i=0; i<skipLines; i++) { r.readLine(); } } CSVParser parser = new CSVParser(reader, strategy); // parse the fieldnames from the header of the file if (fieldnames==null) { fieldnames = parser.getLine(); if (fieldnames==null) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST,"Expected fieldnames in CSV input"); } prepareFields(); } // read the rest of the CSV file for(;;) { int line = parser.getLineNumber(); // for error reporting in MT mode String[] vals = null; try { vals = parser.getLine(); } catch (IOException e) { //Catch the exception and rethrow it with more line information input_err("can't read line: " + line, null, line, e); } if (vals==null) break; if (vals.length != fields.length) { input_err("expected "+fields.length+" values but got "+vals.length, vals, line); } addDoc(line,vals); } } finally{ if (reader != null) { IOUtils.closeQuietly(reader); } } }
// in core/src/java/org/apache/solr/handler/loader/CSVLoaderBase.java
void doAdd(int line, String[] vals, SolrInputDocument doc, AddUpdateCommand template) throws IOException { // the line number is passed simply for error reporting in MT mode. // first, create the lucene document for (int i=0; i<vals.length; i++) { if (fields[i]==null) continue; // ignore this field String val = vals[i]; adders[i].add(doc, line, i, val); } // add any literals for (SchemaField sf : literals.keySet()) { String fn = sf.getName(); String val = literals.get(sf); doc.addField(fn, val); } template.solrDoc = doc; processor.processAdd(template); }
// in core/src/java/org/apache/solr/handler/ReplicationHandler.java
private void registerFileStreamResponseWriter() { core.registerResponseWriter(FILE_STREAM, new BinaryQueryResponseWriter() { public void write(OutputStream out, SolrQueryRequest request, SolrQueryResponse resp) throws IOException { FileStream stream = (FileStream) resp.getValues().get(FILE_STREAM); stream.write(out); } public void write(Writer writer, SolrQueryRequest request, SolrQueryResponse response) throws IOException { throw new RuntimeException("This is a binary writer , Cannot write to a characterstream"); } public String getContentType(SolrQueryRequest request, SolrQueryResponse response) { return "application/octet-stream"; } public void init(NamedList args) { /*no op*/ } }); }
// in core/src/java/org/apache/solr/handler/ReplicationHandler.java
public void write(OutputStream out, SolrQueryRequest request, SolrQueryResponse resp) throws IOException { FileStream stream = (FileStream) resp.getValues().get(FILE_STREAM); stream.write(out); }
// in core/src/java/org/apache/solr/handler/ReplicationHandler.java
public void write(Writer writer, SolrQueryRequest request, SolrQueryResponse response) throws IOException { throw new RuntimeException("This is a binary writer , Cannot write to a characterstream"); }
// in core/src/java/org/apache/solr/handler/ReplicationHandler.java
public void write(OutputStream out) throws IOException { String fileName = params.get(FILE); String cfileName = params.get(CONF_FILE_SHORT); String sOffset = params.get(OFFSET); String sLen = params.get(LEN); String compress = params.get(COMPRESSION); String sChecksum = params.get(CHECKSUM); String sGen = params.get(GENERATION); if (sGen != null) indexGen = Long.parseLong(sGen); if (Boolean.parseBoolean(compress)) { fos = new FastOutputStream(new DeflaterOutputStream(out)); } else { fos = new FastOutputStream(out); } FileInputStream inputStream = null; int packetsWritten = 0; try { long offset = -1; int len = -1; //check if checksum is requested boolean useChecksum = Boolean.parseBoolean(sChecksum); if (sOffset != null) offset = Long.parseLong(sOffset); if (sLen != null) len = Integer.parseInt(sLen); if (fileName == null && cfileName == null) { //no filename do nothing writeNothing(); } File file = null; if (cfileName != null) { //if if is a conf file read from config diectory file = new File(core.getResourceLoader().getConfigDir(), cfileName); } else { //else read from the indexdirectory file = new File(core.getIndexDir(), fileName); } if (file.exists() && file.canRead()) { inputStream = new FileInputStream(file); FileChannel channel = inputStream.getChannel(); //if offset is mentioned move the pointer to that point if (offset != -1) channel.position(offset); byte[] buf = new byte[(len == -1 || len > PACKET_SZ) ? PACKET_SZ : len]; Checksum checksum = null; if (useChecksum) checksum = new Adler32(); ByteBuffer bb = ByteBuffer.wrap(buf); while (true) { bb.clear(); long bytesRead = channel.read(bb); if (bytesRead <= 0) { writeNothing(); fos.close(); break; } fos.writeInt((int) bytesRead); if (useChecksum) { checksum.reset(); checksum.update(buf, 0, (int) bytesRead); fos.writeLong(checksum.getValue()); } fos.write(buf, 0, (int) bytesRead); fos.flush(); if (indexGen != null && (packetsWritten % 5 == 0)) { //after every 5 packets reserve the commitpoint for some time delPolicy.setReserveDuration(indexGen, reserveCommitDuration); } packetsWritten++; } } else { writeNothing(); } } catch (IOException e) { LOG.warn("Exception while writing response for params: " + params, e); } finally { IOUtils.closeQuietly(inputStream); } }
// in core/src/java/org/apache/solr/handler/ReplicationHandler.java
private void writeNothing() throws IOException { fos.writeInt(0); fos.flush(); }
// in core/src/java/org/apache/solr/handler/RequestHandlerUtils.java
public static boolean handleCommit(SolrQueryRequest req, UpdateRequestProcessor processor, SolrParams params, boolean force ) throws IOException { if( params == null) { params = new MapSolrParams( new HashMap<String, String>() ); } boolean optimize = params.getBool( UpdateParams.OPTIMIZE, false ); boolean commit = params.getBool( UpdateParams.COMMIT, false ); boolean softCommit = params.getBool( UpdateParams.SOFT_COMMIT, false ); boolean prepareCommit = params.getBool( UpdateParams.PREPARE_COMMIT, false ); if( optimize || commit || softCommit || prepareCommit || force ) { CommitUpdateCommand cmd = new CommitUpdateCommand(req, optimize ); updateCommit(cmd, params); processor.processCommit( cmd ); return true; } return false; }
// in core/src/java/org/apache/solr/handler/RequestHandlerUtils.java
public static void updateCommit(CommitUpdateCommand cmd, SolrParams params) throws IOException { if( params == null ) return; cmd.openSearcher = params.getBool( UpdateParams.OPEN_SEARCHER, cmd.openSearcher ); cmd.waitSearcher = params.getBool( UpdateParams.WAIT_SEARCHER, cmd.waitSearcher ); cmd.softCommit = params.getBool( UpdateParams.SOFT_COMMIT, cmd.softCommit ); cmd.expungeDeletes = params.getBool( UpdateParams.EXPUNGE_DELETES, cmd.expungeDeletes ); cmd.maxOptimizeSegments = params.getInt( UpdateParams.MAX_OPTIMIZE_SEGMENTS, cmd.maxOptimizeSegments ); cmd.prepareCommit = params.getBool( UpdateParams.PREPARE_COMMIT, cmd.prepareCommit ); }
// in core/src/java/org/apache/solr/handler/RequestHandlerUtils.java
public static boolean handleRollback(SolrQueryRequest req, UpdateRequestProcessor processor, SolrParams params, boolean force ) throws IOException { if( params == null ) { params = new MapSolrParams( new HashMap<String, String>() ); } boolean rollback = params.getBool( UpdateParams.ROLLBACK, false ); if( rollback || force ) { RollbackUpdateCommand cmd = new RollbackUpdateCommand(req); processor.processRollback( cmd ); return true; } return false; }
// in core/src/java/org/apache/solr/handler/MoreLikeThisHandler.java
public DocListAndSet getMoreLikeThis( int id, int start, int rows, List<Query> filters, List<InterestingTerm> terms, int flags ) throws IOException { Document doc = reader.document(id); rawMLTQuery = mlt.like(id); boostedMLTQuery = getBoostedQuery( rawMLTQuery ); if( terms != null ) { fillInterestingTermsFromMLTQuery( rawMLTQuery, terms ); } // exclude current document from results realMLTQuery = new BooleanQuery(); realMLTQuery.add(boostedMLTQuery, BooleanClause.Occur.MUST); realMLTQuery.add( new TermQuery(new Term(uniqueKeyField.getName(), uniqueKeyField.getType().storedToIndexed(doc.getField(uniqueKeyField.getName())))), BooleanClause.Occur.MUST_NOT); DocListAndSet results = new DocListAndSet(); if (this.needDocSet) { results = searcher.getDocListAndSet(realMLTQuery, filters, null, start, rows, flags); } else { results.docList = searcher.getDocList(realMLTQuery, filters, null, start, rows, flags); } return results; }
// in core/src/java/org/apache/solr/handler/MoreLikeThisHandler.java
public DocListAndSet getMoreLikeThis( Reader reader, int start, int rows, List<Query> filters, List<InterestingTerm> terms, int flags ) throws IOException { // analyzing with the first field: previous (stupid) behavior rawMLTQuery = mlt.like(reader, mlt.getFieldNames()[0]); boostedMLTQuery = getBoostedQuery( rawMLTQuery ); if( terms != null ) { fillInterestingTermsFromMLTQuery( boostedMLTQuery, terms ); } DocListAndSet results = new DocListAndSet(); if (this.needDocSet) { results = searcher.getDocListAndSet( boostedMLTQuery, filters, null, start, rows, flags); } else { results.docList = searcher.getDocList( boostedMLTQuery, filters, null, start, rows, flags); } return results; }
// in core/src/java/org/apache/solr/handler/MoreLikeThisHandler.java
Deprecated public NamedList<DocList> getMoreLikeThese( DocList docs, int rows, int flags ) throws IOException { IndexSchema schema = searcher.getSchema(); NamedList<DocList> mlt = new SimpleOrderedMap<DocList>(); DocIterator iterator = docs.iterator(); while( iterator.hasNext() ) { int id = iterator.nextDoc(); DocListAndSet sim = getMoreLikeThis( id, 0, rows, null, null, flags ); String name = schema.printableUniqueKey( reader.document( id ) ); mlt.add(name, sim.docList); } return mlt; }
// in core/src/java/org/apache/solr/handler/component/TermsComponent.java
Override public void prepare(ResponseBuilder rb) throws IOException { SolrParams params = rb.req.getParams(); if (params.getBool(TermsParams.TERMS, false)) { rb.doTerms = true; } // TODO: temporary... this should go in a different component. String shards = params.get(ShardParams.SHARDS); if (shards != null) { rb.isDistrib = true; if (params.get(ShardParams.SHARDS_QT) == null) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "No shards.qt parameter specified"); } List<String> lst = StrUtils.splitSmart(shards, ",", true); rb.shards = lst.toArray(new String[lst.size()]); } }
// in core/src/java/org/apache/solr/handler/component/TermsComponent.java
Override public void process(ResponseBuilder rb) throws IOException { SolrParams params = rb.req.getParams(); if (!params.getBool(TermsParams.TERMS, false)) return; String[] fields = params.getParams(TermsParams.TERMS_FIELD); NamedList<Object> termsResult = new SimpleOrderedMap<Object>(); rb.rsp.add("terms", termsResult); if (fields == null || fields.length==0) return; int limit = params.getInt(TermsParams.TERMS_LIMIT, 10); if (limit < 0) { limit = Integer.MAX_VALUE; } String lowerStr = params.get(TermsParams.TERMS_LOWER); String upperStr = params.get(TermsParams.TERMS_UPPER); boolean upperIncl = params.getBool(TermsParams.TERMS_UPPER_INCLUSIVE, false); boolean lowerIncl = params.getBool(TermsParams.TERMS_LOWER_INCLUSIVE, true); boolean sort = !TermsParams.TERMS_SORT_INDEX.equals( params.get(TermsParams.TERMS_SORT, TermsParams.TERMS_SORT_COUNT)); int freqmin = params.getInt(TermsParams.TERMS_MINCOUNT, 1); int freqmax = params.getInt(TermsParams.TERMS_MAXCOUNT, UNLIMITED_MAX_COUNT); if (freqmax<0) { freqmax = Integer.MAX_VALUE; } String prefix = params.get(TermsParams.TERMS_PREFIX_STR); String regexp = params.get(TermsParams.TERMS_REGEXP_STR); Pattern pattern = regexp != null ? Pattern.compile(regexp, resolveRegexpFlags(params)) : null; boolean raw = params.getBool(TermsParams.TERMS_RAW, false); final AtomicReader indexReader = rb.req.getSearcher().getAtomicReader(); Fields lfields = indexReader.fields(); for (String field : fields) { NamedList<Integer> fieldTerms = new NamedList<Integer>(); termsResult.add(field, fieldTerms); Terms terms = lfields == null ? null : lfields.terms(field); if (terms == null) { // no terms for this field continue; } FieldType ft = raw ? null : rb.req.getSchema().getFieldTypeNoEx(field); if (ft==null) ft = new StrField(); // prefix must currently be text BytesRef prefixBytes = prefix==null ? null : new BytesRef(prefix); BytesRef upperBytes = null; if (upperStr != null) { upperBytes = new BytesRef(); ft.readableToIndexed(upperStr, upperBytes); } BytesRef lowerBytes; if (lowerStr == null) { // If no lower bound was specified, use the prefix lowerBytes = prefixBytes; } else { lowerBytes = new BytesRef(); if (raw) { // TODO: how to handle binary? perhaps we don't for "raw"... or if the field exists // perhaps we detect if the FieldType is non-character and expect hex if so? lowerBytes = new BytesRef(lowerStr); } else { lowerBytes = new BytesRef(); ft.readableToIndexed(lowerStr, lowerBytes); } } TermsEnum termsEnum = terms.iterator(null); BytesRef term = null; if (lowerBytes != null) { if (termsEnum.seekCeil(lowerBytes, true) == TermsEnum.SeekStatus.END) { termsEnum = null; } else { term = termsEnum.term(); //Only advance the enum if we are excluding the lower bound and the lower Term actually matches if (lowerIncl == false && term.equals(lowerBytes)) { term = termsEnum.next(); } } } else { // position termsEnum on first term term = termsEnum.next(); } int i = 0; BoundedTreeSet<CountPair<BytesRef, Integer>> queue = (sort ? new BoundedTreeSet<CountPair<BytesRef, Integer>>(limit) : null); CharsRef external = new CharsRef(); while (term != null && (i<limit || sort)) { boolean externalized = false; // did we fill in "external" yet for this term? // stop if the prefix doesn't match if (prefixBytes != null && !StringHelper.startsWith(term, prefixBytes)) break; if (pattern != null) { // indexed text or external text? // TODO: support "raw" mode? ft.indexedToReadable(term, external); externalized = true; if (!pattern.matcher(external).matches()) { term = termsEnum.next(); continue; } } if (upperBytes != null) { int upperCmp = term.compareTo(upperBytes); // if we are past the upper term, or equal to it (when don't include upper) then stop. if (upperCmp>0 || (upperCmp==0 && !upperIncl)) break; } // This is a good term in the range. Check if mincount/maxcount conditions are satisfied. int docFreq = termsEnum.docFreq(); if (docFreq >= freqmin && docFreq <= freqmax) { // add the term to the list if (sort) { queue.add(new CountPair<BytesRef, Integer>(BytesRef.deepCopyOf(term), docFreq)); } else { // TODO: handle raw somehow if (!externalized) { ft.indexedToReadable(term, external); } fieldTerms.add(external.toString(), docFreq); i++; } } term = termsEnum.next(); } if (sort) { for (CountPair<BytesRef, Integer> item : queue) { if (i >= limit) break; ft.indexedToReadable(item.key, external); fieldTerms.add(external.toString(), item.val); i++; } } } }
// in core/src/java/org/apache/solr/handler/component/TermsComponent.java
Override public int distributedProcess(ResponseBuilder rb) throws IOException { if (!rb.doTerms) { return ResponseBuilder.STAGE_DONE; } if (rb.stage == ResponseBuilder.STAGE_EXECUTE_QUERY) { TermsHelper th = rb._termsHelper; if (th == null) { th = rb._termsHelper = new TermsHelper(); th.init(rb.req.getParams()); } ShardRequest sreq = createShardQuery(rb.req.getParams()); rb.addRequest(this, sreq); } if (rb.stage < ResponseBuilder.STAGE_EXECUTE_QUERY) { return ResponseBuilder.STAGE_EXECUTE_QUERY; } else { return ResponseBuilder.STAGE_DONE; } }
// in core/src/java/org/apache/solr/handler/component/QueryElevationComponent.java
private Map<String, ElevationObj> loadElevationMap(Config cfg) throws IOException { XPath xpath = XPathFactory.newInstance().newXPath(); Map<String, ElevationObj> map = new HashMap<String, ElevationObj>(); NodeList nodes = (NodeList) cfg.evaluate("elevate/query", XPathConstants.NODESET); for (int i = 0; i < nodes.getLength(); i++) { Node node = nodes.item(i); String qstr = DOMUtil.getAttr(node, "text", "missing query 'text'"); NodeList children = null; try { children = (NodeList) xpath.evaluate("doc", node, XPathConstants.NODESET); } catch (XPathExpressionException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "query requires '<doc .../>' child"); } ArrayList<String> include = new ArrayList<String>(); ArrayList<String> exclude = new ArrayList<String>(); for (int j = 0; j < children.getLength(); j++) { Node child = children.item(j); String id = DOMUtil.getAttr(child, "id", "missing 'id'"); String e = DOMUtil.getAttr(child, EXCLUDE, null); if (e != null) { if (Boolean.valueOf(e)) { exclude.add(id); continue; } } include.add(id); } ElevationObj elev = new ElevationObj(qstr, include, exclude); if (map.containsKey(elev.analyzed)) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Boosting query defined twice for query: '" + elev.text + "' (" + elev.analyzed + "')"); } map.put(elev.analyzed, elev); } return map; }
// in core/src/java/org/apache/solr/handler/component/QueryElevationComponent.java
void setTopQueryResults(IndexReader reader, String query, String[] ids, String[] ex) throws IOException { if (ids == null) { ids = new String[0]; } if (ex == null) { ex = new String[0]; } Map<String, ElevationObj> elev = elevationCache.get(reader); if (elev == null) { elev = new HashMap<String, ElevationObj>(); elevationCache.put(reader, elev); } ElevationObj obj = new ElevationObj(query, Arrays.asList(ids), Arrays.asList(ex)); elev.put(obj.analyzed, obj); }
// in core/src/java/org/apache/solr/handler/component/QueryElevationComponent.java
String getAnalyzedQuery(String query) throws IOException { if (analyzer == null) { return query; } StringBuilder norm = new StringBuilder(); TokenStream tokens = analyzer.tokenStream("", new StringReader(query)); tokens.reset(); CharTermAttribute termAtt = tokens.addAttribute(CharTermAttribute.class); while (tokens.incrementToken()) { norm.append(termAtt.buffer(), 0, termAtt.length()); } tokens.end(); tokens.close(); return norm.toString(); }
// in core/src/java/org/apache/solr/handler/component/QueryElevationComponent.java
Override public void prepare(ResponseBuilder rb) throws IOException { SolrQueryRequest req = rb.req; SolrParams params = req.getParams(); // A runtime param can skip if (!params.getBool(QueryElevationParams.ENABLE, true)) { return; } boolean exclusive = params.getBool(QueryElevationParams.EXCLUSIVE, false); // A runtime parameter can alter the config value for forceElevation boolean force = params.getBool(QueryElevationParams.FORCE_ELEVATION, forceElevation); boolean markExcludes = params.getBool(QueryElevationParams.MARK_EXCLUDES, false); Query query = rb.getQuery(); String qstr = rb.getQueryString(); if (query == null || qstr == null) { return; } qstr = getAnalyzedQuery(qstr); IndexReader reader = req.getSearcher().getIndexReader(); ElevationObj booster = null; try { booster = getElevationMap(reader, req.getCore()).get(qstr); } catch (Exception ex) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Error loading elevation", ex); } if (booster != null) { rb.req.getContext().put(BOOSTED, booster.ids); // Change the query to insert forced documents if (exclusive == true) { //we only want these results rb.setQuery(booster.include); } else { BooleanQuery newq = new BooleanQuery(true); newq.add(query, BooleanClause.Occur.SHOULD); newq.add(booster.include, BooleanClause.Occur.SHOULD); if (booster.exclude != null) { if (markExcludes == false) { for (TermQuery tq : booster.exclude) { newq.add(new BooleanClause(tq, BooleanClause.Occur.MUST_NOT)); } } else { //we are only going to mark items as excluded, not actually exclude them. This works //with the EditorialMarkerFactory rb.req.getContext().put(EXCLUDED, booster.excludeIds); for (TermQuery tq : booster.exclude) { newq.add(new BooleanClause(tq, BooleanClause.Occur.SHOULD)); } } } rb.setQuery(newq); } ElevationComparatorSource comparator = new ElevationComparatorSource(booster); // if the sort is 'score desc' use a custom sorting method to // insert documents in their proper place SortSpec sortSpec = rb.getSortSpec(); if (sortSpec.getSort() == null) { sortSpec.setSort(new Sort(new SortField[]{ new SortField("_elevate_", comparator, true), new SortField(null, SortField.Type.SCORE, false) })); } else { // Check if the sort is based on score boolean modify = false; SortField[] current = sortSpec.getSort().getSort(); ArrayList<SortField> sorts = new ArrayList<SortField>(current.length + 1); // Perhaps force it to always sort by score if (force && current[0].getType() != SortField.Type.SCORE) { sorts.add(new SortField("_elevate_", comparator, true)); modify = true; } for (SortField sf : current) { if (sf.getType() == SortField.Type.SCORE) { sorts.add(new SortField("_elevate_", comparator, !sf.getReverse())); modify = true; } sorts.add(sf); } if (modify) { sortSpec.setSort(new Sort(sorts.toArray(new SortField[sorts.size()]))); } } } // Add debugging information if (rb.isDebug()) { List<String> match = null; if (booster != null) { // Extract the elevated terms into a list match = new ArrayList<String>(booster.priority.size()); for (Object o : booster.include.clauses()) { TermQuery tq = (TermQuery) ((BooleanClause) o).getQuery(); match.add(tq.getTerm().text()); } } SimpleOrderedMap<Object> dbg = new SimpleOrderedMap<Object>(); dbg.add("q", qstr); dbg.add("match", match); if (rb.isDebugQuery()) { rb.addDebugInfo("queryBoosting", dbg); } } }
// in core/src/java/org/apache/solr/handler/component/QueryElevationComponent.java
Override public void process(ResponseBuilder rb) throws IOException { // Do nothing -- the real work is modifying the input query }
// in core/src/java/org/apache/solr/handler/component/QueryElevationComponent.java
Override public FieldComparator<Integer> newComparator(String fieldname, final int numHits, int sortPos, boolean reversed) throws IOException { return new FieldComparator<Integer>() { private final int[] values = new int[numHits]; private int bottomVal; private TermsEnum termsEnum; private DocsEnum docsEnum; Set<String> seen = new HashSet<String>(elevations.ids.size()); @Override public int compare(int slot1, int slot2) { return values[slot1] - values[slot2]; // values will be small enough that there is no overflow concern } @Override public void setBottom(int slot) { bottomVal = values[slot]; } private int docVal(int doc) throws IOException { if (ordSet.size() > 0) { int slot = ordSet.find(doc); if (slot >= 0) { BytesRef id = termValues[slot]; Integer prio = elevations.priority.get(id); return prio == null ? 0 : prio.intValue(); } } return 0; } @Override public int compareBottom(int doc) throws IOException { return bottomVal - docVal(doc); } @Override public void copy(int slot, int doc) throws IOException { values[slot] = docVal(doc); } @Override public FieldComparator setNextReader(AtomicReaderContext context) throws IOException { //convert the ids to Lucene doc ids, the ordSet and termValues needs to be the same size as the number of elevation docs we have ordSet.clear(); Fields fields = context.reader().fields(); if (fields == null) return this; Terms terms = fields.terms(idField); if (terms == null) return this; termsEnum = terms.iterator(termsEnum); BytesRef term = new BytesRef(); Bits liveDocs = context.reader().getLiveDocs(); for (String id : elevations.ids) { term.copyChars(id); if (seen.contains(id) == false && termsEnum.seekExact(term, false)) { docsEnum = termsEnum.docs(liveDocs, docsEnum, false); if (docsEnum != null) { int docId = docsEnum.nextDoc(); if (docId == DocIdSetIterator.NO_MORE_DOCS ) continue; // must have been deleted termValues[ordSet.put(docId)] = BytesRef.deepCopyOf(term); seen.add(id); assert docsEnum.nextDoc() == DocIdSetIterator.NO_MORE_DOCS; } } } return this; } @Override public Integer value(int slot) { return values[slot]; } @Override public int compareDocToValue(int doc, Integer valueObj) throws IOException { final int value = valueObj.intValue(); final int docValue = docVal(doc); return docValue - value; // values will be small enough that there is no overflow concern } }; }
// in core/src/java/org/apache/solr/handler/component/QueryElevationComponent.java
private int docVal(int doc) throws IOException { if (ordSet.size() > 0) { int slot = ordSet.find(doc); if (slot >= 0) { BytesRef id = termValues[slot]; Integer prio = elevations.priority.get(id); return prio == null ? 0 : prio.intValue(); } } return 0; }
// in core/src/java/org/apache/solr/handler/component/QueryElevationComponent.java
Override public int compareBottom(int doc) throws IOException { return bottomVal - docVal(doc); }
// in core/src/java/org/apache/solr/handler/component/QueryElevationComponent.java
Override public void copy(int slot, int doc) throws IOException { values[slot] = docVal(doc); }
// in core/src/java/org/apache/solr/handler/component/QueryElevationComponent.java
Override public FieldComparator setNextReader(AtomicReaderContext context) throws IOException { //convert the ids to Lucene doc ids, the ordSet and termValues needs to be the same size as the number of elevation docs we have ordSet.clear(); Fields fields = context.reader().fields(); if (fields == null) return this; Terms terms = fields.terms(idField); if (terms == null) return this; termsEnum = terms.iterator(termsEnum); BytesRef term = new BytesRef(); Bits liveDocs = context.reader().getLiveDocs(); for (String id : elevations.ids) { term.copyChars(id); if (seen.contains(id) == false && termsEnum.seekExact(term, false)) { docsEnum = termsEnum.docs(liveDocs, docsEnum, false); if (docsEnum != null) { int docId = docsEnum.nextDoc(); if (docId == DocIdSetIterator.NO_MORE_DOCS ) continue; // must have been deleted termValues[ordSet.put(docId)] = BytesRef.deepCopyOf(term); seen.add(id); assert docsEnum.nextDoc() == DocIdSetIterator.NO_MORE_DOCS; } } } return this; }
// in core/src/java/org/apache/solr/handler/component/QueryElevationComponent.java
Override public int compareDocToValue(int doc, Integer valueObj) throws IOException { final int value = valueObj.intValue(); final int docValue = docVal(doc); return docValue - value; // values will be small enough that there is no overflow concern }
// in core/src/java/org/apache/solr/handler/component/SpellCheckComponent.java
private Collection<Token> getTokens(String q, Analyzer analyzer) throws IOException { Collection<Token> result = new ArrayList<Token>(); assert analyzer != null; TokenStream ts = analyzer.tokenStream("", new StringReader(q)); ts.reset(); // TODO: support custom attributes CharTermAttribute termAtt = ts.addAttribute(CharTermAttribute.class); OffsetAttribute offsetAtt = ts.addAttribute(OffsetAttribute.class); TypeAttribute typeAtt = ts.addAttribute(TypeAttribute.class); FlagsAttribute flagsAtt = ts.addAttribute(FlagsAttribute.class); PayloadAttribute payloadAtt = ts.addAttribute(PayloadAttribute.class); PositionIncrementAttribute posIncAtt = ts.addAttribute(PositionIncrementAttribute.class); while (ts.incrementToken()){ Token token = new Token(); token.copyBuffer(termAtt.buffer(), 0, termAtt.length()); token.setOffset(offsetAtt.startOffset(), offsetAtt.endOffset()); token.setType(typeAtt.type()); token.setFlags(flagsAtt.getFlags()); token.setPayload(payloadAtt.getPayload()); token.setPositionIncrement(posIncAtt.getPositionIncrement()); result.add(token); } ts.end(); ts.close(); return result; }
// in core/src/java/org/apache/solr/handler/component/DebugComponent.java
Override public void prepare(ResponseBuilder rb) throws IOException { }
// in core/src/java/org/apache/solr/handler/component/PivotFacetHelper.java
public SimpleOrderedMap<List<NamedList<Object>>> process(ResponseBuilder rb, SolrParams params, String[] pivots) throws IOException { if (!rb.doFacets || pivots == null) return null; int minMatch = params.getInt( FacetParams.FACET_PIVOT_MINCOUNT, 1 ); SimpleOrderedMap<List<NamedList<Object>>> pivotResponse = new SimpleOrderedMap<List<NamedList<Object>>>(); for (String pivot : pivots) { String[] fields = pivot.split(","); // only support two levels for now if( fields.length < 2 ) { throw new SolrException( ErrorCode.BAD_REQUEST, "Pivot Facet needs at least two fields: "+pivot ); } DocSet docs = rb.getResults().docSet; String field = fields[0]; String subField = fields[1]; Deque<String> fnames = new LinkedList<String>(); for( int i=fields.length-1; i>1; i-- ) { fnames.push( fields[i] ); } SimpleFacets sf = getFacetImplementation(rb.req, rb.getResults().docSet, rb.req.getParams()); NamedList<Integer> superFacets = sf.getTermCounts(field); pivotResponse.add(pivot, doPivots(superFacets, field, subField, fnames, rb, docs, minMatch)); } return pivotResponse; }
// in core/src/java/org/apache/solr/handler/component/PivotFacetHelper.java
protected List<NamedList<Object>> doPivots( NamedList<Integer> superFacets, String field, String subField, Deque<String> fnames, ResponseBuilder rb, DocSet docs, int minMatch ) throws IOException { SolrIndexSearcher searcher = rb.req.getSearcher(); // TODO: optimize to avoid converting to an external string and then having to convert back to internal below SchemaField sfield = searcher.getSchema().getField(field); FieldType ftype = sfield.getType(); String nextField = fnames.poll(); List<NamedList<Object>> values = new ArrayList<NamedList<Object>>( superFacets.size() ); for (Map.Entry<String, Integer> kv : superFacets) { // Only sub-facet if parent facet has positive count - still may not be any values for the sub-field though if (kv.getValue() >= minMatch ) { // don't reuse the same BytesRef each time since we will be constructing Term // objects that will most likely be cached. BytesRef termval = new BytesRef(); ftype.readableToIndexed(kv.getKey(), termval); SimpleOrderedMap<Object> pivot = new SimpleOrderedMap<Object>(); pivot.add( "field", field ); pivot.add( "value", ftype.toObject(sfield, termval) ); pivot.add( "count", kv.getValue() ); if( subField == null ) { values.add( pivot ); } else { Query query = new TermQuery(new Term(field, termval)); DocSet subset = searcher.getDocSet(query, docs); SimpleFacets sf = getFacetImplementation(rb.req, subset, rb.req.getParams()); NamedList<Integer> nl = sf.getTermCounts(subField); if (nl.size() >= minMatch ) { pivot.add( "pivot", doPivots( nl, subField, nextField, fnames, rb, subset, minMatch ) ); values.add( pivot ); // only add response if there are some counts } } } } // put the field back on the list fnames.push( nextField ); return values; }
// in core/src/java/org/apache/solr/handler/component/QueryComponent.java
Override public void prepare(ResponseBuilder rb) throws IOException { SolrQueryRequest req = rb.req; SolrParams params = req.getParams(); if (!params.getBool(COMPONENT_NAME, true)) { return; } SolrQueryResponse rsp = rb.rsp; // Set field flags ReturnFields returnFields = new ReturnFields( req ); rsp.setReturnFields( returnFields ); int flags = 0; if (returnFields.wantsScore()) { flags |= SolrIndexSearcher.GET_SCORES; } rb.setFieldFlags( flags ); String defType = params.get(QueryParsing.DEFTYPE,QParserPlugin.DEFAULT_QTYPE); // get it from the response builder to give a different component a chance // to set it. String queryString = rb.getQueryString(); if (queryString == null) { // this is the normal way it's set. queryString = params.get( CommonParams.Q ); rb.setQueryString(queryString); } try { QParser parser = QParser.getParser(rb.getQueryString(), defType, req); Query q = parser.getQuery(); if (q == null) { // normalize a null query to a query that matches nothing q = new BooleanQuery(); } rb.setQuery( q ); rb.setSortSpec( parser.getSort(true) ); rb.setQparser(parser); rb.setScoreDoc(parser.getPaging()); String[] fqs = req.getParams().getParams(CommonParams.FQ); if (fqs!=null && fqs.length!=0) { List<Query> filters = rb.getFilters(); if (filters==null) { filters = new ArrayList<Query>(fqs.length); } for (String fq : fqs) { if (fq != null && fq.trim().length()!=0) { QParser fqp = QParser.getParser(fq, null, req); filters.add(fqp.getQuery()); } } // only set the filters if they are not empty otherwise // fq=&someotherParam= will trigger all docs filter for every request // if filter cache is disabled if (!filters.isEmpty()) { rb.setFilters( filters ); } } } catch (ParseException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); } boolean grouping = params.getBool(GroupParams.GROUP, false); if (!grouping) { return; } SolrIndexSearcher.QueryCommand cmd = rb.getQueryCommand(); SolrIndexSearcher searcher = rb.req.getSearcher(); GroupingSpecification groupingSpec = new GroupingSpecification(); rb.setGroupingSpec(groupingSpec); //TODO: move weighting of sort Sort groupSort = searcher.weightSort(cmd.getSort()); if (groupSort == null) { groupSort = Sort.RELEVANCE; } // groupSort defaults to sort String groupSortStr = params.get(GroupParams.GROUP_SORT); //TODO: move weighting of sort Sort sortWithinGroup = groupSortStr == null ? groupSort : searcher.weightSort(QueryParsing.parseSort(groupSortStr, req)); if (sortWithinGroup == null) { sortWithinGroup = Sort.RELEVANCE; } groupingSpec.setSortWithinGroup(sortWithinGroup); groupingSpec.setGroupSort(groupSort); String formatStr = params.get(GroupParams.GROUP_FORMAT, Grouping.Format.grouped.name()); Grouping.Format responseFormat; try { responseFormat = Grouping.Format.valueOf(formatStr); } catch (IllegalArgumentException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, String.format("Illegal %s parameter", GroupParams.GROUP_FORMAT)); } groupingSpec.setResponseFormat(responseFormat); groupingSpec.setFields(params.getParams(GroupParams.GROUP_FIELD)); groupingSpec.setQueries(params.getParams(GroupParams.GROUP_QUERY)); groupingSpec.setFunctions(params.getParams(GroupParams.GROUP_FUNC)); groupingSpec.setGroupOffset(params.getInt(GroupParams.GROUP_OFFSET, 0)); groupingSpec.setGroupLimit(params.getInt(GroupParams.GROUP_LIMIT, 1)); groupingSpec.setOffset(rb.getSortSpec().getOffset()); groupingSpec.setLimit(rb.getSortSpec().getCount()); groupingSpec.setIncludeGroupCount(params.getBool(GroupParams.GROUP_TOTAL_COUNT, false)); groupingSpec.setMain(params.getBool(GroupParams.GROUP_MAIN, false)); groupingSpec.setNeedScore((cmd.getFlags() & SolrIndexSearcher.GET_SCORES) != 0); groupingSpec.setTruncateGroups(params.getBool(GroupParams.GROUP_TRUNCATE, false)); }
// in core/src/java/org/apache/solr/handler/component/QueryComponent.java
Override public void process(ResponseBuilder rb) throws IOException { SolrQueryRequest req = rb.req; SolrQueryResponse rsp = rb.rsp; SolrParams params = req.getParams(); if (!params.getBool(COMPONENT_NAME, true)) { return; } SolrIndexSearcher searcher = req.getSearcher(); if (rb.getQueryCommand().getOffset() < 0) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "'start' parameter cannot be negative"); } // -1 as flag if not set. long timeAllowed = (long)params.getInt( CommonParams.TIME_ALLOWED, -1 ); // Optional: This could also be implemented by the top-level searcher sending // a filter that lists the ids... that would be transparent to // the request handler, but would be more expensive (and would preserve score // too if desired). String ids = params.get(ShardParams.IDS); if (ids != null) { SchemaField idField = req.getSchema().getUniqueKeyField(); List<String> idArr = StrUtils.splitSmart(ids, ",", true); int[] luceneIds = new int[idArr.size()]; int docs = 0; for (int i=0; i<idArr.size(); i++) { int id = req.getSearcher().getFirstMatch( new Term(idField.getName(), idField.getType().toInternal(idArr.get(i)))); if (id >= 0) luceneIds[docs++] = id; } DocListAndSet res = new DocListAndSet(); res.docList = new DocSlice(0, docs, luceneIds, null, docs, 0); if (rb.isNeedDocSet()) { // TODO: create a cache for this! List<Query> queries = new ArrayList<Query>(); queries.add(rb.getQuery()); List<Query> filters = rb.getFilters(); if (filters != null) queries.addAll(filters); res.docSet = searcher.getDocSet(queries); } rb.setResults(res); ResultContext ctx = new ResultContext(); ctx.docs = rb.getResults().docList; ctx.query = null; // anything? rsp.add("response", ctx); return; } SolrIndexSearcher.QueryCommand cmd = rb.getQueryCommand(); cmd.setTimeAllowed(timeAllowed); SolrIndexSearcher.QueryResult result = new SolrIndexSearcher.QueryResult(); // // grouping / field collapsing // GroupingSpecification groupingSpec = rb.getGroupingSpec(); if (groupingSpec != null) { try { boolean needScores = (cmd.getFlags() & SolrIndexSearcher.GET_SCORES) != 0; if (params.getBool(GroupParams.GROUP_DISTRIBUTED_FIRST, false)) { CommandHandler.Builder topsGroupsActionBuilder = new CommandHandler.Builder() .setQueryCommand(cmd) .setNeedDocSet(false) // Order matters here .setIncludeHitCount(true) .setSearcher(searcher); for (String field : groupingSpec.getFields()) { topsGroupsActionBuilder.addCommandField(new SearchGroupsFieldCommand.Builder() .setField(searcher.getSchema().getField(field)) .setGroupSort(groupingSpec.getGroupSort()) .setTopNGroups(cmd.getOffset() + cmd.getLen()) .setIncludeGroupCount(groupingSpec.isIncludeGroupCount()) .build() ); } CommandHandler commandHandler = topsGroupsActionBuilder.build(); commandHandler.execute(); SearchGroupsResultTransformer serializer = new SearchGroupsResultTransformer(searcher); rsp.add("firstPhase", commandHandler.processResult(result, serializer)); rsp.add("totalHitCount", commandHandler.getTotalHitCount()); rb.setResult(result); return; } else if (params.getBool(GroupParams.GROUP_DISTRIBUTED_SECOND, false)) { CommandHandler.Builder secondPhaseBuilder = new CommandHandler.Builder() .setQueryCommand(cmd) .setTruncateGroups(groupingSpec.isTruncateGroups() && groupingSpec.getFields().length > 0) .setSearcher(searcher); for (String field : groupingSpec.getFields()) { String[] topGroupsParam = params.getParams(GroupParams.GROUP_DISTRIBUTED_TOPGROUPS_PREFIX + field); if (topGroupsParam == null) { topGroupsParam = new String[0]; } List<SearchGroup<BytesRef>> topGroups = new ArrayList<SearchGroup<BytesRef>>(topGroupsParam.length); for (String topGroup : topGroupsParam) { SearchGroup<BytesRef> searchGroup = new SearchGroup<BytesRef>(); if (!topGroup.equals(TopGroupsShardRequestFactory.GROUP_NULL_VALUE)) { searchGroup.groupValue = new BytesRef(searcher.getSchema().getField(field).getType().readableToIndexed(topGroup)); } topGroups.add(searchGroup); } secondPhaseBuilder.addCommandField( new TopGroupsFieldCommand.Builder() .setField(searcher.getSchema().getField(field)) .setGroupSort(groupingSpec.getGroupSort()) .setSortWithinGroup(groupingSpec.getSortWithinGroup()) .setFirstPhaseGroups(topGroups) .setMaxDocPerGroup(groupingSpec.getGroupOffset() + groupingSpec.getGroupLimit()) .setNeedScores(needScores) .setNeedMaxScore(needScores) .build() ); } for (String query : groupingSpec.getQueries()) { secondPhaseBuilder.addCommandField(new QueryCommand.Builder() .setDocsToCollect(groupingSpec.getOffset() + groupingSpec.getLimit()) .setSort(groupingSpec.getGroupSort()) .setQuery(query, rb.req) .setDocSet(searcher) .build() ); } CommandHandler commandHandler = secondPhaseBuilder.build(); commandHandler.execute(); TopGroupsResultTransformer serializer = new TopGroupsResultTransformer(rb); rsp.add("secondPhase", commandHandler.processResult(result, serializer)); rb.setResult(result); return; } int maxDocsPercentageToCache = params.getInt(GroupParams.GROUP_CACHE_PERCENTAGE, 0); boolean cacheSecondPassSearch = maxDocsPercentageToCache >= 1 && maxDocsPercentageToCache <= 100; Grouping.TotalCount defaultTotalCount = groupingSpec.isIncludeGroupCount() ? Grouping.TotalCount.grouped : Grouping.TotalCount.ungrouped; int limitDefault = cmd.getLen(); // this is normally from "rows" Grouping grouping = new Grouping(searcher, result, cmd, cacheSecondPassSearch, maxDocsPercentageToCache, groupingSpec.isMain()); grouping.setSort(groupingSpec.getGroupSort()) .setGroupSort(groupingSpec.getSortWithinGroup()) .setDefaultFormat(groupingSpec.getResponseFormat()) .setLimitDefault(limitDefault) .setDefaultTotalCount(defaultTotalCount) .setDocsPerGroupDefault(groupingSpec.getGroupLimit()) .setGroupOffsetDefault(groupingSpec.getGroupOffset()) .setGetGroupedDocSet(groupingSpec.isTruncateGroups()); if (groupingSpec.getFields() != null) { for (String field : groupingSpec.getFields()) { grouping.addFieldCommand(field, rb.req); } } if (groupingSpec.getFunctions() != null) { for (String groupByStr : groupingSpec.getFunctions()) { grouping.addFunctionCommand(groupByStr, rb.req); } } if (groupingSpec.getQueries() != null) { for (String groupByStr : groupingSpec.getQueries()) { grouping.addQueryCommand(groupByStr, rb.req); } } if (rb.doHighlights || rb.isDebug() || params.getBool(MoreLikeThisParams.MLT, false)) { // we need a single list of the returned docs cmd.setFlags(SolrIndexSearcher.GET_DOCLIST); } grouping.execute(); if (grouping.isSignalCacheWarning()) { rsp.add( "cacheWarning", String.format("Cache limit of %d percent relative to maxdoc has exceeded. Please increase cache size or disable caching.", maxDocsPercentageToCache) ); } rb.setResult(result); if (grouping.mainResult != null) { ResultContext ctx = new ResultContext(); ctx.docs = grouping.mainResult; ctx.query = null; // TODO? add the query? rsp.add("response", ctx); rsp.getToLog().add("hits", grouping.mainResult.matches()); } else if (!grouping.getCommands().isEmpty()) { // Can never be empty since grouping.execute() checks for this. rsp.add("grouped", result.groupedResults); rsp.getToLog().add("hits", grouping.getCommands().get(0).getMatches()); } return; } catch (ParseException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); } } // normal search result searcher.search(result,cmd); rb.setResult( result ); ResultContext ctx = new ResultContext(); ctx.docs = rb.getResults().docList; ctx.query = rb.getQuery(); rsp.add("response", ctx); rsp.getToLog().add("hits", rb.getResults().docList.matches()); doFieldSortValues(rb, searcher); doPrefetch(rb); }
// in core/src/java/org/apache/solr/handler/component/QueryComponent.java
protected void doFieldSortValues(ResponseBuilder rb, SolrIndexSearcher searcher) throws IOException { SolrQueryRequest req = rb.req; SolrQueryResponse rsp = rb.rsp; final CharsRef spare = new CharsRef(); // The query cache doesn't currently store sort field values, and SolrIndexSearcher doesn't // currently have an option to return sort field values. Because of this, we // take the documents given and re-derive the sort values. boolean fsv = req.getParams().getBool(ResponseBuilder.FIELD_SORT_VALUES,false); if(fsv){ Sort sort = searcher.weightSort(rb.getSortSpec().getSort()); SortField[] sortFields = sort==null ? new SortField[]{SortField.FIELD_SCORE} : sort.getSort(); NamedList<Object[]> sortVals = new NamedList<Object[]>(); // order is important for the sort fields Field field = new StringField("dummy", ""); // a dummy Field IndexReaderContext topReaderContext = searcher.getTopReaderContext(); AtomicReaderContext[] leaves = topReaderContext.leaves(); AtomicReaderContext currentLeaf = null; if (leaves.length==1) { // if there is a single segment, use that subReader and avoid looking up each time currentLeaf = leaves[0]; leaves=null; } DocList docList = rb.getResults().docList; // sort ids from lowest to highest so we can access them in order int nDocs = docList.size(); long[] sortedIds = new long[nDocs]; DocIterator it = rb.getResults().docList.iterator(); for (int i=0; i<nDocs; i++) { sortedIds[i] = (((long)it.nextDoc()) << 32) | i; } Arrays.sort(sortedIds); for (SortField sortField: sortFields) { SortField.Type type = sortField.getType(); if (type==SortField.Type.SCORE || type==SortField.Type.DOC) continue; FieldComparator comparator = null; String fieldname = sortField.getField(); FieldType ft = fieldname==null ? null : req.getSchema().getFieldTypeNoEx(fieldname); Object[] vals = new Object[nDocs]; int lastIdx = -1; int idx = 0; for (long idAndPos : sortedIds) { int doc = (int)(idAndPos >>> 32); int position = (int)idAndPos; if (leaves != null) { idx = ReaderUtil.subIndex(doc, leaves); currentLeaf = leaves[idx]; if (idx != lastIdx) { // we switched segments. invalidate comparator. comparator = null; } } if (comparator == null) { comparator = sortField.getComparator(1,0); comparator = comparator.setNextReader(currentLeaf); } doc -= currentLeaf.docBase; // adjust for what segment this is in comparator.copy(0, doc); Object val = comparator.value(0); // Sortable float, double, int, long types all just use a string // comparator. For these, we need to put the type into a readable // format. One reason for this is that XML can't represent all // string values (or even all unicode code points). // indexedToReadable() should be a no-op and should // thus be harmless anyway (for all current ways anyway) if (val instanceof String) { field.setStringValue((String)val); val = ft.toObject(field); } // Must do the same conversion when sorting by a // String field in Lucene, which returns the terms // data as BytesRef: if (val instanceof BytesRef) { UnicodeUtil.UTF8toUTF16((BytesRef)val, spare); field.setStringValue(spare.toString()); val = ft.toObject(field); } vals[position] = val; } sortVals.add(fieldname, vals); } rsp.add("sort_values", sortVals); } }
// in core/src/java/org/apache/solr/handler/component/QueryComponent.java
protected void doPrefetch(ResponseBuilder rb) throws IOException { SolrQueryRequest req = rb.req; SolrQueryResponse rsp = rb.rsp; //pre-fetch returned documents if (!req.getParams().getBool(ShardParams.IS_SHARD,false) && rb.getResults().docList != null && rb.getResults().docList.size()<=50) { SolrPluginUtils.optimizePreFetchDocs(rb, rb.getResults().docList, rb.getQuery(), req, rsp); } }
// in core/src/java/org/apache/solr/handler/component/QueryComponent.java
Override public int distributedProcess(ResponseBuilder rb) throws IOException { if (rb.grouping()) { return groupedDistributedProcess(rb); } else { return regularDistributedProcess(rb); } }
// in core/src/java/org/apache/solr/handler/component/FacetComponent.java
Override public void prepare(ResponseBuilder rb) throws IOException { if (rb.req.getParams().getBool(FacetParams.FACET,false)) { rb.setNeedDocSet( true ); rb.doFacets = true; } }
// in core/src/java/org/apache/solr/handler/component/FacetComponent.java
Override public void process(ResponseBuilder rb) throws IOException { if (rb.doFacets) { SolrParams params = rb.req.getParams(); SimpleFacets f = new SimpleFacets(rb.req, rb.getResults().docSet, params, rb ); NamedList<Object> counts = f.getFacetCounts(); String[] pivots = params.getParams( FacetParams.FACET_PIVOT ); if( pivots != null && pivots.length > 0 ) { NamedList v = pivotHelper.process(rb, params, pivots); if( v != null ) { counts.add( PIVOT_KEY, v ); } } // TODO ???? add this directly to the response, or to the builder? rb.rsp.add( "facet_counts", counts ); } }
// in core/src/java/org/apache/solr/handler/component/FacetComponent.java
Override public int distributedProcess(ResponseBuilder rb) throws IOException { if (!rb.doFacets) { return ResponseBuilder.STAGE_DONE; } if (rb.stage == ResponseBuilder.STAGE_GET_FIELDS) { // overlap facet refinement requests (those shards that we need a count for // particular facet values from), where possible, with // the requests to get fields (because we know that is the // only other required phase). // We do this in distributedProcess so we can look at all of the // requests in the outgoing queue at once. for (int shardNum=0; shardNum<rb.shards.length; shardNum++) { List<String> refinements = null; for (DistribFieldFacet dff : rb._facetInfo.facets.values()) { if (!dff.needRefinements) continue; List<String> refList = dff._toRefine[shardNum]; if (refList == null || refList.size()==0) continue; String key = dff.getKey(); // reuse the same key that was used for the main facet String termsKey = key + "__terms"; String termsVal = StrUtils.join(refList, ','); String facetCommand; // add terms into the original facet.field command // do it via parameter reference to avoid another layer of encoding. String termsKeyEncoded = QueryParsing.encodeLocalParamVal(termsKey); if (dff.localParams != null) { facetCommand = commandPrefix+termsKeyEncoded + " " + dff.facetStr.substring(2); } else { facetCommand = commandPrefix+termsKeyEncoded+'}'+dff.field; } if (refinements == null) { refinements = new ArrayList<String>(); } refinements.add(facetCommand); refinements.add(termsKey); refinements.add(termsVal); } if (refinements == null) continue; String shard = rb.shards[shardNum]; ShardRequest refine = null; boolean newRequest = false; // try to find a request that is already going out to that shard. // If nshards becomes to great, we way want to move to hashing for better // scalability. for (ShardRequest sreq : rb.outgoing) { if ((sreq.purpose & ShardRequest.PURPOSE_GET_FIELDS)!=0 && sreq.shards != null && sreq.shards.length==1 && sreq.shards[0].equals(shard)) { refine = sreq; break; } } if (refine == null) { // we didn't find any other suitable requests going out to that shard, so // create one ourselves. newRequest = true; refine = new ShardRequest(); refine.shards = new String[]{rb.shards[shardNum]}; refine.params = new ModifiableSolrParams(rb.req.getParams()); // don't request any documents refine.params.remove(CommonParams.START); refine.params.set(CommonParams.ROWS,"0"); } refine.purpose |= ShardRequest.PURPOSE_REFINE_FACETS; refine.params.set(FacetParams.FACET, "true"); refine.params.remove(FacetParams.FACET_FIELD); refine.params.remove(FacetParams.FACET_QUERY); for (int i=0; i<refinements.size();) { String facetCommand=refinements.get(i++); String termsKey=refinements.get(i++); String termsVal=refinements.get(i++); refine.params.add(FacetParams.FACET_FIELD, facetCommand); refine.params.set(termsKey, termsVal); } if (newRequest) { rb.addRequest(this, refine); } } } return ResponseBuilder.STAGE_DONE; }
// in core/src/java/org/apache/solr/handler/component/HighlightComponent.java
Override public void prepare(ResponseBuilder rb) throws IOException { SolrParams params = rb.req.getParams(); rb.doHighlights = highlighter.isHighlightingEnabled(params); if(rb.doHighlights){ String hlq = params.get(HighlightParams.Q); if(hlq != null){ try { QParser parser = QParser.getParser(hlq, null, rb.req); rb.setHighlightQuery(parser.getHighlightQuery()); } catch (ParseException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); } } } }
// in core/src/java/org/apache/solr/handler/component/HighlightComponent.java
Override public void process(ResponseBuilder rb) throws IOException { if (rb.doHighlights) { SolrQueryRequest req = rb.req; SolrParams params = req.getParams(); String[] defaultHighlightFields; //TODO: get from builder by default? if (rb.getQparser() != null) { defaultHighlightFields = rb.getQparser().getDefaultHighlightFields(); } else { defaultHighlightFields = params.getParams(CommonParams.DF); } Query highlightQuery = rb.getHighlightQuery(); if(highlightQuery==null) { if (rb.getQparser() != null) { try { highlightQuery = rb.getQparser().getHighlightQuery(); rb.setHighlightQuery( highlightQuery ); } catch (Exception e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); } } else { highlightQuery = rb.getQuery(); rb.setHighlightQuery( highlightQuery ); } } if(highlightQuery != null) { boolean rewrite = !(Boolean.valueOf(params.get(HighlightParams.USE_PHRASE_HIGHLIGHTER, "true")) && Boolean.valueOf(params.get(HighlightParams.HIGHLIGHT_MULTI_TERM, "true"))); highlightQuery = rewrite ? highlightQuery.rewrite(req.getSearcher().getIndexReader()) : highlightQuery; } // No highlighting if there is no query -- consider q.alt="*:* if( highlightQuery != null ) { NamedList sumData = highlighter.doHighlighting( rb.getResults().docList, highlightQuery, req, defaultHighlightFields ); if(sumData != null) { // TODO ???? add this directly to the response? rb.rsp.add("highlighting", sumData); } } } }
// in core/src/java/org/apache/solr/handler/component/SearchComponent.java
public int distributedProcess(ResponseBuilder rb) throws IOException { return ResponseBuilder.STAGE_DONE; }
// in core/src/java/org/apache/solr/handler/component/StatsComponent.java
Override public void prepare(ResponseBuilder rb) throws IOException { if (rb.req.getParams().getBool(StatsParams.STATS,false)) { rb.setNeedDocSet( true ); rb.doStats = true; } }
// in core/src/java/org/apache/solr/handler/component/StatsComponent.java
Override public void process(ResponseBuilder rb) throws IOException { if (rb.doStats) { SolrParams params = rb.req.getParams(); SimpleStats s = new SimpleStats(rb.req, rb.getResults().docSet, params ); // TODO ???? add this directly to the response, or to the builder? rb.rsp.add( "stats", s.getStatsCounts() ); } }
// in core/src/java/org/apache/solr/handler/component/StatsComponent.java
Override public int distributedProcess(ResponseBuilder rb) throws IOException { return ResponseBuilder.STAGE_DONE; }
// in core/src/java/org/apache/solr/handler/component/StatsComponent.java
public NamedList<Object> getStatsCounts() throws IOException { NamedList<Object> res = new SimpleOrderedMap<Object>(); res.add("stats_fields", getStatsFields()); return res; }
// in core/src/java/org/apache/solr/handler/component/StatsComponent.java
public NamedList<Object> getStatsFields() throws IOException { NamedList<Object> res = new SimpleOrderedMap<Object>(); String[] statsFs = params.getParams(StatsParams.STATS_FIELD); boolean isShard = params.getBool(ShardParams.IS_SHARD, false); if (null != statsFs) { for (String f : statsFs) { String[] facets = params.getFieldParams(f, StatsParams.STATS_FACET); if (facets == null) { facets = new String[0]; // make sure it is something... } SchemaField sf = searcher.getSchema().getField(f); FieldType ft = sf.getType(); NamedList<?> stv; // Currently, only UnInvertedField can deal with multi-part trie fields String prefix = TrieField.getMainValuePrefix(ft); if (sf.multiValued() || ft.multiValuedFieldCache() || prefix!=null) { //use UnInvertedField for multivalued fields UnInvertedField uif = UnInvertedField.getUnInvertedField(f, searcher); stv = uif.getStats(searcher, docs, facets).getStatsValues(); } else { stv = getFieldCacheStats(f, facets); } if (isShard == true || (Long) stv.get("count") > 0) { res.add(f, stv); } else { res.add(f, null); } } } return res; }
// in core/src/java/org/apache/solr/handler/component/RealTimeGetComponent.java
Override public void prepare(ResponseBuilder rb) throws IOException { // Set field flags ReturnFields returnFields = new ReturnFields( rb.req ); rb.rsp.setReturnFields( returnFields ); }
// in core/src/java/org/apache/solr/handler/component/RealTimeGetComponent.java
Override public void process(ResponseBuilder rb) throws IOException { SolrQueryRequest req = rb.req; SolrQueryResponse rsp = rb.rsp; SolrParams params = req.getParams(); if (!params.getBool(COMPONENT_NAME, true)) { return; } String val = params.get("getVersions"); if (val != null) { processGetVersions(rb); return; } val = params.get("getUpdates"); if (val != null) { processGetUpdates(rb); return; } String id[] = params.getParams("id"); String ids[] = params.getParams("ids"); if (id == null && ids == null) { return; } String[] allIds = id==null ? new String[0] : id; if (ids != null) { List<String> lst = new ArrayList<String>(); for (String s : allIds) { lst.add(s); } for (String idList : ids) { lst.addAll( StrUtils.splitSmart(idList, ",", true) ); } allIds = lst.toArray(new String[lst.size()]); } SchemaField idField = req.getSchema().getUniqueKeyField(); FieldType fieldType = idField.getType(); SolrDocumentList docList = new SolrDocumentList(); UpdateLog ulog = req.getCore().getUpdateHandler().getUpdateLog(); RefCounted<SolrIndexSearcher> searcherHolder = null; DocTransformer transformer = rsp.getReturnFields().getTransformer(); if (transformer != null) { TransformContext context = new TransformContext(); context.req = req; transformer.setContext(context); } try { SolrIndexSearcher searcher = null; BytesRef idBytes = new BytesRef(); for (String idStr : allIds) { fieldType.readableToIndexed(idStr, idBytes); if (ulog != null) { Object o = ulog.lookup(idBytes); if (o != null) { // should currently be a List<Oper,Ver,Doc/Id> List entry = (List)o; assert entry.size() >= 3; int oper = (Integer)entry.get(0) & UpdateLog.OPERATION_MASK; switch (oper) { case UpdateLog.ADD: SolrDocument doc = toSolrDoc((SolrInputDocument)entry.get(entry.size()-1), req.getSchema()); if(transformer!=null) { transformer.transform(doc, -1); // unknown docID } docList.add(doc); break; case UpdateLog.DELETE: break; default: throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Unknown Operation! " + oper); } continue; } } // didn't find it in the update log, so it should be in the newest searcher opened if (searcher == null) { searcherHolder = req.getCore().getRealtimeSearcher(); searcher = searcherHolder.get(); } // SolrCore.verbose("RealTimeGet using searcher ", searcher); int docid = searcher.getFirstMatch(new Term(idField.getName(), idBytes)); if (docid < 0) continue; Document luceneDocument = searcher.doc(docid); SolrDocument doc = toSolrDoc(luceneDocument, req.getSchema()); if( transformer != null ) { transformer.transform(doc, docid); } docList.add(doc); } } finally { if (searcherHolder != null) { searcherHolder.decref(); } } // if the client specified a single id=foo, then use "doc":{ // otherwise use a standard doclist if (ids == null && allIds.length <= 1) { // if the doc was not found, then use a value of null. rsp.add("doc", docList.size() > 0 ? docList.get(0) : null); } else { docList.setNumFound(docList.size()); rsp.add("response", docList); } }
// in core/src/java/org/apache/solr/handler/component/RealTimeGetComponent.java
public static SolrInputDocument getInputDocument(SolrCore core, BytesRef idBytes) throws IOException { SolrInputDocument sid = null; RefCounted<SolrIndexSearcher> searcherHolder = null; try { SolrIndexSearcher searcher = null; UpdateLog ulog = core.getUpdateHandler().getUpdateLog(); if (ulog != null) { Object o = ulog.lookup(idBytes); if (o != null) { // should currently be a List<Oper,Ver,Doc/Id> List entry = (List)o; assert entry.size() >= 3; int oper = (Integer)entry.get(0) & UpdateLog.OPERATION_MASK; switch (oper) { case UpdateLog.ADD: sid = (SolrInputDocument)entry.get(entry.size()-1); break; case UpdateLog.DELETE: return null; default: throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Unknown Operation! " + oper); } } } if (sid == null) { // didn't find it in the update log, so it should be in the newest searcher opened if (searcher == null) { searcherHolder = core.getRealtimeSearcher(); searcher = searcherHolder.get(); } // SolrCore.verbose("RealTimeGet using searcher ", searcher); SchemaField idField = core.getSchema().getUniqueKeyField(); int docid = searcher.getFirstMatch(new Term(idField.getName(), idBytes)); if (docid < 0) return null; Document luceneDocument = searcher.doc(docid); sid = toSolrInputDocument(luceneDocument, core.getSchema()); } } finally { if (searcherHolder != null) { searcherHolder.decref(); } } return sid; }
// in core/src/java/org/apache/solr/handler/component/RealTimeGetComponent.java
Override public int distributedProcess(ResponseBuilder rb) throws IOException { if (rb.stage < ResponseBuilder.STAGE_GET_FIELDS) return ResponseBuilder.STAGE_GET_FIELDS; if (rb.stage == ResponseBuilder.STAGE_GET_FIELDS) { return createSubRequests(rb); } return ResponseBuilder.STAGE_DONE; }
// in core/src/java/org/apache/solr/handler/component/RealTimeGetComponent.java
public int createSubRequests(ResponseBuilder rb) throws IOException { SolrParams params = rb.req.getParams(); String id1[] = params.getParams("id"); String ids[] = params.getParams("ids"); if (id1 == null && ids == null) { return ResponseBuilder.STAGE_DONE; } List<String> allIds = new ArrayList<String>(); if (id1 != null) { for (String s : id1) { allIds.add(s); } } if (ids != null) { for (String s : ids) { allIds.addAll( StrUtils.splitSmart(s, ",", true) ); } } // TODO: handle collection=...? ZkController zkController = rb.req.getCore().getCoreDescriptor().getCoreContainer().getZkController(); // if shards=... then use that if (zkController != null && params.get("shards") == null) { SchemaField sf = rb.req.getSchema().getUniqueKeyField(); CloudDescriptor cloudDescriptor = rb.req.getCore().getCoreDescriptor().getCloudDescriptor(); String collection = cloudDescriptor.getCollectionName(); CloudState cloudState = zkController.getCloudState(); Map<String, List<String>> shardToId = new HashMap<String, List<String>>(); for (String id : allIds) { BytesRef br = new BytesRef(); sf.getType().readableToIndexed(id, br); int hash = Hash.murmurhash3_x86_32(br.bytes, br.offset, br.length, 0); String shard = cloudState.getShard(hash, collection); List<String> idsForShard = shardToId.get(shard); if (idsForShard == null) { idsForShard = new ArrayList<String>(2); shardToId.put(shard, idsForShard); } idsForShard.add(id); } for (Map.Entry<String,List<String>> entry : shardToId.entrySet()) { String shard = entry.getKey(); String shardIdList = StrUtils.join(entry.getValue(), ','); ShardRequest sreq = new ShardRequest(); sreq.purpose = 1; // sreq.shards = new String[]{shard}; // TODO: would be nice if this would work... sreq.shards = sliceToShards(rb, collection, shard); sreq.actualShards = sreq.shards; sreq.params = new ModifiableSolrParams(); sreq.params.set(ShardParams.SHARDS_QT,"/get"); // TODO: how to avoid hardcoding this and hit the same handler? sreq.params.set("distrib",false); sreq.params.set("ids", shardIdList); rb.addRequest(this, sreq); } } else { String shardIdList = StrUtils.join(allIds, ','); ShardRequest sreq = new ShardRequest(); sreq.purpose = 1; sreq.shards = null; // ALL sreq.actualShards = sreq.shards; sreq.params = new ModifiableSolrParams(); sreq.params.set(ShardParams.SHARDS_QT,"/get"); // TODO: how to avoid hardcoding this and hit the same handler? sreq.params.set("distrib",false); sreq.params.set("ids", shardIdList); rb.addRequest(this, sreq); } return ResponseBuilder.STAGE_DONE; }
// in core/src/java/org/apache/solr/handler/component/RealTimeGetComponent.java
public void processGetVersions(ResponseBuilder rb) throws IOException { SolrQueryRequest req = rb.req; SolrQueryResponse rsp = rb.rsp; SolrParams params = req.getParams(); if (!params.getBool(COMPONENT_NAME, true)) { return; } int nVersions = params.getInt("getVersions", -1); if (nVersions == -1) return; String sync = params.get("sync"); if (sync != null) { processSync(rb, nVersions, sync); return; } UpdateLog ulog = req.getCore().getUpdateHandler().getUpdateLog(); if (ulog == null) return; UpdateLog.RecentUpdates recentUpdates = ulog.getRecentUpdates(); try { rb.rsp.add("versions", recentUpdates.getVersions(nVersions)); } finally { recentUpdates.close(); // cache this somehow? } }
// in core/src/java/org/apache/solr/handler/component/RealTimeGetComponent.java
public void processGetUpdates(ResponseBuilder rb) throws IOException { SolrQueryRequest req = rb.req; SolrQueryResponse rsp = rb.rsp; SolrParams params = req.getParams(); if (!params.getBool(COMPONENT_NAME, true)) { return; } String versionsStr = params.get("getUpdates"); if (versionsStr == null) return; UpdateLog ulog = req.getCore().getUpdateHandler().getUpdateLog(); if (ulog == null) return; List<String> versions = StrUtils.splitSmart(versionsStr, ",", true); // TODO: get this from cache instead of rebuilding? UpdateLog.RecentUpdates recentUpdates = ulog.getRecentUpdates(); List<Object> updates = new ArrayList<Object>(versions.size()); long minVersion = Long.MAX_VALUE; try { for (String versionStr : versions) { long version = Long.parseLong(versionStr); try { Object o = recentUpdates.lookup(version); if (o == null) continue; if (version > 0) { minVersion = Math.min(minVersion, version); } // TODO: do any kind of validation here? updates.add(o); } catch (SolrException e) { log.warn("Exception reading log for updates", e); } catch (ClassCastException e) { log.warn("Exception reading log for updates", e); } } // Must return all delete-by-query commands that occur after the first add requested // since they may apply. updates.addAll( recentUpdates.getDeleteByQuery(minVersion)); rb.rsp.add("updates", updates); } finally { recentUpdates.close(); // cache this somehow? } }
// in core/src/java/org/apache/solr/handler/component/MoreLikeThisComponent.java
Override public void prepare(ResponseBuilder rb) throws IOException { }
// in core/src/java/org/apache/solr/handler/component/MoreLikeThisComponent.java
Override public void process(ResponseBuilder rb) throws IOException { SolrParams p = rb.req.getParams(); if( p.getBool( MoreLikeThisParams.MLT, false ) ) { SolrIndexSearcher searcher = rb.req.getSearcher(); NamedList<DocList> sim = getMoreLikeThese( rb, searcher, rb.getResults().docList, rb.getFieldFlags() ); // TODO ???? add this directly to the response? rb.rsp.add( "moreLikeThis", sim ); } }
// in core/src/java/org/apache/solr/handler/component/MoreLikeThisComponent.java
NamedList<DocList> getMoreLikeThese( ResponseBuilder rb, SolrIndexSearcher searcher, DocList docs, int flags ) throws IOException { SolrParams p = rb.req.getParams(); IndexSchema schema = searcher.getSchema(); MoreLikeThisHandler.MoreLikeThisHelper mltHelper = new MoreLikeThisHandler.MoreLikeThisHelper( p, searcher ); NamedList<DocList> mlt = new SimpleOrderedMap<DocList>(); DocIterator iterator = docs.iterator(); SimpleOrderedMap<Object> dbg = null; if( rb.isDebug() ){ dbg = new SimpleOrderedMap<Object>(); } while( iterator.hasNext() ) { int id = iterator.nextDoc(); int rows = p.getInt( MoreLikeThisParams.DOC_COUNT, 5 ); DocListAndSet sim = mltHelper.getMoreLikeThis( id, 0, rows, null, null, flags ); String name = schema.printableUniqueKey( searcher.doc( id ) ); mlt.add(name, sim.docList); if( dbg != null ){ SimpleOrderedMap<Object> docDbg = new SimpleOrderedMap<Object>(); docDbg.add( "rawMLTQuery", mltHelper.getRawMLTQuery().toString() ); docDbg.add( "boostedMLTQuery", mltHelper.getBoostedMLTQuery().toString() ); docDbg.add( "realMLTQuery", mltHelper.getRealMLTQuery().toString() ); SimpleOrderedMap<Object> explains = new SimpleOrderedMap<Object>(); DocIterator mltIte = sim.docList.iterator(); while( mltIte.hasNext() ){ int mltid = mltIte.nextDoc(); String key = schema.printableUniqueKey( searcher.doc( mltid ) ); explains.add( key, searcher.explain( mltHelper.getRealMLTQuery(), mltid ) ); } docDbg.add( "explain", explains ); dbg.add( name, docDbg ); } } // add debug information if( dbg != null ){ rb.addDebugInfo( "moreLikeThis", dbg ); } return mlt; }
// in core/src/java/org/apache/solr/handler/component/TermVectorComponent.java
Override public void process(ResponseBuilder rb) throws IOException { SolrParams params = rb.req.getParams(); if (!params.getBool(COMPONENT_NAME, false)) { return; } NamedList<Object> termVectors = new NamedList<Object>(); rb.rsp.add(TERM_VECTORS, termVectors); FieldOptions allFields = new FieldOptions(); //figure out what options we have, and try to get the appropriate vector allFields.termFreq = params.getBool(TermVectorParams.TF, false); allFields.positions = params.getBool(TermVectorParams.POSITIONS, false); allFields.offsets = params.getBool(TermVectorParams.OFFSETS, false); allFields.docFreq = params.getBool(TermVectorParams.DF, false); allFields.tfIdf = params.getBool(TermVectorParams.TF_IDF, false); //boolean cacheIdf = params.getBool(TermVectorParams.IDF, false); //short cut to all values. if (params.getBool(TermVectorParams.ALL, false)) { allFields.termFreq = true; allFields.positions = true; allFields.offsets = true; allFields.docFreq = true; allFields.tfIdf = true; } String fldLst = params.get(TermVectorParams.FIELDS); if (fldLst == null) { fldLst = params.get(CommonParams.FL); } //use this to validate our fields IndexSchema schema = rb.req.getSchema(); //Build up our per field mapping Map<String, FieldOptions> fieldOptions = new HashMap<String, FieldOptions>(); NamedList<List<String>> warnings = new NamedList<List<String>>(); List<String> noTV = new ArrayList<String>(); List<String> noPos = new ArrayList<String>(); List<String> noOff = new ArrayList<String>(); //we have specific fields to retrieve if (fldLst != null) { String [] fields = SolrPluginUtils.split(fldLst); for (String field : fields) { SchemaField sf = schema.getFieldOrNull(field); if (sf != null) { if (sf.storeTermVector()) { FieldOptions option = fieldOptions.get(field); if (option == null) { option = new FieldOptions(); option.fieldName = field; fieldOptions.put(field, option); } //get the per field mappings option.termFreq = params.getFieldBool(field, TermVectorParams.TF, allFields.termFreq); option.docFreq = params.getFieldBool(field, TermVectorParams.DF, allFields.docFreq); option.tfIdf = params.getFieldBool(field, TermVectorParams.TF_IDF, allFields.tfIdf); //Validate these are even an option option.positions = params.getFieldBool(field, TermVectorParams.POSITIONS, allFields.positions); if (option.positions && !sf.storeTermPositions()){ noPos.add(field); } option.offsets = params.getFieldBool(field, TermVectorParams.OFFSETS, allFields.offsets); if (option.offsets && !sf.storeTermOffsets()){ noOff.add(field); } } else {//field doesn't have term vectors noTV.add(field); } } else { //field doesn't exist throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "undefined field: " + field); } } } //else, deal with all fields boolean hasWarnings = false; if (!noTV.isEmpty()) { warnings.add("noTermVectors", noTV); hasWarnings = true; } if (!noPos.isEmpty()) { warnings.add("noPositions", noPos); hasWarnings = true; } if (!noOff.isEmpty()) { warnings.add("noOffsets", noOff); hasWarnings = true; } if (hasWarnings) { termVectors.add("warnings", warnings); } DocListAndSet listAndSet = rb.getResults(); List<Integer> docIds = getInts(params.getParams(TermVectorParams.DOC_IDS)); Iterator<Integer> iter; if (docIds != null && !docIds.isEmpty()) { iter = docIds.iterator(); } else { DocList list = listAndSet.docList; iter = list.iterator(); } SolrIndexSearcher searcher = rb.req.getSearcher(); IndexReader reader = searcher.getIndexReader(); //the TVMapper is a TermVectorMapper which can be used to optimize loading of Term Vectors SchemaField keyField = schema.getUniqueKeyField(); String uniqFieldName = null; if (keyField != null) { uniqFieldName = keyField.getName(); } //Only load the id field to get the uniqueKey of that //field final String finalUniqFieldName = uniqFieldName; final List<String> uniqValues = new ArrayList<String>(); // TODO: is this required to be single-valued? if so, we should STOP // once we find it... final StoredFieldVisitor getUniqValue = new StoredFieldVisitor() { @Override public void stringField(FieldInfo fieldInfo, String value) throws IOException { uniqValues.add(value); } @Override public void intField(FieldInfo fieldInfo, int value) throws IOException { uniqValues.add(Integer.toString(value)); } @Override public void longField(FieldInfo fieldInfo, long value) throws IOException { uniqValues.add(Long.toString(value)); } @Override public Status needsField(FieldInfo fieldInfo) throws IOException { return (fieldInfo.name.equals(finalUniqFieldName)) ? Status.YES : Status.NO; } }; TermsEnum termsEnum = null; while (iter.hasNext()) { Integer docId = iter.next(); NamedList<Object> docNL = new NamedList<Object>(); termVectors.add("doc-" + docId, docNL); if (keyField != null) { reader.document(docId, getUniqValue); String uniqVal = null; if (uniqValues.size() != 0) { uniqVal = uniqValues.get(0); uniqValues.clear(); docNL.add("uniqueKey", uniqVal); termVectors.add("uniqueKeyFieldName", uniqFieldName); } } if (!fieldOptions.isEmpty()) { for (Map.Entry<String, FieldOptions> entry : fieldOptions.entrySet()) { final String field = entry.getKey(); final Terms vector = reader.getTermVector(docId, field); if (vector != null) { termsEnum = vector.iterator(termsEnum); mapOneVector(docNL, entry.getValue(), reader, docId, vector.iterator(termsEnum), field); } } } else { // extract all fields final Fields vectors = reader.getTermVectors(docId); final FieldsEnum fieldsEnum = vectors.iterator(); String field; while((field = fieldsEnum.next()) != null) { Terms terms = fieldsEnum.terms(); if (terms != null) { termsEnum = terms.iterator(termsEnum); mapOneVector(docNL, allFields, reader, docId, termsEnum, field); } } } } }
// in core/src/java/org/apache/solr/handler/component/TermVectorComponent.java
Override public void stringField(FieldInfo fieldInfo, String value) throws IOException { uniqValues.add(value); }
// in core/src/java/org/apache/solr/handler/component/TermVectorComponent.java
Override public void intField(FieldInfo fieldInfo, int value) throws IOException { uniqValues.add(Integer.toString(value)); }
// in core/src/java/org/apache/solr/handler/component/TermVectorComponent.java
Override public void longField(FieldInfo fieldInfo, long value) throws IOException { uniqValues.add(Long.toString(value)); }
// in core/src/java/org/apache/solr/handler/component/TermVectorComponent.java
Override public Status needsField(FieldInfo fieldInfo) throws IOException { return (fieldInfo.name.equals(finalUniqFieldName)) ? Status.YES : Status.NO; }
// in core/src/java/org/apache/solr/handler/component/TermVectorComponent.java
private void mapOneVector(NamedList<Object> docNL, FieldOptions fieldOptions, IndexReader reader, int docID, TermsEnum termsEnum, String field) throws IOException { NamedList<Object> fieldNL = new NamedList<Object>(); docNL.add(field, fieldNL); BytesRef text; DocsAndPositionsEnum dpEnum = null; while((text = termsEnum.next()) != null) { String term = text.utf8ToString(); NamedList<Object> termInfo = new NamedList<Object>(); fieldNL.add(term, termInfo); final int freq = (int) termsEnum.totalTermFreq(); if (fieldOptions.termFreq == true) { termInfo.add("tf", freq); } dpEnum = termsEnum.docsAndPositions(null, dpEnum, fieldOptions.offsets); boolean useOffsets = fieldOptions.offsets; if (dpEnum == null) { useOffsets = false; dpEnum = termsEnum.docsAndPositions(null, dpEnum, false); } boolean usePositions = false; if (dpEnum != null) { dpEnum.nextDoc(); usePositions = fieldOptions.positions; } NamedList<Number> theOffsets = null; if (useOffsets) { theOffsets = new NamedList<Number>(); termInfo.add("offsets", theOffsets); } NamedList<Integer> positionsNL = null; if (usePositions || theOffsets != null) { for (int i = 0; i < freq; i++) { final int pos = dpEnum.nextPosition(); if (usePositions && pos >= 0) { if (positionsNL == null) { positionsNL = new NamedList<Integer>(); termInfo.add("positions", positionsNL); } positionsNL.add("position", pos); } if (theOffsets != null) { theOffsets.add("start", dpEnum.startOffset()); theOffsets.add("end", dpEnum.endOffset()); } } } if (fieldOptions.docFreq) { termInfo.add("df", getDocFreq(reader, field, text)); } if (fieldOptions.tfIdf) { double tfIdfVal = ((double) freq) / getDocFreq(reader, field, text); termInfo.add("tf-idf", tfIdfVal); } } }
// in core/src/java/org/apache/solr/handler/component/TermVectorComponent.java
Override public int distributedProcess(ResponseBuilder rb) throws IOException { int result = ResponseBuilder.STAGE_DONE; if (rb.stage == ResponseBuilder.STAGE_GET_FIELDS) { //Go ask each shard for it's vectors // for each shard, collect the documents for that shard. HashMap<String, Collection<ShardDoc>> shardMap = new HashMap<String, Collection<ShardDoc>>(); for (ShardDoc sdoc : rb.resultIds.values()) { Collection<ShardDoc> shardDocs = shardMap.get(sdoc.shard); if (shardDocs == null) { shardDocs = new ArrayList<ShardDoc>(); shardMap.put(sdoc.shard, shardDocs); } shardDocs.add(sdoc); } // Now create a request for each shard to retrieve the stored fields for (Collection<ShardDoc> shardDocs : shardMap.values()) { ShardRequest sreq = new ShardRequest(); sreq.purpose = ShardRequest.PURPOSE_GET_FIELDS; sreq.shards = new String[]{shardDocs.iterator().next().shard}; sreq.params = new ModifiableSolrParams(); // add original params sreq.params.add(rb.req.getParams()); sreq.params.remove(CommonParams.Q);//remove the query ArrayList<String> ids = new ArrayList<String>(shardDocs.size()); for (ShardDoc shardDoc : shardDocs) { ids.add(shardDoc.id.toString()); } sreq.params.add(TermVectorParams.DOC_IDS, StrUtils.join(ids, ',')); rb.addRequest(this, sreq); } result = ResponseBuilder.STAGE_DONE; } return result; }
// in core/src/java/org/apache/solr/handler/component/TermVectorComponent.java
Override public void prepare(ResponseBuilder rb) throws IOException { }
// in core/src/java/org/apache/solr/handler/admin/ShowFileRequestHandler.java
Override public void handleRequestBody(SolrQueryRequest req, SolrQueryResponse rsp) throws IOException, KeeperException, InterruptedException { CoreContainer coreContainer = req.getCore().getCoreDescriptor().getCoreContainer(); if (coreContainer.isZooKeeperAware()) { showFromZooKeeper(req, rsp, coreContainer); } else { showFromFileSystem(req, rsp); } }
// in core/src/java/org/apache/solr/handler/admin/ShowFileRequestHandler.java
private void showFromFileSystem(SolrQueryRequest req, SolrQueryResponse rsp) throws IOException { File adminFile = null; final SolrResourceLoader loader = req.getCore().getResourceLoader(); File configdir = new File( loader.getConfigDir() ); if (!configdir.exists()) { // TODO: maybe we should just open it this way to start with? try { configdir = new File( loader.getClassLoader().getResource(loader.getConfigDir()).toURI() ); } catch (URISyntaxException e) { throw new SolrException( ErrorCode.FORBIDDEN, "Can not access configuration directory!"); } } String fname = req.getParams().get("file", null); if( fname == null ) { adminFile = configdir; } else { fname = fname.replace( '\\', '/' ); // normalize slashes if( hiddenFiles.contains( fname.toUpperCase(Locale.ENGLISH) ) ) { throw new SolrException( ErrorCode.FORBIDDEN, "Can not access: "+fname ); } if( fname.indexOf( ".." ) >= 0 ) { throw new SolrException( ErrorCode.FORBIDDEN, "Invalid path: "+fname ); } adminFile = new File( configdir, fname ); } // Make sure the file exists, is readable and is not a hidden file if( !adminFile.exists() ) { throw new SolrException( ErrorCode.BAD_REQUEST, "Can not find: "+adminFile.getName() + " ["+adminFile.getAbsolutePath()+"]" ); } if( !adminFile.canRead() || adminFile.isHidden() ) { throw new SolrException( ErrorCode.BAD_REQUEST, "Can not show: "+adminFile.getName() + " ["+adminFile.getAbsolutePath()+"]" ); } // Show a directory listing if( adminFile.isDirectory() ) { int basePath = configdir.getAbsolutePath().length() + 1; NamedList<SimpleOrderedMap<Object>> files = new SimpleOrderedMap<SimpleOrderedMap<Object>>(); for( File f : adminFile.listFiles() ) { String path = f.getAbsolutePath().substring( basePath ); path = path.replace( '\\', '/' ); // normalize slashes if( hiddenFiles.contains( path.toUpperCase(Locale.ENGLISH) ) ) { continue; // don't show 'hidden' files } if( f.isHidden() || f.getName().startsWith( "." ) ) { continue; // skip hidden system files... } SimpleOrderedMap<Object> fileInfo = new SimpleOrderedMap<Object>(); files.add( path, fileInfo ); if( f.isDirectory() ) { fileInfo.add( "directory", true ); } else { // TODO? content type fileInfo.add( "size", f.length() ); } fileInfo.add( "modified", new Date( f.lastModified() ) ); } rsp.add( "files", files ); } else { // Include the file contents //The file logic depends on RawResponseWriter, so force its use. ModifiableSolrParams params = new ModifiableSolrParams( req.getParams() ); params.set( CommonParams.WT, "raw" ); req.setParams(params); ContentStreamBase content = new ContentStreamBase.FileStream( adminFile ); content.setContentType( req.getParams().get( USE_CONTENT_TYPE ) ); rsp.add(RawResponseWriter.CONTENT, content); } rsp.setHttpCaching(false); }
// in core/src/java/org/apache/solr/handler/admin/PropertiesRequestHandler.java
Override public void handleRequestBody(SolrQueryRequest req, SolrQueryResponse rsp) throws IOException { Object props = null; String name = req.getParams().get( "name" ); if( name != null ) { NamedList<String> p = new SimpleOrderedMap<String>(); p.add( name, System.getProperty(name) ); props = p; } else { props = System.getProperties(); } rsp.add( "system.properties", props ); rsp.setHttpCaching(false); }
// in core/src/java/org/apache/solr/handler/admin/LukeRequestHandler.java
private static SimpleOrderedMap<Object> getDocumentFieldsInfo( Document doc, int docId, IndexReader reader, IndexSchema schema ) throws IOException { final CharsRef spare = new CharsRef(); SimpleOrderedMap<Object> finfo = new SimpleOrderedMap<Object>(); for( Object o : doc.getFields() ) { Field field = (Field)o; SimpleOrderedMap<Object> f = new SimpleOrderedMap<Object>(); SchemaField sfield = schema.getFieldOrNull( field.name() ); FieldType ftype = (sfield==null)?null:sfield.getType(); f.add( "type", (ftype==null)?null:ftype.getTypeName() ); f.add( "schema", getFieldFlags( sfield ) ); f.add( "flags", getFieldFlags( field ) ); Term t = new Term(field.name(), ftype!=null ? ftype.storedToIndexed(field) : field.stringValue()); f.add( "value", (ftype==null)?null:ftype.toExternal( field ) ); // TODO: this really should be "stored" f.add( "internal", field.stringValue() ); // may be a binary number BytesRef bytes = field.binaryValue(); if (bytes != null) { f.add( "binary", Base64.byteArrayToBase64(bytes.bytes, bytes.offset, bytes.length)); } f.add( "boost", field.boost() ); f.add( "docFreq", t.text()==null ? 0 : reader.docFreq( t ) ); // this can be 0 for non-indexed fields // If we have a term vector, return that if( field.fieldType().storeTermVectors() ) { try { Terms v = reader.getTermVector( docId, field.name() ); if( v != null ) { SimpleOrderedMap<Integer> tfv = new SimpleOrderedMap<Integer>(); final TermsEnum termsEnum = v.iterator(null); BytesRef text; while((text = termsEnum.next()) != null) { final int freq = (int) termsEnum.totalTermFreq(); UnicodeUtil.UTF8toUTF16(text, spare); tfv.add(spare.toString(), freq); } f.add( "termVector", tfv ); } } catch( Exception ex ) { log.warn( "error writing term vector", ex ); } } finfo.add( field.name(), f ); } return finfo; }
// in core/src/java/org/apache/solr/handler/admin/LukeRequestHandler.java
private static Document getFirstLiveDoc(AtomicReader reader, String fieldName, Terms terms) throws IOException { DocsEnum docsEnum = null; TermsEnum termsEnum = terms.iterator(null); BytesRef text; // Deal with the chance that the first bunch of terms are in deleted documents. Is there a better way? for (int idx = 0; idx < 1000 && docsEnum == null; ++idx) { text = termsEnum.next(); if (text == null) { // Ran off the end of the terms enum without finding any live docs with that field in them. return null; } Term term = new Term(fieldName, text); docsEnum = reader.termDocsEnum(reader.getLiveDocs(), term.field(), new BytesRef(term.text()), false); if (docsEnum != null) { int docId; if ((docId = docsEnum.nextDoc()) != DocIdSetIterator.NO_MORE_DOCS) { return reader.document(docId); } } } return null; }
// in core/src/java/org/apache/solr/handler/admin/LukeRequestHandler.java
public static SimpleOrderedMap<Object> getIndexInfo(DirectoryReader reader, boolean detail) throws IOException { return getIndexInfo(reader); }
// in core/src/java/org/apache/solr/handler/admin/LukeRequestHandler.java
public static SimpleOrderedMap<Object> getIndexInfo(DirectoryReader reader) throws IOException { Directory dir = reader.directory(); SimpleOrderedMap<Object> indexInfo = new SimpleOrderedMap<Object>(); indexInfo.add("numDocs", reader.numDocs()); indexInfo.add("maxDoc", reader.maxDoc()); indexInfo.add("version", reader.getVersion()); // TODO? Is this different then: IndexReader.getCurrentVersion( dir )? indexInfo.add("segmentCount", reader.getSequentialSubReaders().length); indexInfo.add("current", reader.isCurrent() ); indexInfo.add("hasDeletions", reader.hasDeletions() ); indexInfo.add("directory", dir ); indexInfo.add("userData", reader.getIndexCommit().getUserData()); String s = reader.getIndexCommit().getUserData().get(SolrIndexWriter.COMMIT_TIME_MSEC_KEY); if (s != null) { indexInfo.add("lastModified", new Date(Long.parseLong(s))); } return indexInfo; }
// in core/src/java/org/apache/solr/handler/admin/ThreadDumpHandler.java
Override public void handleRequestBody(SolrQueryRequest req, SolrQueryResponse rsp) throws IOException { SimpleOrderedMap<Object> system = new SimpleOrderedMap<Object>(); rsp.add( "system", system ); ThreadMXBean tmbean = ManagementFactory.getThreadMXBean(); // Thread Count SimpleOrderedMap<Object> nl = new SimpleOrderedMap<Object>(); nl.add( "current",tmbean.getThreadCount() ); nl.add( "peak", tmbean.getPeakThreadCount() ); nl.add( "daemon", tmbean.getDaemonThreadCount() ); system.add( "threadCount", nl ); // Deadlocks ThreadInfo[] tinfos; long[] tids = tmbean.findMonitorDeadlockedThreads(); if (tids != null) { tinfos = tmbean.getThreadInfo(tids, Integer.MAX_VALUE); NamedList<SimpleOrderedMap<Object>> lst = new NamedList<SimpleOrderedMap<Object>>(); for (ThreadInfo ti : tinfos) { if (ti != null) { lst.add( "thread", getThreadInfo( ti, tmbean ) ); } } system.add( "deadlocks", lst ); } // Now show all the threads.... tids = tmbean.getAllThreadIds(); tinfos = tmbean.getThreadInfo(tids, Integer.MAX_VALUE); NamedList<SimpleOrderedMap<Object>> lst = new NamedList<SimpleOrderedMap<Object>>(); for (ThreadInfo ti : tinfos) { if (ti != null) { lst.add( "thread", getThreadInfo( ti, tmbean ) ); } } system.add( "threadDump", lst ); rsp.setHttpCaching(false); }
// in core/src/java/org/apache/solr/handler/admin/ThreadDumpHandler.java
private static SimpleOrderedMap<Object> getThreadInfo( ThreadInfo ti, ThreadMXBean tmbean ) throws IOException { SimpleOrderedMap<Object> info = new SimpleOrderedMap<Object>(); long tid = ti.getThreadId(); info.add( "id", tid ); info.add( "name", ti.getThreadName() ); info.add( "state", ti.getThreadState().toString() ); if (ti.getLockName() != null) { info.add( "lock", ti.getLockName() ); } if (ti.isSuspended()) { info.add( "suspended", true ); } if (ti.isInNative()) { info.add( "native", true ); } if (tmbean.isThreadCpuTimeSupported()) { info.add( "cpuTime", formatNanos(tmbean.getThreadCpuTime(tid)) ); info.add( "userTime", formatNanos(tmbean.getThreadUserTime(tid)) ); } if (ti.getLockOwnerName() != null) { SimpleOrderedMap<Object> owner = new SimpleOrderedMap<Object>(); owner.add( "name", ti.getLockOwnerName() ); owner.add( "id", ti.getLockOwnerId() ); } // Add the stack trace int i=0; String[] trace = new String[ti.getStackTrace().length]; for( StackTraceElement ste : ti.getStackTrace()) { trace[i++] = ste.toString(); } info.add( "stackTrace", trace ); return info; }
// in core/src/java/org/apache/solr/handler/admin/CoreAdminHandler.java
protected boolean handleMergeAction(SolrQueryRequest req, SolrQueryResponse rsp) throws IOException { SolrParams params = req.getParams(); String cname = params.required().get(CoreAdminParams.CORE); SolrCore core = coreContainer.getCore(cname); SolrQueryRequest wrappedReq = null; SolrCore[] sourceCores = null; RefCounted<SolrIndexSearcher>[] searchers = null; // stores readers created from indexDir param values DirectoryReader[] readersToBeClosed = null; Directory[] dirsToBeReleased = null; if (core != null) { try { String[] dirNames = params.getParams(CoreAdminParams.INDEX_DIR); if (dirNames == null || dirNames.length == 0) { String[] sources = params.getParams("srcCore"); if (sources == null || sources.length == 0) throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "At least one indexDir or srcCore must be specified"); sourceCores = new SolrCore[sources.length]; for (int i = 0; i < sources.length; i++) { String source = sources[i]; SolrCore srcCore = coreContainer.getCore(source); if (srcCore == null) throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Core: " + source + " does not exist"); sourceCores[i] = srcCore; } } else { readersToBeClosed = new DirectoryReader[dirNames.length]; dirsToBeReleased = new Directory[dirNames.length]; DirectoryFactory dirFactory = core.getDirectoryFactory(); for (int i = 0; i < dirNames.length; i++) { Directory dir = dirFactory.get(dirNames[i], core.getSolrConfig().indexConfig.lockType); dirsToBeReleased[i] = dir; // TODO: why doesn't this use the IR factory? what is going on here? readersToBeClosed[i] = DirectoryReader.open(dir); } } DirectoryReader[] readers = null; if (readersToBeClosed != null) { readers = readersToBeClosed; } else { readers = new DirectoryReader[sourceCores.length]; searchers = new RefCounted[sourceCores.length]; for (int i = 0; i < sourceCores.length; i++) { SolrCore solrCore = sourceCores[i]; // record the searchers so that we can decref searchers[i] = solrCore.getSearcher(); readers[i] = searchers[i].get().getIndexReader(); } } UpdateRequestProcessorChain processorChain = core.getUpdateProcessingChain(params.get(UpdateParams.UPDATE_CHAIN)); wrappedReq = new LocalSolrQueryRequest(core, req.getParams()); UpdateRequestProcessor processor = processorChain.createProcessor(wrappedReq, rsp); processor.processMergeIndexes(new MergeIndexesCommand(readers, req)); } finally { if (searchers != null) { for (RefCounted<SolrIndexSearcher> searcher : searchers) { if (searcher != null) searcher.decref(); } } if (sourceCores != null) { for (SolrCore solrCore : sourceCores) { if (solrCore != null) solrCore.close(); } } if (readersToBeClosed != null) IOUtils.closeWhileHandlingException(readersToBeClosed); if (dirsToBeReleased != null) { for (Directory dir : dirsToBeReleased) { DirectoryFactory dirFactory = core.getDirectoryFactory(); dirFactory.release(dir); } } if (wrappedReq != null) wrappedReq.close(); core.close(); } } return coreContainer.isPersistent(); }
// in core/src/java/org/apache/solr/handler/admin/CoreAdminHandler.java
protected void handleRequestRecoveryAction(SolrQueryRequest req, SolrQueryResponse rsp) throws IOException { final SolrParams params = req.getParams(); String cname = params.get(CoreAdminParams.CORE); if (cname == null) { cname = ""; } SolrCore core = null; try { core = coreContainer.getCore(cname); if (core != null) { core.getUpdateHandler().getSolrCoreState().doRecovery(coreContainer, cname); } else { SolrException.log(log, "Cound not find core to call recovery:" + cname); } } finally { // no recoveryStrat close for now if (core != null) { core.close(); } } }
// in core/src/java/org/apache/solr/handler/admin/CoreAdminHandler.java
protected void handleWaitForStateAction(SolrQueryRequest req, SolrQueryResponse rsp) throws IOException, InterruptedException { final SolrParams params = req.getParams(); String cname = params.get(CoreAdminParams.CORE); if (cname == null) { cname = ""; } String nodeName = params.get("nodeName"); String coreNodeName = params.get("coreNodeName"); String waitForState = params.get("state"); Boolean checkLive = params.getBool("checkLive"); int pauseFor = params.getInt("pauseFor", 0); String state = null; boolean live = false; int retry = 0; while (true) { SolrCore core = null; try { core = coreContainer.getCore(cname); if (core == null && retry == 30) { throw new SolrException(ErrorCode.BAD_REQUEST, "core not found:" + cname); } if (core != null) { // wait until we are sure the recovering node is ready // to accept updates CloudDescriptor cloudDescriptor = core.getCoreDescriptor() .getCloudDescriptor(); CloudState cloudState = coreContainer.getZkController() .getCloudState(); String collection = cloudDescriptor.getCollectionName(); Slice slice = cloudState.getSlice(collection, cloudDescriptor.getShardId()); if (slice != null) { ZkNodeProps nodeProps = slice.getShards().get(coreNodeName); if (nodeProps != null) { state = nodeProps.get(ZkStateReader.STATE_PROP); live = cloudState.liveNodesContain(nodeName); if (nodeProps != null && state.equals(waitForState)) { if (checkLive == null) { break; } else if (checkLive && live) { break; } else if (!checkLive && !live) { break; } } } } } if (retry++ == 30) { throw new SolrException(ErrorCode.BAD_REQUEST, "I was asked to wait on state " + waitForState + " for " + nodeName + " but I still do not see the request state. I see state: " + state + " live:" + live); } } finally { if (core != null) { core.close(); } } Thread.sleep(1000); } // small safety net for any updates that started with state that // kept it from sending the update to be buffered - // pause for a while to let any outstanding updates finish // System.out.println("I saw state:" + state + " sleep for " + pauseFor + // " live:" + live); Thread.sleep(pauseFor); // solrcloud_debug // try {; // LocalSolrQueryRequest r = new LocalSolrQueryRequest(core, new // ModifiableSolrParams()); // CommitUpdateCommand commitCmd = new CommitUpdateCommand(r, false); // commitCmd.softCommit = true; // core.getUpdateHandler().commit(commitCmd); // RefCounted<SolrIndexSearcher> searchHolder = // core.getNewestSearcher(false); // SolrIndexSearcher searcher = searchHolder.get(); // try { // System.out.println(core.getCoreDescriptor().getCoreContainer().getZkController().getNodeName() // + " to replicate " // + searcher.search(new MatchAllDocsQuery(), 1).totalHits + " gen:" + // core.getDeletionPolicy().getLatestCommit().getGeneration() + " data:" + // core.getDataDir()); // } finally { // searchHolder.decref(); // } // } catch (Exception e) { // // } }
// in core/src/java/org/apache/solr/handler/admin/CoreAdminHandler.java
protected void handleDistribUrlAction(SolrQueryRequest req, SolrQueryResponse rsp) throws IOException, InterruptedException, SolrServerException { // TODO: finish this and tests SolrParams params = req.getParams(); final ModifiableSolrParams newParams = new ModifiableSolrParams(params); newParams.remove("action"); SolrParams required = params.required(); final String subAction = required.get("subAction"); String collection = required.get("collection"); newParams.set(CoreAdminParams.ACTION, subAction); SolrCore core = req.getCore(); ZkController zkController = core.getCoreDescriptor().getCoreContainer() .getZkController(); CloudState cloudState = zkController.getCloudState(); Map<String,Slice> slices = cloudState.getCollectionStates().get(collection); for (Map.Entry<String,Slice> entry : slices.entrySet()) { Slice slice = entry.getValue(); Map<String,ZkNodeProps> shards = slice.getShards(); Set<Map.Entry<String,ZkNodeProps>> shardEntries = shards.entrySet(); for (Map.Entry<String,ZkNodeProps> shardEntry : shardEntries) { final ZkNodeProps node = shardEntry.getValue(); if (cloudState.liveNodesContain(node.get(ZkStateReader.NODE_NAME_PROP))) { newParams.set(CoreAdminParams.CORE, node.get(ZkStateReader.CORE_NAME_PROP)); String replica = node.get(ZkStateReader.BASE_URL_PROP); ShardRequest sreq = new ShardRequest(); newParams.set("qt", "/admin/cores"); sreq.purpose = 1; // TODO: this sucks if (replica.startsWith("http://")) replica = replica.substring(7); sreq.shards = new String[]{replica}; sreq.actualShards = sreq.shards; sreq.params = newParams; shardHandler.submit(sreq, replica, sreq.params); } } } ShardResponse srsp; do { srsp = shardHandler.takeCompletedOrError(); if (srsp != null) { Throwable e = srsp.getException(); if (e != null) { log.error("Error talking to shard: " + srsp.getShard(), e); } } } while(srsp != null); }
// in core/src/java/org/apache/solr/handler/admin/CoreAdminHandler.java
protected NamedList<Object> getCoreStatus(CoreContainer cores, String cname) throws IOException { NamedList<Object> info = new SimpleOrderedMap<Object>(); SolrCore core = cores.getCore(cname); if (core != null) { try { info.add("name", core.getName()); info.add("isDefaultCore", core.getName().equals(cores.getDefaultCoreName())); info.add("instanceDir", normalizePath(core.getResourceLoader().getInstanceDir())); info.add("dataDir", normalizePath(core.getDataDir())); info.add("config", core.getConfigResource()); info.add("schema", core.getSchemaResource()); info.add("startTime", new Date(core.getStartTime())); info.add("uptime", System.currentTimeMillis() - core.getStartTime()); RefCounted<SolrIndexSearcher> searcher = core.getSearcher(); try { SimpleOrderedMap<Object> indexInfo = LukeRequestHandler.getIndexInfo(searcher.get().getIndexReader()); long size = getIndexSize(core); indexInfo.add("sizeInBytes", size); indexInfo.add("size", NumberUtils.readableSize(size)); info.add("index", indexInfo); } finally { searcher.decref(); } } finally { core.close(); } } return info; }
// in core/src/java/org/apache/solr/handler/SnapShooter.java
public void copyFiles(Collection<String> files, File destDir) throws IOException { for (String indexFile : files) { File source = new File(solrCore.getIndexDir(), indexFile); copyFile(source, new File(destDir, source.getName()), true); } }
// in core/src/java/org/apache/solr/handler/SnapShooter.java
public void copyFile(File source, File destination, boolean preserveFileDate) throws IOException { // check source exists if (!source.exists()) { String message = "File " + source + " does not exist"; throw new FileNotFoundException(message); } // does destinations directory exist ? if (destination.getParentFile() != null && !destination.getParentFile().exists()) { destination.getParentFile().mkdirs(); } // make sure we can write to destination if (destination.exists() && !destination.canWrite()) { String message = "Unable to open file " + destination + " for writing."; throw new IOException(message); } FileInputStream input = null; FileOutputStream output = null; try { input = new FileInputStream(source); output = new FileOutputStream(destination); int count = 0; int n = 0; int rcnt = 0; while (-1 != (n = input.read(buffer))) { output.write(buffer, 0, n); count += n; rcnt++; /*** // reserve every 4.6875 MB if (rcnt == 150) { rcnt = 0; delPolicy.setReserveDuration(indexCommit.getVersion(), reserveTime); } ***/ } } finally { try { IOUtils.closeQuietly(input); } finally { IOUtils.closeQuietly(output); } } if (source.length() != destination.length()) { String message = "Failed to copy full contents from " + source + " to " + destination; throw new IOException(message); } if (preserveFileDate) { // file copy should preserve file date destination.setLastModified(source.lastModified()); } }
// in core/src/java/org/apache/solr/analysis/SynonymFilterFactory.java
private SynonymMap loadSolrSynonyms(ResourceLoader loader, boolean dedup, Analyzer analyzer) throws IOException, ParseException { final boolean expand = getBoolean("expand", true); String synonyms = args.get("synonyms"); if (synonyms == null) throw new InitializationException("Missing required argument 'synonyms'."); CharsetDecoder decoder = Charset.forName("UTF-8").newDecoder() .onMalformedInput(CodingErrorAction.REPORT) .onUnmappableCharacter(CodingErrorAction.REPORT); SolrSynonymParser parser = new SolrSynonymParser(dedup, expand, analyzer); File synonymFile = new File(synonyms); if (synonymFile.exists()) { decoder.reset(); parser.add(new InputStreamReader(loader.openResource(synonyms), decoder)); } else { List<String> files = StrUtils.splitFileNames(synonyms); for (String file : files) { decoder.reset(); parser.add(new InputStreamReader(loader.openResource(file), decoder)); } } return parser.build(); }
// in core/src/java/org/apache/solr/analysis/SynonymFilterFactory.java
private SynonymMap loadWordnetSynonyms(ResourceLoader loader, boolean dedup, Analyzer analyzer) throws IOException, ParseException { final boolean expand = getBoolean("expand", true); String synonyms = args.get("synonyms"); if (synonyms == null) throw new InitializationException("Missing required argument 'synonyms'."); CharsetDecoder decoder = Charset.forName("UTF-8").newDecoder() .onMalformedInput(CodingErrorAction.REPORT) .onUnmappableCharacter(CodingErrorAction.REPORT); WordnetSynonymParser parser = new WordnetSynonymParser(dedup, expand, analyzer); File synonymFile = new File(synonyms); if (synonymFile.exists()) { decoder.reset(); parser.add(new InputStreamReader(loader.openResource(synonyms), decoder)); } else { List<String> files = StrUtils.splitFileNames(synonyms); for (String file : files) { decoder.reset(); parser.add(new InputStreamReader(loader.openResource(file), decoder)); } } return parser.build(); }
// in core/src/java/org/apache/solr/analysis/TrieTokenizerFactory.java
Override public void reset(Reader input) throws IOException { try { super.reset(input); input = super.input; char[] buf = new char[32]; int len = input.read(buf); this.startOfs = correctOffset(0); this.endOfs = correctOffset(len); String v = new String(buf, 0, len); try { switch (type) { case INTEGER: ts.setIntValue(Integer.parseInt(v)); break; case FLOAT: ts.setFloatValue(Float.parseFloat(v)); break; case LONG: ts.setLongValue(Long.parseLong(v)); break; case DOUBLE: ts.setDoubleValue(Double.parseDouble(v)); break; case DATE: ts.setLongValue(dateField.parseMath(null, v).getTime()); break; default: throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Unknown type for trie field"); } } catch (NumberFormatException nfe) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Invalid Number: " + v); } } catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Unable to create TrieIndexTokenizer", e); } }
// in core/src/java/org/apache/solr/analysis/TrieTokenizerFactory.java
Override public void close() throws IOException { super.close(); ts.close(); }
// in core/src/java/org/apache/solr/analysis/TrieTokenizerFactory.java
Override public void reset() throws IOException { super.reset(); ts.reset(); }
// in core/src/java/org/apache/solr/analysis/TrieTokenizerFactory.java
Override public boolean incrementToken() throws IOException { if (ts.incrementToken()) { ofsAtt.setOffset(startOfs, endOfs); return true; } return false; }
// in core/src/java/org/apache/solr/analysis/TrieTokenizerFactory.java
Override public void end() throws IOException { ts.end(); ofsAtt.setOffset(endOfs, endOfs); }
// in core/src/java/org/apache/solr/analysis/ReversedWildcardFilter.java
Override public boolean incrementToken() throws IOException { if( save != null ) { // clearAttributes(); // not currently necessary restoreState(save); save = null; return true; } if (!input.incrementToken()) return false; // pass through zero-length terms int oldLen = termAtt.length(); if (oldLen ==0) return true; int origOffset = posAtt.getPositionIncrement(); if (withOriginal == true){ posAtt.setPositionIncrement(0); save = captureState(); } char [] buffer = termAtt.resizeBuffer(oldLen + 1); buffer[oldLen] = markerChar; reverse(buffer, 0, oldLen + 1); posAtt.setPositionIncrement(origOffset); termAtt.copyBuffer(buffer, 0, oldLen +1); return true; }
// in core/src/java/org/apache/solr/analysis/TokenizerChain.java
Override protected void reset(Reader reader) throws IOException { // the tokenizers are currently reset by the indexing process, so only // the tokenizer needs to be reset. Reader r = initReader(reader); super.reset(r); }
// in core/src/java/org/apache/solr/analysis/LegacyHTMLStripCharFilter.java
public static void main(String[] args) throws IOException { Reader in = new LegacyHTMLStripCharFilter( CharReader.get(new InputStreamReader(System.in))); int ch; while ( (ch=in.read()) != -1 ) System.out.print((char)ch); }
// in core/src/java/org/apache/solr/analysis/LegacyHTMLStripCharFilter.java
private int next() throws IOException { int len = pushed.length(); if (len>0) { int ch = pushed.charAt(len-1); pushed.setLength(len-1); return ch; } numRead++; return input.read(); }
// in core/src/java/org/apache/solr/analysis/LegacyHTMLStripCharFilter.java
private int nextSkipWS() throws IOException { int ch=next(); while(isSpace(ch)) ch=next(); return ch; }
// in core/src/java/org/apache/solr/analysis/LegacyHTMLStripCharFilter.java
private int peek() throws IOException { int len = pushed.length(); if (len>0) { return pushed.charAt(len-1); } numRead++; int ch = input.read(); push(ch); return ch; }
// in core/src/java/org/apache/solr/analysis/LegacyHTMLStripCharFilter.java
private void saveState() throws IOException { lastMark = numRead; input.mark(readAheadLimit); }
// in core/src/java/org/apache/solr/analysis/LegacyHTMLStripCharFilter.java
private void restoreState() throws IOException { input.reset(); pushed.setLength(0); }
// in core/src/java/org/apache/solr/analysis/LegacyHTMLStripCharFilter.java
private int readNumericEntity() throws IOException { // "&#" has already been read at this point int eaten = 2; // is this decimal, hex, or nothing at all. int ch = next(); int base=10; boolean invalid=false; sb.setLength(0); if (isDigit(ch)) { // decimal character entity sb.append((char)ch); for (int i=0; i<10; i++) { ch = next(); if (isDigit(ch)) { sb.append((char)ch); } else { break; } } } else if (ch=='x') { eaten++; // hex character entity base=16; sb.setLength(0); for (int i=0; i<10; i++) { ch = next(); if (isHex(ch)) { sb.append((char)ch); } else { break; } } } else { return MISMATCH; } // In older HTML, an entity may not have always been terminated // with a semicolon. We'll also treat EOF or whitespace as terminating // the entity. try { if (ch==';' || ch==-1) { // do not account for the eaten ";" due to the fact that we do output a char numWhitespace = sb.length() + eaten; return Integer.parseInt(sb.toString(), base); } // if whitespace terminated the entity, we need to return // that whitespace on the next call to read(). if (isSpace(ch)) { push(ch); numWhitespace = sb.length() + eaten; return Integer.parseInt(sb.toString(), base); } } catch (NumberFormatException e) { return MISMATCH; } // Not an entity... return MISMATCH; }
// in core/src/java/org/apache/solr/analysis/LegacyHTMLStripCharFilter.java
private int readEntity() throws IOException { int ch = next(); if (ch=='#') return readNumericEntity(); //read an entity reference // for an entity reference, require the ';' for safety. // otherwise we may try and convert part of some company // names to an entity. "Alpha&Beta Corp" for instance. // // TODO: perhaps I should special case some of the // more common ones like &amp to make the ';' optional... sb.setLength(0); sb.append((char)ch); for (int i=0; i< safeReadAheadLimit; i++) { ch=next(); if (Character.isLetter(ch)) { sb.append((char)ch); } else { break; } } if (ch==';') { String entity=sb.toString(); Character entityChar = entityTable.get(entity); if (entityChar!=null) { numWhitespace = entity.length() + 1 ; return entityChar.charValue(); } } return MISMATCH; }
// in core/src/java/org/apache/solr/analysis/LegacyHTMLStripCharFilter.java
private int readBang(boolean inScript) throws IOException { // at this point, "<!" has been read int ret = readComment(inScript); if (ret==MATCH) return MATCH; if ((numRead - lastMark) < safeReadAheadLimit || peek() == '>' ) { int ch = next(); if (ch=='>') return MATCH; // if it starts with <! and isn't a comment, // simply read until ">" //since we did readComment already, it may be the case that we are already deep into the read ahead buffer //so, we may need to abort sooner while ((numRead - lastMark) < safeReadAheadLimit) { ch = next(); if (ch=='>') { return MATCH; } else if (ch<0) { return MISMATCH; } } } return MISMATCH; }
// in core/src/java/org/apache/solr/analysis/LegacyHTMLStripCharFilter.java
private int readComment(boolean inScript) throws IOException { // at this point "<!" has been read int ch = next(); if (ch!='-') { // not a comment push(ch); return MISMATCH; } ch = next(); if (ch!='-') { // not a comment push(ch); push('-'); return MISMATCH; } /*two extra calls to next() here, so make sure we don't read past our mark*/ while ((numRead - lastMark) < safeReadAheadLimit -3 ) { ch = next(); if (ch<0) return MISMATCH; if (ch=='-') { ch = next(); if (ch<0) return MISMATCH; if (ch!='-') { push(ch); continue; } ch = next(); if (ch<0) return MISMATCH; if (ch!='>') { push(ch); push('-'); continue; } return MATCH; } else if ((ch=='\'' || ch=='"') && inScript) { push(ch); int ret=readScriptString(); // if this wasn't a string, there's not much we can do // at this point without having a stack of stream states in // order to "undo" just the latest. } else if (ch=='<') { eatSSI(); } } return MISMATCH; }
// in core/src/java/org/apache/solr/analysis/LegacyHTMLStripCharFilter.java
private int readTag() throws IOException { // at this point '<' has already been read int ch = next(); if (!isAlpha(ch)) { push(ch); return MISMATCH; } sb.setLength(0); sb.append((char)ch); while((numRead - lastMark) < safeReadAheadLimit) { ch = next(); if (isIdChar(ch)) { sb.append((char)ch); } else if (ch=='/') { // Hmmm, a tag can close with "/>" as well as "/ >" // read end tag '/>' or '/ >', etc return nextSkipWS()=='>' ? MATCH : MISMATCH; } else { break; } } if (escapedTags!=null && escapedTags.contains(sb.toString())){ //if this is a reservedTag, then keep it return MISMATCH; } // After the tag id, there needs to be either whitespace or // '>' if ( !(ch=='>' || isSpace(ch)) ) { return MISMATCH; } if (ch!='>') { // process attributes while ((numRead - lastMark) < safeReadAheadLimit) { ch=next(); if (isSpace(ch)) { continue; } else if (isFirstIdChar(ch)) { push(ch); int ret = readAttr2(); if (ret==MISMATCH) return ret; } else if (ch=='/') { // read end tag '/>' or '/ >', etc return nextSkipWS()=='>' ? MATCH : MISMATCH; } else if (ch=='>') { break; } else { return MISMATCH; } } if ((numRead - lastMark) >= safeReadAheadLimit){ return MISMATCH;//exit out if we exceeded the buffer } } // We only get to this point after we have read the // entire tag. Now let's see if it's a special tag. String name=sb.toString(); if (name.equalsIgnoreCase("script") || name.equalsIgnoreCase("style")) { // The content of script and style elements is // CDATA in HTML 4 but PCDATA in XHTML. /* From HTML4: Although the STYLE and SCRIPT elements use CDATA for their data model, for these elements, CDATA must be handled differently by user agents. Markup and entities must be treated as raw text and passed to the application as is. The first occurrence of the character sequence "</" (end-tag open delimiter) is treated as terminating the end of the element's content. In valid documents, this would be the end tag for the element. */ // discard everything until endtag is hit (except // if it occurs in a comment. // reset the stream mark to here, since we know that we sucessfully matched // a tag, and if we can't find the end tag, this is where we will want // to roll back to. saveState(); pushed.setLength(0); return findEndTag(); } return MATCH; }
// in core/src/java/org/apache/solr/analysis/LegacyHTMLStripCharFilter.java
int findEndTag() throws IOException { while ((numRead - lastMark) < safeReadAheadLimit) { int ch = next(); if (ch=='<') { ch = next(); // skip looking for end-tag in comments if (ch=='!') { int ret = readBang(true); if (ret==MATCH) continue; // yikes... what now? It wasn't a comment, but I can't get // back to the state I was at. Just continue from where I // am I guess... continue; } // did we match "</" if (ch!='/') { push(ch); continue; } int ret = readName(false); if (ret==MISMATCH) return MISMATCH; ch=nextSkipWS(); if (ch!='>') return MISMATCH; return MATCH; } else if (ch=='\'' || ch=='"') { // read javascript string to avoid a false match. push(ch); int ret = readScriptString(); // what to do about a non-match (non-terminated string?) // play it safe and index the rest of the data I guess... if (ret==MISMATCH) return MISMATCH; } else if (ch<0) { return MISMATCH; } } return MISMATCH; }
// in core/src/java/org/apache/solr/analysis/LegacyHTMLStripCharFilter.java
private int readScriptString() throws IOException { int quoteChar = next(); if (quoteChar!='\'' && quoteChar!='"') return MISMATCH; while((numRead - lastMark) < safeReadAheadLimit) { int ch = next(); if (ch==quoteChar) return MATCH; else if (ch=='\\') { ch=next(); } else if (ch<0) { return MISMATCH; } else if (ch=='<') { eatSSI(); } } return MISMATCH; }
// in core/src/java/org/apache/solr/analysis/LegacyHTMLStripCharFilter.java
private int readName(boolean checkEscaped) throws IOException { StringBuilder builder = (checkEscaped && escapedTags!=null) ? new StringBuilder() : null; int ch = next(); if (builder!=null) builder.append((char)ch); if (!isFirstIdChar(ch)) return MISMATCH; ch = next(); if (builder!=null) builder.append((char)ch); while(isIdChar(ch)) { ch=next(); if (builder!=null) builder.append((char)ch); } if (ch!=-1) { push(ch); } //strip off the trailing > if (builder!=null && escapedTags.contains(builder.substring(0, builder.length() - 1))){ return MISMATCH; } return MATCH; }
// in core/src/java/org/apache/solr/analysis/LegacyHTMLStripCharFilter.java
private int readAttr2() throws IOException { if ((numRead - lastMark < safeReadAheadLimit)) { int ch = next(); if (!isFirstIdChar(ch)) return MISMATCH; ch = next(); while(isIdChar(ch) && ((numRead - lastMark) < safeReadAheadLimit)){ ch=next(); } if (isSpace(ch)) ch = nextSkipWS(); // attributes may not have a value at all! // if (ch != '=') return MISMATCH; if (ch != '=') { push(ch); return MATCH; } int quoteChar = nextSkipWS(); if (quoteChar=='"' || quoteChar=='\'') { while ((numRead - lastMark) < safeReadAheadLimit) { ch = next(); if (ch<0) return MISMATCH; else if (ch=='<') { eatSSI(); } else if (ch==quoteChar) { return MATCH; //} else if (ch=='<') { // return MISMATCH; } } } else { // unquoted attribute while ((numRead - lastMark) < safeReadAheadLimit) { ch = next(); if (ch<0) return MISMATCH; else if (isSpace(ch)) { push(ch); return MATCH; } else if (ch=='>') { push(ch); return MATCH; } else if (ch=='<') { eatSSI(); } } } } return MISMATCH; }
// in core/src/java/org/apache/solr/analysis/LegacyHTMLStripCharFilter.java
private int eatSSI() throws IOException { // at this point, only a "<" was read. // on a mismatch, push back the last char so that if it was // a quote that closes the attribute, it will be re-read and matched. int ch = next(); if (ch!='!') { push(ch); return MISMATCH; } ch=next(); if (ch!='-') { push(ch); return MISMATCH; } ch=next(); if (ch!='-') { push(ch); return MISMATCH; } ch=next(); if (ch!='#') { push(ch); return MISMATCH; } push('#'); push('-'); push('-'); return readComment(false); }
// in core/src/java/org/apache/solr/analysis/LegacyHTMLStripCharFilter.java
private int readProcessingInstruction() throws IOException { // "<?" has already been read while ((numRead - lastMark) < safeReadAheadLimit) { int ch = next(); if (ch=='?' && peek()=='>') { next(); return MATCH; } else if (ch==-1) { return MISMATCH; } } return MISMATCH; }
// in core/src/java/org/apache/solr/analysis/LegacyHTMLStripCharFilter.java
Override public int read() throws IOException { // TODO: Do we ever want to preserve CDATA sections? // where do we have to worry about them? // <![ CDATA [ unescaped markup ]]> if (numWhitespace > 0){ numEaten += numWhitespace; addOffCorrectMap(numReturned, numEaten); numWhitespace = 0; } numReturned++; //do not limit this one by the READAHEAD while(true) { int lastNumRead = numRead; int ch = next(); switch (ch) { case '&': saveState(); ch = readEntity(); if (ch>=0) return ch; if (ch==MISMATCH) { restoreState(); return '&'; } break; case '<': saveState(); ch = next(); int ret = MISMATCH; if (ch=='!') { ret = readBang(false); } else if (ch=='/') { ret = readName(true); if (ret==MATCH) { ch=nextSkipWS(); ret= ch=='>' ? MATCH : MISMATCH; } } else if (isAlpha(ch)) { push(ch); ret = readTag(); } else if (ch=='?') { ret = readProcessingInstruction(); } // matched something to be discarded, so break // from this case and continue in the loop if (ret==MATCH) { //break;//was //return whitespace from numWhitespace = (numRead - lastNumRead) - 1;//tack on the -1 since we are returning a space right now return ' '; } // didn't match any HTML constructs, so roll back // the stream state and just return '<' restoreState(); return '<'; default: return ch; } } }
// in core/src/java/org/apache/solr/analysis/LegacyHTMLStripCharFilter.java
Override public int read(char cbuf[], int off, int len) throws IOException { int i=0; for (i=0; i<len; i++) { int ch = read(); if (ch==-1) break; cbuf[off++] = (char)ch; } if (i==0) { if (len==0) return 0; return -1; } return i; }
// in core/src/java/org/apache/solr/analysis/LegacyHTMLStripCharFilter.java
Override public void close() throws IOException { input.close(); }
// in core/src/java/org/apache/solr/client/solrj/embedded/EmbeddedSolrServer.java
Override public NamedList<Object> request(SolrRequest request) throws SolrServerException, IOException { String path = request.getPath(); if( path == null || !path.startsWith( "/" ) ) { path = "/select"; } // Check for cores action SolrCore core = coreContainer.getCore( coreName ); if( core == null ) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "No such core: " + coreName ); } SolrParams params = request.getParams(); if( params == null ) { params = new ModifiableSolrParams(); } // Extract the handler from the path or params SolrRequestHandler handler = core.getRequestHandler( path ); if( handler == null ) { if( "/select".equals( path ) || "/select/".equalsIgnoreCase( path) ) { String qt = params.get( CommonParams.QT ); handler = core.getRequestHandler( qt ); if( handler == null ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "unknown handler: "+qt); } } // Perhaps the path is to manage the cores if( handler == null && coreContainer != null && path.equals( coreContainer.getAdminPath() ) ) { handler = coreContainer.getMultiCoreHandler(); } } if( handler == null ) { core.close(); throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "unknown handler: "+path ); } SolrQueryRequest req = null; try { req = _parser.buildRequestFrom( core, params, request.getContentStreams() ); req.getContext().put( "path", path ); SolrQueryResponse rsp = new SolrQueryResponse(); SolrRequestInfo.setRequestInfo(new SolrRequestInfo(req, rsp)); core.execute( handler, req, rsp ); if( rsp.getException() != null ) { if(rsp.getException() instanceof SolrException) { throw rsp.getException(); } throw new SolrServerException( rsp.getException() ); } // Check if this should stream results if( request.getStreamingResponseCallback() != null ) { try { final StreamingResponseCallback callback = request.getStreamingResponseCallback(); BinaryResponseWriter.Resolver resolver = new BinaryResponseWriter.Resolver( req, rsp.getReturnFields()) { @Override public void writeResults(ResultContext ctx, JavaBinCodec codec) throws IOException { // write an empty list... SolrDocumentList docs = new SolrDocumentList(); docs.setNumFound( ctx.docs.matches() ); docs.setStart( ctx.docs.offset() ); docs.setMaxScore( ctx.docs.maxScore() ); codec.writeSolrDocumentList( docs ); // This will transform writeResultsBody( ctx, codec ); } }; ByteArrayOutputStream out = new ByteArrayOutputStream(); new JavaBinCodec(resolver) { @Override public void writeSolrDocument(SolrDocument doc) throws IOException { callback.streamSolrDocument( doc ); //super.writeSolrDocument( doc, fields ); } @Override public void writeSolrDocumentList(SolrDocumentList docs) throws IOException { if( docs.size() > 0 ) { SolrDocumentList tmp = new SolrDocumentList(); tmp.setMaxScore( docs.getMaxScore() ); tmp.setNumFound( docs.getNumFound() ); tmp.setStart( docs.getStart() ); docs = tmp; } callback.streamDocListInfo( docs.getNumFound(), docs.getStart(), docs.getMaxScore() ); super.writeSolrDocumentList(docs); } }.marshal(rsp.getValues(), out); InputStream in = new ByteArrayInputStream(out.toByteArray()); return (NamedList<Object>) new JavaBinCodec(resolver).unmarshal(in); } catch (Exception ex) { throw new RuntimeException(ex); } } // Now write it out NamedList<Object> normalized = getParsedResponse(req, rsp); return normalized; } catch( IOException iox ) { throw iox; } catch( SolrException sx ) { throw sx; } catch( Exception ex ) { throw new SolrServerException( ex ); } finally { if (req != null) req.close(); core.close(); SolrRequestInfo.clearRequestInfo(); } }
// in core/src/java/org/apache/solr/client/solrj/embedded/EmbeddedSolrServer.java
Override public void writeResults(ResultContext ctx, JavaBinCodec codec) throws IOException { // write an empty list... SolrDocumentList docs = new SolrDocumentList(); docs.setNumFound( ctx.docs.matches() ); docs.setStart( ctx.docs.offset() ); docs.setMaxScore( ctx.docs.maxScore() ); codec.writeSolrDocumentList( docs ); // This will transform writeResultsBody( ctx, codec ); }
// in core/src/java/org/apache/solr/client/solrj/embedded/EmbeddedSolrServer.java
Override public void writeSolrDocument(SolrDocument doc) throws IOException { callback.streamSolrDocument( doc ); //super.writeSolrDocument( doc, fields ); }
// in core/src/java/org/apache/solr/client/solrj/embedded/EmbeddedSolrServer.java
Override public void writeSolrDocumentList(SolrDocumentList docs) throws IOException { if( docs.size() > 0 ) { SolrDocumentList tmp = new SolrDocumentList(); tmp.setMaxScore( docs.getMaxScore() ); tmp.setNumFound( docs.getNumFound() ); tmp.setStart( docs.getStart() ); docs = tmp; } callback.streamDocListInfo( docs.getNumFound(), docs.getStart(), docs.getMaxScore() ); super.writeSolrDocumentList(docs); }
// in core/src/java/org/apache/solr/client/solrj/embedded/JettySolrRunner.java
Override public void service(HttpServletRequest req, HttpServletResponse res) throws IOException { res.sendError(404, "Can not find: " + req.getRequestURI()); }
// in core/src/java/org/apache/solr/response/TextResponseWriter.java
public void close() throws IOException { writer.flushBuffer(); }
// in core/src/java/org/apache/solr/response/TextResponseWriter.java
public void indent() throws IOException { if (doIndent) indent(level); }
// in core/src/java/org/apache/solr/response/TextResponseWriter.java
public void indent(int lev) throws IOException { writer.write(indentChars, 0, Math.min((lev<<1)+1, indentChars.length)); }
// in core/src/java/org/apache/solr/response/TextResponseWriter.java
public final void writeVal(String name, Object val) throws IOException { // if there get to be enough types, perhaps hashing on the type // to get a handler might be faster (but types must be exact to do that...) // go in order of most common to least common if (val==null) { writeNull(name); } else if (val instanceof String) { writeStr(name, val.toString(), true); // micro-optimization... using toString() avoids a cast first } else if (val instanceof IndexableField) { IndexableField f = (IndexableField)val; SchemaField sf = schema.getFieldOrNull( f.name() ); if( sf != null ) { sf.getType().write(this, name, f); } else { writeStr(name, f.stringValue(), true); } } else if (val instanceof Number) { if (val instanceof Integer) { writeInt(name, val.toString()); } else if (val instanceof Long) { writeLong(name, val.toString()); } else if (val instanceof Float) { // we pass the float instead of using toString() because // it may need special formatting. same for double. writeFloat(name, ((Float)val).floatValue()); } else if (val instanceof Double) { writeDouble(name, ((Double)val).doubleValue()); } else if (val instanceof Short) { writeInt(name, val.toString()); } else if (val instanceof Byte) { writeInt(name, val.toString()); } else { // default... for debugging only writeStr(name, val.getClass().getName() + ':' + val.toString(), true); } } else if (val instanceof Boolean) { writeBool(name, val.toString()); } else if (val instanceof Date) { writeDate(name,(Date)val); } else if (val instanceof Document) { SolrDocument doc = toSolrDocument( (Document)val ); DocTransformer transformer = returnFields.getTransformer(); if( transformer != null ) { TransformContext context = new TransformContext(); context.req = req; transformer.setContext(context); transformer.transform(doc, -1); } writeSolrDocument(name, doc, returnFields, 0 ); } else if (val instanceof SolrDocument) { writeSolrDocument(name, (SolrDocument)val, returnFields, 0); } else if (val instanceof ResultContext) { // requires access to IndexReader writeDocuments(name, (ResultContext)val, returnFields); } else if (val instanceof DocList) { // Should not happen normally ResultContext ctx = new ResultContext(); ctx.docs = (DocList)val; writeDocuments(name, ctx, returnFields); // } // else if (val instanceof DocSet) { // how do we know what fields to read? // todo: have a DocList/DocSet wrapper that // restricts the fields to write...? } else if (val instanceof SolrDocumentList) { writeSolrDocumentList(name, (SolrDocumentList)val, returnFields); } else if (val instanceof Map) { writeMap(name, (Map)val, false, true); } else if (val instanceof NamedList) { writeNamedList(name, (NamedList)val); } else if (val instanceof Iterable) { writeArray(name,((Iterable)val).iterator()); } else if (val instanceof Object[]) { writeArray(name,(Object[])val); } else if (val instanceof Iterator) { writeArray(name,(Iterator)val); } else if (val instanceof byte[]) { byte[] arr = (byte[])val; writeByteArr(name, arr, 0, arr.length); } else if (val instanceof BytesRef) { BytesRef arr = (BytesRef)val; writeByteArr(name, arr.bytes, arr.offset, arr.length); } else { // default... for debugging only writeStr(name, val.getClass().getName() + ':' + val.toString(), true); } }
// in core/src/java/org/apache/solr/response/TextResponseWriter.java
public final void writeSolrDocumentList(String name, SolrDocumentList docs, ReturnFields returnFields) throws IOException { writeStartDocumentList(name, docs.getStart(), docs.size(), docs.getNumFound(), docs.getMaxScore() ); for( int i=0; i<docs.size(); i++ ) { writeSolrDocument( null, docs.get(i), returnFields, i ); } writeEndDocumentList(); }
// in core/src/java/org/apache/solr/response/TextResponseWriter.java
public final void writeDocuments(String name, ResultContext res, ReturnFields fields ) throws IOException { DocList ids = res.docs; TransformContext context = new TransformContext(); context.query = res.query; context.wantsScores = fields.wantsScore() && ids.hasScores(); context.req = req; writeStartDocumentList(name, ids.offset(), ids.size(), ids.matches(), context.wantsScores ? new Float(ids.maxScore()) : null ); DocTransformer transformer = fields.getTransformer(); context.searcher = req.getSearcher(); context.iterator = ids.iterator(); if( transformer != null ) { transformer.setContext( context ); } int sz = ids.size(); Set<String> fnames = fields.getLuceneFieldNames(); for (int i=0; i<sz; i++) { int id = context.iterator.nextDoc(); Document doc = context.searcher.doc(id, fnames); SolrDocument sdoc = toSolrDocument( doc ); if( transformer != null ) { transformer.transform( sdoc, id); } writeSolrDocument( null, sdoc, returnFields, i ); } if( transformer != null ) { transformer.setContext( null ); } writeEndDocumentList(); }
// in core/src/java/org/apache/solr/response/TextResponseWriter.java
public void writeArray(String name, Object[] val) throws IOException { writeArray(name, Arrays.asList(val).iterator()); }
// in core/src/java/org/apache/solr/response/TextResponseWriter.java
public void writeInt(String name, int val) throws IOException { writeInt(name,Integer.toString(val)); }
// in core/src/java/org/apache/solr/response/TextResponseWriter.java
public void writeLong(String name, long val) throws IOException { writeLong(name,Long.toString(val)); }
// in core/src/java/org/apache/solr/response/TextResponseWriter.java
public void writeBool(String name, boolean val) throws IOException { writeBool(name,Boolean.toString(val)); }
// in core/src/java/org/apache/solr/response/TextResponseWriter.java
public void writeFloat(String name, float val) throws IOException { String s = Float.toString(val); // If it's not a normal number, write the value as a string instead. // The following test also handles NaN since comparisons are always false. if (val > Float.NEGATIVE_INFINITY && val < Float.POSITIVE_INFINITY) { writeFloat(name,s); } else { writeStr(name,s,false); } }
// in core/src/java/org/apache/solr/response/TextResponseWriter.java
public void writeDouble(String name, double val) throws IOException { String s = Double.toString(val); // If it's not a normal number, write the value as a string instead. // The following test also handles NaN since comparisons are always false. if (val > Double.NEGATIVE_INFINITY && val < Double.POSITIVE_INFINITY) { writeDouble(name,s); } else { writeStr(name,s,false); } }
// in core/src/java/org/apache/solr/response/TextResponseWriter.java
public void writeDate(String name, Date val) throws IOException { writeDate(name, DateField.formatExternal(val)); }
// in core/src/java/org/apache/solr/response/TextResponseWriter.java
public void writeByteArr(String name, byte[] buf, int offset, int len) throws IOException { writeStr(name, Base64.byteArrayToBase64(buf, offset, len), false); }
// in core/src/java/org/apache/solr/response/JSONResponseWriter.java
public void write(Writer writer, SolrQueryRequest req, SolrQueryResponse rsp) throws IOException { JSONWriter w = new JSONWriter(writer, req, rsp); try { w.writeResponse(); } finally { w.close(); } }
// in core/src/java/org/apache/solr/response/JSONResponseWriter.java
public void writeResponse() throws IOException { if(wrapperFunction!=null) { writer.write(wrapperFunction + "("); } Boolean omitHeader = req.getParams().getBool(CommonParams.OMIT_HEADER); if(omitHeader != null && omitHeader) rsp.getValues().remove("responseHeader"); writeNamedList(null, rsp.getValues()); if(wrapperFunction!=null) { writer.write(')'); } if (doIndent) writer.write('\n'); // ending with a newline looks much better from the command line }
// in core/src/java/org/apache/solr/response/JSONResponseWriter.java
protected void writeKey(String fname, boolean needsEscaping) throws IOException { writeStr(null, fname, needsEscaping); writer.write(':'); }
// in core/src/java/org/apache/solr/response/JSONResponseWriter.java
protected void writeNamedListAsMapMangled(String name, NamedList val) throws IOException { int sz = val.size(); writeMapOpener(sz); incLevel(); // In JSON objects (maps) we can't have null keys or duplicates... // map null to "" and append a qualifier to duplicates. // // a=123,a=456 will be mapped to {a=1,a__1=456} // Disad: this is ambiguous since a real key could be called a__1 // // Another possible mapping could aggregate multiple keys to an array: // a=123,a=456 maps to a=[123,456] // Disad: this is ambiguous with a real single value that happens to be an array // // Both of these mappings have ambiguities. HashMap<String,Integer> repeats = new HashMap<String,Integer>(4); boolean first=true; for (int i=0; i<sz; i++) { String key = val.getName(i); if (key==null) key=""; if (first) { first=false; repeats.put(key,0); } else { writeMapSeparator(); Integer repeatCount = repeats.get(key); if (repeatCount==null) { repeats.put(key,0); } else { String newKey = key; int newCount = repeatCount; do { // avoid generated key clashing with a real key newKey = key + ' ' + (++newCount); repeatCount = repeats.get(newKey); } while (repeatCount != null); repeats.put(key,newCount); key = newKey; } } indent(); writeKey(key, true); writeVal(key,val.getVal(i)); } decLevel(); writeMapCloser(); }
// in core/src/java/org/apache/solr/response/JSONResponseWriter.java
protected void writeNamedListAsMapWithDups(String name, NamedList val) throws IOException { int sz = val.size(); writeMapOpener(sz); incLevel(); for (int i=0; i<sz; i++) { if (i!=0) { writeMapSeparator(); } String key = val.getName(i); if (key==null) key=""; indent(); writeKey(key, true); writeVal(key,val.getVal(i)); } decLevel(); writeMapCloser(); }
// in core/src/java/org/apache/solr/response/JSONResponseWriter.java
protected void writeNamedListAsArrMap(String name, NamedList val) throws IOException { int sz = val.size(); indent(); writeArrayOpener(sz); incLevel(); boolean first=true; for (int i=0; i<sz; i++) { String key = val.getName(i); if (first) { first=false; } else { writeArraySeparator(); } indent(); if (key==null) { writeVal(null,val.getVal(i)); } else { writeMapOpener(1); writeKey(key, true); writeVal(key,val.getVal(i)); writeMapCloser(); } } decLevel(); writeArrayCloser(); }
// in core/src/java/org/apache/solr/response/JSONResponseWriter.java
protected void writeNamedListAsArrArr(String name, NamedList val) throws IOException { int sz = val.size(); indent(); writeArrayOpener(sz); incLevel(); boolean first=true; for (int i=0; i<sz; i++) { String key = val.getName(i); if (first) { first=false; } else { writeArraySeparator(); } indent(); /*** if key is null, just write value??? if (key==null) { writeVal(null,val.getVal(i)); } else { ***/ writeArrayOpener(1); incLevel(); if (key==null) { writeNull(null); } else { writeStr(null, key, true); } writeArraySeparator(); writeVal(key,val.getVal(i)); decLevel(); writeArrayCloser(); } decLevel(); writeArrayCloser(); }
// in core/src/java/org/apache/solr/response/JSONResponseWriter.java
protected void writeNamedListAsFlat(String name, NamedList val) throws IOException { int sz = val.size(); writeArrayOpener(sz); incLevel(); for (int i=0; i<sz; i++) { if (i!=0) { writeArraySeparator(); } String key = val.getName(i); indent(); if (key==null) { writeNull(null); } else { writeStr(null, key, true); } writeArraySeparator(); writeVal(key, val.getVal(i)); } decLevel(); writeArrayCloser(); }
// in core/src/java/org/apache/solr/response/JSONResponseWriter.java
Override public void writeNamedList(String name, NamedList val) throws IOException { if (val instanceof SimpleOrderedMap) { writeNamedListAsMapWithDups(name,val); } else if (namedListStyle==JSON_NL_FLAT) { writeNamedListAsFlat(name,val); } else if (namedListStyle==JSON_NL_MAP){ writeNamedListAsMapWithDups(name,val); } else if (namedListStyle==JSON_NL_ARROFARR) { writeNamedListAsArrArr(name,val); } else if (namedListStyle==JSON_NL_ARROFMAP) { writeNamedListAsArrMap(name,val); } }
// in core/src/java/org/apache/solr/response/JSONResponseWriter.java
Override public void writeSolrDocument(String name, SolrDocument doc, ReturnFields returnFields, int idx) throws IOException { if( idx > 0 ) { writeArraySeparator(); } indent(); writeMapOpener(doc.size()); incLevel(); boolean first=true; for (String fname : doc.getFieldNames()) { if (!returnFields.wantsField(fname)) { continue; } if (first) { first=false; } else { writeMapSeparator(); } indent(); writeKey(fname, true); Object val = doc.getFieldValue(fname); if (val instanceof Collection) { writeVal(fname, val); } else { // if multivalued field, write single value as an array SchemaField sf = schema.getFieldOrNull(fname); if (sf != null && sf.multiValued()) { writeArrayOpener(-1); // no trivial way to determine array size writeVal(fname, val); writeArrayCloser(); } else { writeVal(fname, val); } } } decLevel(); writeMapCloser(); }
// in core/src/java/org/apache/solr/response/JSONResponseWriter.java
Override public void writeStartDocumentList(String name, long start, int size, long numFound, Float maxScore) throws IOException { writeMapOpener((maxScore==null) ? 3 : 4); incLevel(); writeKey("numFound",false); writeLong(null,numFound); writeMapSeparator(); writeKey("start",false); writeLong(null,start); if (maxScore!=null) { writeMapSeparator(); writeKey("maxScore",false); writeFloat(null,maxScore); } writeMapSeparator(); // indent(); writeKey("docs",false); writeArrayOpener(size); incLevel(); }
// in core/src/java/org/apache/solr/response/JSONResponseWriter.java
Override public void writeEndDocumentList() throws IOException { decLevel(); writeArrayCloser(); decLevel(); indent(); writeMapCloser(); }
// in core/src/java/org/apache/solr/response/JSONResponseWriter.java
public void writeMapOpener(int size) throws IOException, IllegalArgumentException { writer.write('{'); }
// in core/src/java/org/apache/solr/response/JSONResponseWriter.java
public void writeMapSeparator() throws IOException { writer.write(','); }
// in core/src/java/org/apache/solr/response/JSONResponseWriter.java
public void writeMapCloser() throws IOException { writer.write('}'); }
// in core/src/java/org/apache/solr/response/JSONResponseWriter.java
public void writeArrayOpener(int size) throws IOException, IllegalArgumentException { writer.write('['); }
// in core/src/java/org/apache/solr/response/JSONResponseWriter.java
public void writeArraySeparator() throws IOException { writer.write(','); }
// in core/src/java/org/apache/solr/response/JSONResponseWriter.java
public void writeArrayCloser() throws IOException { writer.write(']'); }
// in core/src/java/org/apache/solr/response/JSONResponseWriter.java
Override public void writeStr(String name, String val, boolean needsEscaping) throws IOException { // it might be more efficient to use a stringbuilder or write substrings // if writing chars to the stream is slow. if (needsEscaping) { /* http://www.ietf.org/internet-drafts/draft-crockford-jsonorg-json-04.txt All Unicode characters may be placed within the quotation marks except for the characters which must be escaped: quotation mark, reverse solidus, and the control characters (U+0000 through U+001F). */ writer.write('"'); for (int i=0; i<val.length(); i++) { char ch = val.charAt(i); if ((ch > '#' && ch != '\\' && ch < '\u2028') || ch == ' ') { // fast path writer.write(ch); continue; } switch(ch) { case '"': case '\\': writer.write('\\'); writer.write(ch); break; case '\r': writer.write('\\'); writer.write('r'); break; case '\n': writer.write('\\'); writer.write('n'); break; case '\t': writer.write('\\'); writer.write('t'); break; case '\b': writer.write('\\'); writer.write('b'); break; case '\f': writer.write('\\'); writer.write('f'); break; case '\u2028': // fallthrough case '\u2029': unicodeEscape(writer,ch); break; // case '/': default: { if (ch <= 0x1F) { unicodeEscape(writer,ch); } else { writer.write(ch); } } } } writer.write('"'); } else { writer.write('"'); writer.write(val); writer.write('"'); } }
// in core/src/java/org/apache/solr/response/JSONResponseWriter.java
Override public void writeMap(String name, Map val, boolean excludeOuter, boolean isFirstVal) throws IOException { if (!excludeOuter) { writeMapOpener(val.size()); incLevel(); isFirstVal=true; } boolean doIndent = excludeOuter || val.size() > 1; for (Map.Entry entry : (Set<Map.Entry>)val.entrySet()) { Object e = entry.getKey(); String k = e==null ? "" : e.toString(); Object v = entry.getValue(); if (isFirstVal) { isFirstVal=false; } else { writeMapSeparator(); } if (doIndent) indent(); writeKey(k,true); writeVal(k,v); } if (!excludeOuter) { decLevel(); writeMapCloser(); } }
// in core/src/java/org/apache/solr/response/JSONResponseWriter.java
Override public void writeArray(String name, Iterator val) throws IOException { writeArrayOpener(-1); // no trivial way to determine array size incLevel(); boolean first=true; while( val.hasNext() ) { if( !first ) indent(); writeVal(null, val.next()); if( val.hasNext() ) { writeArraySeparator(); } first=false; } decLevel(); writeArrayCloser(); }
// in core/src/java/org/apache/solr/response/JSONResponseWriter.java
Override public void writeNull(String name) throws IOException { writer.write("null"); }
// in core/src/java/org/apache/solr/response/JSONResponseWriter.java
Override public void writeInt(String name, String val) throws IOException { writer.write(val); }
// in core/src/java/org/apache/solr/response/JSONResponseWriter.java
Override public void writeLong(String name, String val) throws IOException { writer.write(val); }
// in core/src/java/org/apache/solr/response/JSONResponseWriter.java
Override public void writeBool(String name, String val) throws IOException { writer.write(val); }
// in core/src/java/org/apache/solr/response/JSONResponseWriter.java
Override public void writeFloat(String name, String val) throws IOException { writer.write(val); }
// in core/src/java/org/apache/solr/response/JSONResponseWriter.java
Override public void writeDouble(String name, String val) throws IOException { writer.write(val); }
// in core/src/java/org/apache/solr/response/JSONResponseWriter.java
Override public void writeDate(String name, String val) throws IOException { writeStr(name, val, false); }
// in core/src/java/org/apache/solr/response/JSONResponseWriter.java
protected static void unicodeEscape(Appendable out, int ch) throws IOException { out.append('\\'); out.append('u'); out.append(hexdigits[(ch>>>12) ]); out.append(hexdigits[(ch>>>8) & 0xf]); out.append(hexdigits[(ch>>>4) & 0xf]); out.append(hexdigits[(ch) & 0xf]); }
// in core/src/java/org/apache/solr/response/JSONResponseWriter.java
Override public void writeFloat(String name, float val) throws IOException { if (Float.isNaN(val)) { writer.write(getNaN()); } else if (Float.isInfinite(val)) { if (val < 0.0f) writer.write('-'); writer.write(getInf()); } else { writeFloat(name, Float.toString(val)); } }
// in core/src/java/org/apache/solr/response/JSONResponseWriter.java
Override public void writeDouble(String name, double val) throws IOException { if (Double.isNaN(val)) { writer.write(getNaN()); } else if (Double.isInfinite(val)) { if (val < 0.0) writer.write('-'); writer.write(getInf()); } else { writeDouble(name, Double.toString(val)); } }
// in core/src/java/org/apache/solr/response/PHPResponseWriter.java
public void write(Writer writer, SolrQueryRequest req, SolrQueryResponse rsp) throws IOException { PHPWriter w = new PHPWriter(writer, req, rsp); try { w.writeResponse(); } finally { w.close(); } }
// in core/src/java/org/apache/solr/response/PHPResponseWriter.java
Override public void writeNamedList(String name, NamedList val) throws IOException { writeNamedListAsMapMangled(name,val); }
// in core/src/java/org/apache/solr/response/PHPResponseWriter.java
Override public void writeMapOpener(int size) throws IOException { writer.write("array("); }
// in core/src/java/org/apache/solr/response/PHPResponseWriter.java
Override public void writeMapCloser() throws IOException { writer.write(')'); }
// in core/src/java/org/apache/solr/response/PHPResponseWriter.java
Override public void writeArrayOpener(int size) throws IOException { writer.write("array("); }
// in core/src/java/org/apache/solr/response/PHPResponseWriter.java
Override public void writeArrayCloser() throws IOException { writer.write(')'); }
// in core/src/java/org/apache/solr/response/PHPResponseWriter.java
Override public void writeNull(String name) throws IOException { writer.write("null"); }
// in core/src/java/org/apache/solr/response/PHPResponseWriter.java
Override protected void writeKey(String fname, boolean needsEscaping) throws IOException { writeStr(null, fname, needsEscaping); writer.write('='); writer.write('>'); }
// in core/src/java/org/apache/solr/response/PHPResponseWriter.java
Override public void writeStr(String name, String val, boolean needsEscaping) throws IOException { if (needsEscaping) { writer.write('\''); for (int i=0; i<val.length(); i++) { char ch = val.charAt(i); switch (ch) { case '\'': case '\\': writer.write('\\'); writer.write(ch); break; default: writer.write(ch); } } writer.write('\''); } else { writer.write('\''); writer.write(val); writer.write('\''); } }
// in core/src/java/org/apache/solr/response/CSVResponseWriter.java
public void write(Writer writer, SolrQueryRequest req, SolrQueryResponse rsp) throws IOException { CSVWriter w = new CSVWriter(writer, req, rsp); try { w.writeResponse(); } finally { w.close(); } }
// in core/src/java/org/apache/solr/response/CSVResponseWriter.java
public void freeze() throws IOException { if (cw.size() > 0) { flush(); result = cw.getInternalBuf(); resultLen = cw.size(); } else { result = buf; resultLen = pos; } }
// in core/src/java/org/apache/solr/response/CSVResponseWriter.java
public void writeResponse() throws IOException { SolrParams params = req.getParams(); strategy = new CSVStrategy(',', '"', CSVStrategy.COMMENTS_DISABLED, CSVStrategy.ESCAPE_DISABLED, false, false, false, true); CSVStrategy strat = strategy; String sep = params.get(CSV_SEPARATOR); if (sep!=null) { if (sep.length()!=1) throw new SolrException( SolrException.ErrorCode.BAD_REQUEST,"Invalid separator:'"+sep+"'"); strat.setDelimiter(sep.charAt(0)); } String nl = params.get(CSV_NEWLINE); if (nl!=null) { if (nl.length()==0) throw new SolrException( SolrException.ErrorCode.BAD_REQUEST,"Invalid newline:'"+nl+"'"); strat.setPrinterNewline(nl); } String encapsulator = params.get(CSV_ENCAPSULATOR); String escape = params.get(CSV_ESCAPE); if (encapsulator!=null) { if (encapsulator.length()!=1) throw new SolrException( SolrException.ErrorCode.BAD_REQUEST,"Invalid encapsulator:'"+encapsulator+"'"); strat.setEncapsulator(encapsulator.charAt(0)); } if (escape!=null) { if (escape.length()!=1) throw new SolrException( SolrException.ErrorCode.BAD_REQUEST,"Invalid escape:'"+escape+"'"); strat.setEscape(escape.charAt(0)); if (encapsulator == null) { strat.setEncapsulator( CSVStrategy.ENCAPSULATOR_DISABLED); } } if (strat.getEscape() == '\\') { // If the escape is the standard backslash, then also enable // unicode escapes (it's harmless since 'u' would not otherwise // be escaped. strat.setUnicodeEscapeInterpretation(true); } printer = new CSVPrinter(writer, strategy); CSVStrategy mvStrategy = new CSVStrategy(strategy.getDelimiter(), CSVStrategy.ENCAPSULATOR_DISABLED, CSVStrategy.COMMENTS_DISABLED, '\\', false, false, false, false); strat = mvStrategy; sep = params.get(MV_SEPARATOR); if (sep!=null) { if (sep.length()!=1) throw new SolrException( SolrException.ErrorCode.BAD_REQUEST,"Invalid mv separator:'"+sep+"'"); strat.setDelimiter(sep.charAt(0)); } encapsulator = params.get(MV_ENCAPSULATOR); escape = params.get(MV_ESCAPE); if (encapsulator!=null) { if (encapsulator.length()!=1) throw new SolrException( SolrException.ErrorCode.BAD_REQUEST,"Invalid mv encapsulator:'"+encapsulator+"'"); strat.setEncapsulator(encapsulator.charAt(0)); if (escape == null) { strat.setEscape(CSVStrategy.ESCAPE_DISABLED); } } escape = params.get(MV_ESCAPE); if (escape!=null) { if (escape.length()!=1) throw new SolrException( SolrException.ErrorCode.BAD_REQUEST,"Invalid mv escape:'"+escape+"'"); strat.setEscape(escape.charAt(0)); // encapsulator will already be disabled if it wasn't specified } Collection<String> fields = returnFields.getLuceneFieldNames(); Object responseObj = rsp.getValues().get("response"); boolean returnOnlyStored = false; if (fields==null) { if (responseObj instanceof SolrDocumentList) { // get the list of fields from the SolrDocumentList fields = new LinkedHashSet<String>(); for (SolrDocument sdoc: (SolrDocumentList)responseObj) { fields.addAll(sdoc.getFieldNames()); } } else { // get the list of fields from the index fields = req.getSearcher().getFieldNames(); } if (returnFields.wantsScore()) { fields.add("score"); } else { fields.remove("score"); } returnOnlyStored = true; } CSVSharedBufPrinter csvPrinterMV = new CSVSharedBufPrinter(mvWriter, mvStrategy); for (String field : fields) { if (!returnFields.wantsField(field)) { continue; } if (field.equals("score")) { CSVField csvField = new CSVField(); csvField.name = "score"; csvFields.put("score", csvField); continue; } SchemaField sf = schema.getFieldOrNull(field); if (sf == null) { FieldType ft = new StrField(); sf = new SchemaField(field, ft); } // Return only stored fields, unless an explicit field list is specified if (returnOnlyStored && sf != null && !sf.stored()) { continue; } // check for per-field overrides sep = params.get("f." + field + '.' + CSV_SEPARATOR); encapsulator = params.get("f." + field + '.' + CSV_ENCAPSULATOR); escape = params.get("f." + field + '.' + CSV_ESCAPE); CSVSharedBufPrinter csvPrinter = csvPrinterMV; if (sep != null || encapsulator != null || escape != null) { // create a new strategy + printer if there were any per-field overrides strat = (CSVStrategy)mvStrategy.clone(); if (sep!=null) { if (sep.length()!=1) throw new SolrException( SolrException.ErrorCode.BAD_REQUEST,"Invalid mv separator:'"+sep+"'"); strat.setDelimiter(sep.charAt(0)); } if (encapsulator!=null) { if (encapsulator.length()!=1) throw new SolrException( SolrException.ErrorCode.BAD_REQUEST,"Invalid mv encapsulator:'"+encapsulator+"'"); strat.setEncapsulator(encapsulator.charAt(0)); if (escape == null) { strat.setEscape(CSVStrategy.ESCAPE_DISABLED); } } if (escape!=null) { if (escape.length()!=1) throw new SolrException( SolrException.ErrorCode.BAD_REQUEST,"Invalid mv escape:'"+escape+"'"); strat.setEscape(escape.charAt(0)); if (encapsulator == null) { strat.setEncapsulator(CSVStrategy.ENCAPSULATOR_DISABLED); } } csvPrinter = new CSVSharedBufPrinter(mvWriter, strat); } CSVField csvField = new CSVField(); csvField.name = field; csvField.sf = sf; csvField.mvPrinter = csvPrinter; csvFields.put(field, csvField); } NullValue = params.get(CSV_NULL, ""); if (params.getBool(CSV_HEADER, true)) { for (CSVField csvField : csvFields.values()) { printer.print(csvField.name); } printer.println(); } if (responseObj instanceof ResultContext ) { writeDocuments(null, (ResultContext)responseObj, returnFields ); } else if (responseObj instanceof DocList) { ResultContext ctx = new ResultContext(); ctx.docs = (DocList)responseObj; writeDocuments(null, ctx, returnFields ); } else if (responseObj instanceof SolrDocumentList) { writeSolrDocumentList(null, (SolrDocumentList)responseObj, returnFields ); } }
// in core/src/java/org/apache/solr/response/CSVResponseWriter.java
Override public void close() throws IOException { if (printer != null) printer.flush(); super.close(); }
// in core/src/java/org/apache/solr/response/CSVResponseWriter.java
Override public void writeNamedList(String name, NamedList val) throws IOException { }
// in core/src/java/org/apache/solr/response/CSVResponseWriter.java
public void writeStartDocumentList(String name, long start, int size, long numFound, Float maxScore) throws IOException { // nothing }
// in core/src/java/org/apache/solr/response/CSVResponseWriter.java
public void writeEndDocumentList() throws IOException { // nothing }
// in core/src/java/org/apache/solr/response/CSVResponseWriter.java
Override public void writeSolrDocument(String name, SolrDocument doc, ReturnFields returnFields, int idx ) throws IOException { if (tmpList == null) { tmpList = new ArrayList(1); tmpList.add(null); } for (CSVField csvField : csvFields.values()) { Object val = doc.getFieldValue(csvField.name); int nVals = val instanceof Collection ? ((Collection)val).size() : (val==null ? 0 : 1); if (nVals == 0) { writeNull(csvField.name); continue; } if ((csvField.sf != null && csvField.sf.multiValued()) || nVals > 1) { Collection values; // normalize to a collection if (val instanceof Collection) { values = (Collection)val; } else { tmpList.set(0, val); values = tmpList; } mvWriter.reset(); csvField.mvPrinter.reset(); // switch the printer to use the multi-valued one CSVPrinter tmp = printer; printer = csvField.mvPrinter; for (Object fval : values) { writeVal(csvField.name, fval); } printer = tmp; // restore the original printer mvWriter.freeze(); printer.print(mvWriter.getFrozenBuf(), 0, mvWriter.getFrozenSize(), true); } else { // normalize to first value if (val instanceof Collection) { Collection values = (Collection)val; val = values.iterator().next(); } writeVal(csvField.name, val); } } printer.println(); }
// in core/src/java/org/apache/solr/response/CSVResponseWriter.java
Override public void writeStr(String name, String val, boolean needsEscaping) throws IOException { printer.print(val, needsEscaping); }
// in core/src/java/org/apache/solr/response/CSVResponseWriter.java
Override public void writeMap(String name, Map val, boolean excludeOuter, boolean isFirstVal) throws IOException { }
// in core/src/java/org/apache/solr/response/CSVResponseWriter.java
Override public void writeArray(String name, Iterator val) throws IOException { }
// in core/src/java/org/apache/solr/response/CSVResponseWriter.java
Override public void writeNull(String name) throws IOException { printer.print(NullValue); }
// in core/src/java/org/apache/solr/response/CSVResponseWriter.java
Override public void writeInt(String name, String val) throws IOException { printer.print(val, false); }
// in core/src/java/org/apache/solr/response/CSVResponseWriter.java
Override public void writeLong(String name, String val) throws IOException { printer.print(val, false); }
// in core/src/java/org/apache/solr/response/CSVResponseWriter.java
Override public void writeBool(String name, String val) throws IOException { printer.print(val, false); }
// in core/src/java/org/apache/solr/response/CSVResponseWriter.java
Override public void writeFloat(String name, String val) throws IOException { printer.print(val, false); }
// in core/src/java/org/apache/solr/response/CSVResponseWriter.java
Override public void writeDouble(String name, String val) throws IOException { printer.print(val, false); }
// in core/src/java/org/apache/solr/response/CSVResponseWriter.java
Override public void writeDate(String name, Date val) throws IOException { StringBuilder sb = new StringBuilder(25); cal = DateUtil.formatDate(val, cal, sb); writeDate(name, sb.toString()); }
// in core/src/java/org/apache/solr/response/CSVResponseWriter.java
Override public void writeDate(String name, String val) throws IOException { printer.print(val, false); }
// in core/src/java/org/apache/solr/response/PythonResponseWriter.java
public void write(Writer writer, SolrQueryRequest req, SolrQueryResponse rsp) throws IOException { PythonWriter w = new PythonWriter(writer, req, rsp); try { w.writeResponse(); } finally { w.close(); } }
// in core/src/java/org/apache/solr/response/PythonResponseWriter.java
Override public void writeNull(String name) throws IOException { writer.write("None"); }
// in core/src/java/org/apache/solr/response/PythonResponseWriter.java
Override public void writeBool(String name, boolean val) throws IOException { writer.write(val ? "True" : "False"); }
// in core/src/java/org/apache/solr/response/PythonResponseWriter.java
Override public void writeBool(String name, String val) throws IOException { writeBool(name,val.charAt(0)=='t'); }
// in core/src/java/org/apache/solr/response/PythonResponseWriter.java
Override public void writeStr(String name, String val, boolean needsEscaping) throws IOException { if (!needsEscaping) { writer.write('\''); writer.write(val); writer.write('\''); return; } // use python unicode strings... // python doesn't tolerate newlines in strings in it's eval(), so we must escape them. StringBuilder sb = new StringBuilder(val.length()); boolean needUnicode=false; for (int i=0; i<val.length(); i++) { char ch = val.charAt(i); switch(ch) { case '\'': case '\\': sb.append('\\'); sb.append(ch); break; case '\r': sb.append("\\r"); break; case '\n': sb.append("\\n"); break; case '\t': sb.append("\\t"); break; default: // we don't strictly have to escape these chars, but it will probably increase // portability to stick to visible ascii if (ch<' ' || ch>127) { unicodeEscape(sb, ch); needUnicode=true; } else { sb.append(ch); } } } if (needUnicode) { writer.write('u'); } writer.write('\''); writer.append(sb); writer.write('\''); }
// in core/src/java/org/apache/solr/response/RubyResponseWriter.java
public void write(Writer writer, SolrQueryRequest req, SolrQueryResponse rsp) throws IOException { RubyWriter w = new RubyWriter(writer, req, rsp); try { w.writeResponse(); } finally { w.close(); } }
// in core/src/java/org/apache/solr/response/RubyResponseWriter.java
Override public void writeNull(String name) throws IOException { writer.write("nil"); }
// in core/src/java/org/apache/solr/response/RubyResponseWriter.java
Override protected void writeKey(String fname, boolean needsEscaping) throws IOException { writeStr(null, fname, needsEscaping); writer.write('='); writer.write('>'); }
// in core/src/java/org/apache/solr/response/RubyResponseWriter.java
Override public void writeStr(String name, String val, boolean needsEscaping) throws IOException { // Ruby doesn't do unicode escapes... so let the servlet container write raw UTF-8 // bytes into the string. // // Use single quoted strings for safety since no evaluation is done within them. // Also, there are very few escapes recognized in a single quoted string, so // only escape the backslash and single quote. writer.write('\''); if (needsEscaping) { for (int i=0; i<val.length(); i++) { char ch = val.charAt(i); if (ch=='\'' || ch=='\\') { writer.write('\\'); } writer.write(ch); } } else { writer.write(val); } writer.write('\''); }
// in core/src/java/org/apache/solr/response/XSLTResponseWriter.java
public void write(Writer writer, SolrQueryRequest request, SolrQueryResponse response) throws IOException { final Transformer t = getTransformer(request); // capture the output of the XMLWriter final CharArrayWriter w = new CharArrayWriter(); XMLWriter.writeResponse(w,request,response); // and write transformed result to our writer final Reader r = new BufferedReader(new CharArrayReader(w.toCharArray())); final StreamSource source = new StreamSource(r); final StreamResult result = new StreamResult(writer); try { t.transform(source, result); } catch(TransformerException te) { final IOException ioe = new IOException("XSLT transformation error"); ioe.initCause(te); throw ioe; } }
// in core/src/java/org/apache/solr/response/XSLTResponseWriter.java
protected Transformer getTransformer(SolrQueryRequest request) throws IOException { final String xslt = request.getParams().get(CommonParams.TR,null); if(xslt==null) { throw new IOException("'" + CommonParams.TR + "' request parameter is required to use the XSLTResponseWriter"); } // not the cleanest way to achieve this SolrConfig solrConfig = request.getCore().getSolrConfig(); // no need to synchronize access to context, right? // Nothing else happens with it at the same time final Map<Object,Object> ctx = request.getContext(); Transformer result = (Transformer)ctx.get(CONTEXT_TRANSFORMER_KEY); if(result==null) { result = TransformerProvider.instance.getTransformer(solrConfig, xslt,xsltCacheLifetimeSeconds.intValue()); result.setErrorListener(xmllog); ctx.put(CONTEXT_TRANSFORMER_KEY,result); } return result; }
// in core/src/java/org/apache/solr/response/XMLWriter.java
public static void writeResponse(Writer writer, SolrQueryRequest req, SolrQueryResponse rsp) throws IOException { XMLWriter xmlWriter = null; try { xmlWriter = new XMLWriter(writer, req, rsp); xmlWriter.writeResponse(); } finally { xmlWriter.close(); } }
// in core/src/java/org/apache/solr/response/XMLWriter.java
public void writeResponse() throws IOException { writer.write(XML_START1); String stylesheet = req.getParams().get("stylesheet"); if (stylesheet != null && stylesheet.length() > 0) { writer.write(XML_STYLESHEET); XML.escapeAttributeValue(stylesheet, writer); writer.write(XML_STYLESHEET_END); } /*** String noSchema = req.getParams().get("noSchema"); // todo - change when schema becomes available? if (false && noSchema == null) writer.write(XML_START2_SCHEMA); else writer.write(XML_START2_NOSCHEMA); ***/ writer.write(XML_START2_NOSCHEMA); // dump response values NamedList<?> lst = rsp.getValues(); Boolean omitHeader = req.getParams().getBool(CommonParams.OMIT_HEADER); if(omitHeader != null && omitHeader) lst.remove("responseHeader"); int sz = lst.size(); int start=0; for (int i=start; i<sz; i++) { writeVal(lst.getName(i),lst.getVal(i)); } writer.write("\n</response>\n"); }
// in core/src/java/org/apache/solr/response/XMLWriter.java
private void writeAttr(String name, String val) throws IOException { writeAttr(name, val, true); }
// in core/src/java/org/apache/solr/response/XMLWriter.java
public void writeAttr(String name, String val, boolean escape) throws IOException{ if (val != null) { writer.write(' '); writer.write(name); writer.write("=\""); if(escape){ XML.escapeAttributeValue(val, writer); } else { writer.write(val); } writer.write('"'); } }
// in core/src/java/org/apache/solr/response/XMLWriter.java
void startTag(String tag, String name, boolean closeTag) throws IOException { if (doIndent) indent(); writer.write('<'); writer.write(tag); if (name!=null) { writeAttr("name", name); if (closeTag) { writer.write("/>"); } else { writer.write(">"); } } else { if (closeTag) { writer.write("/>"); } else { writer.write('>'); } } }
// in core/src/java/org/apache/solr/response/XMLWriter.java
Override public void writeStartDocumentList(String name, long start, int size, long numFound, Float maxScore) throws IOException { if (doIndent) indent(); writer.write("<result"); writeAttr("name",name); writeAttr("numFound",Long.toString(numFound)); writeAttr("start",Long.toString(start)); if(maxScore!=null) { writeAttr("maxScore",Float.toString(maxScore)); } writer.write(">"); incLevel(); }
// in core/src/java/org/apache/solr/response/XMLWriter.java
Override public void writeSolrDocument(String name, SolrDocument doc, ReturnFields returnFields, int idx ) throws IOException { startTag("doc", name, false); incLevel(); for (String fname : doc.getFieldNames()) { if (!returnFields.wantsField(fname)) { continue; } Object val = doc.getFieldValue(fname); if( "_explain_".equals( fname ) ) { System.out.println( val ); } writeVal(fname, val); } decLevel(); writer.write("</doc>"); }
// in core/src/java/org/apache/solr/response/XMLWriter.java
Override public void writeEndDocumentList() throws IOException { decLevel(); if (doIndent) indent(); writer.write("</result>"); }
// in core/src/java/org/apache/solr/response/XMLWriter.java
Override public void writeNamedList(String name, NamedList val) throws IOException { int sz = val.size(); startTag("lst", name, sz<=0); incLevel(); for (int i=0; i<sz; i++) { writeVal(val.getName(i),val.getVal(i)); } decLevel(); if (sz > 0) { if (doIndent) indent(); writer.write("</lst>"); } }
// in core/src/java/org/apache/solr/response/XMLWriter.java
Override public void writeMap(String name, Map map, boolean excludeOuter, boolean isFirstVal) throws IOException { int sz = map.size(); if (!excludeOuter) { startTag("lst", name, sz<=0); incLevel(); } for (Map.Entry entry : (Set<Map.Entry>)map.entrySet()) { Object k = entry.getKey(); Object v = entry.getValue(); // if (sz<indentThreshold) indent(); writeVal( null == k ? null : k.toString(), v); } if (!excludeOuter) { decLevel(); if (sz > 0) { if (doIndent) indent(); writer.write("</lst>"); } } }
// in core/src/java/org/apache/solr/response/XMLWriter.java
Override public void writeArray(String name, Object[] val) throws IOException { writeArray(name, Arrays.asList(val).iterator()); }
// in core/src/java/org/apache/solr/response/XMLWriter.java
Override public void writeArray(String name, Iterator iter) throws IOException { if( iter.hasNext() ) { startTag("arr", name, false ); incLevel(); while( iter.hasNext() ) { writeVal(null, iter.next()); } decLevel(); if (doIndent) indent(); writer.write("</arr>"); } else { startTag("arr", name, true ); } }
// in core/src/java/org/apache/solr/response/XMLWriter.java
Override public void writeNull(String name) throws IOException { writePrim("null",name,"",false); }
// in core/src/java/org/apache/solr/response/XMLWriter.java
Override public void writeStr(String name, String val, boolean escape) throws IOException { writePrim("str",name,val,escape); }
// in core/src/java/org/apache/solr/response/XMLWriter.java
Override public void writeInt(String name, String val) throws IOException { writePrim("int",name,val,false); }
// in core/src/java/org/apache/solr/response/XMLWriter.java
Override public void writeLong(String name, String val) throws IOException { writePrim("long",name,val,false); }
// in core/src/java/org/apache/solr/response/XMLWriter.java
Override public void writeBool(String name, String val) throws IOException { writePrim("bool",name,val,false); }
// in core/src/java/org/apache/solr/response/XMLWriter.java
Override public void writeFloat(String name, String val) throws IOException { writePrim("float",name,val,false); }
// in core/src/java/org/apache/solr/response/XMLWriter.java
Override public void writeFloat(String name, float val) throws IOException { writeFloat(name,Float.toString(val)); }
// in core/src/java/org/apache/solr/response/XMLWriter.java
Override public void writeDouble(String name, String val) throws IOException { writePrim("double",name,val,false); }
// in core/src/java/org/apache/solr/response/XMLWriter.java
Override public void writeDouble(String name, double val) throws IOException { writeDouble(name,Double.toString(val)); }
// in core/src/java/org/apache/solr/response/XMLWriter.java
Override public void writeDate(String name, String val) throws IOException { writePrim("date",name,val,false); }
// in core/src/java/org/apache/solr/response/XMLWriter.java
private void writePrim(String tag, String name, String val, boolean escape) throws IOException { int contentLen = val==null ? 0 : val.length(); startTag(tag, name, contentLen==0); if (contentLen==0) return; if (escape) { XML.escapeCharData(val,writer); } else { writer.write(val,0,contentLen); } writer.write('<'); writer.write('/'); writer.write(tag); writer.write('>'); }
// in core/src/java/org/apache/solr/response/XMLResponseWriter.java
public void write(Writer writer, SolrQueryRequest req, SolrQueryResponse rsp) throws IOException { XMLWriter w = new XMLWriter(writer, req, rsp); try { w.writeResponse(); } finally { w.close(); } }
// in core/src/java/org/apache/solr/response/transform/DocTransformers.java
Override public void transform(SolrDocument doc, int docid) throws IOException { for( DocTransformer a : children ) { a.transform( doc, docid); } }
// in core/src/java/org/apache/solr/response/BinaryResponseWriter.java
public void write(OutputStream out, SolrQueryRequest req, SolrQueryResponse response) throws IOException { Resolver resolver = new Resolver(req, response.getReturnFields()); Boolean omitHeader = req.getParams().getBool(CommonParams.OMIT_HEADER); if (omitHeader != null && omitHeader) response.getValues().remove("responseHeader"); JavaBinCodec codec = new JavaBinCodec(resolver); codec.marshal(response.getValues(), out); }
// in core/src/java/org/apache/solr/response/BinaryResponseWriter.java
public void write(Writer writer, SolrQueryRequest request, SolrQueryResponse response) throws IOException { throw new RuntimeException("This is a binary writer , Cannot write to a characterstream"); }
// in core/src/java/org/apache/solr/response/BinaryResponseWriter.java
public Object resolve(Object o, JavaBinCodec codec) throws IOException { if (o instanceof ResultContext) { writeResults((ResultContext) o, codec); return null; // null means we completely handled it } if (o instanceof DocList) { ResultContext ctx = new ResultContext(); ctx.docs = (DocList) o; writeResults(ctx, codec); return null; // null means we completely handled it } if( o instanceof IndexableField ) { if(schema == null) schema = solrQueryRequest.getSchema(); IndexableField f = (IndexableField)o; SchemaField sf = schema.getFieldOrNull(f.name()); try { o = getValue(sf, f); } catch (Exception e) { LOG.warn("Error reading a field : " + o, e); } } if (o instanceof SolrDocument) { // Remove any fields that were not requested. // This typically happens when distributed search adds // extra fields to an internal request SolrDocument doc = (SolrDocument)o; Iterator<Map.Entry<String, Object>> i = doc.iterator(); while ( i.hasNext() ) { String fname = i.next().getKey(); if ( !returnFields.wantsField( fname ) ) { i.remove(); } } return doc; } return o; }
// in core/src/java/org/apache/solr/response/BinaryResponseWriter.java
protected void writeResultsBody( ResultContext res, JavaBinCodec codec ) throws IOException { DocList ids = res.docs; int sz = ids.size(); codec.writeTag(JavaBinCodec.ARR, sz); if(searcher == null) searcher = solrQueryRequest.getSearcher(); if(schema == null) schema = solrQueryRequest.getSchema(); DocTransformer transformer = returnFields.getTransformer(); TransformContext context = new TransformContext(); context.query = res.query; context.wantsScores = returnFields.wantsScore() && ids.hasScores(); context.req = solrQueryRequest; context.searcher = searcher; if( transformer != null ) { transformer.setContext( context ); } Set<String> fnames = returnFields.getLuceneFieldNames(); context.iterator = ids.iterator(); for (int i = 0; i < sz; i++) { int id = context.iterator.nextDoc(); Document doc = searcher.doc(id, fnames); SolrDocument sdoc = getDoc(doc); if( transformer != null ) { transformer.transform(sdoc, id); } codec.writeSolrDocument(sdoc); } if( transformer != null ) { transformer.setContext( null ); } }
// in core/src/java/org/apache/solr/response/BinaryResponseWriter.java
public void writeResults(ResultContext ctx, JavaBinCodec codec) throws IOException { codec.writeTag(JavaBinCodec.SOLRDOCLST); boolean wantsScores = returnFields.wantsScore() && ctx.docs.hasScores(); List l = new ArrayList(3); l.add((long) ctx.docs.matches()); l.add((long) ctx.docs.offset()); Float maxScore = null; if (wantsScores) { maxScore = ctx.docs.maxScore(); } l.add(maxScore); codec.writeArray(l); // this is a seprate function so that streaming responses can use just that part writeResultsBody( ctx, codec ); }
// in core/src/java/org/apache/solr/response/RawResponseWriter.java
public void write(Writer writer, SolrQueryRequest request, SolrQueryResponse response) throws IOException { Object obj = response.getValues().get( CONTENT ); if( obj != null && (obj instanceof ContentStream ) ) { // copy the contents to the writer... ContentStream content = (ContentStream)obj; Reader reader = content.getReader(); try { IOUtils.copy( reader, writer ); } finally { reader.close(); } } else { getBaseWriter( request ).write( writer, request, response ); } }
// in core/src/java/org/apache/solr/response/RawResponseWriter.java
public void write(OutputStream out, SolrQueryRequest request, SolrQueryResponse response) throws IOException { Object obj = response.getValues().get( CONTENT ); if( obj != null && (obj instanceof ContentStream ) ) { // copy the contents to the writer... ContentStream content = (ContentStream)obj; java.io.InputStream in = content.getStream(); try { IOUtils.copy( in, out ); } finally { in.close(); } } else { //getBaseWriter( request ).write( writer, request, response ); throw new IOException("did not find a CONTENT object"); } }
// in core/src/java/org/apache/solr/response/PHPSerializedResponseWriter.java
public void write(Writer writer, SolrQueryRequest req, SolrQueryResponse rsp) throws IOException { PHPSerializedWriter w = new PHPSerializedWriter(writer, req, rsp); try { w.writeResponse(); } finally { w.close(); } }
// in core/src/java/org/apache/solr/response/PHPSerializedResponseWriter.java
Override public void writeResponse() throws IOException { Boolean omitHeader = req.getParams().getBool(CommonParams.OMIT_HEADER); if(omitHeader != null && omitHeader) rsp.getValues().remove("responseHeader"); writeNamedList(null, rsp.getValues()); }
// in core/src/java/org/apache/solr/response/PHPSerializedResponseWriter.java
Override public void writeNamedList(String name, NamedList val) throws IOException { writeNamedListAsMapMangled(name,val); }
// in core/src/java/org/apache/solr/response/PHPSerializedResponseWriter.java
public void writeStartDocumentList(String name, long start, int size, long numFound, Float maxScore) throws IOException { writeMapOpener((maxScore==null) ? 3 : 4); writeKey("numFound",false); writeLong(null,numFound); writeKey("start",false); writeLong(null,start); if (maxScore!=null) { writeKey("maxScore",false); writeFloat(null,maxScore); } writeKey("docs",false); writeArrayOpener(size); }
// in core/src/java/org/apache/solr/response/PHPSerializedResponseWriter.java
public void writeEndDocumentList() throws IOException { writeArrayCloser(); // doc list writeMapCloser(); }
// in core/src/java/org/apache/solr/response/PHPSerializedResponseWriter.java
Override public void writeSolrDocument(String name, SolrDocument doc, ReturnFields returnFields, int idx) throws IOException { writeKey(idx, false); LinkedHashMap <String,Object> single = new LinkedHashMap<String, Object>(); LinkedHashMap <String,Object> multi = new LinkedHashMap<String, Object>(); for (String fname : doc.getFieldNames()) { if(!returnFields.wantsField(fname)){ continue; } Object val = doc.getFieldValue(fname); if (val instanceof Collection) { multi.put(fname, val); }else{ single.put(fname, val); } } writeMapOpener(single.size() + multi.size()); for(String fname: single.keySet()){ Object val = single.get(fname); writeKey(fname, true); writeVal(fname, val); } for(String fname: multi.keySet()){ writeKey(fname, true); Object val = multi.get(fname); if (!(val instanceof Collection)) { // should never be reached if multivalued fields are stored as a Collection // so I'm assuming a size of 1 just to wrap the single value writeArrayOpener(1); writeVal(fname, val); writeArrayCloser(); }else{ writeVal(fname, val); } } writeMapCloser(); }
// in core/src/java/org/apache/solr/response/PHPSerializedResponseWriter.java
Override public void writeArray(String name, Object[] val) throws IOException { writeMapOpener(val.length); for(int i=0; i < val.length; i++) { writeKey(i, false); writeVal(String.valueOf(i), val[i]); } writeMapCloser(); }
// in core/src/java/org/apache/solr/response/PHPSerializedResponseWriter.java
Override public void writeArray(String name, Iterator val) throws IOException { ArrayList vals = new ArrayList(); while( val.hasNext() ) { vals.add(val.next()); } writeArray(name, vals.toArray()); }
// in core/src/java/org/apache/solr/response/PHPSerializedResponseWriter.java
Override public void writeMapOpener(int size) throws IOException, IllegalArgumentException { // negative size value indicates that something has gone wrong if (size < 0) { throw new IllegalArgumentException("Map size must not be negative"); } writer.write("a:"+size+":{"); }
// in core/src/java/org/apache/solr/response/PHPSerializedResponseWriter.java
Override public void writeMapSeparator() throws IOException { /* NOOP */ }
// in core/src/java/org/apache/solr/response/PHPSerializedResponseWriter.java
Override public void writeMapCloser() throws IOException { writer.write('}'); }
// in core/src/java/org/apache/solr/response/PHPSerializedResponseWriter.java
Override public void writeArrayOpener(int size) throws IOException, IllegalArgumentException { // negative size value indicates that something has gone wrong if (size < 0) { throw new IllegalArgumentException("Array size must not be negative"); } writer.write("a:"+size+":{"); }
// in core/src/java/org/apache/solr/response/PHPSerializedResponseWriter.java
Override public void writeArraySeparator() throws IOException { /* NOOP */ }
// in core/src/java/org/apache/solr/response/PHPSerializedResponseWriter.java
Override public void writeArrayCloser() throws IOException { writer.write('}'); }
// in core/src/java/org/apache/solr/response/PHPSerializedResponseWriter.java
Override public void writeNull(String name) throws IOException { writer.write("N;"); }
// in core/src/java/org/apache/solr/response/PHPSerializedResponseWriter.java
Override protected void writeKey(String fname, boolean needsEscaping) throws IOException { writeStr(null, fname, needsEscaping); }
// in core/src/java/org/apache/solr/response/PHPSerializedResponseWriter.java
void writeKey(int val, boolean needsEscaping) throws IOException { writeInt(null, String.valueOf(val)); }
// in core/src/java/org/apache/solr/response/PHPSerializedResponseWriter.java
Override public void writeBool(String name, boolean val) throws IOException { writer.write(val ? "b:1;" : "b:0;"); }
// in core/src/java/org/apache/solr/response/PHPSerializedResponseWriter.java
Override public void writeBool(String name, String val) throws IOException { writeBool(name, val.charAt(0) == 't'); }
// in core/src/java/org/apache/solr/response/PHPSerializedResponseWriter.java
Override public void writeInt(String name, String val) throws IOException { writer.write("i:"+val+";"); }
// in core/src/java/org/apache/solr/response/PHPSerializedResponseWriter.java
Override public void writeLong(String name, String val) throws IOException { writeInt(name,val); }
// in core/src/java/org/apache/solr/response/PHPSerializedResponseWriter.java
Override public void writeFloat(String name, String val) throws IOException { writeDouble(name,val); }
// in core/src/java/org/apache/solr/response/PHPSerializedResponseWriter.java
Override public void writeDouble(String name, String val) throws IOException { writer.write("d:"+val+";"); }
// in core/src/java/org/apache/solr/response/PHPSerializedResponseWriter.java
Override public void writeStr(String name, String val, boolean needsEscaping) throws IOException { // serialized PHP strings don't need to be escaped at all, however the // string size reported needs be the number of bytes rather than chars. UnicodeUtil.UTF16toUTF8(val, 0, val.length(), utf8); int nBytes = utf8.length; writer.write("s:"); writer.write(Integer.toString(nBytes)); writer.write(":\""); writer.write(val); writer.write("\";"); }
// in core/src/java/org/apache/solr/request/PerSegmentSingleValuedFaceting.java
NamedList<Integer> getFacetCounts(Executor executor) throws IOException { CompletionService<SegFacet> completionService = new ExecutorCompletionService<SegFacet>(executor); // reuse the translation logic to go from top level set to per-segment set baseSet = docs.getTopFilter(); final AtomicReaderContext[] leaves = searcher.getTopReaderContext().leaves(); // The list of pending tasks that aren't immediately submitted // TODO: Is there a completion service, or a delegating executor that can // limit the number of concurrent tasks submitted to a bigger executor? LinkedList<Callable<SegFacet>> pending = new LinkedList<Callable<SegFacet>>(); int threads = nThreads <= 0 ? Integer.MAX_VALUE : nThreads; for (int i=0; i<leaves.length; i++) { final SegFacet segFacet = new SegFacet(leaves[i]); Callable<SegFacet> task = new Callable<SegFacet>() { public SegFacet call() throws Exception { segFacet.countTerms(); return segFacet; } }; // TODO: if limiting threads, submit by largest segment first? if (--threads >= 0) { completionService.submit(task); } else { pending.add(task); } } // now merge the per-segment results PriorityQueue<SegFacet> queue = new PriorityQueue<SegFacet>(leaves.length) { @Override protected boolean lessThan(SegFacet a, SegFacet b) { return a.tempBR.compareTo(b.tempBR) < 0; } }; boolean hasMissingCount=false; int missingCount=0; for (int i=0; i<leaves.length; i++) { SegFacet seg = null; try { Future<SegFacet> future = completionService.take(); seg = future.get(); if (!pending.isEmpty()) { completionService.submit(pending.removeFirst()); } } catch (InterruptedException e) { Thread.currentThread().interrupt(); throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); } catch (ExecutionException e) { Throwable cause = e.getCause(); if (cause instanceof RuntimeException) { throw (RuntimeException)cause; } else { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Error in per-segment faceting on field: " + fieldName, cause); } } if (seg.startTermIndex < seg.endTermIndex) { if (seg.startTermIndex==0) { hasMissingCount=true; missingCount += seg.counts[0]; seg.pos = 1; } else { seg.pos = seg.startTermIndex; } if (seg.pos < seg.endTermIndex) { seg.tenum = seg.si.getTermsEnum(); seg.tenum.seekExact(seg.pos); seg.tempBR = seg.tenum.term(); queue.add(seg); } } } FacetCollector collector; if (sort.equals(FacetParams.FACET_SORT_COUNT) || sort.equals(FacetParams.FACET_SORT_COUNT_LEGACY)) { collector = new CountSortedFacetCollector(offset, limit, mincount); } else { collector = new IndexSortedFacetCollector(offset, limit, mincount); } BytesRef val = new BytesRef(); while (queue.size() > 0) { SegFacet seg = queue.top(); // make a shallow copy val.bytes = seg.tempBR.bytes; val.offset = seg.tempBR.offset; val.length = seg.tempBR.length; int count = 0; do { count += seg.counts[seg.pos - seg.startTermIndex]; // TODO: OPTIMIZATION... // if mincount>0 then seg.pos++ can skip ahead to the next non-zero entry. seg.pos++; if (seg.pos >= seg.endTermIndex) { queue.pop(); seg = queue.top(); } else { seg.tempBR = seg.tenum.next(); seg = queue.updateTop(); } } while (seg != null && val.compareTo(seg.tempBR) == 0); boolean stop = collector.collect(val, count); if (stop) break; } NamedList<Integer> res = collector.getFacetCounts(); // convert labels to readable form FieldType ft = searcher.getSchema().getFieldType(fieldName); int sz = res.size(); for (int i=0; i<sz; i++) { res.setName(i, ft.indexedToReadable(res.getName(i))); } if (missing) { if (!hasMissingCount) { missingCount = SimpleFacets.getFieldMissingCount(searcher,docs,fieldName); } res.add(null, missingCount); } return res; }
// in core/src/java/org/apache/solr/request/PerSegmentSingleValuedFaceting.java
void countTerms() throws IOException { si = FieldCache.DEFAULT.getTermsIndex(context.reader(), fieldName); // SolrCore.log.info("reader= " + reader + " FC=" + System.identityHashCode(si)); if (prefix!=null) { BytesRef prefixRef = new BytesRef(prefix); startTermIndex = si.binarySearchLookup(prefixRef, tempBR); if (startTermIndex<0) startTermIndex=-startTermIndex-1; prefixRef.append(UnicodeUtil.BIG_TERM); // TODO: we could constrain the lower endpoint if we had a binarySearch method that allowed passing start/end endTermIndex = si.binarySearchLookup(prefixRef, tempBR); assert endTermIndex < 0; endTermIndex = -endTermIndex-1; } else { startTermIndex=0; endTermIndex=si.numOrd(); } final int nTerms=endTermIndex-startTermIndex; if (nTerms>0) { // count collection array only needs to be as big as the number of terms we are // going to collect counts for. final int[] counts = this.counts = new int[nTerms]; DocIdSet idSet = baseSet.getDocIdSet(context, null); // this set only includes live docs DocIdSetIterator iter = idSet.iterator(); //// PackedInts.Reader ordReader = si.getDocToOrd(); int doc; final Object arr; if (ordReader.hasArray()) { arr = ordReader.getArray(); } else { arr = null; } if (arr instanceof int[]) { int[] ords = (int[]) arr; if (prefix==null) { while ((doc = iter.nextDoc()) < DocIdSetIterator.NO_MORE_DOCS) { counts[ords[doc]]++; } } else { while ((doc = iter.nextDoc()) < DocIdSetIterator.NO_MORE_DOCS) { int term = ords[doc]; int arrIdx = term-startTermIndex; if (arrIdx>=0 && arrIdx<nTerms) counts[arrIdx]++; } } } else if (arr instanceof short[]) { short[] ords = (short[]) arr; if (prefix==null) { while ((doc = iter.nextDoc()) < DocIdSetIterator.NO_MORE_DOCS) { counts[ords[doc] & 0xffff]++; } } else { while ((doc = iter.nextDoc()) < DocIdSetIterator.NO_MORE_DOCS) { int term = ords[doc] & 0xffff; int arrIdx = term-startTermIndex; if (arrIdx>=0 && arrIdx<nTerms) counts[arrIdx]++; } } } else if (arr instanceof byte[]) { byte[] ords = (byte[]) arr; if (prefix==null) { while ((doc = iter.nextDoc()) < DocIdSetIterator.NO_MORE_DOCS) { counts[ords[doc] & 0xff]++; } } else { while ((doc = iter.nextDoc()) < DocIdSetIterator.NO_MORE_DOCS) { int term = ords[doc] & 0xff; int arrIdx = term-startTermIndex; if (arrIdx>=0 && arrIdx<nTerms) counts[arrIdx]++; } } } else { if (prefix==null) { // specialized version when collecting counts for all terms while ((doc = iter.nextDoc()) < DocIdSetIterator.NO_MORE_DOCS) { counts[si.getOrd(doc)]++; } } else { // version that adjusts term numbers because we aren't collecting the full range while ((doc = iter.nextDoc()) < DocIdSetIterator.NO_MORE_DOCS) { int term = si.getOrd(doc); int arrIdx = term-startTermIndex; if (arrIdx>=0 && arrIdx<nTerms) counts[arrIdx]++; } } } } }
// in core/src/java/org/apache/solr/request/SimpleFacets.java
void parseParams(String type, String param) throws ParseException, IOException { localParams = QueryParsing.getLocalParams(param, req.getParams()); base = docs; facetValue = param; key = param; threads = -1; if (localParams == null) return; // remove local params unless it's a query if (type != FacetParams.FACET_QUERY) { // TODO Cut over to an Enum here facetValue = localParams.get(CommonParams.VALUE); } // reset set the default key now that localParams have been removed key = facetValue; // allow explicit set of the key key = localParams.get(CommonParams.OUTPUT_KEY, key); String threadStr = localParams.get(CommonParams.THREADS); if (threadStr != null) { threads = Integer.parseInt(threadStr); } // figure out if we need a new base DocSet String excludeStr = localParams.get(CommonParams.EXCLUDE); if (excludeStr == null) return; Map<?,?> tagMap = (Map<?,?>)req.getContext().get("tags"); if (tagMap != null && rb != null) { List<String> excludeTagList = StrUtils.splitSmart(excludeStr,','); IdentityHashMap<Query,Boolean> excludeSet = new IdentityHashMap<Query,Boolean>(); for (String excludeTag : excludeTagList) { Object olst = tagMap.get(excludeTag); // tagMap has entries of List<String,List<QParser>>, but subject to change in the future if (!(olst instanceof Collection)) continue; for (Object o : (Collection<?>)olst) { if (!(o instanceof QParser)) continue; QParser qp = (QParser)o; excludeSet.put(qp.getQuery(), Boolean.TRUE); } } if (excludeSet.size() == 0) return; List<Query> qlist = new ArrayList<Query>(); // add the base query if (!excludeSet.containsKey(rb.getQuery())) { qlist.add(rb.getQuery()); } // add the filters if (rb.getFilters() != null) { for (Query q : rb.getFilters()) { if (!excludeSet.containsKey(q)) { qlist.add(q); } } } // get the new base docset for this facet DocSet base = searcher.getDocSet(qlist); if (rb.grouping() && rb.getGroupingSpec().isTruncateGroups()) { Grouping grouping = new Grouping(searcher, null, rb.getQueryCommand(), false, 0, false); if (rb.getGroupingSpec().getFields().length > 0) { grouping.addFieldCommand(rb.getGroupingSpec().getFields()[0], req); } else if (rb.getGroupingSpec().getFunctions().length > 0) { grouping.addFunctionCommand(rb.getGroupingSpec().getFunctions()[0], req); } else { this.base = base; return; } AbstractAllGroupHeadsCollector allGroupHeadsCollector = grouping.getCommands().get(0).createAllGroupCollector(); searcher.search(new MatchAllDocsQuery(), base.getTopFilter(), allGroupHeadsCollector); int maxDoc = searcher.maxDoc(); FixedBitSet fixedBitSet = allGroupHeadsCollector.retrieveGroupHeads(maxDoc); long[] bits = fixedBitSet.getBits(); this.base = new BitDocSet(new OpenBitSet(bits, bits.length)); } else { this.base = base; } } }
// in core/src/java/org/apache/solr/request/SimpleFacets.java
public NamedList<Integer> getFacetQueryCounts() throws IOException,ParseException { NamedList<Integer> res = new SimpleOrderedMap<Integer>(); /* Ignore CommonParams.DF - could have init param facet.query assuming * the schema default with query param DF intented to only affect Q. * If user doesn't want schema default for facet.query, they should be * explicit. */ // SolrQueryParser qp = searcher.getSchema().getSolrQueryParser(null); String[] facetQs = params.getParams(FacetParams.FACET_QUERY); if (null != facetQs && 0 != facetQs.length) { for (String q : facetQs) { parseParams(FacetParams.FACET_QUERY, q); // TODO: slight optimization would prevent double-parsing of any localParams Query qobj = QParser.getParser(q, null, req).getQuery(); res.add(key, searcher.numDocs(qobj, base)); } } return res; }
// in core/src/java/org/apache/solr/request/SimpleFacets.java
public NamedList<Integer> getTermCounts(String field) throws IOException { int offset = params.getFieldInt(field, FacetParams.FACET_OFFSET, 0); int limit = params.getFieldInt(field, FacetParams.FACET_LIMIT, 100); if (limit == 0) return new NamedList<Integer>(); Integer mincount = params.getFieldInt(field, FacetParams.FACET_MINCOUNT); if (mincount==null) { Boolean zeros = params.getFieldBool(field, FacetParams.FACET_ZEROS); // mincount = (zeros!=null && zeros) ? 0 : 1; mincount = (zeros!=null && !zeros) ? 1 : 0; // current default is to include zeros. } boolean missing = params.getFieldBool(field, FacetParams.FACET_MISSING, false); // default to sorting if there is a limit. String sort = params.getFieldParam(field, FacetParams.FACET_SORT, limit>0 ? FacetParams.FACET_SORT_COUNT : FacetParams.FACET_SORT_INDEX); String prefix = params.getFieldParam(field,FacetParams.FACET_PREFIX); NamedList<Integer> counts; SchemaField sf = searcher.getSchema().getField(field); FieldType ft = sf.getType(); // determine what type of faceting method to use String method = params.getFieldParam(field, FacetParams.FACET_METHOD); boolean enumMethod = FacetParams.FACET_METHOD_enum.equals(method); // TODO: default to per-segment or not? boolean per_segment = FacetParams.FACET_METHOD_fcs.equals(method); if (method == null && ft instanceof BoolField) { // Always use filters for booleans... we know the number of values is very small. enumMethod = true; } boolean multiToken = sf.multiValued() || ft.multiValuedFieldCache(); if (TrieField.getMainValuePrefix(ft) != null) { // A TrieField with multiple parts indexed per value... currently only // UnInvertedField can handle this case, so force it's use. enumMethod = false; multiToken = true; } if (params.getFieldBool(field, GroupParams.GROUP_FACET, false)) { counts = getGroupedCounts(searcher, base, field, multiToken, offset,limit, mincount, missing, sort, prefix); } else { // unless the enum method is explicitly specified, use a counting method. if (enumMethod) { counts = getFacetTermEnumCounts(searcher, base, field, offset, limit, mincount,missing,sort,prefix); } else { if (multiToken) { UnInvertedField uif = UnInvertedField.getUnInvertedField(field, searcher); counts = uif.getCounts(searcher, base, offset, limit, mincount,missing,sort,prefix); } else { // TODO: future logic could use filters instead of the fieldcache if // the number of terms in the field is small enough. if (per_segment) { PerSegmentSingleValuedFaceting ps = new PerSegmentSingleValuedFaceting(searcher, base, field, offset,limit, mincount, missing, sort, prefix); Executor executor = threads == 0 ? directExecutor : facetExecutor; ps.setNumThreads(threads); counts = ps.getFacetCounts(executor); } else { counts = getFieldCacheCounts(searcher, base, field, offset,limit, mincount, missing, sort, prefix); } } } } return counts; }
// in core/src/java/org/apache/solr/request/SimpleFacets.java
public NamedList<Integer> getGroupedCounts(SolrIndexSearcher searcher, DocSet base, String field, boolean multiToken, int offset, int limit, int mincount, boolean missing, String sort, String prefix) throws IOException { GroupingSpecification groupingSpecification = rb.getGroupingSpec(); String groupField = groupingSpecification != null ? groupingSpecification.getFields()[0] : null; if (groupField == null) { throw new SolrException ( SolrException.ErrorCode.BAD_REQUEST, "Specify the group.field as parameter or local parameter" ); } BytesRef prefixBR = prefix != null ? new BytesRef(prefix) : null; TermGroupFacetCollector collector = TermGroupFacetCollector.createTermGroupFacetCollector(groupField, field, multiToken, prefixBR, 128); searcher.search(new MatchAllDocsQuery(), base.getTopFilter(), collector); boolean orderByCount = sort.equals(FacetParams.FACET_SORT_COUNT) || sort.equals(FacetParams.FACET_SORT_COUNT_LEGACY); TermGroupFacetCollector.GroupedFacetResult result = collector.mergeSegmentResults(offset + limit, mincount, orderByCount); CharsRef charsRef = new CharsRef(); FieldType facetFieldType = searcher.getSchema().getFieldType(field); NamedList<Integer> facetCounts = new NamedList<Integer>(); List<TermGroupFacetCollector.FacetEntry> scopedEntries = result.getFacetEntries(offset, limit); for (TermGroupFacetCollector.FacetEntry facetEntry : scopedEntries) { facetFieldType.indexedToReadable(facetEntry.getValue(), charsRef); facetCounts.add(charsRef.toString(), facetEntry.getCount()); } if (missing) { facetCounts.add(null, result.getTotalMissingCount()); } return facetCounts; }
// in core/src/java/org/apache/solr/request/SimpleFacets.java
public NamedList<Object> getFacetFieldCounts() throws IOException, ParseException { NamedList<Object> res = new SimpleOrderedMap<Object>(); String[] facetFs = params.getParams(FacetParams.FACET_FIELD); if (null != facetFs) { for (String f : facetFs) { parseParams(FacetParams.FACET_FIELD, f); String termList = localParams == null ? null : localParams.get(CommonParams.TERMS); if (termList != null) { res.add(key, getListedTermCounts(facetValue, termList)); } else { res.add(key, getTermCounts(facetValue)); } } } return res; }
// in core/src/java/org/apache/solr/request/SimpleFacets.java
private NamedList<Integer> getListedTermCounts(String field, String termList) throws IOException { FieldType ft = searcher.getSchema().getFieldType(field); List<String> terms = StrUtils.splitSmart(termList, ",", true); NamedList<Integer> res = new NamedList<Integer>(); for (String term : terms) { String internal = ft.toInternal(term); int count = searcher.numDocs(new TermQuery(new Term(field, internal)), base); res.add(term, count); } return res; }
// in core/src/java/org/apache/solr/request/SimpleFacets.java
public static int getFieldMissingCount(SolrIndexSearcher searcher, DocSet docs, String fieldName) throws IOException { DocSet hasVal = searcher.getDocSet (new TermRangeQuery(fieldName, null, null, false, false)); return docs.andNotSize(hasVal); }
// in core/src/java/org/apache/solr/request/SimpleFacets.java
public static NamedList<Integer> getFieldCacheCounts(SolrIndexSearcher searcher, DocSet docs, String fieldName, int offset, int limit, int mincount, boolean missing, String sort, String prefix) throws IOException { // TODO: If the number of terms is high compared to docs.size(), and zeros==false, // we should use an alternate strategy to avoid // 1) creating another huge int[] for the counts // 2) looping over that huge int[] looking for the rare non-zeros. // // Yet another variation: if docs.size() is small and termvectors are stored, // then use them instead of the FieldCache. // // TODO: this function is too big and could use some refactoring, but // we also need a facet cache, and refactoring of SimpleFacets instead of // trying to pass all the various params around. FieldType ft = searcher.getSchema().getFieldType(fieldName); NamedList<Integer> res = new NamedList<Integer>(); FieldCache.DocTermsIndex si = FieldCache.DEFAULT.getTermsIndex(searcher.getAtomicReader(), fieldName); final BytesRef prefixRef; if (prefix == null) { prefixRef = null; } else if (prefix.length()==0) { prefix = null; prefixRef = null; } else { prefixRef = new BytesRef(prefix); } final BytesRef br = new BytesRef(); int startTermIndex, endTermIndex; if (prefix!=null) { startTermIndex = si.binarySearchLookup(prefixRef, br); if (startTermIndex<0) startTermIndex=-startTermIndex-1; prefixRef.append(UnicodeUtil.BIG_TERM); endTermIndex = si.binarySearchLookup(prefixRef, br); assert endTermIndex < 0; endTermIndex = -endTermIndex-1; } else { startTermIndex=0; endTermIndex=si.numOrd(); } final int nTerms=endTermIndex-startTermIndex; int missingCount = -1; final CharsRef charsRef = new CharsRef(10); if (nTerms>0 && docs.size() >= mincount) { // count collection array only needs to be as big as the number of terms we are // going to collect counts for. final int[] counts = new int[nTerms]; DocIterator iter = docs.iterator(); PackedInts.Reader ordReader = si.getDocToOrd(); final Object arr; if (ordReader.hasArray()) { arr = ordReader.getArray(); } else { arr = null; } if (arr instanceof int[]) { int[] ords = (int[]) arr; if (prefix==null) { while (iter.hasNext()) { counts[ords[iter.nextDoc()]]++; } } else { while (iter.hasNext()) { int term = ords[iter.nextDoc()]; int arrIdx = term-startTermIndex; if (arrIdx>=0 && arrIdx<nTerms) counts[arrIdx]++; } } } else if (arr instanceof short[]) { short[] ords = (short[]) arr; if (prefix==null) { while (iter.hasNext()) { counts[ords[iter.nextDoc()] & 0xffff]++; } } else { while (iter.hasNext()) { int term = ords[iter.nextDoc()] & 0xffff; int arrIdx = term-startTermIndex; if (arrIdx>=0 && arrIdx<nTerms) counts[arrIdx]++; } } } else if (arr instanceof byte[]) { byte[] ords = (byte[]) arr; if (prefix==null) { while (iter.hasNext()) { counts[ords[iter.nextDoc()] & 0xff]++; } } else { while (iter.hasNext()) { int term = ords[iter.nextDoc()] & 0xff; int arrIdx = term-startTermIndex; if (arrIdx>=0 && arrIdx<nTerms) counts[arrIdx]++; } } } else { while (iter.hasNext()) { int term = si.getOrd(iter.nextDoc()); int arrIdx = term-startTermIndex; if (arrIdx>=0 && arrIdx<nTerms) counts[arrIdx]++; } } if (startTermIndex == 0) { missingCount = counts[0]; } // IDEA: we could also maintain a count of "other"... everything that fell outside // of the top 'N' int off=offset; int lim=limit>=0 ? limit : Integer.MAX_VALUE; if (sort.equals(FacetParams.FACET_SORT_COUNT) || sort.equals(FacetParams.FACET_SORT_COUNT_LEGACY)) { int maxsize = limit>0 ? offset+limit : Integer.MAX_VALUE-1; maxsize = Math.min(maxsize, nTerms); LongPriorityQueue queue = new LongPriorityQueue(Math.min(maxsize,1000), maxsize, Long.MIN_VALUE); int min=mincount-1; // the smallest value in the top 'N' values for (int i=(startTermIndex==0)?1:0; i<nTerms; i++) { int c = counts[i]; if (c>min) { // NOTE: we use c>min rather than c>=min as an optimization because we are going in // index order, so we already know that the keys are ordered. This can be very // important if a lot of the counts are repeated (like zero counts would be). // smaller term numbers sort higher, so subtract the term number instead long pair = (((long)c)<<32) + (Integer.MAX_VALUE - i); boolean displaced = queue.insert(pair); if (displaced) min=(int)(queue.top() >>> 32); } } // if we are deep paging, we don't have to order the highest "offset" counts. int collectCount = Math.max(0, queue.size() - off); assert collectCount <= lim; // the start and end indexes of our list "sorted" (starting with the highest value) int sortedIdxStart = queue.size() - (collectCount - 1); int sortedIdxEnd = queue.size() + 1; final long[] sorted = queue.sort(collectCount); for (int i=sortedIdxStart; i<sortedIdxEnd; i++) { long pair = sorted[i]; int c = (int)(pair >>> 32); int tnum = Integer.MAX_VALUE - (int)pair; ft.indexedToReadable(si.lookup(startTermIndex+tnum, br), charsRef); res.add(charsRef.toString(), c); } } else { // add results in index order int i=(startTermIndex==0)?1:0; if (mincount<=0) { // if mincount<=0, then we won't discard any terms and we know exactly // where to start. i+=off; off=0; } for (; i<nTerms; i++) { int c = counts[i]; if (c<mincount || --off>=0) continue; if (--lim<0) break; ft.indexedToReadable(si.lookup(startTermIndex+i, br), charsRef); res.add(charsRef.toString(), c); } } } if (missing) { if (missingCount < 0) { missingCount = getFieldMissingCount(searcher,docs,fieldName); } res.add(null, missingCount); } return res; }
// in core/src/java/org/apache/solr/request/SimpleFacets.java
public NamedList<Integer> getFacetTermEnumCounts(SolrIndexSearcher searcher, DocSet docs, String field, int offset, int limit, int mincount, boolean missing, String sort, String prefix) throws IOException { /* :TODO: potential optimization... * cache the Terms with the highest docFreq and try them first * don't enum if we get our max from them */ // Minimum term docFreq in order to use the filterCache for that term. int minDfFilterCache = params.getFieldInt(field, FacetParams.FACET_ENUM_CACHE_MINDF, 0); // make sure we have a set that is fast for random access, if we will use it for that DocSet fastForRandomSet = docs; if (minDfFilterCache>0 && docs instanceof SortedIntDocSet) { SortedIntDocSet sset = (SortedIntDocSet)docs; fastForRandomSet = new HashDocSet(sset.getDocs(), 0, sset.size()); } IndexSchema schema = searcher.getSchema(); AtomicReader r = searcher.getAtomicReader(); FieldType ft = schema.getFieldType(field); boolean sortByCount = sort.equals("count") || sort.equals("true"); final int maxsize = limit>=0 ? offset+limit : Integer.MAX_VALUE-1; final BoundedTreeSet<CountPair<BytesRef,Integer>> queue = sortByCount ? new BoundedTreeSet<CountPair<BytesRef,Integer>>(maxsize) : null; final NamedList<Integer> res = new NamedList<Integer>(); int min=mincount-1; // the smallest value in the top 'N' values int off=offset; int lim=limit>=0 ? limit : Integer.MAX_VALUE; BytesRef startTermBytes = null; if (prefix != null) { String indexedPrefix = ft.toInternal(prefix); startTermBytes = new BytesRef(indexedPrefix); } Fields fields = r.fields(); Terms terms = fields==null ? null : fields.terms(field); TermsEnum termsEnum = null; SolrIndexSearcher.DocsEnumState deState = null; BytesRef term = null; if (terms != null) { termsEnum = terms.iterator(null); // TODO: OPT: if seek(ord) is supported for this termsEnum, then we could use it for // facet.offset when sorting by index order. if (startTermBytes != null) { if (termsEnum.seekCeil(startTermBytes, true) == TermsEnum.SeekStatus.END) { termsEnum = null; } else { term = termsEnum.term(); } } else { // position termsEnum on first term term = termsEnum.next(); } } DocsEnum docsEnum = null; CharsRef charsRef = new CharsRef(10); if (docs.size() >= mincount) { while (term != null) { if (startTermBytes != null && !StringHelper.startsWith(term, startTermBytes)) break; int df = termsEnum.docFreq(); // If we are sorting, we can use df>min (rather than >=) since we // are going in index order. For certain term distributions this can // make a large difference (for example, many terms with df=1). if (df>0 && df>min) { int c; if (df >= minDfFilterCache) { // use the filter cache if (deState==null) { deState = new SolrIndexSearcher.DocsEnumState(); deState.fieldName = field; deState.liveDocs = r.getLiveDocs(); deState.termsEnum = termsEnum; deState.docsEnum = docsEnum; } c = searcher.numDocs(docs, deState); docsEnum = deState.docsEnum; } else { // iterate over TermDocs to calculate the intersection // TODO: specialize when base docset is a bitset or hash set (skipDocs)? or does it matter for this? // TODO: do this per-segment for better efficiency (MultiDocsEnum just uses base class impl) // TODO: would passing deleted docs lead to better efficiency over checking the fastForRandomSet? docsEnum = termsEnum.docs(null, docsEnum, false); c=0; if (docsEnum instanceof MultiDocsEnum) { MultiDocsEnum.EnumWithSlice[] subs = ((MultiDocsEnum)docsEnum).getSubs(); int numSubs = ((MultiDocsEnum)docsEnum).getNumSubs(); for (int subindex = 0; subindex<numSubs; subindex++) { MultiDocsEnum.EnumWithSlice sub = subs[subindex]; if (sub.docsEnum == null) continue; int base = sub.slice.start; int docid; while ((docid = sub.docsEnum.nextDoc()) != DocIdSetIterator.NO_MORE_DOCS) { if (fastForRandomSet.exists(docid+base)) c++; } } } else { int docid; while ((docid = docsEnum.nextDoc()) != DocIdSetIterator.NO_MORE_DOCS) { if (fastForRandomSet.exists(docid)) c++; } } } if (sortByCount) { if (c>min) { BytesRef termCopy = BytesRef.deepCopyOf(term); queue.add(new CountPair<BytesRef,Integer>(termCopy, c)); if (queue.size()>=maxsize) min=queue.last().val; } } else { if (c >= mincount && --off<0) { if (--lim<0) break; ft.indexedToReadable(term, charsRef); res.add(charsRef.toString(), c); } } } term = termsEnum.next(); } } if (sortByCount) { for (CountPair<BytesRef,Integer> p : queue) { if (--off>=0) continue; if (--lim<0) break; ft.indexedToReadable(p.key, charsRef); res.add(charsRef.toString(), p.val); } } if (missing) { res.add(null, getFieldMissingCount(searcher,docs,field)); } return res; }
// in core/src/java/org/apache/solr/request/SimpleFacets.java
Deprecated public NamedList<Object> getFacetDateCounts() throws IOException, ParseException { final NamedList<Object> resOuter = new SimpleOrderedMap<Object>(); final String[] fields = params.getParams(FacetParams.FACET_DATE); if (null == fields || 0 == fields.length) return resOuter; for (String f : fields) { getFacetDateCounts(f, resOuter); } return resOuter; }
// in core/src/java/org/apache/solr/request/SimpleFacets.java
Deprecated public void getFacetDateCounts(String dateFacet, NamedList<Object> resOuter) throws IOException, ParseException { final IndexSchema schema = searcher.getSchema(); parseParams(FacetParams.FACET_DATE, dateFacet); String f = facetValue; final NamedList<Object> resInner = new SimpleOrderedMap<Object>(); resOuter.add(key, resInner); final SchemaField sf = schema.getField(f); if (! (sf.getType() instanceof DateField)) { throw new SolrException (SolrException.ErrorCode.BAD_REQUEST, "Can not date facet on a field which is not a DateField: " + f); } final DateField ft = (DateField) sf.getType(); final String startS = required.getFieldParam(f,FacetParams.FACET_DATE_START); final Date start; try { start = ft.parseMath(null, startS); } catch (SolrException e) { throw new SolrException (SolrException.ErrorCode.BAD_REQUEST, "date facet 'start' is not a valid Date string: " + startS, e); } final String endS = required.getFieldParam(f,FacetParams.FACET_DATE_END); Date end; // not final, hardend may change this try { end = ft.parseMath(null, endS); } catch (SolrException e) { throw new SolrException (SolrException.ErrorCode.BAD_REQUEST, "date facet 'end' is not a valid Date string: " + endS, e); } if (end.before(start)) { throw new SolrException (SolrException.ErrorCode.BAD_REQUEST, "date facet 'end' comes before 'start': "+endS+" < "+startS); } final String gap = required.getFieldParam(f,FacetParams.FACET_DATE_GAP); final DateMathParser dmp = new DateMathParser(); final int minCount = params.getFieldInt(f,FacetParams.FACET_MINCOUNT, 0); String[] iStrs = params.getFieldParams(f,FacetParams.FACET_DATE_INCLUDE); // Legacy support for default of [lower,upper,edge] for date faceting // this is not handled by FacetRangeInclude.parseParam because // range faceting has differnet defaults final EnumSet<FacetRangeInclude> include = (null == iStrs || 0 == iStrs.length ) ? EnumSet.of(FacetRangeInclude.LOWER, FacetRangeInclude.UPPER, FacetRangeInclude.EDGE) : FacetRangeInclude.parseParam(iStrs); try { Date low = start; while (low.before(end)) { dmp.setNow(low); String label = ft.toExternal(low); Date high = dmp.parseMath(gap); if (end.before(high)) { if (params.getFieldBool(f,FacetParams.FACET_DATE_HARD_END,false)) { high = end; } else { end = high; } } if (high.before(low)) { throw new SolrException (SolrException.ErrorCode.BAD_REQUEST, "date facet infinite loop (is gap negative?)"); } final boolean includeLower = (include.contains(FacetRangeInclude.LOWER) || (include.contains(FacetRangeInclude.EDGE) && low.equals(start))); final boolean includeUpper = (include.contains(FacetRangeInclude.UPPER) || (include.contains(FacetRangeInclude.EDGE) && high.equals(end))); final int count = rangeCount(sf,low,high,includeLower,includeUpper); if (count >= minCount) { resInner.add(label, count); } low = high; } } catch (java.text.ParseException e) { throw new SolrException (SolrException.ErrorCode.BAD_REQUEST, "date facet 'gap' is not a valid Date Math string: " + gap, e); } // explicitly return the gap and end so all the counts // (including before/after/between) are meaningful - even if mincount // has removed the neighboring ranges resInner.add("gap", gap); resInner.add("start", start); resInner.add("end", end); final String[] othersP = params.getFieldParams(f,FacetParams.FACET_DATE_OTHER); if (null != othersP && 0 < othersP.length ) { final Set<FacetRangeOther> others = EnumSet.noneOf(FacetRangeOther.class); for (final String o : othersP) { others.add(FacetRangeOther.get(o)); } // no matter what other values are listed, we don't do // anything if "none" is specified. if (! others.contains(FacetRangeOther.NONE) ) { boolean all = others.contains(FacetRangeOther.ALL); if (all || others.contains(FacetRangeOther.BEFORE)) { // include upper bound if "outer" or if first gap doesn't already include it resInner.add(FacetRangeOther.BEFORE.toString(), rangeCount(sf,null,start, false, (include.contains(FacetRangeInclude.OUTER) || (! (include.contains(FacetRangeInclude.LOWER) || include.contains(FacetRangeInclude.EDGE)))))); } if (all || others.contains(FacetRangeOther.AFTER)) { // include lower bound if "outer" or if last gap doesn't already include it resInner.add(FacetRangeOther.AFTER.toString(), rangeCount(sf,end,null, (include.contains(FacetRangeInclude.OUTER) || (! (include.contains(FacetRangeInclude.UPPER) || include.contains(FacetRangeInclude.EDGE)))), false)); } if (all || others.contains(FacetRangeOther.BETWEEN)) { resInner.add(FacetRangeOther.BETWEEN.toString(), rangeCount(sf,start,end, (include.contains(FacetRangeInclude.LOWER) || include.contains(FacetRangeInclude.EDGE)), (include.contains(FacetRangeInclude.UPPER) || include.contains(FacetRangeInclude.EDGE)))); } } } }
// in core/src/java/org/apache/solr/request/SimpleFacets.java
public NamedList<Object> getFacetRangeCounts() throws IOException, ParseException { final NamedList<Object> resOuter = new SimpleOrderedMap<Object>(); final String[] fields = params.getParams(FacetParams.FACET_RANGE); if (null == fields || 0 == fields.length) return resOuter; for (String f : fields) { getFacetRangeCounts(f, resOuter); } return resOuter; }
// in core/src/java/org/apache/solr/request/SimpleFacets.java
void getFacetRangeCounts(String facetRange, NamedList<Object> resOuter) throws IOException, ParseException { final IndexSchema schema = searcher.getSchema(); parseParams(FacetParams.FACET_RANGE, facetRange); String f = facetValue; final SchemaField sf = schema.getField(f); final FieldType ft = sf.getType(); RangeEndpointCalculator<?> calc = null; if (ft instanceof TrieField) { final TrieField trie = (TrieField)ft; switch (trie.getType()) { case FLOAT: calc = new FloatRangeEndpointCalculator(sf); break; case DOUBLE: calc = new DoubleRangeEndpointCalculator(sf); break; case INTEGER: calc = new IntegerRangeEndpointCalculator(sf); break; case LONG: calc = new LongRangeEndpointCalculator(sf); break; default: throw new SolrException (SolrException.ErrorCode.BAD_REQUEST, "Unable to range facet on tried field of unexpected type:" + f); } } else if (ft instanceof DateField) { calc = new DateRangeEndpointCalculator(sf, null); } else if (ft instanceof SortableIntField) { calc = new IntegerRangeEndpointCalculator(sf); } else if (ft instanceof SortableLongField) { calc = new LongRangeEndpointCalculator(sf); } else if (ft instanceof SortableFloatField) { calc = new FloatRangeEndpointCalculator(sf); } else if (ft instanceof SortableDoubleField) { calc = new DoubleRangeEndpointCalculator(sf); } else { throw new SolrException (SolrException.ErrorCode.BAD_REQUEST, "Unable to range facet on field:" + sf); } resOuter.add(key, getFacetRangeCounts(sf, calc)); }
// in core/src/java/org/apache/solr/request/SimpleFacets.java
private <T extends Comparable<T>> NamedList getFacetRangeCounts (final SchemaField sf, final RangeEndpointCalculator<T> calc) throws IOException { final String f = sf.getName(); final NamedList<Object> res = new SimpleOrderedMap<Object>(); final NamedList<Integer> counts = new NamedList<Integer>(); res.add("counts", counts); final T start = calc.getValue(required.getFieldParam(f,FacetParams.FACET_RANGE_START)); // not final, hardend may change this T end = calc.getValue(required.getFieldParam(f,FacetParams.FACET_RANGE_END)); if (end.compareTo(start) < 0) { throw new SolrException (SolrException.ErrorCode.BAD_REQUEST, "range facet 'end' comes before 'start': "+end+" < "+start); } final String gap = required.getFieldParam(f, FacetParams.FACET_RANGE_GAP); // explicitly return the gap. compute this early so we are more // likely to catch parse errors before attempting math res.add("gap", calc.getGap(gap)); final int minCount = params.getFieldInt(f,FacetParams.FACET_MINCOUNT, 0); final EnumSet<FacetRangeInclude> include = FacetRangeInclude.parseParam (params.getFieldParams(f,FacetParams.FACET_RANGE_INCLUDE)); T low = start; while (low.compareTo(end) < 0) { T high = calc.addGap(low, gap); if (end.compareTo(high) < 0) { if (params.getFieldBool(f,FacetParams.FACET_RANGE_HARD_END,false)) { high = end; } else { end = high; } } if (high.compareTo(low) < 0) { throw new SolrException (SolrException.ErrorCode.BAD_REQUEST, "range facet infinite loop (is gap negative? did the math overflow?)"); } final boolean includeLower = (include.contains(FacetRangeInclude.LOWER) || (include.contains(FacetRangeInclude.EDGE) && 0 == low.compareTo(start))); final boolean includeUpper = (include.contains(FacetRangeInclude.UPPER) || (include.contains(FacetRangeInclude.EDGE) && 0 == high.compareTo(end))); final String lowS = calc.formatValue(low); final String highS = calc.formatValue(high); final int count = rangeCount(sf, lowS, highS, includeLower,includeUpper); if (count >= minCount) { counts.add(lowS, count); } low = high; } // explicitly return the start and end so all the counts // (including before/after/between) are meaningful - even if mincount // has removed the neighboring ranges res.add("start", start); res.add("end", end); final String[] othersP = params.getFieldParams(f,FacetParams.FACET_RANGE_OTHER); if (null != othersP && 0 < othersP.length ) { Set<FacetRangeOther> others = EnumSet.noneOf(FacetRangeOther.class); for (final String o : othersP) { others.add(FacetRangeOther.get(o)); } // no matter what other values are listed, we don't do // anything if "none" is specified. if (! others.contains(FacetRangeOther.NONE) ) { boolean all = others.contains(FacetRangeOther.ALL); final String startS = calc.formatValue(start); final String endS = calc.formatValue(end); if (all || others.contains(FacetRangeOther.BEFORE)) { // include upper bound if "outer" or if first gap doesn't already include it res.add(FacetRangeOther.BEFORE.toString(), rangeCount(sf,null,startS, false, (include.contains(FacetRangeInclude.OUTER) || (! (include.contains(FacetRangeInclude.LOWER) || include.contains(FacetRangeInclude.EDGE)))))); } if (all || others.contains(FacetRangeOther.AFTER)) { // include lower bound if "outer" or if last gap doesn't already include it res.add(FacetRangeOther.AFTER.toString(), rangeCount(sf,endS,null, (include.contains(FacetRangeInclude.OUTER) || (! (include.contains(FacetRangeInclude.UPPER) || include.contains(FacetRangeInclude.EDGE)))), false)); } if (all || others.contains(FacetRangeOther.BETWEEN)) { res.add(FacetRangeOther.BETWEEN.toString(), rangeCount(sf,startS,endS, (include.contains(FacetRangeInclude.LOWER) || include.contains(FacetRangeInclude.EDGE)), (include.contains(FacetRangeInclude.UPPER) || include.contains(FacetRangeInclude.EDGE)))); } } } return res; }
// in core/src/java/org/apache/solr/request/SimpleFacets.java
protected int rangeCount(SchemaField sf, String low, String high, boolean iLow, boolean iHigh) throws IOException { Query rangeQ = sf.getType().getRangeQuery(null, sf,low,high,iLow,iHigh); return searcher.numDocs(rangeQ ,base); }
// in core/src/java/org/apache/solr/request/SimpleFacets.java
Deprecated protected int rangeCount(SchemaField sf, Date low, Date high, boolean iLow, boolean iHigh) throws IOException { Query rangeQ = ((DateField)(sf.getType())).getRangeQuery(null, sf,low,high,iLow,iHigh); return searcher.numDocs(rangeQ ,base); }
// in core/src/java/org/apache/solr/request/UnInvertedField.java
Override protected void visitTerm(TermsEnum te, int termNum) throws IOException { if (termNum >= maxTermCounts.length) { // resize by doubling - for very large number of unique terms, expanding // by 4K and resultant GC will dominate uninvert times. Resize at end if material int[] newMaxTermCounts = new int[maxTermCounts.length*2]; System.arraycopy(maxTermCounts, 0, newMaxTermCounts, 0, termNum); maxTermCounts = newMaxTermCounts; } final BytesRef term = te.term(); if (te.docFreq() > maxTermDocFreq) { TopTerm topTerm = new TopTerm(); topTerm.term = BytesRef.deepCopyOf(term); topTerm.termNum = termNum; bigTerms.put(topTerm.termNum, topTerm); if (deState == null) { deState = new SolrIndexSearcher.DocsEnumState(); deState.fieldName = field; // deState.termsEnum = te.tenum; deState.termsEnum = te; // TODO: check for MultiTermsEnum in SolrIndexSearcher could now fail? deState.docsEnum = docsEnum; deState.minSetSizeCached = maxTermDocFreq; } docsEnum = deState.docsEnum; DocSet set = searcher.getDocSet(deState); maxTermCounts[termNum] = set.size(); } }
// in core/src/java/org/apache/solr/request/UnInvertedField.java
public NamedList<Integer> getCounts(SolrIndexSearcher searcher, DocSet baseDocs, int offset, int limit, Integer mincount, boolean missing, String sort, String prefix) throws IOException { use.incrementAndGet(); FieldType ft = searcher.getSchema().getFieldType(field); NamedList<Integer> res = new NamedList<Integer>(); // order is important DocSet docs = baseDocs; int baseSize = docs.size(); int maxDoc = searcher.maxDoc(); //System.out.println("GET COUNTS field=" + field + " baseSize=" + baseSize + " minCount=" + mincount + " maxDoc=" + maxDoc + " numTermsInField=" + numTermsInField); if (baseSize >= mincount) { final int[] index = this.index; // tricky: we add more more element than we need because we will reuse this array later // for ordering term ords before converting to term labels. final int[] counts = new int[numTermsInField + 1]; // // If there is prefix, find it's start and end term numbers // int startTerm = 0; int endTerm = numTermsInField; // one past the end TermsEnum te = getOrdTermsEnum(searcher.getAtomicReader()); if (te != null && prefix != null && prefix.length() > 0) { final BytesRef prefixBr = new BytesRef(prefix); if (te.seekCeil(prefixBr, true) == TermsEnum.SeekStatus.END) { startTerm = numTermsInField; } else { startTerm = (int) te.ord(); } prefixBr.append(UnicodeUtil.BIG_TERM); if (te.seekCeil(prefixBr, true) == TermsEnum.SeekStatus.END) { endTerm = numTermsInField; } else { endTerm = (int) te.ord(); } } /*********** // Alternative 2: get the docSet of the prefix (could take a while) and // then do the intersection with the baseDocSet first. if (prefix != null && prefix.length() > 0) { docs = searcher.getDocSet(new ConstantScorePrefixQuery(new Term(field, ft.toInternal(prefix))), docs); // The issue with this method are problems of returning 0 counts for terms w/o // the prefix. We can't just filter out those terms later because it may // mean that we didn't collect enough terms in the queue (in the sorted case). } ***********/ boolean doNegative = baseSize > maxDoc >> 1 && termInstances > 0 && startTerm==0 && endTerm==numTermsInField && docs instanceof BitDocSet; if (doNegative) { OpenBitSet bs = (OpenBitSet)((BitDocSet)docs).getBits().clone(); bs.flip(0, maxDoc); // TODO: when iterator across negative elements is available, use that // instead of creating a new bitset and inverting. docs = new BitDocSet(bs, maxDoc - baseSize); // simply negating will mean that we have deleted docs in the set. // that should be OK, as their entries in our table should be empty. //System.out.println(" NEG"); } // For the biggest terms, do straight set intersections for (TopTerm tt : bigTerms.values()) { //System.out.println(" do big termNum=" + tt.termNum + " term=" + tt.term.utf8ToString()); // TODO: counts could be deferred if sorted==false if (tt.termNum >= startTerm && tt.termNum < endTerm) { counts[tt.termNum] = searcher.numDocs(new TermQuery(new Term(field, tt.term)), docs); //System.out.println(" count=" + counts[tt.termNum]); } else { //System.out.println("SKIP term=" + tt.termNum); } } // TODO: we could short-circuit counting altogether for sorted faceting // where we already have enough terms from the bigTerms // TODO: we could shrink the size of the collection array, and // additionally break when the termNumber got above endTerm, but // it would require two extra conditionals in the inner loop (although // they would be predictable for the non-prefix case). // Perhaps a different copy of the code would be warranted. if (termInstances > 0) { DocIterator iter = docs.iterator(); while (iter.hasNext()) { int doc = iter.nextDoc(); //System.out.println("iter doc=" + doc); int code = index[doc]; if ((code & 0xff)==1) { //System.out.println(" ptr"); int pos = code>>>8; int whichArray = (doc >>> 16) & 0xff; byte[] arr = tnums[whichArray]; int tnum = 0; for(;;) { int delta = 0; for(;;) { byte b = arr[pos++]; delta = (delta << 7) | (b & 0x7f); if ((b & 0x80) == 0) break; } if (delta == 0) break; tnum += delta - TNUM_OFFSET; //System.out.println(" tnum=" + tnum); counts[tnum]++; } } else { //System.out.println(" inlined"); int tnum = 0; int delta = 0; for (;;) { delta = (delta << 7) | (code & 0x7f); if ((code & 0x80)==0) { if (delta==0) break; tnum += delta - TNUM_OFFSET; //System.out.println(" tnum=" + tnum); counts[tnum]++; delta = 0; } code >>>= 8; } } } } final CharsRef charsRef = new CharsRef(); int off=offset; int lim=limit>=0 ? limit : Integer.MAX_VALUE; if (sort.equals(FacetParams.FACET_SORT_COUNT) || sort.equals(FacetParams.FACET_SORT_COUNT_LEGACY)) { int maxsize = limit>0 ? offset+limit : Integer.MAX_VALUE-1; maxsize = Math.min(maxsize, numTermsInField); LongPriorityQueue queue = new LongPriorityQueue(Math.min(maxsize,1000), maxsize, Long.MIN_VALUE); int min=mincount-1; // the smallest value in the top 'N' values //System.out.println("START=" + startTerm + " END=" + endTerm); for (int i=startTerm; i<endTerm; i++) { int c = doNegative ? maxTermCounts[i] - counts[i] : counts[i]; if (c>min) { // NOTE: we use c>min rather than c>=min as an optimization because we are going in // index order, so we already know that the keys are ordered. This can be very // important if a lot of the counts are repeated (like zero counts would be). // smaller term numbers sort higher, so subtract the term number instead long pair = (((long)c)<<32) + (Integer.MAX_VALUE - i); boolean displaced = queue.insert(pair); if (displaced) min=(int)(queue.top() >>> 32); } } // now select the right page from the results // if we are deep paging, we don't have to order the highest "offset" counts. int collectCount = Math.max(0, queue.size() - off); assert collectCount <= lim; // the start and end indexes of our list "sorted" (starting with the highest value) int sortedIdxStart = queue.size() - (collectCount - 1); int sortedIdxEnd = queue.size() + 1; final long[] sorted = queue.sort(collectCount); final int[] indirect = counts; // reuse the counts array for the index into the tnums array assert indirect.length >= sortedIdxEnd; for (int i=sortedIdxStart; i<sortedIdxEnd; i++) { long pair = sorted[i]; int c = (int)(pair >>> 32); int tnum = Integer.MAX_VALUE - (int)pair; indirect[i] = i; // store the index for indirect sorting sorted[i] = tnum; // reuse the "sorted" array to store the term numbers for indirect sorting // add a null label for now... we'll fill it in later. res.add(null, c); } // now sort the indexes by the term numbers PrimUtils.sort(sortedIdxStart, sortedIdxEnd, indirect, new PrimUtils.IntComparator() { @Override public int compare(int a, int b) { return (int)sorted[a] - (int)sorted[b]; } @Override public boolean lessThan(int a, int b) { return sorted[a] < sorted[b]; } @Override public boolean equals(int a, int b) { return sorted[a] == sorted[b]; } }); // convert the term numbers to term values and set // as the label //System.out.println("sortStart=" + sortedIdxStart + " end=" + sortedIdxEnd); for (int i=sortedIdxStart; i<sortedIdxEnd; i++) { int idx = indirect[i]; int tnum = (int)sorted[idx]; final String label = getReadableValue(getTermValue(te, tnum), ft, charsRef); //System.out.println(" label=" + label); res.setName(idx - sortedIdxStart, label); } } else { // add results in index order int i=startTerm; if (mincount<=0) { // if mincount<=0, then we won't discard any terms and we know exactly // where to start. i=startTerm+off; off=0; } for (; i<endTerm; i++) { int c = doNegative ? maxTermCounts[i] - counts[i] : counts[i]; if (c<mincount || --off>=0) continue; if (--lim<0) break; final String label = getReadableValue(getTermValue(te, i), ft, charsRef); res.add(label, c); } } }
// in core/src/java/org/apache/solr/request/UnInvertedField.java
public StatsValues getStats(SolrIndexSearcher searcher, DocSet baseDocs, String[] facet) throws IOException { //this function is ripped off nearly wholesale from the getCounts function to use //for multiValued fields within the StatsComponent. may be useful to find common //functionality between the two and refactor code somewhat use.incrementAndGet(); SchemaField sf = searcher.getSchema().getField(field); // FieldType ft = sf.getType(); StatsValues allstats = StatsValuesFactory.createStatsValues(sf); DocSet docs = baseDocs; int baseSize = docs.size(); int maxDoc = searcher.maxDoc(); if (baseSize <= 0) return allstats; DocSet missing = docs.andNot( searcher.getDocSet(new TermRangeQuery(field, null, null, false, false)) ); int i = 0; final FieldFacetStats[] finfo = new FieldFacetStats[facet.length]; //Initialize facetstats, if facets have been passed in FieldCache.DocTermsIndex si; for (String f : facet) { SchemaField facet_sf = searcher.getSchema().getField(f); try { si = FieldCache.DEFAULT.getTermsIndex(searcher.getAtomicReader(), f); } catch (IOException e) { throw new RuntimeException("failed to open field cache for: " + f, e); } finfo[i] = new FieldFacetStats(f, si, sf, facet_sf, numTermsInField); i++; } final int[] index = this.index; final int[] counts = new int[numTermsInField];//keep track of the number of times we see each word in the field for all the documents in the docset TermsEnum te = getOrdTermsEnum(searcher.getAtomicReader()); boolean doNegative = false; if (finfo.length == 0) { //if we're collecting statistics with a facet field, can't do inverted counting doNegative = baseSize > maxDoc >> 1 && termInstances > 0 && docs instanceof BitDocSet; } if (doNegative) { OpenBitSet bs = (OpenBitSet) ((BitDocSet) docs).getBits().clone(); bs.flip(0, maxDoc); // TODO: when iterator across negative elements is available, use that // instead of creating a new bitset and inverting. docs = new BitDocSet(bs, maxDoc - baseSize); // simply negating will mean that we have deleted docs in the set. // that should be OK, as their entries in our table should be empty. } // For the biggest terms, do straight set intersections for (TopTerm tt : bigTerms.values()) { // TODO: counts could be deferred if sorted==false if (tt.termNum >= 0 && tt.termNum < numTermsInField) { final Term t = new Term(field, tt.term); if (finfo.length == 0) { counts[tt.termNum] = searcher.numDocs(new TermQuery(t), docs); } else { //COULD BE VERY SLOW //if we're collecting stats for facet fields, we need to iterate on all matching documents DocSet bigTermDocSet = searcher.getDocSet(new TermQuery(t)).intersection(docs); DocIterator iter = bigTermDocSet.iterator(); while (iter.hasNext()) { int doc = iter.nextDoc(); counts[tt.termNum]++; for (FieldFacetStats f : finfo) { f.facetTermNum(doc, tt.termNum); } } } } } if (termInstances > 0) { DocIterator iter = docs.iterator(); while (iter.hasNext()) { int doc = iter.nextDoc(); int code = index[doc]; if ((code & 0xff) == 1) { int pos = code >>> 8; int whichArray = (doc >>> 16) & 0xff; byte[] arr = tnums[whichArray]; int tnum = 0; for (; ;) { int delta = 0; for (; ;) { byte b = arr[pos++]; delta = (delta << 7) | (b & 0x7f); if ((b & 0x80) == 0) break; } if (delta == 0) break; tnum += delta - TNUM_OFFSET; counts[tnum]++; for (FieldFacetStats f : finfo) { f.facetTermNum(doc, tnum); } } } else { int tnum = 0; int delta = 0; for (; ;) { delta = (delta << 7) | (code & 0x7f); if ((code & 0x80) == 0) { if (delta == 0) break; tnum += delta - TNUM_OFFSET; counts[tnum]++; for (FieldFacetStats f : finfo) { f.facetTermNum(doc, tnum); } delta = 0; } code >>>= 8; } } } } // add results in index order for (i = 0; i < numTermsInField; i++) { int c = doNegative ? maxTermCounts[i] - counts[i] : counts[i]; if (c == 0) continue; BytesRef value = getTermValue(te, i); allstats.accumulate(value, c); //as we've parsed the termnum into a value, lets also accumulate fieldfacet statistics for (FieldFacetStats f : finfo) { f.accumulateTermNum(i, value); } } int c = missing.size(); allstats.addMissing(c); if (finfo.length > 0) { for (FieldFacetStats f : finfo) { Map<String, StatsValues> facetStatsValues = f.facetStatsValues; FieldType facetType = searcher.getSchema().getFieldType(f.name); for (Map.Entry<String,StatsValues> entry : facetStatsValues.entrySet()) { String termLabel = entry.getKey(); int missingCount = searcher.numDocs(new TermQuery(new Term(f.name, facetType.toInternal(termLabel))), missing); entry.getValue().addMissing(missingCount); } allstats.addFacet(f.name, facetStatsValues); } } return allstats; }
// in core/src/java/org/apache/solr/request/UnInvertedField.java
BytesRef getTermValue(TermsEnum te, int termNum) throws IOException { //System.out.println("getTermValue termNum=" + termNum + " this=" + this + " numTerms=" + numTermsInField); if (bigTerms.size() > 0) { // see if the term is one of our big terms. TopTerm tt = bigTerms.get(termNum); if (tt != null) { //System.out.println(" return big " + tt.term); return tt.term; } } return lookupTerm(te, termNum); }
// in core/src/java/org/apache/solr/request/UnInvertedField.java
public static UnInvertedField getUnInvertedField(String field, SolrIndexSearcher searcher) throws IOException { SolrCache<String,UnInvertedField> cache = searcher.getFieldValueCache(); if (cache == null) { return new UnInvertedField(field, searcher); } UnInvertedField uif = cache.get(field); if (uif == null) { synchronized (cache) { uif = cache.get(field); if (uif == null) { uif = new UnInvertedField(field, searcher); cache.put(field, uif); } } } return uif; }
// in core/src/java/org/apache/solr/servlet/SolrDispatchFilter.java
public void doFilter(ServletRequest request, ServletResponse response, FilterChain chain) throws IOException, ServletException { if( abortErrorMessage != null ) { ((HttpServletResponse)response).sendError( 500, abortErrorMessage ); return; } if (this.cores == null) { ((HttpServletResponse)response).sendError( 403, "Server is shutting down" ); return; } CoreContainer cores = this.cores; SolrCore core = null; SolrQueryRequest solrReq = null; if( request instanceof HttpServletRequest) { HttpServletRequest req = (HttpServletRequest)request; HttpServletResponse resp = (HttpServletResponse)response; SolrRequestHandler handler = null; String corename = ""; try { // put the core container in request attribute req.setAttribute("org.apache.solr.CoreContainer", cores); String path = req.getServletPath(); if( req.getPathInfo() != null ) { // this lets you handle /update/commit when /update is a servlet path += req.getPathInfo(); } if( pathPrefix != null && path.startsWith( pathPrefix ) ) { path = path.substring( pathPrefix.length() ); } // check for management path String alternate = cores.getManagementPath(); if (alternate != null && path.startsWith(alternate)) { path = path.substring(0, alternate.length()); } // unused feature ? int idx = path.indexOf( ':' ); if( idx > 0 ) { // save the portion after the ':' for a 'handler' path parameter path = path.substring( 0, idx ); } // Check for the core admin page if( path.equals( cores.getAdminPath() ) ) { handler = cores.getMultiCoreHandler(); solrReq = adminRequestParser.parse(null,path, req); handleAdminRequest(req, response, handler, solrReq); return; } else { //otherwise, we should find a core from the path idx = path.indexOf( "/", 1 ); if( idx > 1 ) { // try to get the corename as a request parameter first corename = path.substring( 1, idx ); core = cores.getCore(corename); if (core != null) { path = path.substring( idx ); } } if (core == null) { if (!cores.isZooKeeperAware() ) { core = cores.getCore(""); } } } if (core == null && cores.isZooKeeperAware()) { // we couldn't find the core - lets make sure a collection was not specified instead core = getCoreByCollection(cores, corename, path); if (core != null) { // we found a core, update the path path = path.substring( idx ); } else { // try the default core core = cores.getCore(""); } // TODO: if we couldn't find it locally, look on other nodes } // With a valid core... if( core != null ) { final SolrConfig config = core.getSolrConfig(); // get or create/cache the parser for the core SolrRequestParsers parser = null; parser = parsers.get(config); if( parser == null ) { parser = new SolrRequestParsers(config); parsers.put(config, parser ); } // Determine the handler from the url path if not set // (we might already have selected the cores handler) if( handler == null && path.length() > 1 ) { // don't match "" or "/" as valid path handler = core.getRequestHandler( path ); // no handler yet but allowed to handle select; let's check if( handler == null && parser.isHandleSelect() ) { if( "/select".equals( path ) || "/select/".equals( path ) ) { solrReq = parser.parse( core, path, req ); String qt = solrReq.getParams().get( CommonParams.QT ); handler = core.getRequestHandler( qt ); if( handler == null ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "unknown handler: "+qt); } if( qt != null && qt.startsWith("/") && (handler instanceof ContentStreamHandlerBase)) { //For security reasons it's a bad idea to allow a leading '/', ex: /select?qt=/update see SOLR-3161 //There was no restriction from Solr 1.4 thru 3.5 and it's not supported for update handlers. throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "Invalid query type. Do not use /select to access: "+qt); } } } } // With a valid handler and a valid core... if( handler != null ) { // if not a /select, create the request if( solrReq == null ) { solrReq = parser.parse( core, path, req ); } final Method reqMethod = Method.getMethod(req.getMethod()); HttpCacheHeaderUtil.setCacheControlHeader(config, resp, reqMethod); // unless we have been explicitly told not to, do cache validation // if we fail cache validation, execute the query if (config.getHttpCachingConfig().isNever304() || !HttpCacheHeaderUtil.doCacheHeaderValidation(solrReq, req, reqMethod, resp)) { SolrQueryResponse solrRsp = new SolrQueryResponse(); /* even for HEAD requests, we need to execute the handler to * ensure we don't get an error (and to make sure the correct * QueryResponseWriter is selected and we get the correct * Content-Type) */ SolrRequestInfo.setRequestInfo(new SolrRequestInfo(solrReq, solrRsp)); this.execute( req, handler, solrReq, solrRsp ); HttpCacheHeaderUtil.checkHttpCachingVeto(solrRsp, resp, reqMethod); // add info to http headers //TODO: See SOLR-232 and SOLR-267. /*try { NamedList solrRspHeader = solrRsp.getResponseHeader(); for (int i=0; i<solrRspHeader.size(); i++) { ((javax.servlet.http.HttpServletResponse) response).addHeader(("Solr-" + solrRspHeader.getName(i)), String.valueOf(solrRspHeader.getVal(i))); } } catch (ClassCastException cce) { log.log(Level.WARNING, "exception adding response header log information", cce); }*/ QueryResponseWriter responseWriter = core.getQueryResponseWriter(solrReq); writeResponse(solrRsp, response, responseWriter, solrReq, reqMethod); } return; // we are done with a valid handler } } log.debug("no handler or core retrieved for " + path + ", follow through..."); } catch (Throwable ex) { sendError( core, solrReq, request, (HttpServletResponse)response, ex ); return; } finally { if( solrReq != null ) { solrReq.close(); } if (core != null) { core.close(); } SolrRequestInfo.clearRequestInfo(); } } // Otherwise let the webapp handle the request chain.doFilter(request, response); }
// in core/src/java/org/apache/solr/servlet/SolrDispatchFilter.java
private void handleAdminRequest(HttpServletRequest req, ServletResponse response, SolrRequestHandler handler, SolrQueryRequest solrReq) throws IOException { SolrQueryResponse solrResp = new SolrQueryResponse(); final NamedList<Object> responseHeader = new SimpleOrderedMap<Object>(); solrResp.add("responseHeader", responseHeader); NamedList toLog = solrResp.getToLog(); toLog.add("webapp", req.getContextPath()); toLog.add("path", solrReq.getContext().get("path")); toLog.add("params", "{" + solrReq.getParamString() + "}"); handler.handleRequest(solrReq, solrResp); SolrCore.setResponseHeaderValues(handler, solrReq, solrResp); StringBuilder sb = new StringBuilder(); for (int i = 0; i < toLog.size(); i++) { String name = toLog.getName(i); Object val = toLog.getVal(i); sb.append(name).append("=").append(val).append(" "); } QueryResponseWriter respWriter = SolrCore.DEFAULT_RESPONSE_WRITERS.get(solrReq.getParams().get(CommonParams.WT)); if (respWriter == null) respWriter = SolrCore.DEFAULT_RESPONSE_WRITERS.get("standard"); writeResponse(solrResp, response, respWriter, solrReq, Method.getMethod(req.getMethod())); }
// in core/src/java/org/apache/solr/servlet/SolrDispatchFilter.java
private void writeResponse(SolrQueryResponse solrRsp, ServletResponse response, QueryResponseWriter responseWriter, SolrQueryRequest solrReq, Method reqMethod) throws IOException { // Now write it out final String ct = responseWriter.getContentType(solrReq, solrRsp); // don't call setContentType on null if (null != ct) response.setContentType(ct); if (solrRsp.getException() != null) { NamedList info = new SimpleOrderedMap(); int code = getErrorInfo(solrRsp.getException(),info); solrRsp.add("error", info); ((HttpServletResponse) response).setStatus(code); } if (Method.HEAD != reqMethod) { if (responseWriter instanceof BinaryQueryResponseWriter) { BinaryQueryResponseWriter binWriter = (BinaryQueryResponseWriter) responseWriter; binWriter.write(response.getOutputStream(), solrReq, solrRsp); } else { String charset = ContentStreamBase.getCharsetFromContentType(ct); Writer out = (charset == null || charset.equalsIgnoreCase("UTF-8")) ? new OutputStreamWriter(response.getOutputStream(), UTF8) : new OutputStreamWriter(response.getOutputStream(), charset); out = new FastWriter(out); responseWriter.write(out, solrReq, solrRsp); out.flush(); } } //else http HEAD request, nothing to write out, waited this long just to get ContentType }
// in core/src/java/org/apache/solr/servlet/SolrDispatchFilter.java
protected void sendError(SolrCore core, SolrQueryRequest req, ServletRequest request, HttpServletResponse response, Throwable ex) throws IOException { try { SolrQueryResponse solrResp = new SolrQueryResponse(); if(ex instanceof Exception) { solrResp.setException((Exception)ex); } else { solrResp.setException(new RuntimeException(ex)); } if(core==null) { core = cores.getCore(""); // default core } if(req==null) { req = new SolrQueryRequestBase(core,new ServletSolrParams(request)) {}; } QueryResponseWriter writer = core.getQueryResponseWriter(req); writeResponse(solrResp, response, writer, req, Method.GET); } catch( Throwable t ) { // This error really does not matter SimpleOrderedMap info = new SimpleOrderedMap(); int code=getErrorInfo(ex, info); response.sendError( code, info.toString() ); } }
// in core/src/java/org/apache/solr/servlet/ZookeeperInfoServlet.java
Override public void doGet(HttpServletRequest request, HttpServletResponse response) throws IOException, ServletException { response.setCharacterEncoding("UTF-8"); response.setContentType("application/json"); // This attribute is set by the SolrDispatchFilter CoreContainer cores = (CoreContainer) request.getAttribute("org.apache.solr.CoreContainer"); String path = request.getParameter("path"); String addr = request.getParameter("addr"); if (addr != null && addr.length() == 0) { addr = null; } String detailS = request.getParameter("detail"); boolean detail = detailS != null && detailS.equals("true"); String dumpS = request.getParameter("dump"); boolean dump = dumpS != null && dumpS.equals("true"); PrintWriter out = response.getWriter(); ZKPrinter printer = new ZKPrinter(response, out, cores.getZkController(), addr); printer.detail = detail; printer.dump = dump; try { printer.print(path); } finally { printer.close(); } }
// in core/src/java/org/apache/solr/servlet/ZookeeperInfoServlet.java
Override public void doPost(HttpServletRequest request, HttpServletResponse response) throws IOException, ServletException { doGet(request, response); }
// in core/src/java/org/apache/solr/servlet/ZookeeperInfoServlet.java
void print(String path) throws IOException { if (zkClient == null) { return; } // normalize path if (path == null) { path = "/"; } else { path.trim(); if (path.length() == 0) { path = "/"; } } if (path.endsWith("/") && path.length() > 1) { path = path.substring(0, path.length() - 1); } int idx = path.lastIndexOf('/'); String parent = idx >= 0 ? path.substring(0, idx) : path; if (parent.length() == 0) { parent = "/"; } CharArr chars = new CharArr(); JSONWriter json = new JSONWriter(chars, 2); json.startObject(); if (detail) { if (!printZnode(json, path)) { return; } json.writeValueSeparator(); } json.writeString("tree"); json.writeNameSeparator(); json.startArray(); if (!printTree(json, path)) { return; // there was an error } json.endArray(); json.endObject(); out.println(chars.toString()); }
// in core/src/java/org/apache/solr/servlet/ZookeeperInfoServlet.java
boolean printTree(JSONWriter json, String path) throws IOException { String label = path; if (!fullpath) { int idx = path.lastIndexOf('/'); label = idx > 0 ? path.substring(idx + 1) : path; } json.startObject(); //writeKeyValue(json, "data", label, true ); json.writeString("data"); json.writeNameSeparator(); json.startObject(); writeKeyValue(json, "title", label, true); json.writeValueSeparator(); json.writeString("attr"); json.writeNameSeparator(); json.startObject(); writeKeyValue(json, "href", "zookeeper?detail=true&path=" + URLEncoder.encode(path, "UTF-8"), true); json.endObject(); json.endObject(); Stat stat = new Stat(); try { // Trickily, the call to zkClient.getData fills in the stat variable byte[] data = zkClient.getData(path, null, stat, true); if (stat.getEphemeralOwner() != 0) { writeKeyValue(json, "ephemeral", true, false); writeKeyValue(json, "version", stat.getVersion(), false); } if (dump) { json.writeValueSeparator(); printZnode(json, path); } /* if (stat.getNumChildren() != 0) { writeKeyValue(json, "children_count", stat.getNumChildren(), false ); out.println(", \"children_count\" : \"" + stat.getNumChildren() + "\""); } */ //if (stat.getDataLength() != 0) if (data != null) { String str = new BytesRef(data).utf8ToString(); //?? writeKeyValue(json, "content", str, false ); // Does nothing now, but on the assumption this will be used later we'll leave it in. If it comes out // the catches below need to be restructured. } } catch (IllegalArgumentException e) { // path doesn't exist (must have been removed) writeKeyValue(json, "warning", "(path gone)", false); } catch (KeeperException e) { writeKeyValue(json, "warning", e.toString(), false); log.warn("Keeper Exception", e); } catch (InterruptedException e) { writeKeyValue(json, "warning", e.toString(), false); log.warn("InterruptedException", e); } if (stat.getNumChildren() > 0) { json.writeValueSeparator(); if (indent) { json.indent(); } json.writeString("children"); json.writeNameSeparator(); json.startArray(); try { List<String> children = zkClient.getChildren(path, null, true); java.util.Collections.sort(children); boolean first = true; for (String child : children) { if (!first) { json.writeValueSeparator(); } String childPath = path + (path.endsWith("/") ? "" : "/") + child; if (!printTree(json, childPath)) { return false; } first = false; } } catch (KeeperException e) { writeError(500, e.toString()); return false; } catch (InterruptedException e) { writeError(500, e.toString()); return false; } catch (IllegalArgumentException e) { // path doesn't exist (must have been removed) json.writeString("(children gone)"); } json.endArray(); } json.endObject(); return true; }
// in core/src/java/org/apache/solr/servlet/ZookeeperInfoServlet.java
boolean printZnode(JSONWriter json, String path) throws IOException { try { Stat stat = new Stat(); // Trickily, the call to zkClient.getData fills in the stat variable byte[] data = zkClient.getData(path, null, stat, true); json.writeString("znode"); json.writeNameSeparator(); json.startObject(); writeKeyValue(json, "path", path, true); json.writeValueSeparator(); json.writeString("prop"); json.writeNameSeparator(); json.startObject(); writeKeyValue(json, "version", stat.getVersion(), true); writeKeyValue(json, "aversion", stat.getAversion(), false); writeKeyValue(json, "children_count", stat.getNumChildren(), false); writeKeyValue(json, "ctime", time(stat.getCtime()), false); writeKeyValue(json, "cversion", stat.getCversion(), false); writeKeyValue(json, "czxid", stat.getCzxid(), false); writeKeyValue(json, "dataLength", stat.getDataLength(), false); writeKeyValue(json, "ephemeralOwner", stat.getEphemeralOwner(), false); writeKeyValue(json, "mtime", time(stat.getMtime()), false); writeKeyValue(json, "mzxid", stat.getMzxid(), false); writeKeyValue(json, "pzxid", stat.getPzxid(), false); json.endObject(); if (data != null) { writeKeyValue(json, "data", new BytesRef(data).utf8ToString(), false); } json.endObject(); } catch (KeeperException e) { writeError(500, e.toString()); return false; } catch (InterruptedException e) { writeError(500, e.toString()); return false; } return true; }
// in core/src/java/org/apache/solr/servlet/SolrRequestParsers.java
public InputStream getStream() throws IOException { return req.getInputStream(); }
// in core/src/java/org/apache/solr/servlet/SolrRequestParsers.java
public InputStream getStream() throws IOException { return item.getInputStream(); }
// in core/src/java/org/apache/solr/servlet/LoadAdminUiServlet.java
Override public void doGet(HttpServletRequest request, HttpServletResponse response) throws IOException, ServletException { response.setCharacterEncoding("UTF-8"); response.setContentType("text/html"); PrintWriter out = response.getWriter(); InputStream in = getServletContext().getResourceAsStream("/admin.html"); if(in != null) { try { // This attribute is set by the SolrDispatchFilter CoreContainer cores = (CoreContainer) request.getAttribute("org.apache.solr.CoreContainer"); String html = IOUtils.toString(in, "UTF-8"); String[] search = new String[] { "${contextPath}", "${adminPath}" }; String[] replace = new String[] { StringEscapeUtils.escapeJavaScript(request.getContextPath()), StringEscapeUtils.escapeJavaScript(cores.getAdminPath()) }; out.println( StringUtils.replaceEach(html, search, replace) ); } finally { IOUtils.closeQuietly(in); } } else { out.println("solr"); } }
// in core/src/java/org/apache/solr/servlet/LoadAdminUiServlet.java
Override public void doPost(HttpServletRequest request, HttpServletResponse response) throws IOException, ServletException { doGet(request, response); }
// in core/src/java/org/apache/solr/servlet/RedirectServlet.java
public void doGet(HttpServletRequest req, HttpServletResponse res) throws ServletException,IOException { res.setStatus(code); res.setHeader("Location", destination); }
// in core/src/java/org/apache/solr/servlet/RedirectServlet.java
public void doPost(HttpServletRequest req, HttpServletResponse res) throws ServletException,IOException { doGet(req,res); }
// in core/src/java/org/apache/solr/servlet/cache/HttpCacheHeaderUtil.java
public static void sendNotModified(HttpServletResponse res) throws IOException { res.setStatus(HttpServletResponse.SC_NOT_MODIFIED); }
// in core/src/java/org/apache/solr/servlet/cache/HttpCacheHeaderUtil.java
public static void sendPreconditionFailed(HttpServletResponse res) throws IOException { res.setStatus(HttpServletResponse.SC_PRECONDITION_FAILED); }
// in core/src/java/org/apache/solr/servlet/cache/HttpCacheHeaderUtil.java
public static boolean doCacheHeaderValidation(final SolrQueryRequest solrReq, final HttpServletRequest req, final Method reqMethod, final HttpServletResponse resp) throws IOException { if (Method.POST==reqMethod || Method.OTHER==reqMethod) { return false; } final long lastMod = HttpCacheHeaderUtil.calcLastModified(solrReq); final String etag = HttpCacheHeaderUtil.calcEtag(solrReq); resp.setDateHeader("Last-Modified", lastMod); resp.setHeader("ETag", etag); if (checkETagValidators(req, resp, reqMethod, etag)) { return true; } if (checkLastModValidators(req, resp, lastMod)) { return true; } return false; }
// in core/src/java/org/apache/solr/servlet/cache/HttpCacheHeaderUtil.java
public static boolean checkLastModValidators(final HttpServletRequest req, final HttpServletResponse resp, final long lastMod) throws IOException { try { // First check for If-Modified-Since because this is the common // used header by HTTP clients final long modifiedSince = req.getDateHeader("If-Modified-Since"); if (modifiedSince != -1L && lastMod <= modifiedSince) { // Send a "not-modified" sendNotModified(resp); return true; } final long unmodifiedSince = req.getDateHeader("If-Unmodified-Since"); if (unmodifiedSince != -1L && lastMod > unmodifiedSince) { // Send a "precondition failed" sendPreconditionFailed(resp); return true; } } catch (IllegalArgumentException iae) { // one of our date headers was not formated properly, ignore it /* NOOP */ } return false; }
// in core/src/java/org/apache/solr/schema/BCDIntField.java
Override public void write(TextResponseWriter writer, String name, IndexableField f) throws IOException { writer.writeInt(name,toExternal(f)); }
// in core/src/java/org/apache/solr/schema/SortableIntField.java
Override public void write(TextResponseWriter writer, String name, IndexableField f) throws IOException { String sval = f.stringValue(); writer.writeInt(name, NumberUtils.SortableStr2int(sval,0,sval.length())); }
// in core/src/java/org/apache/solr/schema/SortableIntField.java
Override public FunctionValues getValues(Map context, AtomicReaderContext readerContext) throws IOException { final int def = defVal; return new DocTermsIndexDocValues(this, readerContext, field) { private final BytesRef spare = new BytesRef(); @Override protected String toTerm(String readableValue) { return NumberUtils.int2sortableStr(readableValue); } @Override public float floatVal(int doc) { return (float)intVal(doc); } @Override public int intVal(int doc) { int ord=termsIndex.getOrd(doc); return ord==0 ? def : NumberUtils.SortableStr2int(termsIndex.lookup(ord, spare),0,3); } @Override public long longVal(int doc) { return (long)intVal(doc); } @Override public double doubleVal(int doc) { return (double)intVal(doc); } @Override public String strVal(int doc) { return Integer.toString(intVal(doc)); } @Override public String toString(int doc) { return description() + '=' + intVal(doc); } @Override public Object objectVal(int doc) { int ord=termsIndex.getOrd(doc); return ord==0 ? null : NumberUtils.SortableStr2int(termsIndex.lookup(ord, spare)); } @Override public ValueFiller getValueFiller() { return new ValueFiller() { private final MutableValueInt mval = new MutableValueInt(); @Override public MutableValue getValue() { return mval; } @Override public void fillValue(int doc) { int ord=termsIndex.getOrd(doc); if (ord == 0) { mval.value = def; mval.exists = false; } else { mval.value = NumberUtils.SortableStr2int(termsIndex.lookup(ord, spare),0,3); mval.exists = true; } } }; } }; }
// in core/src/java/org/apache/solr/schema/StrField.java
Override public void write(TextResponseWriter writer, String name, IndexableField f) throws IOException { writer.writeStr(name, f.stringValue(), true); }
// in core/src/java/org/apache/solr/schema/DoubleField.java
Override public void write(TextResponseWriter writer, String name, IndexableField f) throws IOException { String s = f.stringValue(); // these values may be from a legacy lucene index, which may // not be properly formatted in some output formats, or may // incorrectly have a zero length. if (s.length()==0) { // zero length value means someone mistakenly indexed the value // instead of simply leaving it out. Write a null value instead of a numeric. writer.writeNull(name); return; } try { double val = Double.parseDouble(s); writer.writeDouble(name, val); } catch (NumberFormatException e){ // can't parse - write out the contents as a string so nothing is lost and // clients don't get a parse error. writer.writeStr(name, s, true); } }
// in core/src/java/org/apache/solr/schema/RandomSortField.java
Override public void write(TextResponseWriter writer, String name, IndexableField f) throws IOException { }
// in core/src/java/org/apache/solr/schema/RandomSortField.java
Override public FieldComparator<Integer> newComparator(final String fieldname, final int numHits, int sortPos, boolean reversed) throws IOException { return new FieldComparator<Integer>() { int seed; private final int[] values = new int[numHits]; int bottomVal; @Override public int compare(int slot1, int slot2) { return values[slot1] - values[slot2]; // values will be positive... no overflow possible. } @Override public void setBottom(int slot) { bottomVal = values[slot]; } @Override public int compareBottom(int doc) throws IOException { return bottomVal - hash(doc+seed); } @Override public void copy(int slot, int doc) throws IOException { values[slot] = hash(doc+seed); } @Override public FieldComparator setNextReader(AtomicReaderContext context) throws IOException { seed = getSeed(fieldname, context); return this; } @Override public Integer value(int slot) { return values[slot]; } @Override public int compareDocToValue(int doc, Integer valueObj) { // values will be positive... no overflow possible. return hash(doc+seed) - valueObj.intValue(); } }; }
// in core/src/java/org/apache/solr/schema/RandomSortField.java
Override public int compareBottom(int doc) throws IOException { return bottomVal - hash(doc+seed); }
// in core/src/java/org/apache/solr/schema/RandomSortField.java
Override public void copy(int slot, int doc) throws IOException { values[slot] = hash(doc+seed); }
// in core/src/java/org/apache/solr/schema/RandomSortField.java
Override public FieldComparator setNextReader(AtomicReaderContext context) throws IOException { seed = getSeed(fieldname, context); return this; }
// in core/src/java/org/apache/solr/schema/RandomSortField.java
Override public FunctionValues getValues(Map context, final AtomicReaderContext readerContext) throws IOException { return new IntDocValues(this) { private final int seed = getSeed(field, readerContext); @Override public int intVal(int doc) { return hash(doc+seed); } }; }
// in core/src/java/org/apache/solr/schema/DateField.java
Override public void write(TextResponseWriter writer, String name, IndexableField f) throws IOException { writer.writeDate(name, toExternal(f)); }
// in core/src/java/org/apache/solr/schema/DateField.java
Override public FunctionValues getValues(Map context, AtomicReaderContext readerContext) throws IOException { return new DocTermsIndexDocValues(this, readerContext, field) { @Override protected String toTerm(String readableValue) { // needed for frange queries to work properly return ft.toInternal(readableValue); } @Override public float floatVal(int doc) { return (float)intVal(doc); } @Override public int intVal(int doc) { int ord=termsIndex.getOrd(doc); return ord; } @Override public long longVal(int doc) { return (long)intVal(doc); } @Override public double doubleVal(int doc) { return (double)intVal(doc); } @Override public String strVal(int doc) { int ord=termsIndex.getOrd(doc); if (ord == 0) { return null; } else { final BytesRef br = termsIndex.lookup(ord, spare); return ft.indexedToReadable(br, spareChars).toString(); } } @Override public Object objectVal(int doc) { int ord=termsIndex.getOrd(doc); if (ord == 0) { return null; } else { final BytesRef br = termsIndex.lookup(ord, new BytesRef()); return ft.toObject(null, br); } } @Override public String toString(int doc) { return description() + '=' + intVal(doc); } }; }
// in core/src/java/org/apache/solr/schema/CurrencyField.java
public void write(XMLWriter xmlWriter, String name, IndexableField field) throws IOException { xmlWriter.writeStr(name, field.stringValue(), false); }
// in core/src/java/org/apache/solr/schema/CurrencyField.java
Override public void write(TextResponseWriter writer, String name, IndexableField field) throws IOException { writer.writeStr(name, field.stringValue(), false); }
// in core/src/java/org/apache/solr/schema/CurrencyField.java
public FunctionValues getValues(Map context, AtomicReaderContext reader) throws IOException { final FunctionValues amounts = amountValues.getValues(context, reader); final FunctionValues currencies = currencyValues.getValues(context, reader); return new FunctionValues() { private final int MAX_CURRENCIES_TO_CACHE = 256; private final int[] fractionDigitCache = new int[MAX_CURRENCIES_TO_CACHE]; private final String[] currencyOrdToCurrencyCache = new String[MAX_CURRENCIES_TO_CACHE]; private final double[] exchangeRateCache = new double[MAX_CURRENCIES_TO_CACHE]; private int targetFractionDigits = -1; private int targetCurrencyOrd = -1; private boolean initializedCache; private String getDocCurrencyCode(int doc, int currencyOrd) { if (currencyOrd < MAX_CURRENCIES_TO_CACHE) { String currency = currencyOrdToCurrencyCache[currencyOrd]; if (currency == null) { currencyOrdToCurrencyCache[currencyOrd] = currency = currencies.strVal(doc); } if (currency == null) { currency = defaultCurrency; } if (targetCurrencyOrd == -1 && currency.equals(targetCurrencyCode)) { targetCurrencyOrd = currencyOrd; } return currency; } else { return currencies.strVal(doc); } } public long longVal(int doc) { if (!initializedCache) { for (int i = 0; i < fractionDigitCache.length; i++) { fractionDigitCache[i] = -1; } initializedCache = true; } long amount = amounts.longVal(doc); int currencyOrd = currencies.ordVal(doc); if (currencyOrd == targetCurrencyOrd) { return amount; } double exchangeRate; int sourceFractionDigits; if (targetFractionDigits == -1) { targetFractionDigits = Currency.getInstance(targetCurrencyCode).getDefaultFractionDigits(); } if (currencyOrd < MAX_CURRENCIES_TO_CACHE) { exchangeRate = exchangeRateCache[currencyOrd]; if (exchangeRate <= 0.0) { String sourceCurrencyCode = getDocCurrencyCode(doc, currencyOrd); exchangeRate = exchangeRateCache[currencyOrd] = provider.getExchangeRate(sourceCurrencyCode, targetCurrencyCode); } sourceFractionDigits = fractionDigitCache[currencyOrd]; if (sourceFractionDigits == -1) { String sourceCurrencyCode = getDocCurrencyCode(doc, currencyOrd); sourceFractionDigits = fractionDigitCache[currencyOrd] = Currency.getInstance(sourceCurrencyCode).getDefaultFractionDigits(); } } else { String sourceCurrencyCode = getDocCurrencyCode(doc, currencyOrd); exchangeRate = provider.getExchangeRate(sourceCurrencyCode, targetCurrencyCode); sourceFractionDigits = Currency.getInstance(sourceCurrencyCode).getDefaultFractionDigits(); } return CurrencyValue.convertAmount(exchangeRate, sourceFractionDigits, amount, targetFractionDigits); } public int intVal(int doc) { return (int) longVal(doc); } public double doubleVal(int doc) { return (double) longVal(doc); } public float floatVal(int doc) { return (float) longVal(doc); } public String strVal(int doc) { return Long.toString(longVal(doc)); } public String toString(int doc) { return name() + '(' + amounts.toString(doc) + ',' + currencies.toString(doc) + ')'; } }; }
// in core/src/java/org/apache/solr/schema/SortableFloatField.java
Override public void write(TextResponseWriter writer, String name, IndexableField f) throws IOException { String sval = f.stringValue(); writer.writeFloat(name, NumberUtils.SortableStr2float(sval)); }
// in core/src/java/org/apache/solr/schema/SortableFloatField.java
Override public FunctionValues getValues(Map context, AtomicReaderContext readerContext) throws IOException { final float def = defVal; return new DocTermsIndexDocValues(this, readerContext, field) { private final BytesRef spare = new BytesRef(); @Override protected String toTerm(String readableValue) { return NumberUtils.float2sortableStr(readableValue); } @Override public float floatVal(int doc) { int ord=termsIndex.getOrd(doc); return ord==0 ? def : NumberUtils.SortableStr2float(termsIndex.lookup(ord, spare)); } @Override public int intVal(int doc) { return (int)floatVal(doc); } @Override public long longVal(int doc) { return (long)floatVal(doc); } @Override public double doubleVal(int doc) { return (double)floatVal(doc); } @Override public String strVal(int doc) { return Float.toString(floatVal(doc)); } @Override public String toString(int doc) { return description() + '=' + floatVal(doc); } @Override public Object objectVal(int doc) { int ord=termsIndex.getOrd(doc); return ord==0 ? null : NumberUtils.SortableStr2float(termsIndex.lookup(ord, spare)); } @Override public ValueFiller getValueFiller() { return new ValueFiller() { private final MutableValueFloat mval = new MutableValueFloat(); @Override public MutableValue getValue() { return mval; } @Override public void fillValue(int doc) { int ord=termsIndex.getOrd(doc); if (ord == 0) { mval.value = def; mval.exists = false; } else { mval.value = NumberUtils.SortableStr2float(termsIndex.lookup(ord, spare)); mval.exists = true; } } }; } }; }
// in core/src/java/org/apache/solr/schema/IntField.java
Override public void write(TextResponseWriter writer, String name, IndexableField f) throws IOException { String s = f.stringValue(); // these values may be from a legacy lucene index, which may // not be properly formatted in some output formats, or may // incorrectly have a zero length. if (s.length()==0) { // zero length value means someone mistakenly indexed the value // instead of simply leaving it out. Write a null value instead of a numeric. writer.writeNull(name); return; } try { int val = Integer.parseInt(s); writer.writeInt(name, val); } catch (NumberFormatException e){ // can't parse - write out the contents as a string so nothing is lost and // clients don't get a parse error. writer.writeStr(name, s, true); } }
// in core/src/java/org/apache/solr/schema/FloatField.java
Override public void write(TextResponseWriter writer, String name, IndexableField f) throws IOException { String s = f.stringValue(); // these values may be from a legacy lucene index, which may // not be properly formatted in some output formats, or may // incorrectly have a zero length. if (s.length()==0) { // zero length value means someone mistakenly indexed the value // instead of simply leaving it out. Write a null value instead of a numeric. writer.writeNull(name); return; } try { float fval = Float.parseFloat(s); writer.writeFloat(name, fval); } catch (NumberFormatException e){ // can't parse - write out the contents as a string so nothing is lost and // clients don't get a parse error. writer.writeStr(name, s, true); } }
// in core/src/java/org/apache/solr/schema/TrieField.java
Override public void write(TextResponseWriter writer, String name, IndexableField f) throws IOException { writer.writeVal(name, toObject(f)); }
// in core/src/java/org/apache/solr/schema/BoolField.java
Override public TokenStreamComponents createComponents(String fieldName, Reader reader) { Tokenizer tokenizer = new Tokenizer(reader) { final CharTermAttribute termAtt = addAttribute(CharTermAttribute.class); boolean done = false; @Override public void reset(Reader input) throws IOException { done = false; super.reset(input); } @Override public boolean incrementToken() throws IOException { clearAttributes(); if (done) return false; done = true; int ch = input.read(); if (ch==-1) return false; termAtt.copyBuffer( ((ch=='t' || ch=='T' || ch=='1') ? TRUE_TOKEN : FALSE_TOKEN) ,0,1); return true; } }; return new TokenStreamComponents(tokenizer); }
// in core/src/java/org/apache/solr/schema/BoolField.java
Override public void reset(Reader input) throws IOException { done = false; super.reset(input); }
// in core/src/java/org/apache/solr/schema/BoolField.java
Override public boolean incrementToken() throws IOException { clearAttributes(); if (done) return false; done = true; int ch = input.read(); if (ch==-1) return false; termAtt.copyBuffer( ((ch=='t' || ch=='T' || ch=='1') ? TRUE_TOKEN : FALSE_TOKEN) ,0,1); return true; }
// in core/src/java/org/apache/solr/schema/BoolField.java
Override public void write(TextResponseWriter writer, String name, IndexableField f) throws IOException { writer.writeBool(name, f.stringValue().charAt(0) == 'T'); }
// in core/src/java/org/apache/solr/schema/BoolField.java
Override public FunctionValues getValues(Map context, AtomicReaderContext readerContext) throws IOException { final FieldCache.DocTermsIndex sindex = FieldCache.DEFAULT.getTermsIndex(readerContext.reader(), field); // figure out what ord maps to true int nord = sindex.numOrd(); BytesRef br = new BytesRef(); int tord = -1; for (int i=1; i<nord; i++) { sindex.lookup(i, br); if (br.length==1 && br.bytes[br.offset]=='T') { tord = i; break; } } final int trueOrd = tord; return new BoolDocValues(this) { @Override public boolean boolVal(int doc) { return sindex.getOrd(doc) == trueOrd; } @Override public boolean exists(int doc) { return sindex.getOrd(doc) != 0; } @Override public ValueFiller getValueFiller() { return new ValueFiller() { private final MutableValueBool mval = new MutableValueBool(); @Override public MutableValue getValue() { return mval; } @Override public void fillValue(int doc) { int ord = sindex.getOrd(doc); mval.value = (ord == trueOrd); mval.exists = (ord != 0); } }; } }; }
// in core/src/java/org/apache/solr/schema/BinaryField.java
Override public void write(TextResponseWriter writer, String name, IndexableField f) throws IOException { writer.writeStr(name, toBase64String(toObject(f)), false); }
// in core/src/java/org/apache/solr/schema/LatLonType.java
Override public void write(TextResponseWriter writer, String name, IndexableField f) throws IOException { writer.writeStr(name, f.stringValue(), false); }
// in core/src/java/org/apache/solr/schema/LatLonType.java
Override public Query rewrite(IndexReader reader) throws IOException { return bboxQuery != null ? bboxQuery.rewrite(reader) : this; }
// in core/src/java/org/apache/solr/schema/LatLonType.java
Override public float getValueForNormalization() throws IOException { queryWeight = getBoost(); return queryWeight * queryWeight; }
// in core/src/java/org/apache/solr/schema/LatLonType.java
Override public Scorer scorer(AtomicReaderContext context, boolean scoreDocsInOrder, boolean topScorer, Bits acceptDocs) throws IOException { return new SpatialScorer(context, acceptDocs, this, queryWeight); }
// in core/src/java/org/apache/solr/schema/LatLonType.java
Override public Explanation explain(AtomicReaderContext context, int doc) throws IOException { return ((SpatialScorer)scorer(context, true, true, context.reader().getLiveDocs())).explain(doc); }
// in core/src/java/org/apache/solr/schema/LatLonType.java
Override public int nextDoc() throws IOException { for(;;) { ++doc; if (doc>=maxDoc) { return doc=NO_MORE_DOCS; } if (acceptDocs != null && !acceptDocs.get(doc)) continue; if (!match()) continue; return doc; } }
// in core/src/java/org/apache/solr/schema/LatLonType.java
Override public int advance(int target) throws IOException { // this will work even if target==NO_MORE_DOCS doc=target-1; return nextDoc(); }
// in core/src/java/org/apache/solr/schema/LatLonType.java
Override public float score() throws IOException { double dist = (doc == lastDistDoc) ? lastDist : dist(latVals.doubleVal(doc), lonVals.doubleVal(doc)); return (float)(dist * qWeight); }
// in core/src/java/org/apache/solr/schema/LatLonType.java
public Explanation explain(int doc) throws IOException { advance(doc); boolean matched = this.doc == doc; this.doc = doc; float sc = matched ? score() : 0; double dist = dist(latVals.doubleVal(doc), lonVals.doubleVal(doc)); String description = SpatialDistanceQuery.this.toString(); Explanation result = new ComplexExplanation (this.doc == doc, sc, description + " product of:"); // result.addDetail(new Explanation((float)dist, "hsin("+latVals.explain(doc)+","+lonVals.explain(doc))); result.addDetail(new Explanation((float)dist, "hsin("+latVals.doubleVal(doc)+","+lonVals.doubleVal(doc))); result.addDetail(new Explanation(getBoost(), "boost")); result.addDetail(new Explanation(weight.queryNorm,"queryNorm")); return result; }
// in core/src/java/org/apache/solr/schema/LatLonType.java
Override public void collect(int doc) throws IOException { spatialScorer.doc = doc; if (spatialScorer.match()) delegate.collect(doc); }
// in core/src/java/org/apache/solr/schema/LatLonType.java
Override public void setNextReader(AtomicReaderContext context) throws IOException { maxdoc = context.reader().maxDoc(); spatialScorer = new SpatialScorer(context, null, weight, 1.0f); super.setNextReader(context); }
// in core/src/java/org/apache/solr/schema/LatLonType.java
Override public Weight createWeight(IndexSearcher searcher) throws IOException { // if we were supposed to use bboxQuery, then we should have been rewritten using that query assert bboxQuery == null; return new SpatialWeight(searcher); }
// in core/src/java/org/apache/solr/schema/ByteField.java
Override public void write(TextResponseWriter writer, String name, IndexableField f) throws IOException { String s = f.stringValue(); // these values may be from a legacy lucene index, which may // not be properly formatted in some output formats, or may // incorrectly have a zero length. if (s.length()==0) { // zero length value means someone mistakenly indexed the value // instead of simply leaving it out. Write a null value instead of a numeric. writer.writeNull(name); return; } try { byte val = Byte.parseByte(s); writer.writeInt(name, val); } catch (NumberFormatException e){ // can't parse - write out the contents as a string so nothing is lost and // clients don't get a parse error. writer.writeStr(name, s, true); } }
// in core/src/java/org/apache/solr/schema/PointType.java
Override public void write(TextResponseWriter writer, String name, IndexableField f) throws IOException { writer.writeStr(name, f.stringValue(), false); }
// in core/src/java/org/apache/solr/schema/UUIDField.java
Override public void write(TextResponseWriter writer, String name, IndexableField f) throws IOException { writer.writeStr(name, f.stringValue(), false); }
// in core/src/java/org/apache/solr/schema/SimplePreAnalyzedParser.java
Override public ParseResult parse(Reader reader, AttributeSource parent) throws IOException { ParseResult res = new ParseResult(); StringBuilder sb = new StringBuilder(); char[] buf = new char[128]; int cnt; while ((cnt = reader.read(buf)) > 0) { sb.append(buf, 0, cnt); } String val = sb.toString(); // empty string - accept even without version number if (val.length() == 0) { return res; } // first consume the version int idx = val.indexOf(' '); if (idx == -1) { throw new IOException("Missing VERSION token"); } String version = val.substring(0, idx); if (!VERSION.equals(version)) { throw new IOException("Unknown VERSION " + version); } val = val.substring(idx + 1); // then consume the optional stored part int tsStart = 0; boolean hasStored = false; StringBuilder storedBuf = new StringBuilder(); if (val.charAt(0) == '=') { hasStored = true; if (val.length() > 1) { for (int i = 1; i < val.length(); i++) { char c = val.charAt(i); if (c == '\\') { if (i < val.length() - 1) { c = val.charAt(++i); if (c == '=') { // we recognize only \= escape in the stored part storedBuf.append('='); } else { storedBuf.append('\\'); storedBuf.append(c); continue; } } else { storedBuf.append(c); continue; } } else if (c == '=') { // end of stored text tsStart = i + 1; break; } else { storedBuf.append(c); } } if (tsStart == 0) { // missing end-of-stored marker throw new IOException("Missing end marker of stored part"); } } else { throw new IOException("Unexpected end of stored field"); } } if (hasStored) { res.str = storedBuf.toString(); } Tok tok = new Tok(); StringBuilder attName = new StringBuilder(); StringBuilder attVal = new StringBuilder(); // parser state S s = S.UNDEF; int lastPos = 0; for (int i = tsStart; i < val.length(); i++) { char c = val.charAt(i); if (c == ' ') { // collect leftovers switch (s) { case VALUE : if (attVal.length() == 0) { throw new IOException("Unexpected character '" + c + "' at position " + i + " - empty value of attribute."); } if (attName.length() > 0) { tok.attr.put(attName.toString(), attVal.toString()); } break; case NAME: // attr name without a value ? if (attName.length() > 0) { throw new IOException("Unexpected character '" + c + "' at position " + i + " - missing attribute value."); } else { // accept missing att name and value } break; case TOKEN: case UNDEF: // do nothing, advance to next token } attName.setLength(0); attVal.setLength(0); if (!tok.isEmpty() || s == S.NAME) { AttributeSource.State state = createState(parent, tok, lastPos); if (state != null) res.states.add(state.clone()); } // reset tok s = S.UNDEF; tok.reset(); // skip lastPos++; continue; } StringBuilder tgt = null; switch (s) { case TOKEN: tgt = tok.token; break; case NAME: tgt = attName; break; case VALUE: tgt = attVal; break; case UNDEF: tgt = tok.token; s = S.TOKEN; } if (c == '\\') { if (s == S.TOKEN) lastPos++; if (i >= val.length() - 1) { // end tgt.append(c); continue; } else { c = val.charAt(++i); switch (c) { case '\\' : case '=' : case ',' : case ' ' : tgt.append(c); break; case 'n': tgt.append('\n'); break; case 'r': tgt.append('\r'); break; case 't': tgt.append('\t'); break; default: tgt.append('\\'); tgt.append(c); lastPos++; } } } else { // state switch if (c == ',') { if (s == S.TOKEN) { s = S.NAME; } else if (s == S.VALUE) { // end of value, start of next attr if (attVal.length() == 0) { throw new IOException("Unexpected character '" + c + "' at position " + i + " - empty value of attribute."); } if (attName.length() > 0 && attVal.length() > 0) { tok.attr.put(attName.toString(), attVal.toString()); } // reset attName.setLength(0); attVal.setLength(0); s = S.NAME; } else { throw new IOException("Unexpected character '" + c + "' at position " + i + " - missing attribute value."); } } else if (c == '=') { if (s == S.NAME) { s = S.VALUE; } else { throw new IOException("Unexpected character '" + c + "' at position " + i + " - empty value of attribute."); } } else { tgt.append(c); if (s == S.TOKEN) lastPos++; } } } // collect leftovers if (!tok.isEmpty() || s == S.NAME || s == S.VALUE) { // remaining attrib? if (s == S.VALUE) { if (attName.length() > 0 && attVal.length() > 0) { tok.attr.put(attName.toString(), attVal.toString()); } } AttributeSource.State state = createState(parent, tok, lastPos); if (state != null) res.states.add(state.clone()); } return res; }
// in core/src/java/org/apache/solr/schema/SimplePreAnalyzedParser.java
public String toFormattedString(Field f) throws IOException { StringBuilder sb = new StringBuilder(); sb.append(VERSION + " "); if (f.fieldType().stored()) { String s = f.stringValue(); if (s != null) { // encode the equals sign s = s.replaceAll("=", "\\="); sb.append('='); sb.append(s); sb.append('='); } } TokenStream ts = f.tokenStreamValue(); if (ts != null) { StringBuilder tok = new StringBuilder(); boolean next = false; while (ts.incrementToken()) { if (next) { sb.append(' '); } else { next = true; } tok.setLength(0); Iterator<Class<? extends Attribute>> it = ts.getAttributeClassesIterator(); String cTerm = null; String tTerm = null; while (it.hasNext()) { Class<? extends Attribute> cl = it.next(); if (!ts.hasAttribute(cl)) { continue; } Attribute att = ts.getAttribute(cl); if (cl.isAssignableFrom(CharTermAttribute.class)) { CharTermAttribute catt = (CharTermAttribute)att; cTerm = escape(catt.buffer(), catt.length()); } else if (cl.isAssignableFrom(TermToBytesRefAttribute.class)) { TermToBytesRefAttribute tatt = (TermToBytesRefAttribute)att; char[] tTermChars = tatt.getBytesRef().utf8ToString().toCharArray(); tTerm = escape(tTermChars, tTermChars.length); } else { if (tok.length() > 0) tok.append(','); if (cl.isAssignableFrom(FlagsAttribute.class)) { tok.append("f=" + Integer.toHexString(((FlagsAttribute)att).getFlags())); } else if (cl.isAssignableFrom(OffsetAttribute.class)) { tok.append("s=" + ((OffsetAttribute)att).startOffset() + ",e=" + ((OffsetAttribute)att).endOffset()); } else if (cl.isAssignableFrom(PayloadAttribute.class)) { Payload p = ((PayloadAttribute)att).getPayload(); if (p != null && p.length() > 0) { tok.append("p=" + bytesToHex(p.getData(), p.getOffset(), p.length())); } else if (tok.length() > 0) { tok.setLength(tok.length() - 1); // remove the last comma } } else if (cl.isAssignableFrom(PositionIncrementAttribute.class)) { tok.append("i=" + ((PositionIncrementAttribute)att).getPositionIncrement()); } else if (cl.isAssignableFrom(TypeAttribute.class)) { tok.append("y=" + escape(((TypeAttribute)att).type())); } else { tok.append(cl.getName() + "=" + escape(att.toString())); } } } String term = null; if (cTerm != null) { term = cTerm; } else { term = tTerm; } if (term != null && term.length() > 0) { if (tok.length() > 0) { tok.insert(0, term + ","); } else { tok.insert(0, term); } } sb.append(tok); } } return sb.toString(); }
// in core/src/java/org/apache/solr/schema/LongField.java
Override public void write(TextResponseWriter writer, String name, IndexableField f) throws IOException { String s = f.stringValue(); // these values may be from a legacy lucene index, which may // not be properly formatted in some output formats, or may // incorrectly have a zero length. if (s.length()==0) { // zero length value means someone mistakenly indexed the value // instead of simply leaving it out. Write a null value instead of a numeric. writer.writeNull(name); return; } try { long val = Long.parseLong(s); writer.writeLong(name, val); } catch (NumberFormatException e){ // can't parse - write out the contents as a string so nothing is lost and // clients don't get a parse error. writer.writeStr(name, s, true); } }
// in core/src/java/org/apache/solr/schema/GeoHashField.java
Override public void write(TextResponseWriter writer, String name, IndexableField f) throws IOException { writer.writeStr(name, toExternal(f), false); }
// in core/src/java/org/apache/solr/schema/ExternalFileField.java
Override public void write(TextResponseWriter writer, String name, IndexableField f) throws IOException { throw new UnsupportedOperationException(); }
// in core/src/java/org/apache/solr/schema/ShortField.java
Override public void write(TextResponseWriter writer, String name, IndexableField f) throws IOException { String s = f.stringValue(); // these values may be from a legacy lucene index, which may // not be properly formatted in some output formats, or may // incorrectly have a zero length. if (s.length()==0) { // zero length value means someone mistakenly indexed the value // instead of simply leaving it out. Write a null value instead of a numeric. writer.writeNull(name); return; } try { short val = Short.parseShort(s); writer.writeInt(name, val); } catch (NumberFormatException e){ // can't parse - write out the contents as a string so nothing is lost and // clients don't get a parse error. writer.writeStr(name, s, true); } }
// in core/src/java/org/apache/solr/schema/SchemaField.java
public void write(TextResponseWriter writer, String name, IndexableField val) throws IOException { // name is passed in because it may be null if name should not be used. type.write(writer,name,val); }
// in core/src/java/org/apache/solr/schema/StrFieldSource.java
Override public FunctionValues getValues(Map context, AtomicReaderContext readerContext) throws IOException { return new DocTermsIndexDocValues(this, readerContext, field) { @Override protected String toTerm(String readableValue) { return readableValue; } @Override public int ordVal(int doc) { return termsIndex.getOrd(doc); } @Override public int numOrd() { return termsIndex.numOrd(); } @Override public Object objectVal(int doc) { return strVal(doc); } @Override public String toString(int doc) { return description() + '=' + strVal(doc); } }; }
// in core/src/java/org/apache/solr/schema/PreAnalyzedField.java
Override public void write(TextResponseWriter writer, String name, IndexableField f) throws IOException { writer.writeStr(name, f.stringValue(), true); }
// in core/src/java/org/apache/solr/schema/PreAnalyzedField.java
public String toFormattedString(Field f) throws IOException { return parser.toFormattedString(f); }
// in core/src/java/org/apache/solr/schema/PreAnalyzedField.java
public final boolean incrementToken() throws IOException { // lazy init the iterator if (it == null) { it = cachedStates.iterator(); } if (!it.hasNext()) { return false; } AttributeSource.State state = (State) it.next(); restoreState(state.clone()); return true; }
// in core/src/java/org/apache/solr/schema/PreAnalyzedField.java
Override public void reset(Reader input) throws IOException { super.reset(input); cachedStates.clear(); stringValue = null; binaryValue = null; ParseResult res = parser.parse(input, this); if (res != null) { stringValue = res.str; binaryValue = res.bin; if (res.states != null) { cachedStates.addAll(res.states); } } }
// in core/src/java/org/apache/solr/schema/TextField.java
Override public void write(TextResponseWriter writer, String name, IndexableField f) throws IOException { writer.writeStr(name, f.stringValue(), true); }
// in core/src/java/org/apache/solr/schema/TrieDateField.java
Override public void write(TextResponseWriter writer, String name, IndexableField f) throws IOException { wrappedField.write(writer, name, f); }
// in core/src/java/org/apache/solr/schema/SortableLongField.java
Override public void write(TextResponseWriter writer, String name, IndexableField f) throws IOException { String sval = f.stringValue(); writer.writeLong(name, NumberUtils.SortableStr2long(sval,0,sval.length())); }
// in core/src/java/org/apache/solr/schema/SortableLongField.java
Override public FunctionValues getValues(Map context, AtomicReaderContext readerContext) throws IOException { final long def = defVal; return new DocTermsIndexDocValues(this, readerContext, field) { private final BytesRef spare = new BytesRef(); @Override protected String toTerm(String readableValue) { return NumberUtils.long2sortableStr(readableValue); } @Override public float floatVal(int doc) { return (float)longVal(doc); } @Override public int intVal(int doc) { return (int)longVal(doc); } @Override public long longVal(int doc) { int ord=termsIndex.getOrd(doc); return ord==0 ? def : NumberUtils.SortableStr2long(termsIndex.lookup(ord, spare),0,5); } @Override public double doubleVal(int doc) { return (double)longVal(doc); } @Override public String strVal(int doc) { return Long.toString(longVal(doc)); } @Override public Object objectVal(int doc) { int ord=termsIndex.getOrd(doc); return ord==0 ? null : NumberUtils.SortableStr2long(termsIndex.lookup(ord, spare)); } @Override public String toString(int doc) { return description() + '=' + longVal(doc); } @Override public ValueFiller getValueFiller() { return new ValueFiller() { private final MutableValueLong mval = new MutableValueLong(); @Override public MutableValue getValue() { return mval; } @Override public void fillValue(int doc) { int ord=termsIndex.getOrd(doc); if (ord == 0) { mval.value = def; mval.exists = false; } else { mval.value = NumberUtils.SortableStr2long(termsIndex.lookup(ord, spare),0,5); mval.exists = true; } } }; } }; }
// in core/src/java/org/apache/solr/schema/SortableDoubleField.java
Override public void write(TextResponseWriter writer, String name, IndexableField f) throws IOException { String sval = f.stringValue(); writer.writeDouble(name, NumberUtils.SortableStr2double(sval)); }
// in core/src/java/org/apache/solr/schema/SortableDoubleField.java
Override public FunctionValues getValues(Map context, AtomicReaderContext readerContext) throws IOException { final double def = defVal; return new DocTermsIndexDocValues(this, readerContext, field) { private final BytesRef spare = new BytesRef(); @Override protected String toTerm(String readableValue) { return NumberUtils.double2sortableStr(readableValue); } @Override public float floatVal(int doc) { return (float)doubleVal(doc); } @Override public int intVal(int doc) { return (int)doubleVal(doc); } @Override public long longVal(int doc) { return (long)doubleVal(doc); } @Override public double doubleVal(int doc) { int ord=termsIndex.getOrd(doc); return ord==0 ? def : NumberUtils.SortableStr2double(termsIndex.lookup(ord, spare)); } @Override public String strVal(int doc) { return Double.toString(doubleVal(doc)); } @Override public Object objectVal(int doc) { int ord=termsIndex.getOrd(doc); return ord==0 ? null : NumberUtils.SortableStr2double(termsIndex.lookup(ord, spare)); } @Override public String toString(int doc) { return description() + '=' + doubleVal(doc); } @Override public ValueFiller getValueFiller() { return new ValueFiller() { private final MutableValueDouble mval = new MutableValueDouble(); @Override public MutableValue getValue() { return mval; } @Override public void fillValue(int doc) { int ord=termsIndex.getOrd(doc); if (ord == 0) { mval.value = def; mval.exists = false; } else { mval.value = NumberUtils.SortableStr2double(termsIndex.lookup(ord, spare)); mval.exists = true; } } }; } }; }
// in core/src/java/org/apache/solr/schema/CollationField.java
Override public void write(TextResponseWriter writer, String name, IndexableField f) throws IOException { writer.writeStr(name, f.stringValue(), true); }
// in core/src/java/org/apache/solr/schema/JsonPreAnalyzedParser.java
Override public String toFormattedString(Field f) throws IOException { Map<String,Object> map = new HashMap<String,Object>(); map.put(VERSION_KEY, VERSION); if (f.fieldType().stored()) { String stringValue = f.stringValue(); if (stringValue != null) { map.put(STRING_KEY, stringValue); } BytesRef binaryValue = f.binaryValue(); if (binaryValue != null) { map.put(BINARY_KEY, Base64.byteArrayToBase64(binaryValue.bytes, binaryValue.offset, binaryValue.length)); } } TokenStream ts = f.tokenStreamValue(); if (ts != null) { List<Map<String,Object>> tokens = new LinkedList<Map<String,Object>>(); while (ts.incrementToken()) { Iterator<Class<? extends Attribute>> it = ts.getAttributeClassesIterator(); String cTerm = null; String tTerm = null; Map<String,Object> tok = new TreeMap<String,Object>(); while (it.hasNext()) { Class<? extends Attribute> cl = it.next(); if (!ts.hasAttribute(cl)) { continue; } Attribute att = ts.getAttribute(cl); if (cl.isAssignableFrom(CharTermAttribute.class)) { CharTermAttribute catt = (CharTermAttribute)att; cTerm = new String(catt.buffer(), 0, catt.length()); } else if (cl.isAssignableFrom(TermToBytesRefAttribute.class)) { TermToBytesRefAttribute tatt = (TermToBytesRefAttribute)att; tTerm = tatt.getBytesRef().utf8ToString(); } else { if (cl.isAssignableFrom(FlagsAttribute.class)) { tok.put(FLAGS_KEY, Integer.toHexString(((FlagsAttribute)att).getFlags())); } else if (cl.isAssignableFrom(OffsetAttribute.class)) { tok.put(OFFSET_START_KEY, ((OffsetAttribute)att).startOffset()); tok.put(OFFSET_END_KEY, ((OffsetAttribute)att).endOffset()); } else if (cl.isAssignableFrom(PayloadAttribute.class)) { Payload p = ((PayloadAttribute)att).getPayload(); if (p != null && p.length() > 0) { tok.put(PAYLOAD_KEY, Base64.byteArrayToBase64(p.getData(), p.getOffset(), p.length())); } } else if (cl.isAssignableFrom(PositionIncrementAttribute.class)) { tok.put(POSINCR_KEY, ((PositionIncrementAttribute)att).getPositionIncrement()); } else if (cl.isAssignableFrom(TypeAttribute.class)) { tok.put(TYPE_KEY, ((TypeAttribute)att).type()); } else { tok.put(cl.getName(), att.toString()); } } } String term = null; if (cTerm != null) { term = cTerm; } else { term = tTerm; } if (term != null && term.length() > 0) { tok.put(TOKEN_KEY, term); } tokens.add(tok); } map.put(TOKENS_KEY, tokens); } return JSONUtil.toJSON(map, -1); }
// in core/src/java/org/apache/solr/schema/FieldType.java
Override public TokenStreamComponents createComponents(String fieldName, Reader reader) { Tokenizer ts = new Tokenizer(reader) { final char[] cbuf = new char[maxChars]; final CharTermAttribute termAtt = addAttribute(CharTermAttribute.class); final OffsetAttribute offsetAtt = addAttribute(OffsetAttribute.class); @Override public boolean incrementToken() throws IOException { clearAttributes(); int n = input.read(cbuf,0,maxChars); if (n<=0) return false; String s = toInternal(new String(cbuf,0,n)); termAtt.setEmpty().append(s); offsetAtt.setOffset(correctOffset(0),correctOffset(n)); return true; } }; return new TokenStreamComponents(ts); }
// in core/src/java/org/apache/solr/schema/FieldType.java
Override public boolean incrementToken() throws IOException { clearAttributes(); int n = input.read(cbuf,0,maxChars); if (n<=0) return false; String s = toInternal(new String(cbuf,0,n)); termAtt.setEmpty().append(s); offsetAtt.setOffset(correctOffset(0),correctOffset(n)); return true; }
// in core/src/java/org/apache/solr/internal/csv/CSVParser.java
public String[][] getAllValues() throws IOException { ArrayList records = new ArrayList(); String[] values; String[][] ret = null; while ((values = getLine()) != null) { records.add(values); } if (records.size() > 0) { ret = new String[records.size()][]; records.toArray(ret); } return ret; }
// in core/src/java/org/apache/solr/internal/csv/CSVParser.java
public String nextValue() throws IOException { Token tkn = nextToken(); String ret = null; switch (tkn.type) { case TT_TOKEN: case TT_EORECORD: ret = tkn.content.toString(); break; case TT_EOF: ret = null; break; case TT_INVALID: default: // error no token available (or error) throw new IOException( "(line " + getLineNumber() + ") invalid parse sequence"); // unreachable: break; } return ret; }
// in core/src/java/org/apache/solr/internal/csv/CSVParser.java
public String[] getLine() throws IOException { String[] ret = EMPTY_STRING_ARRAY; record.clear(); while (true) { reusableToken.reset(); nextToken(reusableToken); switch (reusableToken.type) { case TT_TOKEN: record.add(reusableToken.content.toString()); break; case TT_EORECORD: record.add(reusableToken.content.toString()); break; case TT_EOF: if (reusableToken.isReady) { record.add(reusableToken.content.toString()); } else { ret = null; } break; case TT_INVALID: default: // error: throw IOException throw new IOException("(line " + getLineNumber() + ") invalid parse sequence"); // unreachable: break; } if (reusableToken.type != TT_TOKEN) { break; } } if (!record.isEmpty()) { ret = (String[]) record.toArray(new String[record.size()]); } return ret; }
// in core/src/java/org/apache/solr/internal/csv/CSVParser.java
protected Token nextToken() throws IOException { return nextToken(new Token()); }
// in core/src/java/org/apache/solr/internal/csv/CSVParser.java
protected Token nextToken(Token tkn) throws IOException { wsBuf.clear(); // resuse // get the last read char (required for empty line detection) int lastChar = in.readAgain(); // read the next char and set eol /* note: unfourtunately isEndOfLine may consumes a character silently. * this has no effect outside of the method. so a simple workaround * is to call 'readAgain' on the stream... * uh: might using objects instead of base-types (jdk1.5 autoboxing!) */ int c = in.read(); boolean eol = isEndOfLine(c); c = in.readAgain(); // empty line detection: eol AND (last char was EOL or beginning) while (strategy.getIgnoreEmptyLines() && eol && (lastChar == '\n' || lastChar == ExtendedBufferedReader.UNDEFINED) && !isEndOfFile(lastChar)) { // go on char ahead ... lastChar = c; c = in.read(); eol = isEndOfLine(c); c = in.readAgain(); // reached end of file without any content (empty line at the end) if (isEndOfFile(c)) { tkn.type = TT_EOF; return tkn; } } // did we reached eof during the last iteration already ? TT_EOF if (isEndOfFile(lastChar) || (lastChar != strategy.getDelimiter() && isEndOfFile(c))) { tkn.type = TT_EOF; return tkn; } // important: make sure a new char gets consumed in each iteration while (!tkn.isReady && tkn.type != TT_EOF) { // ignore whitespaces at beginning of a token while (strategy.getIgnoreLeadingWhitespaces() && isWhitespace(c) && !eol) { wsBuf.append((char) c); c = in.read(); eol = isEndOfLine(c); } // ok, start of token reached: comment, encapsulated, or token if (c == strategy.getCommentStart()) { // ignore everything till end of line and continue (incr linecount) in.readLine(); tkn = nextToken(tkn.reset()); } else if (c == strategy.getDelimiter()) { // empty token return TT_TOKEN("") tkn.type = TT_TOKEN; tkn.isReady = true; } else if (eol) { // empty token return TT_EORECORD("") //noop: tkn.content.append(""); tkn.type = TT_EORECORD; tkn.isReady = true; } else if (c == strategy.getEncapsulator()) { // consume encapsulated token encapsulatedTokenLexer(tkn, c); } else if (isEndOfFile(c)) { // end of file return TT_EOF() //noop: tkn.content.append(""); tkn.type = TT_EOF; tkn.isReady = true; } else { // next token must be a simple token // add removed blanks when not ignoring whitespace chars... if (!strategy.getIgnoreLeadingWhitespaces()) { tkn.content.append(wsBuf); } simpleTokenLexer(tkn, c); } } return tkn; }
// in core/src/java/org/apache/solr/internal/csv/CSVParser.java
private Token simpleTokenLexer(Token tkn, int c) throws IOException { for (;;) { if (isEndOfLine(c)) { // end of record tkn.type = TT_EORECORD; tkn.isReady = true; break; } else if (isEndOfFile(c)) { // end of file tkn.type = TT_EOF; tkn.isReady = true; break; } else if (c == strategy.getDelimiter()) { // end of token tkn.type = TT_TOKEN; tkn.isReady = true; break; } else if (c == '\\' && strategy.getUnicodeEscapeInterpretation() && in.lookAhead() == 'u') { // interpret unicode escaped chars (like \u0070 -> p) tkn.content.append((char) unicodeEscapeLexer(c)); } else if (c == strategy.getEscape()) { tkn.content.append((char)readEscape(c)); } else { tkn.content.append((char) c); } c = in.read(); } if (strategy.getIgnoreTrailingWhitespaces()) { tkn.content.trimTrailingWhitespace(); } return tkn; }
// in core/src/java/org/apache/solr/internal/csv/CSVParser.java
private Token encapsulatedTokenLexer(Token tkn, int c) throws IOException { // save current line int startLineNumber = getLineNumber(); // ignore the given delimiter // assert c == delimiter; for (;;) { c = in.read(); if (c == '\\' && strategy.getUnicodeEscapeInterpretation() && in.lookAhead()=='u') { tkn.content.append((char) unicodeEscapeLexer(c)); } else if (c == strategy.getEscape()) { tkn.content.append((char)readEscape(c)); } else if (c == strategy.getEncapsulator()) { if (in.lookAhead() == strategy.getEncapsulator()) { // double or escaped encapsulator -> add single encapsulator to token c = in.read(); tkn.content.append((char) c); } else { // token finish mark (encapsulator) reached: ignore whitespace till delimiter for (;;) { c = in.read(); if (c == strategy.getDelimiter()) { tkn.type = TT_TOKEN; tkn.isReady = true; return tkn; } else if (isEndOfFile(c)) { tkn.type = TT_EOF; tkn.isReady = true; return tkn; } else if (isEndOfLine(c)) { // ok eo token reached tkn.type = TT_EORECORD; tkn.isReady = true; return tkn; } else if (!isWhitespace(c)) { // error invalid char between token and next delimiter throw new IOException( "(line " + getLineNumber() + ") invalid char between encapsulated token end delimiter" ); } } } } else if (isEndOfFile(c)) { // error condition (end of file before end of token) throw new IOException( "(startline " + startLineNumber + ")" + "eof reached before encapsulated token finished" ); } else { // consume character tkn.content.append((char) c); } } }
// in core/src/java/org/apache/solr/internal/csv/CSVParser.java
protected int unicodeEscapeLexer(int c) throws IOException { int ret = 0; // ignore 'u' (assume c==\ now) and read 4 hex digits c = in.read(); code.clear(); try { for (int i = 0; i < 4; i++) { c = in.read(); if (isEndOfFile(c) || isEndOfLine(c)) { throw new NumberFormatException("number too short"); } code.append((char) c); } ret = Integer.parseInt(code.toString(), 16); } catch (NumberFormatException e) { throw new IOException( "(line " + getLineNumber() + ") Wrong unicode escape sequence found '" + code.toString() + "'" + e.toString()); } return ret; }
// in core/src/java/org/apache/solr/internal/csv/CSVParser.java
private int readEscape(int c) throws IOException { // assume c is the escape char (normally a backslash) c = in.read(); int out; switch (c) { case 'r': out='\r'; break; case 'n': out='\n'; break; case 't': out='\t'; break; case 'b': out='\b'; break; case 'f': out='\f'; break; default : out=c; } return out; }
// in core/src/java/org/apache/solr/internal/csv/CSVParser.java
private boolean isEndOfLine(int c) throws IOException { // check if we have \r\n... if (c == '\r') { if (in.lookAhead() == '\n') { // note: does not change c outside of this method !! c = in.read(); } } return (c == '\n'); }
// in core/src/java/org/apache/solr/internal/csv/CSVUtils.java
public static String[][] parse(String s) throws IOException { if (s == null) { throw new IllegalArgumentException("Null argument not allowed."); } String[][] result = (new CSVParser(new StringReader(s))).getAllValues(); if (result == null) { // since CSVStrategy ignores empty lines an empty array is returned // (i.e. not "result = new String[][] {{""}};") result = EMPTY_DOUBLE_STRING_ARRAY; } return result; }
// in core/src/java/org/apache/solr/internal/csv/CSVUtils.java
public static String[] parseLine(String s) throws IOException { if (s == null) { throw new IllegalArgumentException("Null argument not allowed."); } // uh,jh: make sure that parseLine("").length == 0 if (s.length() == 0) { return EMPTY_STRING_ARRAY; } return (new CSVParser(new StringReader(s))).getLine(); }
// in core/src/java/org/apache/solr/internal/csv/CSVPrinter.java
public void println() throws IOException { out.write(strategy.getPrinterNewline()); newLine = true; }
// in core/src/java/org/apache/solr/internal/csv/CSVPrinter.java
public void flush() throws IOException { out.flush(); }
// in core/src/java/org/apache/solr/internal/csv/CSVPrinter.java
public void println(String[] values) throws IOException { for (int i = 0; i < values.length; i++) { print(values[i]); } println(); }
// in core/src/java/org/apache/solr/internal/csv/CSVPrinter.java
public void printlnComment(String comment) throws IOException { if(this.strategy.isCommentingDisabled()) { return; } if (!newLine) { println(); } out.write(this.strategy.getCommentStart()); out.write(' '); for (int i = 0; i < comment.length(); i++) { char c = comment.charAt(i); switch (c) { case '\r' : if (i + 1 < comment.length() && comment.charAt(i + 1) == '\n') { i++; } // break intentionally excluded. case '\n' : println(); out.write(this.strategy.getCommentStart()); out.write(' '); break; default : out.write(c); break; } } println(); }
// in core/src/java/org/apache/solr/internal/csv/CSVPrinter.java
public void print(char[] value, int offset, int len, boolean checkForEscape) throws IOException { if (!checkForEscape) { printSep(); out.write(value, offset, len); return; } if (strategy.getEncapsulator() != CSVStrategy.ENCAPSULATOR_DISABLED) { printAndEncapsulate(value, offset, len); } else if (strategy.getEscape() != CSVStrategy.ESCAPE_DISABLED) { printAndEscape(value, offset, len); } else { printSep(); out.write(value, offset, len); } }
// in core/src/java/org/apache/solr/internal/csv/CSVPrinter.java
void printSep() throws IOException { if (newLine) { newLine = false; } else { out.write(this.strategy.getDelimiter()); } }
// in core/src/java/org/apache/solr/internal/csv/CSVPrinter.java
void printAndEscape(char[] value, int offset, int len) throws IOException { int start = offset; int pos = offset; int end = offset + len; printSep(); char delim = this.strategy.getDelimiter(); char escape = this.strategy.getEscape(); while (pos < end) { char c = value[pos]; if (c == '\r' || c=='\n' || c==delim || c==escape) { // write out segment up until this char int l = pos-start; if (l>0) { out.write(value, start, l); } if (c=='\n') c='n'; else if (c=='\r') c='r'; out.write(escape); out.write(c); start = pos+1; // start on the current char after this one } pos++; } // write last segment int l = pos-start; if (l>0) { out.write(value, start, l); } }
// in core/src/java/org/apache/solr/internal/csv/CSVPrinter.java
void printAndEncapsulate(char[] value, int offset, int len) throws IOException { boolean first = newLine; // is this the first value on this line? boolean quote = false; int start = offset; int pos = offset; int end = offset + len; printSep(); char delim = this.strategy.getDelimiter(); char encapsulator = this.strategy.getEncapsulator(); if (len <= 0) { // always quote an empty token that is the first // on the line, as it may be the only thing on the // line. If it were not quoted in that case, // an empty line has no tokens. if (first) { quote = true; } } else { char c = value[pos]; // Hmmm, where did this rule come from? if (first && (c < '0' || (c > '9' && c < 'A') || (c > 'Z' && c < 'a') || (c > 'z'))) { quote = true; // } else if (c == ' ' || c == '\f' || c == '\t') { } else if (c <= '#') { // Some other chars at the start of a value caused the parser to fail, so for now // encapsulate if we start in anything less than '#'. We are being conservative // by including the default comment char too. quote = true; } else { while (pos < end) { c = value[pos]; if (c=='\n' || c=='\r' || c==encapsulator || c==delim) { quote = true; break; } pos++; } if (!quote) { pos = end-1; c = value[pos]; // if (c == ' ' || c == '\f' || c == '\t') { // Some other chars at the end caused the parser to fail, so for now // encapsulate if we end in anything less than ' ' if (c <= ' ') { quote = true; } } } } if (!quote) { // no encapsulation needed - write out the original value out.write(value, offset, len); return; } // we hit something that needed encapsulation out.write(encapsulator); // Pick up where we left off: pos should be positioned on the first character that caused // the need for encapsulation. while (pos<end) { char c = value[pos]; if (c==encapsulator) { // write out the chunk up until this point // add 1 to the length to write out the encapsulator also out.write(value, start, pos-start+1); // put the next starting position on the encapsulator so we will // write it out again with the next string (effectively doubling it) start = pos; } pos++; } // write the last segment out.write(value, start, pos-start); out.write(encapsulator); }
// in core/src/java/org/apache/solr/internal/csv/CSVPrinter.java
public void print(String value, boolean checkForEscape) throws IOException { if (!checkForEscape) { // write directly from string printSep(); out.write(value); return; } if (buf.length < value.length()) { buf = new char[value.length()]; } value.getChars(0, value.length(), buf, 0); print(buf, 0, value.length(), checkForEscape); }
// in core/src/java/org/apache/solr/internal/csv/CSVPrinter.java
public void print(String value) throws IOException { print(value, true); }
// in core/src/java/org/apache/solr/internal/csv/ExtendedBufferedReader.java
public int read() throws IOException { // initalize the lookahead if (lookaheadChar == UNDEFINED) { lookaheadChar = super.read(); } lastChar = lookaheadChar; if (super.ready()) { lookaheadChar = super.read(); } else { lookaheadChar = UNDEFINED; } if (lastChar == '\n') { lineCounter++; } return lastChar; }
// in core/src/java/org/apache/solr/internal/csv/ExtendedBufferedReader.java
public int read(char[] buf, int off, int len) throws IOException { // do not claim if len == 0 if (len == 0) { return 0; } // init lookahead, but do not block !! if (lookaheadChar == UNDEFINED) { if (ready()) { lookaheadChar = super.read(); } else { return -1; } } // 'first read of underlying stream' if (lookaheadChar == -1) { return -1; } // continue until the lookaheadChar would block int cOff = off; while (len > 0 && ready()) { if (lookaheadChar == -1) { // eof stream reached, do not continue return cOff - off; } else { buf[cOff++] = (char) lookaheadChar; if (lookaheadChar == '\n') { lineCounter++; } lastChar = lookaheadChar; lookaheadChar = super.read(); len--; } } return cOff - off; }
// in core/src/java/org/apache/solr/internal/csv/ExtendedBufferedReader.java
public String readUntil(char c) throws IOException { if (lookaheadChar == UNDEFINED) { lookaheadChar = super.read(); } line.clear(); // reuse while (lookaheadChar != c && lookaheadChar != END_OF_STREAM) { line.append((char) lookaheadChar); if (lookaheadChar == '\n') { lineCounter++; } lastChar = lookaheadChar; lookaheadChar = super.read(); } return line.toString(); }
// in core/src/java/org/apache/solr/internal/csv/ExtendedBufferedReader.java
public String readLine() throws IOException { if (lookaheadChar == UNDEFINED) { lookaheadChar = super.read(); } line.clear(); //reuse // return null if end of stream has been reached if (lookaheadChar == END_OF_STREAM) { return null; } // do we have a line termination already char laChar = (char) lookaheadChar; if (laChar == '\n' || laChar == '\r') { lastChar = lookaheadChar; lookaheadChar = super.read(); // ignore '\r\n' as well if ((char) lookaheadChar == '\n') { lastChar = lookaheadChar; lookaheadChar = super.read(); } lineCounter++; return line.toString(); } // create the rest-of-line return and update the lookahead line.append(laChar); String restOfLine = super.readLine(); // TODO involves copying lastChar = lookaheadChar; lookaheadChar = super.read(); if (restOfLine != null) { line.append(restOfLine); } lineCounter++; return line.toString(); }
// in core/src/java/org/apache/solr/internal/csv/ExtendedBufferedReader.java
public long skip(long n) throws IllegalArgumentException, IOException { if (lookaheadChar == UNDEFINED) { lookaheadChar = super.read(); } // illegal argument if (n < 0) { throw new IllegalArgumentException("negative argument not supported"); } // no skipping if (n == 0 || lookaheadChar == END_OF_STREAM) { return 0; } // skip and reread the lookahead-char long skiped = 0; if (n > 1) { skiped = super.skip(n - 1); } lookaheadChar = super.read(); // fixme uh: we should check the skiped sequence for line-terminations... lineCounter = Integer.MIN_VALUE; return skiped + 1; }
// in core/src/java/org/apache/solr/internal/csv/ExtendedBufferedReader.java
public long skipUntil(char c) throws IllegalArgumentException, IOException { if (lookaheadChar == UNDEFINED) { lookaheadChar = super.read(); } long counter = 0; while (lookaheadChar != c && lookaheadChar != END_OF_STREAM) { if (lookaheadChar == '\n') { lineCounter++; } lookaheadChar = super.read(); counter++; } return counter; }
// in core/src/java/org/apache/solr/internal/csv/ExtendedBufferedReader.java
public int lookAhead() throws IOException { if (lookaheadChar == UNDEFINED) { lookaheadChar = super.read(); } return lookaheadChar; }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
public final int docFreq(Term term) throws IOException { return reader.docFreq(term); }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
public void close() throws IOException { if (debug) { if (cachingEnabled) { StringBuilder sb = new StringBuilder(); sb.append("Closing ").append(name); for (SolrCache cache : cacheList) { sb.append("\n\t"); sb.append(cache); } log.debug(sb.toString()); } else { if (debug) log.debug("Closing " + name); } } core.getInfoRegistry().remove(name); // super.close(); // can't use super.close() since it just calls reader.close() and that may only be called once // per reader (even if incRef() was previously called). if (closeReader) reader.decRef(); for (SolrCache cache : cacheList) { cache.close(); } directoryFactory.release(getIndexReader().directory()); // do this at the end so it only gets done if there are no exceptions numCloses.incrementAndGet(); }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
public static void initRegenerators(SolrConfig solrConfig) { if (solrConfig.fieldValueCacheConfig != null && solrConfig.fieldValueCacheConfig.getRegenerator() == null) { solrConfig.fieldValueCacheConfig.setRegenerator( new CacheRegenerator() { public boolean regenerateItem(SolrIndexSearcher newSearcher, SolrCache newCache, SolrCache oldCache, Object oldKey, Object oldVal) throws IOException { if (oldVal instanceof UnInvertedField) { UnInvertedField.getUnInvertedField((String)oldKey, newSearcher); } return true; } } ); }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
public boolean regenerateItem(SolrIndexSearcher newSearcher, SolrCache newCache, SolrCache oldCache, Object oldKey, Object oldVal) throws IOException { if (oldVal instanceof UnInvertedField) { UnInvertedField.getUnInvertedField((String)oldKey, newSearcher); } return true; }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
public boolean regenerateItem(SolrIndexSearcher newSearcher, SolrCache newCache, SolrCache oldCache, Object oldKey, Object oldVal) throws IOException { newSearcher.cacheDocSet((Query)oldKey, null, false); return true; }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
public boolean regenerateItem(SolrIndexSearcher newSearcher, SolrCache newCache, SolrCache oldCache, Object oldKey, Object oldVal) throws IOException { QueryResultKey key = (QueryResultKey)oldKey; int nDocs=1; // request 1 doc and let caching round up to the next window size... // unless the window size is <=1, in which case we will pick // the minimum of the number of documents requested last time and // a reasonable number such as 40. // TODO: make more configurable later... if (queryResultWindowSize<=1) { DocList oldList = (DocList)oldVal; int oldnDocs = oldList.offset() + oldList.size(); // 40 has factors of 2,4,5,10,20 nDocs = Math.min(oldnDocs,40); } int flags=NO_CHECK_QCACHE | key.nc_flags; QueryCommand qc = new QueryCommand(); qc.setQuery(key.query) .setFilterList(key.filters) .setSort(key.sort) .setLen(nDocs) .setSupersetMaxDoc(nDocs) .setFlags(flags); QueryResult qr = new QueryResult(); newSearcher.getDocListC(qr,qc); return true; }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
public QueryResult search(QueryResult qr, QueryCommand cmd) throws IOException { getDocListC(qr,cmd); return qr; }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
Override public void binaryField(FieldInfo fieldInfo, byte[] value, int offset, int length) throws IOException { doc.add(new StoredField(fieldInfo.name, value)); }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
Override public void stringField(FieldInfo fieldInfo, String value) throws IOException { final FieldType ft = new FieldType(TextField.TYPE_STORED); ft.setStoreTermVectors(fieldInfo.hasVectors()); ft.setIndexed(fieldInfo.isIndexed()); ft.setOmitNorms(fieldInfo.omitsNorms()); ft.setIndexOptions(fieldInfo.getIndexOptions()); doc.add(new Field(fieldInfo.name, value, ft)); }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
Override public Document doc(int i) throws IOException { return doc(i, (Set<String>)null); }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
Override public void doc(int n, StoredFieldVisitor visitor) throws IOException { getIndexReader().document(n, visitor); }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
public Document doc(int i, Set<String> fields) throws IOException { Document d; if (documentCache != null) { d = documentCache.get(i); if (d!=null) return d; } if(!enableLazyFieldLoading || fields == null) { d = getIndexReader().document(i); } else { final SetNonLazyFieldSelector visitor = new SetNonLazyFieldSelector(fields, getIndexReader(), i); getIndexReader().document(i, visitor); d = visitor.doc; } if (documentCache != null) { documentCache.put(i, d); } return d; }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
public void readDocs(Document[] docs, DocList ids) throws IOException { readDocs(docs, ids, null); }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
public void readDocs(Document[] docs, DocList ids, Set<String> fields) throws IOException { DocIterator iter = ids.iterator(); for (int i=0; i<docs.length; i++) { docs[i] = doc(iter.nextDoc(), fields); } }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
public Sort weightSort(Sort sort) throws IOException { return (sort != null) ? sort.rewrite(this) : null; }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
public int getFirstMatch(Term t) throws IOException { Fields fields = atomicReader.fields(); if (fields == null) return -1; Terms terms = fields.terms(t.field()); if (terms == null) return -1; BytesRef termBytes = t.bytes(); final TermsEnum termsEnum = terms.iterator(null); if (!termsEnum.seekExact(termBytes, false)) { return -1; } DocsEnum docs = termsEnum.docs(atomicReader.getLiveDocs(), null, false); if (docs == null) return -1; int id = docs.nextDoc(); return id == DocIdSetIterator.NO_MORE_DOCS ? -1 : id; }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
public long lookupId(BytesRef idBytes) throws IOException { String field = schema.getUniqueKeyField().getName(); final AtomicReaderContext[] leaves = leafContexts; for (int i=0; i<leaves.length; i++) { final AtomicReaderContext leaf = leaves[i]; final AtomicReader reader = leaf.reader(); final Fields fields = reader.fields(); if (fields == null) continue; final Bits liveDocs = reader.getLiveDocs(); final DocsEnum docs = reader.termDocsEnum(liveDocs, field, idBytes, false); if (docs == null) continue; int id = docs.nextDoc(); if (id == DocIdSetIterator.NO_MORE_DOCS) continue; assert docs.nextDoc() == DocIdSetIterator.NO_MORE_DOCS; return (((long)i) << 32) | id; } return -1; }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
public void cacheDocSet(Query query, DocSet optionalAnswer, boolean mustCache) throws IOException { // Even if the cache is null, still compute the DocSet as it may serve to warm the Lucene // or OS disk cache. if (optionalAnswer != null) { if (filterCache!=null) { filterCache.put(query,optionalAnswer); } return; } // Throw away the result, relying on the fact that getDocSet // will currently always cache what it found. If getDocSet() starts // using heuristics about what to cache, and mustCache==true, (or if we // want this method to start using heuristics too) then // this needs to change. getDocSet(query); }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
public DocSet getDocSet(Query query) throws IOException { if (query instanceof ExtendedQuery) { ExtendedQuery eq = (ExtendedQuery)query; if (!eq.getCache()) { if (query instanceof WrappedQuery) { query = ((WrappedQuery)query).getWrappedQuery(); } query = QueryUtils.makeQueryable(query); return getDocSetNC(query, null); } } // Get the absolute value (positive version) of this query. If we // get back the same reference, we know it's positive. Query absQ = QueryUtils.getAbs(query); boolean positive = query==absQ; if (filterCache != null) { DocSet absAnswer = filterCache.get(absQ); if (absAnswer!=null) { if (positive) return absAnswer; else return getPositiveDocSet(matchAllDocsQuery).andNot(absAnswer); } } DocSet absAnswer = getDocSetNC(absQ, null); DocSet answer = positive ? absAnswer : getPositiveDocSet(matchAllDocsQuery).andNot(absAnswer); if (filterCache != null) { // cache negative queries as positive filterCache.put(absQ, absAnswer); } return answer; }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
DocSet getPositiveDocSet(Query q) throws IOException { DocSet answer; if (filterCache != null) { answer = filterCache.get(q); if (answer!=null) return answer; } answer = getDocSetNC(q,null); if (filterCache != null) filterCache.put( q,answer); return answer; }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
public DocSet getDocSet(List<Query> queries) throws IOException { ProcessedFilter pf = getProcessedFilter(null, queries); if (pf.answer != null) return pf.answer; DocSetCollector setCollector = new DocSetCollector(maxDoc()>>6, maxDoc()); Collector collector = setCollector; if (pf.postFilter != null) { pf.postFilter.setLastDelegate(collector); collector = pf.postFilter; } final AtomicReaderContext[] leaves = leafContexts; for (int i=0; i<leaves.length; i++) { final AtomicReaderContext leaf = leaves[i]; final AtomicReader reader = leaf.reader(); final Bits liveDocs = reader.getLiveDocs(); // TODO: the filter may already only have liveDocs... DocIdSet idSet = null; if (pf.filter != null) { idSet = pf.filter.getDocIdSet(leaf, liveDocs); if (idSet == null) continue; } DocIdSetIterator idIter = null; if (idSet != null) { idIter = idSet.iterator(); if (idIter == null) continue; } collector.setNextReader(leaf); int max = reader.maxDoc(); if (idIter == null) { for (int docid = 0; docid<max; docid++) { if (liveDocs != null && !liveDocs.get(docid)) continue; collector.collect(docid); } } else { for (int docid = -1; (docid = idIter.advance(docid+1)) < max; ) { collector.collect(docid); } } }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
public ProcessedFilter getProcessedFilter(DocSet setFilter, List<Query> queries) throws IOException { ProcessedFilter pf = new ProcessedFilter(); if (queries==null || queries.size()==0) { if (setFilter != null) pf.filter = setFilter.getTopFilter(); return pf; } DocSet answer=null; boolean[] neg = new boolean[queries.size()+1]; DocSet[] sets = new DocSet[queries.size()+1]; List<Query> notCached = null; List<Query> postFilters = null; int end = 0; int smallestIndex = -1; if (setFilter != null) { answer = sets[end++] = setFilter; smallestIndex = end; } int smallestCount = Integer.MAX_VALUE; for (Query q : queries) { if (q instanceof ExtendedQuery) { ExtendedQuery eq = (ExtendedQuery)q; if (!eq.getCache()) { if (eq.getCost() >= 100 && eq instanceof PostFilter) { if (postFilters == null) postFilters = new ArrayList<Query>(sets.length-end); postFilters.add(q); } else { if (notCached == null) notCached = new ArrayList<Query>(sets.length-end); notCached.add(q); } continue; } } Query posQuery = QueryUtils.getAbs(q); sets[end] = getPositiveDocSet(posQuery); // Negative query if absolute value different from original if (q==posQuery) { neg[end] = false; // keep track of the smallest positive set. // This optimization is only worth it if size() is cached, which it would // be if we don't do any set operations. int sz = sets[end].size(); if (sz<smallestCount) { smallestCount=sz; smallestIndex=end; answer = sets[end]; } } else { neg[end] = true; } end++; } // Are all of our normal cached filters negative? if (end > 0 && answer==null) { answer = getPositiveDocSet(matchAllDocsQuery); } // do negative queries first to shrink set size for (int i=0; i<end; i++) { if (neg[i]) answer = answer.andNot(sets[i]); } for (int i=0; i<end; i++) { if (!neg[i] && i!=smallestIndex) answer = answer.intersection(sets[i]); } if (notCached != null) { Collections.sort(notCached, sortByCost); List<Weight> weights = new ArrayList<Weight>(notCached.size()); for (Query q : notCached) { Query qq = QueryUtils.makeQueryable(q); weights.add(createNormalizedWeight(qq)); } pf.filter = new FilterImpl(answer, weights); } else { if (postFilters == null) { if (answer == null) { answer = getPositiveDocSet(matchAllDocsQuery); } // "answer" is the only part of the filter, so set it. pf.answer = answer; } if (answer != null) { pf.filter = answer.getTopFilter(); } } if (postFilters != null) { Collections.sort(postFilters, sortByCost); for (int i=postFilters.size()-1; i>=0; i--) { DelegatingCollector prev = pf.postFilter; pf.postFilter = ((PostFilter)postFilters.get(i)).getFilterCollector(this); if (prev != null) pf.postFilter.setDelegate(prev); } } return pf; }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
public DocSet getDocSet(DocsEnumState deState) throws IOException { int largestPossible = deState.termsEnum.docFreq(); boolean useCache = filterCache != null && largestPossible >= deState.minSetSizeCached; TermQuery key = null; if (useCache) { key = new TermQuery(new Term(deState.fieldName, BytesRef.deepCopyOf(deState.termsEnum.term()))); DocSet result = filterCache.get(key); if (result != null) return result; } int smallSetSize = maxDoc()>>6; int scratchSize = Math.min(smallSetSize, largestPossible); if (deState.scratch == null || deState.scratch.length < scratchSize) deState.scratch = new int[scratchSize]; final int[] docs = deState.scratch; int upto = 0; int bitsSet = 0; OpenBitSet obs = null; DocsEnum docsEnum = deState.termsEnum.docs(deState.liveDocs, deState.docsEnum, false); if (deState.docsEnum == null) { deState.docsEnum = docsEnum; } if (docsEnum instanceof MultiDocsEnum) { MultiDocsEnum.EnumWithSlice[] subs = ((MultiDocsEnum)docsEnum).getSubs(); int numSubs = ((MultiDocsEnum)docsEnum).getNumSubs(); for (int subindex = 0; subindex<numSubs; subindex++) { MultiDocsEnum.EnumWithSlice sub = subs[subindex]; if (sub.docsEnum == null) continue; int base = sub.slice.start; int docid; if (largestPossible > docs.length) { if (obs == null) obs = new OpenBitSet(maxDoc()); while ((docid = sub.docsEnum.nextDoc()) != DocIdSetIterator.NO_MORE_DOCS) { obs.fastSet(docid + base); bitsSet++; } } else { while ((docid = sub.docsEnum.nextDoc()) != DocIdSetIterator.NO_MORE_DOCS) { docs[upto++] = docid + base; } } } } else { int docid; if (largestPossible > docs.length) { if (obs == null) obs = new OpenBitSet(maxDoc()); while ((docid = docsEnum.nextDoc()) != DocIdSetIterator.NO_MORE_DOCS) { obs.fastSet(docid); bitsSet++; } } else { while ((docid = docsEnum.nextDoc()) != DocIdSetIterator.NO_MORE_DOCS) { docs[upto++] = docid; } } } DocSet result; if (obs != null) { for (int i=0; i<upto; i++) { obs.fastSet(docs[i]); } bitsSet += upto; result = new BitDocSet(obs, bitsSet); } else { result = upto==0 ? DocSet.EMPTY : new SortedIntDocSet(Arrays.copyOf(docs, upto)); } if (useCache) { filterCache.put(key, result); } return result; }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
protected DocSet getDocSetNC(Query query, DocSet filter) throws IOException { DocSetCollector collector = new DocSetCollector(maxDoc()>>6, maxDoc()); if (filter==null) { if (query instanceof TermQuery) { Term t = ((TermQuery)query).getTerm(); final AtomicReaderContext[] leaves = leafContexts; for (int i=0; i<leaves.length; i++) { final AtomicReaderContext leaf = leaves[i]; final AtomicReader reader = leaf.reader(); collector.setNextReader(leaf); Fields fields = reader.fields(); Terms terms = fields.terms(t.field()); BytesRef termBytes = t.bytes(); Bits liveDocs = reader.getLiveDocs(); DocsEnum docsEnum = null; if (terms != null) { final TermsEnum termsEnum = terms.iterator(null); if (termsEnum.seekExact(termBytes, false)) { docsEnum = termsEnum.docs(liveDocs, null, false); } } if (docsEnum != null) { int docid; while ((docid = docsEnum.nextDoc()) != DocIdSetIterator.NO_MORE_DOCS) { collector.collect(docid); } } } } else { super.search(query,null,collector); } return collector.getDocSet(); } else { Filter luceneFilter = filter.getTopFilter(); super.search(query, luceneFilter, collector); return collector.getDocSet(); } }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
public DocSet getDocSet(Query query, DocSet filter) throws IOException { if (filter==null) return getDocSet(query); if (query instanceof ExtendedQuery) { ExtendedQuery eq = (ExtendedQuery)query; if (!eq.getCache()) { if (query instanceof WrappedQuery) { query = ((WrappedQuery)query).getWrappedQuery(); } query = QueryUtils.makeQueryable(query); return getDocSetNC(query, filter); } } // Negative query if absolute value different from original Query absQ = QueryUtils.getAbs(query); boolean positive = absQ==query; DocSet first; if (filterCache != null) { first = filterCache.get(absQ); if (first==null) { first = getDocSetNC(absQ,null); filterCache.put(absQ,first); } return positive ? first.intersection(filter) : filter.andNot(first); } // If there isn't a cache, then do a single filtered query if positive. return positive ? getDocSetNC(absQ,filter) : filter.andNot(getPositiveDocSet(absQ)); }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
public DocList getDocList(Query query, Query filter, Sort lsort, int offset, int len) throws IOException { QueryCommand qc = new QueryCommand(); qc.setQuery(query) .setFilterList(filter) .setSort(lsort) .setOffset(offset) .setLen(len); QueryResult qr = new QueryResult(); search(qr,qc); return qr.getDocList(); }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
public DocList getDocList(Query query, List<Query> filterList, Sort lsort, int offset, int len, int flags) throws IOException { QueryCommand qc = new QueryCommand(); qc.setQuery(query) .setFilterList(filterList) .setSort(lsort) .setOffset(offset) .setLen(len) .setFlags(flags); QueryResult qr = new QueryResult(); search(qr,qc); return qr.getDocList(); }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
private void getDocListC(QueryResult qr, QueryCommand cmd) throws IOException { DocListAndSet out = new DocListAndSet(); qr.setDocListAndSet(out); QueryResultKey key=null; int maxDocRequested = cmd.getOffset() + cmd.getLen(); // check for overflow, and check for # docs in index if (maxDocRequested < 0 || maxDocRequested > maxDoc()) maxDocRequested = maxDoc(); int supersetMaxDoc= maxDocRequested; DocList superset = null; int flags = cmd.getFlags(); Query q = cmd.getQuery(); if (q instanceof ExtendedQuery) { ExtendedQuery eq = (ExtendedQuery)q; if (!eq.getCache()) { flags |= (NO_CHECK_QCACHE | NO_SET_QCACHE | NO_CHECK_FILTERCACHE); } } // we can try and look up the complete query in the cache. // we can't do that if filter!=null though (we don't want to // do hashCode() and equals() for a big DocSet). if (queryResultCache != null && cmd.getFilter()==null && (flags & (NO_CHECK_QCACHE|NO_SET_QCACHE)) != ((NO_CHECK_QCACHE|NO_SET_QCACHE))) { // all of the current flags can be reused during warming, // so set all of them on the cache key. key = new QueryResultKey(q, cmd.getFilterList(), cmd.getSort(), flags); if ((flags & NO_CHECK_QCACHE)==0) { superset = queryResultCache.get(key); if (superset != null) { // check that the cache entry has scores recorded if we need them if ((flags & GET_SCORES)==0 || superset.hasScores()) { // NOTE: subset() returns null if the DocList has fewer docs than // requested out.docList = superset.subset(cmd.getOffset(),cmd.getLen()); } } if (out.docList != null) { // found the docList in the cache... now check if we need the docset too. // OPT: possible future optimization - if the doclist contains all the matches, // use it to make the docset instead of rerunning the query. if (out.docSet==null && ((flags & GET_DOCSET)!=0) ) { if (cmd.getFilterList()==null) { out.docSet = getDocSet(cmd.getQuery()); } else { List<Query> newList = new ArrayList<Query>(cmd.getFilterList().size()+1); newList.add(cmd.getQuery()); newList.addAll(cmd.getFilterList()); out.docSet = getDocSet(newList); } } return; } } // If we are going to generate the result, bump up to the // next resultWindowSize for better caching. if ((flags & NO_SET_QCACHE) == 0) { // handle 0 special case as well as avoid idiv in the common case. if (maxDocRequested < queryResultWindowSize) { supersetMaxDoc=queryResultWindowSize; } else { supersetMaxDoc = ((maxDocRequested -1)/queryResultWindowSize + 1)*queryResultWindowSize; if (supersetMaxDoc < 0) supersetMaxDoc=maxDocRequested; } } else { key = null; // we won't be caching the result } } // OK, so now we need to generate an answer. // One way to do that would be to check if we have an unordered list // of results for the base query. If so, we can apply the filters and then // sort by the resulting set. This can only be used if: // - the sort doesn't contain score // - we don't want score returned. // check if we should try and use the filter cache boolean useFilterCache=false; if ((flags & (GET_SCORES|NO_CHECK_FILTERCACHE))==0 && useFilterForSortedQuery && cmd.getSort() != null && filterCache != null) { useFilterCache=true; SortField[] sfields = cmd.getSort().getSort(); for (SortField sf : sfields) { if (sf.getType() == SortField.Type.SCORE) { useFilterCache=false; break; } } } // disable useFilterCache optimization temporarily if (useFilterCache) { // now actually use the filter cache. // for large filters that match few documents, this may be // slower than simply re-executing the query. if (out.docSet == null) { out.docSet = getDocSet(cmd.getQuery(),cmd.getFilter()); DocSet bigFilt = getDocSet(cmd.getFilterList()); if (bigFilt != null) out.docSet = out.docSet.intersection(bigFilt); } // todo: there could be a sortDocSet that could take a list of // the filters instead of anding them first... // perhaps there should be a multi-docset-iterator superset = sortDocSet(out.docSet,cmd.getSort(),supersetMaxDoc); out.docList = superset.subset(cmd.getOffset(),cmd.getLen()); } else { // do it the normal way... cmd.setSupersetMaxDoc(supersetMaxDoc); if ((flags & GET_DOCSET)!=0) { // this currently conflates returning the docset for the base query vs // the base query and all filters. DocSet qDocSet = getDocListAndSetNC(qr,cmd); // cache the docSet matching the query w/o filtering if (qDocSet!=null && filterCache!=null && !qr.isPartialResults()) filterCache.put(cmd.getQuery(),qDocSet); } else { getDocListNC(qr,cmd); //Parameters: cmd.getQuery(),theFilt,cmd.getSort(),0,supersetMaxDoc,cmd.getFlags(),cmd.getTimeAllowed(),responseHeader); } superset = out.docList; out.docList = superset.subset(cmd.getOffset(),cmd.getLen()); } // lastly, put the superset in the cache if the size is less than or equal // to queryResultMaxDocsCached if (key != null && superset.size() <= queryResultMaxDocsCached && !qr.isPartialResults()) { queryResultCache.put(key, superset); } }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
private void getDocListNC(QueryResult qr,QueryCommand cmd) throws IOException { final long timeAllowed = cmd.getTimeAllowed(); int len = cmd.getSupersetMaxDoc(); int last = len; if (last < 0 || last > maxDoc()) last=maxDoc(); final int lastDocRequested = last; int nDocsReturned; int totalHits; float maxScore; int[] ids; float[] scores; boolean needScores = (cmd.getFlags() & GET_SCORES) != 0; Query query = QueryUtils.makeQueryable(cmd.getQuery()); ProcessedFilter pf = getProcessedFilter(cmd.getFilter(), cmd.getFilterList()); final Filter luceneFilter = pf.filter; // handle zero case... if (lastDocRequested<=0) { final float[] topscore = new float[] { Float.NEGATIVE_INFINITY }; final int[] numHits = new int[1]; Collector collector; if (!needScores) { collector = new Collector () { @Override public void setScorer(Scorer scorer) throws IOException { } @Override public void collect(int doc) throws IOException { numHits[0]++; } @Override public void setNextReader(AtomicReaderContext context) throws IOException { } @Override public boolean acceptsDocsOutOfOrder() { return true; } }; } else { collector = new Collector() { Scorer scorer; @Override public void setScorer(Scorer scorer) throws IOException { this.scorer = scorer; } @Override public void collect(int doc) throws IOException { numHits[0]++; float score = scorer.score(); if (score > topscore[0]) topscore[0]=score; } @Override public void setNextReader(AtomicReaderContext context) throws IOException { } @Override public boolean acceptsDocsOutOfOrder() { return true; } }; } if( timeAllowed > 0 ) { collector = new TimeLimitingCollector(collector, TimeLimitingCollector.getGlobalCounter(), timeAllowed); } if (pf.postFilter != null) { pf.postFilter.setLastDelegate(collector); collector = pf.postFilter; } try { super.search(query, luceneFilter, collector); } catch( TimeLimitingCollector.TimeExceededException x ) { log.warn( "Query: " + query + "; " + x.getMessage() ); qr.setPartialResults(true); } nDocsReturned=0; ids = new int[nDocsReturned]; scores = new float[nDocsReturned]; totalHits = numHits[0]; maxScore = totalHits>0 ? topscore[0] : 0.0f; } else { TopDocsCollector topCollector; if (cmd.getSort() == null) { if(cmd.getScoreDoc() != null) { topCollector = TopScoreDocCollector.create(len, cmd.getScoreDoc(), true); //create the Collector with InOrderPagingCollector } else { topCollector = TopScoreDocCollector.create(len, true); } } else { topCollector = TopFieldCollector.create(weightSort(cmd.getSort()), len, false, needScores, needScores, true); } Collector collector = topCollector; if( timeAllowed > 0 ) { collector = new TimeLimitingCollector(collector, TimeLimitingCollector.getGlobalCounter(), timeAllowed); } if (pf.postFilter != null) { pf.postFilter.setLastDelegate(collector); collector = pf.postFilter; } try { super.search(query, luceneFilter, collector); } catch( TimeLimitingCollector.TimeExceededException x ) { log.warn( "Query: " + query + "; " + x.getMessage() ); qr.setPartialResults(true); } totalHits = topCollector.getTotalHits(); TopDocs topDocs = topCollector.topDocs(0, len); maxScore = totalHits>0 ? topDocs.getMaxScore() : 0.0f; nDocsReturned = topDocs.scoreDocs.length; ids = new int[nDocsReturned]; scores = (cmd.getFlags()&GET_SCORES)!=0 ? new float[nDocsReturned] : null; for (int i=0; i<nDocsReturned; i++) { ScoreDoc scoreDoc = topDocs.scoreDocs[i]; ids[i] = scoreDoc.doc; if (scores != null) scores[i] = scoreDoc.score; } } int sliceLen = Math.min(lastDocRequested,nDocsReturned); if (sliceLen < 0) sliceLen=0; qr.setDocList(new DocSlice(0,sliceLen,ids,scores,totalHits,maxScore)); }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
Override public void setScorer(Scorer scorer) throws IOException { }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
Override public void collect(int doc) throws IOException { numHits[0]++; }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
Override public void setNextReader(AtomicReaderContext context) throws IOException { }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
Override public void setScorer(Scorer scorer) throws IOException { this.scorer = scorer; }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
Override public void collect(int doc) throws IOException { numHits[0]++; float score = scorer.score(); if (score > topscore[0]) topscore[0]=score; }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
Override public void setNextReader(AtomicReaderContext context) throws IOException { }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
private DocSet getDocListAndSetNC(QueryResult qr,QueryCommand cmd) throws IOException { int len = cmd.getSupersetMaxDoc(); int last = len; if (last < 0 || last > maxDoc()) last=maxDoc(); final int lastDocRequested = last; int nDocsReturned; int totalHits; float maxScore; int[] ids; float[] scores; DocSet set; boolean needScores = (cmd.getFlags() & GET_SCORES) != 0; int maxDoc = maxDoc(); int smallSetSize = maxDoc>>6; ProcessedFilter pf = getProcessedFilter(cmd.getFilter(), cmd.getFilterList()); final Filter luceneFilter = pf.filter; Query query = QueryUtils.makeQueryable(cmd.getQuery()); final long timeAllowed = cmd.getTimeAllowed(); // handle zero case... if (lastDocRequested<=0) { final float[] topscore = new float[] { Float.NEGATIVE_INFINITY }; Collector collector; DocSetCollector setCollector; if (!needScores) { collector = setCollector = new DocSetCollector(smallSetSize, maxDoc); } else { collector = setCollector = new DocSetDelegateCollector(smallSetSize, maxDoc, new Collector() { Scorer scorer; @Override public void setScorer(Scorer scorer) throws IOException { this.scorer = scorer; } @Override public void collect(int doc) throws IOException { float score = scorer.score(); if (score > topscore[0]) topscore[0]=score; } @Override public void setNextReader(AtomicReaderContext context) throws IOException { } @Override public boolean acceptsDocsOutOfOrder() { return false; } }); } if( timeAllowed > 0 ) { collector = new TimeLimitingCollector(collector, TimeLimitingCollector.getGlobalCounter(), timeAllowed); } if (pf.postFilter != null) { pf.postFilter.setLastDelegate(collector); collector = pf.postFilter; } try { super.search(query, luceneFilter, collector); } catch( TimeLimitingCollector.TimeExceededException x ) { log.warn( "Query: " + query + "; " + x.getMessage() ); qr.setPartialResults(true); } set = setCollector.getDocSet(); nDocsReturned = 0; ids = new int[nDocsReturned]; scores = new float[nDocsReturned]; totalHits = set.size(); maxScore = totalHits>0 ? topscore[0] : 0.0f; }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
Override public void setScorer(Scorer scorer) throws IOException { this.scorer = scorer; }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
Override public void collect(int doc) throws IOException { float score = scorer.score(); if (score > topscore[0]) topscore[0]=score; }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
Override public void setNextReader(AtomicReaderContext context) throws IOException { }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
public DocList getDocList(Query query, DocSet filter, Sort lsort, int offset, int len) throws IOException { QueryCommand qc = new QueryCommand(); qc.setQuery(query) .setFilter(filter) .setSort(lsort) .setOffset(offset) .setLen(len); QueryResult qr = new QueryResult(); search(qr,qc); return qr.getDocList(); }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
public DocListAndSet getDocListAndSet(Query query, Query filter, Sort lsort, int offset, int len) throws IOException { QueryCommand qc = new QueryCommand(); qc.setQuery(query) .setFilterList(filter) .setSort(lsort) .setOffset(offset) .setLen(len) .setNeedDocSet(true); QueryResult qr = new QueryResult(); search(qr,qc); return qr.getDocListAndSet(); }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
public DocListAndSet getDocListAndSet(Query query, Query filter, Sort lsort, int offset, int len, int flags) throws IOException { QueryCommand qc = new QueryCommand(); qc.setQuery(query) .setFilterList(filter) .setSort(lsort) .setOffset(offset) .setLen(len) .setFlags(flags) .setNeedDocSet(true); QueryResult qr = new QueryResult(); search(qr,qc); return qr.getDocListAndSet(); }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
public DocListAndSet getDocListAndSet(Query query, List<Query> filterList, Sort lsort, int offset, int len) throws IOException { QueryCommand qc = new QueryCommand(); qc.setQuery(query) .setFilterList(filterList) .setSort(lsort) .setOffset(offset) .setLen(len) .setNeedDocSet(true); QueryResult qr = new QueryResult(); search(qr,qc); return qr.getDocListAndSet(); }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
public DocListAndSet getDocListAndSet(Query query, List<Query> filterList, Sort lsort, int offset, int len, int flags) throws IOException { QueryCommand qc = new QueryCommand(); qc.setQuery(query) .setFilterList(filterList) .setSort(lsort) .setOffset(offset) .setLen(len) .setFlags(flags) .setNeedDocSet(true); QueryResult qr = new QueryResult(); search(qr,qc); return qr.getDocListAndSet(); }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
public DocListAndSet getDocListAndSet(Query query, DocSet filter, Sort lsort, int offset, int len) throws IOException { QueryCommand qc = new QueryCommand(); qc.setQuery(query) .setFilter(filter) .setSort(lsort) .setOffset(offset) .setLen(len) .setNeedDocSet(true); QueryResult qr = new QueryResult(); search(qr,qc); return qr.getDocListAndSet(); }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
public DocListAndSet getDocListAndSet(Query query, DocSet filter, Sort lsort, int offset, int len, int flags) throws IOException { QueryCommand qc = new QueryCommand(); qc.setQuery(query) .setFilter(filter) .setSort(lsort) .setOffset(offset) .setLen(len) .setFlags(flags) .setNeedDocSet(true); QueryResult qr = new QueryResult(); search(qr,qc); return qr.getDocListAndSet(); }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
protected DocList sortDocSet(DocSet set, Sort sort, int nDocs) throws IOException { if (nDocs == 0) { // SOLR-2923 return new DocSlice(0, 0, new int[0], null, 0, 0f); } // bit of a hack to tell if a set is sorted - do it better in the future. boolean inOrder = set instanceof BitDocSet || set instanceof SortedIntDocSet; TopDocsCollector topCollector = TopFieldCollector.create(weightSort(sort), nDocs, false, false, false, inOrder); DocIterator iter = set.iterator(); int base=0; int end=0; int readerIndex = 0; while (iter.hasNext()) { int doc = iter.nextDoc(); while (doc>=end) { AtomicReaderContext leaf = leafContexts[readerIndex++]; base = leaf.docBase; end = base + leaf.reader().maxDoc(); topCollector.setNextReader(leaf); // we should never need to set the scorer given the settings for the collector } topCollector.collect(doc-base); } TopDocs topDocs = topCollector.topDocs(0, nDocs); int nDocsReturned = topDocs.scoreDocs.length; int[] ids = new int[nDocsReturned]; for (int i=0; i<nDocsReturned; i++) { ScoreDoc scoreDoc = topDocs.scoreDocs[i]; ids[i] = scoreDoc.doc; } return new DocSlice(0,nDocsReturned,ids,null,topDocs.totalHits,0.0f); }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
public int numDocs(Query a, DocSet b) throws IOException { // Negative query if absolute value different from original Query absQ = QueryUtils.getAbs(a); DocSet positiveA = getPositiveDocSet(absQ); return a==absQ ? b.intersectionSize(positiveA) : b.andNotSize(positiveA); }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
public int numDocs(DocSet a, DocsEnumState deState) throws IOException { // Negative query if absolute value different from original return a.intersectionSize(getDocSet(deState)); }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
public int numDocs(Query a, Query b) throws IOException { Query absA = QueryUtils.getAbs(a); Query absB = QueryUtils.getAbs(b); DocSet positiveA = getPositiveDocSet(absA); DocSet positiveB = getPositiveDocSet(absB); // Negative query if absolute value different from original if (a==absA) { if (b==absB) return positiveA.intersectionSize(positiveB); return positiveA.andNotSize(positiveB); } if (b==absB) return positiveB.andNotSize(positiveA); // if both negative, we need to create a temp DocSet since we // don't have a counting method that takes three. DocSet all = getPositiveDocSet(matchAllDocsQuery); // -a -b == *:*.andNot(a).andNotSize(b) == *.*.andNotSize(a.union(b)) // we use the last form since the intermediate DocSet should normally be smaller. return all.andNotSize(positiveA.union(positiveB)); }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
public Document[] readDocs(DocList ids) throws IOException { Document[] docs = new Document[ids.size()]; readDocs(docs,ids); return docs; }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
public void warm(SolrIndexSearcher old) throws IOException { // Make sure this is first! filters can help queryResults execute! long warmingStartTime = System.currentTimeMillis(); // warm the caches in order... ModifiableSolrParams params = new ModifiableSolrParams(); params.add("warming","true"); for (int i=0; i<cacheList.length; i++) { if (debug) log.debug("autowarming " + this + " from " + old + "\n\t" + old.cacheList[i]); SolrQueryRequest req = new LocalSolrQueryRequest(core,params) { @Override public SolrIndexSearcher getSearcher() { return SolrIndexSearcher.this; } @Override public void close() { } }; SolrQueryResponse rsp = new SolrQueryResponse(); SolrRequestInfo.setRequestInfo(new SolrRequestInfo(req, rsp)); try { this.cacheList[i].warm(this, old.cacheList[i]); } finally { try { req.close(); } finally { SolrRequestInfo.clearRequestInfo(); } } if (debug) log.debug("autowarming result for " + this + "\n\t" + this.cacheList[i]); } warmupTime = System.currentTimeMillis() - warmingStartTime; }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
Override public Explanation explain(Query query, int doc) throws IOException { return super.explain(QueryUtils.makeQueryable(query), doc); }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
Override public DocIdSet getDocIdSet(AtomicReaderContext context, Bits acceptDocs) throws IOException { DocIdSet sub = topFilter == null ? null : topFilter.getDocIdSet(context, acceptDocs); if (weights.size() == 0) return sub; return new FilterSet(sub, context); }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
Override public DocIdSetIterator iterator() throws IOException { List<DocIdSetIterator> iterators = new ArrayList<DocIdSetIterator>(weights.size()+1); if (docIdSet != null) { DocIdSetIterator iter = docIdSet.iterator(); if (iter == null) return null; iterators.add(iter); } for (Weight w : weights) { Scorer scorer = w.scorer(context, true, false, context.reader().getLiveDocs()); if (scorer == null) return null; iterators.add(scorer); } if (iterators.size()==0) return null; if (iterators.size()==1) return iterators.get(0); if (iterators.size()==2) return new DualFilterIterator(iterators.get(0), iterators.get(1)); return new FilterIterator(iterators.toArray(new DocIdSetIterator[iterators.size()])); }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
Override public Bits bits() throws IOException { return null; // don't use random access }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
private int doNext(int doc) throws IOException { int which=0; // index of the iterator with the highest id int i=1; outer: for(;;) { for (; i<iterators.length; i++) { if (i == which) continue; DocIdSetIterator iter = iterators[i]; int next = iter.advance(doc); if (next != doc) { doc = next; which = i; i = 0; continue outer; } } return doc; } }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
Override public int nextDoc() throws IOException { return doNext(first.nextDoc()); }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
Override public int advance(int target) throws IOException { return doNext(first.advance(target)); }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
Override public int nextDoc() throws IOException { int doc = a.nextDoc(); for(;;) { int other = b.advance(doc); if (other == doc) return doc; doc = a.advance(other); if (other == doc) return doc; } }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
Override public int advance(int target) throws IOException { int doc = a.advance(target); for(;;) { int other = b.advance(doc); if (other == doc) return doc; doc = a.advance(other); if (other == doc) return doc; } }
// in core/src/java/org/apache/solr/search/LFUCache.java
public void warm(SolrIndexSearcher searcher, SolrCache old) throws IOException { if (regenerator == null) return; long warmingStartTime = System.currentTimeMillis(); LFUCache other = (LFUCache) old; // warm entries if (autowarmCount != 0) { int sz = other.size(); if (autowarmCount != -1) sz = Math.min(sz, autowarmCount); Map items = other.cache.getMostUsedItems(sz); Map.Entry[] itemsArr = new Map.Entry[items.size()]; int counter = 0; for (Object mapEntry : items.entrySet()) { itemsArr[counter++] = (Map.Entry) mapEntry; } for (int i = itemsArr.length - 1; i >= 0; i--) { try { boolean continueRegen = regenerator.regenerateItem(searcher, this, old, itemsArr[i].getKey(), itemsArr[i].getValue()); if (!continueRegen) break; } catch (Throwable e) { SolrException.log(log, "Error during auto-warming of key:" + itemsArr[i].getKey(), e); } } } warmupTime = System.currentTimeMillis() - warmingStartTime; }
// in core/src/java/org/apache/solr/search/DelegatingCollector.java
Override public void setScorer(Scorer scorer) throws IOException { this.scorer = scorer; delegate.setScorer(scorer); }
// in core/src/java/org/apache/solr/search/DelegatingCollector.java
Override public void collect(int doc) throws IOException { delegate.collect(doc); }
// in core/src/java/org/apache/solr/search/DelegatingCollector.java
Override public void setNextReader(AtomicReaderContext context) throws IOException { this.context = context; this.docBase = context.docBase; delegate.setNextReader(context); }
// in core/src/java/org/apache/solr/search/Grouping.java
public void execute() throws IOException { if (commands.isEmpty()) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Specify at least one field, function or query to group by."); } DocListAndSet out = new DocListAndSet(); qr.setDocListAndSet(out); SolrIndexSearcher.ProcessedFilter pf = searcher.getProcessedFilter(cmd.getFilter(), cmd.getFilterList()); final Filter luceneFilter = pf.filter; maxDoc = searcher.maxDoc(); needScores = (cmd.getFlags() & SolrIndexSearcher.GET_SCORES) != 0; boolean cacheScores = false; // NOTE: Change this when groupSort can be specified per group if (!needScores && !commands.isEmpty()) { if (commands.get(0).groupSort == null) { cacheScores = true; } else { for (SortField field : commands.get(0).groupSort.getSort()) { if (field.getType() == SortField.Type.SCORE) { cacheScores = true; break; } } } } else if (needScores) { cacheScores = needScores; } getDocSet = (cmd.getFlags() & SolrIndexSearcher.GET_DOCSET) != 0; getDocList = (cmd.getFlags() & SolrIndexSearcher.GET_DOCLIST) != 0; query = QueryUtils.makeQueryable(cmd.getQuery()); for (Command cmd : commands) { cmd.prepare(); } AbstractAllGroupHeadsCollector<?> allGroupHeadsCollector = null; List<Collector> collectors = new ArrayList<Collector>(commands.size()); for (Command cmd : commands) { Collector collector = cmd.createFirstPassCollector(); if (collector != null) { collectors.add(collector); } if (getGroupedDocSet && allGroupHeadsCollector == null) { collectors.add(allGroupHeadsCollector = cmd.createAllGroupCollector()); } } Collector allCollectors = MultiCollector.wrap(collectors.toArray(new Collector[collectors.size()])); DocSetCollector setCollector = null; if (getDocSet && allGroupHeadsCollector == null) { setCollector = new DocSetDelegateCollector(maxDoc >> 6, maxDoc, allCollectors); allCollectors = setCollector; } CachingCollector cachedCollector = null; if (cacheSecondPassSearch && allCollectors != null) { int maxDocsToCache = (int) Math.round(maxDoc * (maxDocsPercentageToCache / 100.0d)); // Only makes sense to cache if we cache more than zero. // Maybe we should have a minimum and a maximum, that defines the window we would like caching for. if (maxDocsToCache > 0) { allCollectors = cachedCollector = CachingCollector.create(allCollectors, cacheScores, maxDocsToCache); } } if (pf.postFilter != null) { pf.postFilter.setLastDelegate(allCollectors); allCollectors = pf.postFilter; } if (allCollectors != null) { searchWithTimeLimiter(luceneFilter, allCollectors); } if (getGroupedDocSet && allGroupHeadsCollector != null) { FixedBitSet fixedBitSet = allGroupHeadsCollector.retrieveGroupHeads(maxDoc); long[] bits = fixedBitSet.getBits(); OpenBitSet openBitSet = new OpenBitSet(bits, bits.length); qr.setDocSet(new BitDocSet(openBitSet)); } else if (getDocSet) { qr.setDocSet(setCollector.getDocSet()); } collectors.clear(); for (Command cmd : commands) { Collector collector = cmd.createSecondPassCollector(); if (collector != null) collectors.add(collector); } if (!collectors.isEmpty()) { Collector secondPhaseCollectors = MultiCollector.wrap(collectors.toArray(new Collector[collectors.size()])); if (collectors.size() > 0) { if (cachedCollector != null) { if (cachedCollector.isCached()) { cachedCollector.replay(secondPhaseCollectors); } else { signalCacheWarning = true; logger.warn(String.format("The grouping cache is active, but not used because it exceeded the max cache limit of %d percent", maxDocsPercentageToCache)); logger.warn("Please increase cache size or disable group caching."); searchWithTimeLimiter(luceneFilter, secondPhaseCollectors); } } else { if (pf.postFilter != null) { pf.postFilter.setLastDelegate(secondPhaseCollectors); secondPhaseCollectors = pf.postFilter; } searchWithTimeLimiter(luceneFilter, secondPhaseCollectors); } } } for (Command cmd : commands) { cmd.finish(); } qr.groupedResults = grouped; if (getDocList) { int sz = idSet.size(); int[] ids = new int[sz]; int idx = 0; for (int val : idSet) { ids[idx++] = val; } qr.setDocList(new DocSlice(0, sz, ids, null, maxMatches, maxScore)); } }
// in core/src/java/org/apache/solr/search/Grouping.java
private void searchWithTimeLimiter(final Filter luceneFilter, Collector collector) throws IOException { if (cmd.getTimeAllowed() > 0) { if (timeLimitingCollector == null) { timeLimitingCollector = new TimeLimitingCollector(collector, TimeLimitingCollector.getGlobalCounter(), cmd.getTimeAllowed()); } else { /* * This is so the same timer can be used for grouping's multiple phases. * We don't want to create a new TimeLimitingCollector for each phase because that would * reset the timer for each phase. If time runs out during the first phase, the * second phase should timeout quickly. */ timeLimitingCollector.setCollector(collector); } collector = timeLimitingCollector; } try { searcher.search(query, luceneFilter, collector); } catch (TimeLimitingCollector.TimeExceededException x) { logger.warn( "Query: " + query + "; " + x.getMessage() ); qr.setPartialResults(true); } }
// in core/src/java/org/apache/solr/search/Grouping.java
protected Collector createSecondPassCollector() throws IOException { return null; }
// in core/src/java/org/apache/solr/search/Grouping.java
public AbstractAllGroupHeadsCollector<?> createAllGroupCollector() throws IOException { return null; }
// in core/src/java/org/apache/solr/search/Grouping.java
protected void prepare() throws IOException { actualGroupsToFind = getMax(offset, numGroups, maxDoc); }
// in core/src/java/org/apache/solr/search/Grouping.java
protected Collector createFirstPassCollector() throws IOException { // Ok we don't want groups, but do want a total count if (actualGroupsToFind <= 0) { fallBackCollector = new TotalHitCountCollector(); return fallBackCollector; } sort = sort == null ? Sort.RELEVANCE : sort; firstPass = new TermFirstPassGroupingCollector(groupBy, sort, actualGroupsToFind); return firstPass; }
// in core/src/java/org/apache/solr/search/Grouping.java
protected Collector createSecondPassCollector() throws IOException { if (actualGroupsToFind <= 0) { allGroupsCollector = new TermAllGroupsCollector(groupBy); return totalCount == TotalCount.grouped ? allGroupsCollector : null; } topGroups = format == Format.grouped ? firstPass.getTopGroups(offset, false) : firstPass.getTopGroups(0, false); if (topGroups == null) { if (totalCount == TotalCount.grouped) { allGroupsCollector = new TermAllGroupsCollector(groupBy); fallBackCollector = new TotalHitCountCollector(); return MultiCollector.wrap(allGroupsCollector, fallBackCollector); } else { fallBackCollector = new TotalHitCountCollector(); return fallBackCollector; } } int groupedDocsToCollect = getMax(groupOffset, docsPerGroup, maxDoc); groupedDocsToCollect = Math.max(groupedDocsToCollect, 1); secondPass = new TermSecondPassGroupingCollector( groupBy, topGroups, sort, groupSort, groupedDocsToCollect, needScores, needScores, false ); if (totalCount == TotalCount.grouped) { allGroupsCollector = new TermAllGroupsCollector(groupBy); return MultiCollector.wrap(secondPass, allGroupsCollector); } else { return secondPass; } }
// in core/src/java/org/apache/solr/search/Grouping.java
Override public AbstractAllGroupHeadsCollector<?> createAllGroupCollector() throws IOException { Sort sortWithinGroup = groupSort != null ? groupSort : new Sort(); return TermAllGroupHeadsCollector.create(groupBy, sortWithinGroup); }
// in core/src/java/org/apache/solr/search/Grouping.java
protected void finish() throws IOException { result = secondPass != null ? secondPass.getTopGroups(0) : null; if (main) { mainResult = createSimpleResponse(); return; } NamedList groupResult = commonResponse(); if (format == Format.simple) { groupResult.add("doclist", createSimpleResponse()); return; } List groupList = new ArrayList(); groupResult.add("groups", groupList); // grouped={ key={ groups=[ if (result == null) { return; } // handle case of rows=0 if (numGroups == 0) return; for (GroupDocs<BytesRef> group : result.groups) { NamedList nl = new SimpleOrderedMap(); groupList.add(nl); // grouped={ key={ groups=[ { // To keep the response format compatable with trunk. // In trunk MutableValue can convert an indexed value to its native type. E.g. string to int // The only option I currently see is the use the FieldType for this if (group.groupValue != null) { SchemaField schemaField = searcher.getSchema().getField(groupBy); FieldType fieldType = schemaField.getType(); String readableValue = fieldType.indexedToReadable(group.groupValue.utf8ToString()); IndexableField field = schemaField.createField(readableValue, 0.0f); nl.add("groupValue", fieldType.toObject(field)); } else { nl.add("groupValue", null); } addDocList(nl, group); } }
// in core/src/java/org/apache/solr/search/Grouping.java
protected void prepare() throws IOException { actualGroupsToFind = getMax(offset, numGroups, maxDoc); }
// in core/src/java/org/apache/solr/search/Grouping.java
protected Collector createFirstPassCollector() throws IOException { DocSet groupFilt = searcher.getDocSet(query); topCollector = newCollector(groupSort, needScores); collector = new FilterCollector(groupFilt, topCollector); return collector; }
// in core/src/java/org/apache/solr/search/Grouping.java
TopDocsCollector newCollector(Sort sort, boolean needScores) throws IOException { int groupDocsToCollect = getMax(groupOffset, docsPerGroup, maxDoc); if (sort == null || sort == Sort.RELEVANCE) { return TopScoreDocCollector.create(groupDocsToCollect, true); } else { return TopFieldCollector.create(searcher.weightSort(sort), groupDocsToCollect, false, needScores, needScores, true); } }
// in core/src/java/org/apache/solr/search/Grouping.java
protected void finish() throws IOException { TopDocsCollector topDocsCollector = (TopDocsCollector) collector.getDelegate(); TopDocs topDocs = topDocsCollector.topDocs(); GroupDocs<String> groupDocs = new GroupDocs<String>(topDocs.getMaxScore(), topDocs.totalHits, topDocs.scoreDocs, query.toString(), null); if (main) { mainResult = getDocList(groupDocs); } else { NamedList rsp = commonResponse(); addDocList(rsp, groupDocs); } }
// in core/src/java/org/apache/solr/search/Grouping.java
protected void prepare() throws IOException { Map context = ValueSource.newContext(searcher); groupBy.createWeight(context, searcher); actualGroupsToFind = getMax(offset, numGroups, maxDoc); }
// in core/src/java/org/apache/solr/search/Grouping.java
protected Collector createFirstPassCollector() throws IOException { // Ok we don't want groups, but do want a total count if (actualGroupsToFind <= 0) { fallBackCollector = new TotalHitCountCollector(); return fallBackCollector; } sort = sort == null ? Sort.RELEVANCE : sort; firstPass = new FunctionFirstPassGroupingCollector(groupBy, context, searcher.weightSort(sort), actualGroupsToFind); return firstPass; }
// in core/src/java/org/apache/solr/search/Grouping.java
protected Collector createSecondPassCollector() throws IOException { if (actualGroupsToFind <= 0) { allGroupsCollector = new FunctionAllGroupsCollector(groupBy, context); return totalCount == TotalCount.grouped ? allGroupsCollector : null; } topGroups = format == Format.grouped ? firstPass.getTopGroups(offset, false) : firstPass.getTopGroups(0, false); if (topGroups == null) { if (totalCount == TotalCount.grouped) { allGroupsCollector = new FunctionAllGroupsCollector(groupBy, context); fallBackCollector = new TotalHitCountCollector(); return MultiCollector.wrap(allGroupsCollector, fallBackCollector); } else { fallBackCollector = new TotalHitCountCollector(); return fallBackCollector; } } int groupdDocsToCollect = getMax(groupOffset, docsPerGroup, maxDoc); groupdDocsToCollect = Math.max(groupdDocsToCollect, 1); secondPass = new FunctionSecondPassGroupingCollector( topGroups, sort, groupSort, groupdDocsToCollect, needScores, needScores, false, groupBy, context ); if (totalCount == TotalCount.grouped) { allGroupsCollector = new FunctionAllGroupsCollector(groupBy, context); return MultiCollector.wrap(secondPass, allGroupsCollector); } else { return secondPass; } }
// in core/src/java/org/apache/solr/search/Grouping.java
Override public AbstractAllGroupHeadsCollector<?> createAllGroupCollector() throws IOException { Sort sortWithinGroup = groupSort != null ? groupSort : new Sort(); return new FunctionAllGroupHeadsCollector(groupBy, context, sortWithinGroup); }
// in core/src/java/org/apache/solr/search/Grouping.java
protected void finish() throws IOException { result = secondPass != null ? secondPass.getTopGroups(0) : null; if (main) { mainResult = createSimpleResponse(); return; } NamedList groupResult = commonResponse(); if (format == Format.simple) { groupResult.add("doclist", createSimpleResponse()); return; } List groupList = new ArrayList(); groupResult.add("groups", groupList); // grouped={ key={ groups=[ if (result == null) { return; } // handle case of rows=0 if (numGroups == 0) return; for (GroupDocs<MutableValue> group : result.groups) { NamedList nl = new SimpleOrderedMap(); groupList.add(nl); // grouped={ key={ groups=[ { nl.add("groupValue", group.groupValue.toObject()); addDocList(nl, group); } }
// in core/src/java/org/apache/solr/search/FunctionRangeQuery.java
Override public void collect(int doc) throws IOException { if (doc<maxdoc && scorer.matches(doc)) { delegate.collect(doc); } }
// in core/src/java/org/apache/solr/search/FunctionRangeQuery.java
Override public void setNextReader(AtomicReaderContext context) throws IOException { maxdoc = context.reader().maxDoc(); FunctionValues dv = rangeFilt.getValueSource().getValues(fcontext, context); scorer = dv.getRangeScorer(context.reader(), rangeFilt.getLowerVal(), rangeFilt.getUpperVal(), rangeFilt.isIncludeLower(), rangeFilt.isIncludeUpper()); super.setNextReader(context); }
// in core/src/java/org/apache/solr/search/BitDocSet.java
Override public Filter getTopFilter() { final OpenBitSet bs = bits; // TODO: if cardinality isn't cached, do a quick measure of sparseness // and return null from bits() if too sparse. return new Filter() { @Override public DocIdSet getDocIdSet(final AtomicReaderContext context, final Bits acceptDocs) throws IOException { AtomicReader reader = context.reader(); // all Solr DocSets that are used as filters only include live docs final Bits acceptDocs2 = acceptDocs == null ? null : (reader.getLiveDocs() == acceptDocs ? null : acceptDocs); if (context.isTopLevel) { return BitsFilteredDocIdSet.wrap(bs, acceptDocs); } final int base = context.docBase; final int maxDoc = reader.maxDoc(); final int max = base + maxDoc; // one past the max doc in this segment. return BitsFilteredDocIdSet.wrap(new DocIdSet() { @Override public DocIdSetIterator iterator() throws IOException { return new DocIdSetIterator() { int pos=base-1; int adjustedDoc=-1; @Override public int docID() { return adjustedDoc; } @Override public int nextDoc() throws IOException { pos = bs.nextSetBit(pos+1); return adjustedDoc = (pos>=0 && pos<max) ? pos-base : NO_MORE_DOCS; } @Override public int advance(int target) throws IOException { if (target==NO_MORE_DOCS) return adjustedDoc=NO_MORE_DOCS; pos = bs.nextSetBit(target+base); return adjustedDoc = (pos>=0 && pos<max) ? pos-base : NO_MORE_DOCS; } }; } @Override public boolean isCacheable() { return true; } @Override public Bits bits() throws IOException { return new Bits() { @Override public boolean get(int index) { return bs.fastGet(index + base); } @Override public int length() { return maxDoc; } }; } }, acceptDocs2); } }; }
// in core/src/java/org/apache/solr/search/BitDocSet.java
Override public DocIdSet getDocIdSet(final AtomicReaderContext context, final Bits acceptDocs) throws IOException { AtomicReader reader = context.reader(); // all Solr DocSets that are used as filters only include live docs final Bits acceptDocs2 = acceptDocs == null ? null : (reader.getLiveDocs() == acceptDocs ? null : acceptDocs); if (context.isTopLevel) { return BitsFilteredDocIdSet.wrap(bs, acceptDocs); } final int base = context.docBase; final int maxDoc = reader.maxDoc(); final int max = base + maxDoc; // one past the max doc in this segment. return BitsFilteredDocIdSet.wrap(new DocIdSet() { @Override public DocIdSetIterator iterator() throws IOException { return new DocIdSetIterator() { int pos=base-1; int adjustedDoc=-1; @Override public int docID() { return adjustedDoc; } @Override public int nextDoc() throws IOException { pos = bs.nextSetBit(pos+1); return adjustedDoc = (pos>=0 && pos<max) ? pos-base : NO_MORE_DOCS; } @Override public int advance(int target) throws IOException { if (target==NO_MORE_DOCS) return adjustedDoc=NO_MORE_DOCS; pos = bs.nextSetBit(target+base); return adjustedDoc = (pos>=0 && pos<max) ? pos-base : NO_MORE_DOCS; } }; } @Override public boolean isCacheable() { return true; } @Override public Bits bits() throws IOException { return new Bits() { @Override public boolean get(int index) { return bs.fastGet(index + base); } @Override public int length() { return maxDoc; } }; } }, acceptDocs2); }
// in core/src/java/org/apache/solr/search/BitDocSet.java
Override public DocIdSetIterator iterator() throws IOException { return new DocIdSetIterator() { int pos=base-1; int adjustedDoc=-1; @Override public int docID() { return adjustedDoc; } @Override public int nextDoc() throws IOException { pos = bs.nextSetBit(pos+1); return adjustedDoc = (pos>=0 && pos<max) ? pos-base : NO_MORE_DOCS; } @Override public int advance(int target) throws IOException { if (target==NO_MORE_DOCS) return adjustedDoc=NO_MORE_DOCS; pos = bs.nextSetBit(target+base); return adjustedDoc = (pos>=0 && pos<max) ? pos-base : NO_MORE_DOCS; } }; }
// in core/src/java/org/apache/solr/search/BitDocSet.java
Override public int nextDoc() throws IOException { pos = bs.nextSetBit(pos+1); return adjustedDoc = (pos>=0 && pos<max) ? pos-base : NO_MORE_DOCS; }
// in core/src/java/org/apache/solr/search/BitDocSet.java
Override public int advance(int target) throws IOException { if (target==NO_MORE_DOCS) return adjustedDoc=NO_MORE_DOCS; pos = bs.nextSetBit(target+base); return adjustedDoc = (pos>=0 && pos<max) ? pos-base : NO_MORE_DOCS; }
// in core/src/java/org/apache/solr/search/BitDocSet.java
Override public Bits bits() throws IOException { return new Bits() { @Override public boolean get(int index) { return bs.fastGet(index + base); } @Override public int length() { return maxDoc; } }; }
// in core/src/java/org/apache/solr/search/SolrConstantScoreQuery.java
Override public Query rewrite(IndexReader reader) throws IOException { return this; }
// in core/src/java/org/apache/solr/search/SolrConstantScoreQuery.java
Override public float getValueForNormalization() throws IOException { queryWeight = getBoost(); return queryWeight * queryWeight; }
// in core/src/java/org/apache/solr/search/SolrConstantScoreQuery.java
Override public Scorer scorer(AtomicReaderContext context, boolean scoreDocsInOrder, boolean topScorer, Bits acceptDocs) throws IOException { return new ConstantScorer(context, this, queryWeight, acceptDocs); }
// in core/src/java/org/apache/solr/search/SolrConstantScoreQuery.java
Override public Explanation explain(AtomicReaderContext context, int doc) throws IOException { ConstantScorer cs = new ConstantScorer(context, this, queryWeight, context.reader().getLiveDocs()); boolean exists = cs.docIdSetIterator.advance(doc) == doc; ComplexExplanation result = new ComplexExplanation(); if (exists) { result.setDescription("ConstantScoreQuery(" + filter + "), product of:"); result.setValue(queryWeight); result.setMatch(Boolean.TRUE); result.addDetail(new Explanation(getBoost(), "boost")); result.addDetail(new Explanation(queryNorm,"queryNorm")); } else { result.setDescription("ConstantScoreQuery(" + filter + ") doesn't match id " + doc); result.setValue(0); result.setMatch(Boolean.FALSE); } return result; }
// in core/src/java/org/apache/solr/search/SolrConstantScoreQuery.java
Override public int nextDoc() throws IOException { return docIdSetIterator.nextDoc(); }
// in core/src/java/org/apache/solr/search/SolrConstantScoreQuery.java
Override public float score() throws IOException { return theScore; }
// in core/src/java/org/apache/solr/search/SolrConstantScoreQuery.java
Override public int advance(int target) throws IOException { return docIdSetIterator.advance(target); }
// in core/src/java/org/apache/solr/search/LRUCache.java
public void warm(SolrIndexSearcher searcher, SolrCache<K,V> old) throws IOException { if (regenerator==null) return; long warmingStartTime = System.currentTimeMillis(); LRUCache<K,V> other = (LRUCache<K,V>)old; // warm entries if (isAutowarmingOn()) { Object[] keys,vals = null; // Don't do the autowarming in the synchronized block, just pull out the keys and values. synchronized (other.map) { int sz = autowarm.getWarmCount(other.map.size()); keys = new Object[sz]; vals = new Object[sz]; Iterator<Map.Entry<K, V>> iter = other.map.entrySet().iterator(); // iteration goes from oldest (least recently used) to most recently used, // so we need to skip over the oldest entries. int skip = other.map.size() - sz; for (int i=0; i<skip; i++) iter.next(); for (int i=0; i<sz; i++) { Map.Entry<K,V> entry = iter.next(); keys[i]=entry.getKey(); vals[i]=entry.getValue(); } } // autowarm from the oldest to the newest entries so that the ordering will be // correct in the new cache. for (int i=0; i<keys.length; i++) { try { boolean continueRegen = regenerator.regenerateItem(searcher, this, old, keys[i], vals[i]); if (!continueRegen) break; } catch (Throwable e) { SolrException.log(log,"Error during auto-warming of key:" + keys[i], e); } } } warmupTime = System.currentTimeMillis() - warmingStartTime; }
// in core/src/java/org/apache/solr/search/QueryParsing.java
static FieldType writeFieldName(String name, IndexSchema schema, Appendable out, int flags) throws IOException { FieldType ft = null; ft = schema.getFieldTypeNoEx(name); out.append(name); if (ft == null) { out.append("(UNKNOWN FIELD " + name + ')'); } out.append(':'); return ft; }
// in core/src/java/org/apache/solr/search/QueryParsing.java
static void writeFieldVal(String val, FieldType ft, Appendable out, int flags) throws IOException { if (ft != null) { try { out.append(ft.indexedToReadable(val)); } catch (Exception e) { out.append("EXCEPTION(val="); out.append(val); out.append(")"); } } else { out.append(val); } }
// in core/src/java/org/apache/solr/search/QueryParsing.java
static void writeFieldVal(BytesRef val, FieldType ft, Appendable out, int flags) throws IOException { if (ft != null) { try { CharsRef readable = new CharsRef(); ft.indexedToReadable(val, readable); out.append(readable); } catch (Exception e) { out.append("EXCEPTION(val="); out.append(val.utf8ToString()); out.append(")"); } } else { out.append(val.utf8ToString()); } }
// in core/src/java/org/apache/solr/search/QueryParsing.java
public static void toString(Query query, IndexSchema schema, Appendable out, int flags) throws IOException { boolean writeBoost = true; if (query instanceof TermQuery) { TermQuery q = (TermQuery) query; Term t = q.getTerm(); FieldType ft = writeFieldName(t.field(), schema, out, flags); writeFieldVal(t.bytes(), ft, out, flags); } else if (query instanceof TermRangeQuery) { TermRangeQuery q = (TermRangeQuery) query; String fname = q.getField(); FieldType ft = writeFieldName(fname, schema, out, flags); out.append(q.includesLower() ? '[' : '{'); BytesRef lt = q.getLowerTerm(); BytesRef ut = q.getUpperTerm(); if (lt == null) { out.append('*'); } else { writeFieldVal(lt, ft, out, flags); } out.append(" TO "); if (ut == null) { out.append('*'); } else { writeFieldVal(ut, ft, out, flags); } out.append(q.includesUpper() ? ']' : '}'); } else if (query instanceof NumericRangeQuery) { NumericRangeQuery q = (NumericRangeQuery) query; String fname = q.getField(); FieldType ft = writeFieldName(fname, schema, out, flags); out.append(q.includesMin() ? '[' : '{'); Number lt = q.getMin(); Number ut = q.getMax(); if (lt == null) { out.append('*'); } else { out.append(lt.toString()); } out.append(" TO "); if (ut == null) { out.append('*'); } else { out.append(ut.toString()); } out.append(q.includesMax() ? ']' : '}'); } else if (query instanceof BooleanQuery) { BooleanQuery q = (BooleanQuery) query; boolean needParens = false; if (q.getBoost() != 1.0 || q.getMinimumNumberShouldMatch() != 0 || q.isCoordDisabled()) { needParens = true; } if (needParens) { out.append('('); } boolean first = true; for (BooleanClause c : q.clauses()) { if (!first) { out.append(' '); } else { first = false; } if (c.isProhibited()) { out.append('-'); } else if (c.isRequired()) { out.append('+'); } Query subQuery = c.getQuery(); boolean wrapQuery = false; // TODO: may need to put parens around other types // of queries too, depending on future syntax. if (subQuery instanceof BooleanQuery) { wrapQuery = true; } if (wrapQuery) { out.append('('); } toString(subQuery, schema, out, flags); if (wrapQuery) { out.append(')'); } } if (needParens) { out.append(')'); } if (q.getMinimumNumberShouldMatch() > 0) { out.append('~'); out.append(Integer.toString(q.getMinimumNumberShouldMatch())); } if (q.isCoordDisabled()) { out.append("/no_coord"); } } else if (query instanceof PrefixQuery) { PrefixQuery q = (PrefixQuery) query; Term prefix = q.getPrefix(); FieldType ft = writeFieldName(prefix.field(), schema, out, flags); out.append(prefix.text()); out.append('*'); } else if (query instanceof WildcardQuery) { out.append(query.toString()); writeBoost = false; } else if (query instanceof FuzzyQuery) { out.append(query.toString()); writeBoost = false; } else if (query instanceof ConstantScoreQuery) { out.append(query.toString()); writeBoost = false; } else { out.append(query.getClass().getSimpleName() + '(' + query.toString() + ')'); writeBoost = false; } if (writeBoost && query.getBoost() != 1.0f) { out.append("^"); out.append(Float.toString(query.getBoost())); } }
// in core/src/java/org/apache/solr/search/grouping/distributed/command/TopGroupsFieldCommand.java
public List<Collector> create() throws IOException { if (firstPhaseGroups.isEmpty()) { return Collections.emptyList(); } List<Collector> collectors = new ArrayList<Collector>(); secondPassCollector = new TermSecondPassGroupingCollector( field.getName(), firstPhaseGroups, groupSort, sortWithinGroup, maxDocPerGroup, needScores, needMaxScore, true ); collectors.add(secondPassCollector); return collectors; }
// in core/src/java/org/apache/solr/search/grouping/distributed/command/SearchGroupsFieldCommand.java
public List<Collector> create() throws IOException { List<Collector> collectors = new ArrayList<Collector>(); if (topNGroups > 0) { firstPassGroupingCollector = new TermFirstPassGroupingCollector(field.getName(), groupSort, topNGroups); collectors.add(firstPassGroupingCollector); } if (includeGroupCount) { allGroupsCollector = new TermAllGroupsCollector(field.getName()); collectors.add(allGroupsCollector); } return collectors; }
// in core/src/java/org/apache/solr/search/grouping/distributed/command/QueryCommand.java
public Builder setDocSet(SolrIndexSearcher searcher) throws IOException { return setDocSet(searcher.getDocSet(query)); }
// in core/src/java/org/apache/solr/search/grouping/distributed/command/QueryCommand.java
public List<Collector> create() throws IOException { if (sort == null || sort == Sort.RELEVANCE) { collector = TopScoreDocCollector.create(docsToCollect, true); } else { collector = TopFieldCollector.create(sort, docsToCollect, true, needScores, needScores, true); } filterCollector = new FilterCollector(docSet, collector); return Arrays.asList((Collector) filterCollector); }
// in core/src/java/org/apache/solr/search/grouping/distributed/shardresultserializer/SearchGroupsResultTransformer.java
public NamedList transform(List<Command> data) throws IOException { NamedList<NamedList> result = new NamedList<NamedList>(); for (Command command : data) { final NamedList<Object> commandResult = new NamedList<Object>(); if (SearchGroupsFieldCommand.class.isInstance(command)) { SearchGroupsFieldCommand fieldCommand = (SearchGroupsFieldCommand) command; Pair<Integer, Collection<SearchGroup<BytesRef>>> pair = fieldCommand.result(); Integer groupedCount = pair.getA(); Collection<SearchGroup<BytesRef>> searchGroups = pair.getB(); if (searchGroups != null) { commandResult.add("topGroups", serializeSearchGroup(searchGroups, fieldCommand.getGroupSort())); } if (groupedCount != null) { commandResult.add("groupCount", groupedCount); } } else { continue; } result.add(command.getKey(), commandResult); } return result; }
// in core/src/java/org/apache/solr/search/grouping/distributed/shardresultserializer/SearchGroupsResultTransformer.java
public Map<String, Pair<Integer, Collection<SearchGroup<BytesRef>>>> transformToNative(NamedList<NamedList> shardResponse, Sort groupSort, Sort sortWithinGroup, String shard) throws IOException { Map<String, Pair<Integer, Collection<SearchGroup<BytesRef>>>> result = new HashMap<String, Pair<Integer, Collection<SearchGroup<BytesRef>>>>(); for (Map.Entry<String, NamedList> command : shardResponse) { List<SearchGroup<BytesRef>> searchGroups = new ArrayList<SearchGroup<BytesRef>>(); NamedList topGroupsAndGroupCount = command.getValue(); @SuppressWarnings("unchecked") NamedList<List<Comparable>> rawSearchGroups = (NamedList<List<Comparable>>) topGroupsAndGroupCount.get("topGroups"); if (rawSearchGroups != null) { for (Map.Entry<String, List<Comparable>> rawSearchGroup : rawSearchGroups){ SearchGroup<BytesRef> searchGroup = new SearchGroup<BytesRef>(); searchGroup.groupValue = rawSearchGroup.getKey() != null ? new BytesRef(rawSearchGroup.getKey()) : null; searchGroup.sortValues = rawSearchGroup.getValue().toArray(new Comparable[rawSearchGroup.getValue().size()]); searchGroups.add(searchGroup); } } Integer groupCount = (Integer) topGroupsAndGroupCount.get("groupCount"); result.put(command.getKey(), new Pair<Integer, Collection<SearchGroup<BytesRef>>>(groupCount, searchGroups)); } return result; }
// in core/src/java/org/apache/solr/search/grouping/distributed/shardresultserializer/TopGroupsResultTransformer.java
public NamedList transform(List<Command> data) throws IOException { NamedList<NamedList> result = new NamedList<NamedList>(); for (Command command : data) { NamedList commandResult; if (TopGroupsFieldCommand.class.isInstance(command)) { TopGroupsFieldCommand fieldCommand = (TopGroupsFieldCommand) command; SchemaField groupField = rb.req.getSearcher().getSchema().getField(fieldCommand.getKey()); commandResult = serializeTopGroups(fieldCommand.result(), groupField); } else if (QueryCommand.class.isInstance(command)) { QueryCommand queryCommand = (QueryCommand) command; commandResult = serializeTopDocs(queryCommand.result()); } else { commandResult = null; } result.add(command.getKey(), commandResult); } return result; }
// in core/src/java/org/apache/solr/search/grouping/distributed/shardresultserializer/TopGroupsResultTransformer.java
protected NamedList serializeTopGroups(TopGroups<BytesRef> data, SchemaField groupField) throws IOException { NamedList<Object> result = new NamedList<Object>(); result.add("totalGroupedHitCount", data.totalGroupedHitCount); result.add("totalHitCount", data.totalHitCount); if (data.totalGroupCount != null) { result.add("totalGroupCount", data.totalGroupCount); } CharsRef spare = new CharsRef(); SchemaField uniqueField = rb.req.getSearcher().getSchema().getUniqueKeyField(); for (GroupDocs<BytesRef> searchGroup : data.groups) { NamedList<Object> groupResult = new NamedList<Object>(); groupResult.add("totalHits", searchGroup.totalHits); if (!Float.isNaN(searchGroup.maxScore)) { groupResult.add("maxScore", searchGroup.maxScore); } List<NamedList<Object>> documents = new ArrayList<NamedList<Object>>(); for (int i = 0; i < searchGroup.scoreDocs.length; i++) { NamedList<Object> document = new NamedList<Object>(); documents.add(document); Document doc = retrieveDocument(uniqueField, searchGroup.scoreDocs[i].doc); document.add("id", uniqueField.getType().toExternal(doc.getField(uniqueField.getName()))); if (!Float.isNaN(searchGroup.scoreDocs[i].score)) { document.add("score", searchGroup.scoreDocs[i].score); } if (!(searchGroup.scoreDocs[i] instanceof FieldDoc)) { continue; } FieldDoc fieldDoc = (FieldDoc) searchGroup.scoreDocs[i]; Object[] convertedSortValues = new Object[fieldDoc.fields.length]; for (int j = 0; j < fieldDoc.fields.length; j++) { Object sortValue = fieldDoc.fields[j]; Sort sortWithinGroup = rb.getGroupingSpec().getSortWithinGroup(); SchemaField field = sortWithinGroup.getSort()[j].getField() != null ? rb.req.getSearcher().getSchema().getFieldOrNull(sortWithinGroup.getSort()[j].getField()) : null; if (field != null) { FieldType fieldType = field.getType(); if (sortValue instanceof BytesRef) { UnicodeUtil.UTF8toUTF16((BytesRef)sortValue, spare); String indexedValue = spare.toString(); sortValue = fieldType.toObject(field.createField(fieldType.indexedToReadable(indexedValue), 0.0f)); } else if (sortValue instanceof String) { sortValue = fieldType.toObject(field.createField(fieldType.indexedToReadable((String) sortValue), 0.0f)); } } convertedSortValues[j] = sortValue; } document.add("sortValues", convertedSortValues); } groupResult.add("documents", documents); String groupValue = searchGroup.groupValue != null ? groupField.getType().indexedToReadable(searchGroup.groupValue.utf8ToString()): null; result.add(groupValue, groupResult); } return result; }
// in core/src/java/org/apache/solr/search/grouping/distributed/shardresultserializer/TopGroupsResultTransformer.java
protected NamedList serializeTopDocs(QueryCommandResult result) throws IOException { NamedList<Object> queryResult = new NamedList<Object>(); queryResult.add("matches", result.getMatches()); queryResult.add("totalHits", result.getTopDocs().totalHits); if (rb.getGroupingSpec().isNeedScore()) { queryResult.add("maxScore", result.getTopDocs().getMaxScore()); } List<NamedList> documents = new ArrayList<NamedList>(); queryResult.add("documents", documents); SchemaField uniqueField = rb.req.getSearcher().getSchema().getUniqueKeyField(); CharsRef spare = new CharsRef(); for (ScoreDoc scoreDoc : result.getTopDocs().scoreDocs) { NamedList<Object> document = new NamedList<Object>(); documents.add(document); Document doc = retrieveDocument(uniqueField, scoreDoc.doc); document.add("id", uniqueField.getType().toExternal(doc.getField(uniqueField.getName()))); if (rb.getGroupingSpec().isNeedScore()) { document.add("score", scoreDoc.score); } if (!FieldDoc.class.isInstance(scoreDoc)) { continue; } FieldDoc fieldDoc = (FieldDoc) scoreDoc; Object[] convertedSortValues = new Object[fieldDoc.fields.length]; for (int j = 0; j < fieldDoc.fields.length; j++) { Object sortValue = fieldDoc.fields[j]; Sort groupSort = rb.getGroupingSpec().getGroupSort(); SchemaField field = groupSort.getSort()[j].getField() != null ? rb.req.getSearcher().getSchema().getFieldOrNull(groupSort.getSort()[j].getField()) : null; if (field != null) { FieldType fieldType = field.getType(); if (sortValue instanceof BytesRef) { UnicodeUtil.UTF8toUTF16((BytesRef)sortValue, spare); String indexedValue = spare.toString(); sortValue = fieldType.toObject(field.createField(fieldType.indexedToReadable(indexedValue), 0.0f)); } else if (sortValue instanceof String) { sortValue = fieldType.toObject(field.createField(fieldType.indexedToReadable((String) sortValue), 0.0f)); } } convertedSortValues[j] = sortValue; } document.add("sortValues", convertedSortValues); } return queryResult; }
// in core/src/java/org/apache/solr/search/grouping/distributed/shardresultserializer/TopGroupsResultTransformer.java
private Document retrieveDocument(final SchemaField uniqueField, int doc) throws IOException { DocumentStoredFieldVisitor visitor = new DocumentStoredFieldVisitor(uniqueField.getName()); rb.req.getSearcher().doc(doc, visitor); return visitor.getDocument(); }
// in core/src/java/org/apache/solr/search/grouping/CommandHandler.java
private DocSet computeGroupedDocSet(Query query, Filter luceneFilter, List<Collector> collectors) throws IOException { Command firstCommand = commands.get(0); AbstractAllGroupHeadsCollector termAllGroupHeadsCollector = TermAllGroupHeadsCollector.create(firstCommand.getKey(), firstCommand.getSortWithinGroup()); if (collectors.isEmpty()) { searchWithTimeLimiter(query, luceneFilter, termAllGroupHeadsCollector); } else { collectors.add(termAllGroupHeadsCollector); searchWithTimeLimiter(query, luceneFilter, MultiCollector.wrap(collectors.toArray(new Collector[collectors.size()]))); } int maxDoc = searcher.maxDoc(); long[] bits = termAllGroupHeadsCollector.retrieveGroupHeads(maxDoc).getBits(); return new BitDocSet(new OpenBitSet(bits, bits.length)); }
// in core/src/java/org/apache/solr/search/grouping/CommandHandler.java
private DocSet computeDocSet(Query query, Filter luceneFilter, List<Collector> collectors) throws IOException { int maxDoc = searcher.maxDoc(); DocSetCollector docSetCollector; if (collectors.isEmpty()) { docSetCollector = new DocSetCollector(maxDoc >> 6, maxDoc); } else { Collector wrappedCollectors = MultiCollector.wrap(collectors.toArray(new Collector[collectors.size()])); docSetCollector = new DocSetDelegateCollector(maxDoc >> 6, maxDoc, wrappedCollectors); } searchWithTimeLimiter(query, luceneFilter, docSetCollector); return docSetCollector.getDocSet(); }
// in core/src/java/org/apache/solr/search/grouping/CommandHandler.java
private void searchWithTimeLimiter(final Query query, final Filter luceneFilter, Collector collector) throws IOException { if (queryCommand.getTimeAllowed() > 0 ) { collector = new TimeLimitingCollector(collector, TimeLimitingCollector.getGlobalCounter(), queryCommand.getTimeAllowed()); } TotalHitCountCollector hitCountCollector = new TotalHitCountCollector(); if (includeHitCount) { collector = MultiCollector.wrap(collector, hitCountCollector); } try { searcher.search(query, luceneFilter, collector); } catch (TimeLimitingCollector.TimeExceededException x) { partialResults = true; logger.warn( "Query: " + query + "; " + x.getMessage() ); } if (includeHitCount) { totalHitCount = hitCountCollector.getTotalHits(); } }
// in core/src/java/org/apache/solr/search/grouping/collector/FilterCollector.java
public void setScorer(Scorer scorer) throws IOException { delegate.setScorer(scorer); }
// in core/src/java/org/apache/solr/search/grouping/collector/FilterCollector.java
public void collect(int doc) throws IOException { matches++; if (filter.exists(doc + docBase)) { delegate.collect(doc); } }
// in core/src/java/org/apache/solr/search/grouping/collector/FilterCollector.java
public void setNextReader(AtomicReaderContext context) throws IOException { this.docBase = context.docBase; delegate.setNextReader(context); }
// in core/src/java/org/apache/solr/search/DocSetDelegateCollector.java
Override public void collect(int doc) throws IOException { collector.collect(doc); doc += base; // optimistically collect the first docs in an array // in case the total number will be small enough to represent // as a small set like SortedIntDocSet instead... // Storing in this array will be quicker to convert // than scanning through a potentially huge bit vector. // FUTURE: when search methods all start returning docs in order, maybe // we could have a ListDocSet() and use the collected array directly. if (pos < scratch.length) { scratch[pos]=doc; } else { // this conditional could be removed if BitSet was preallocated, but that // would take up more memory, and add more GC time... if (bits==null) bits = new OpenBitSet(maxDoc); bits.fastSet(doc); } pos++; }
// in core/src/java/org/apache/solr/search/DocSetDelegateCollector.java
Override public void setScorer(Scorer scorer) throws IOException { collector.setScorer(scorer); }
// in core/src/java/org/apache/solr/search/DocSetDelegateCollector.java
Override public void setNextReader(AtomicReaderContext context) throws IOException { collector.setNextReader(context); this.base = context.docBase; }
// in core/src/java/org/apache/solr/search/JoinQParserPlugin.java
Override public Query rewrite(IndexReader reader) throws IOException { // don't rewrite the subQuery return this; }
// in core/src/java/org/apache/solr/search/JoinQParserPlugin.java
public Weight createWeight(IndexSearcher searcher) throws IOException { return new JoinQueryWeight((SolrIndexSearcher)searcher); }
// in core/src/java/org/apache/solr/search/JoinQParserPlugin.java
Override public void close() throws IOException { ref.decref(); }
// in core/src/java/org/apache/solr/search/JoinQParserPlugin.java
Override public void close() throws IOException { fromCore.close(); }
// in core/src/java/org/apache/solr/search/JoinQParserPlugin.java
Override public float getValueForNormalization() throws IOException { queryWeight = getBoost(); return queryWeight * queryWeight; }
// in core/src/java/org/apache/solr/search/JoinQParserPlugin.java
Override public Scorer scorer(AtomicReaderContext context, boolean scoreDocsInOrder, boolean topScorer, Bits acceptDocs) throws IOException { if (filter == null) { boolean debug = rb != null && rb.isDebug(); long start = debug ? System.currentTimeMillis() : 0; resultSet = getDocSet(); long end = debug ? System.currentTimeMillis() : 0; if (debug) { SimpleOrderedMap<Object> dbg = new SimpleOrderedMap<Object>(); dbg.add("time", (end-start)); dbg.add("fromSetSize", fromSetSize); // the input dbg.add("toSetSize", resultSet.size()); // the output dbg.add("fromTermCount", fromTermCount); dbg.add("fromTermTotalDf", fromTermTotalDf); dbg.add("fromTermDirectCount", fromTermDirectCount); dbg.add("fromTermHits", fromTermHits); dbg.add("fromTermHitsTotalDf", fromTermHitsTotalDf); dbg.add("toTermHits", toTermHits); dbg.add("toTermHitsTotalDf", toTermHitsTotalDf); dbg.add("toTermDirectCount", toTermDirectCount); dbg.add("smallSetsDeferred", smallSetsDeferred); dbg.add("toSetDocsAdded", resultListDocs); // TODO: perhaps synchronize addDebug in the future... rb.addDebug(dbg, "join", JoinQuery.this.toString()); } filter = resultSet.getTopFilter(); } // Although this set only includes live docs, other filters can be pushed down to queries. DocIdSet readerSet = filter.getDocIdSet(context, acceptDocs); if (readerSet == null) readerSet=DocIdSet.EMPTY_DOCIDSET; return new JoinScorer(this, readerSet.iterator(), getBoost()); }
// in core/src/java/org/apache/solr/search/JoinQParserPlugin.java
public DocSet getDocSet() throws IOException { OpenBitSet resultBits = null; // minimum docFreq to use the cache int minDocFreqFrom = Math.max(5, fromSearcher.maxDoc() >> 13); int minDocFreqTo = Math.max(5, toSearcher.maxDoc() >> 13); // use a smaller size than normal since we will need to sort and dedup the results int maxSortedIntSize = Math.max(10, toSearcher.maxDoc() >> 10); DocSet fromSet = fromSearcher.getDocSet(q); fromSetSize = fromSet.size(); List<DocSet> resultList = new ArrayList<DocSet>(10); // make sure we have a set that is fast for random access, if we will use it for that DocSet fastForRandomSet = fromSet; if (minDocFreqFrom>0 && fromSet instanceof SortedIntDocSet) { SortedIntDocSet sset = (SortedIntDocSet)fromSet; fastForRandomSet = new HashDocSet(sset.getDocs(), 0, sset.size()); } Fields fromFields = fromSearcher.getAtomicReader().fields(); Fields toFields = fromSearcher==toSearcher ? fromFields : toSearcher.getAtomicReader().fields(); if (fromFields == null) return DocSet.EMPTY; Terms terms = fromFields.terms(fromField); Terms toTerms = toFields.terms(toField); if (terms == null || toTerms==null) return DocSet.EMPTY; String prefixStr = TrieField.getMainValuePrefix(fromSearcher.getSchema().getFieldType(fromField)); BytesRef prefix = prefixStr == null ? null : new BytesRef(prefixStr); BytesRef term = null; TermsEnum termsEnum = terms.iterator(null); TermsEnum toTermsEnum = toTerms.iterator(null); SolrIndexSearcher.DocsEnumState fromDeState = null; SolrIndexSearcher.DocsEnumState toDeState = null; if (prefix == null) { term = termsEnum.next(); } else { if (termsEnum.seekCeil(prefix, true) != TermsEnum.SeekStatus.END) { term = termsEnum.term(); } } Bits fromLiveDocs = fromSearcher.getAtomicReader().getLiveDocs(); Bits toLiveDocs = fromSearcher == toSearcher ? fromLiveDocs : toSearcher.getAtomicReader().getLiveDocs(); fromDeState = new SolrIndexSearcher.DocsEnumState(); fromDeState.fieldName = fromField; fromDeState.liveDocs = fromLiveDocs; fromDeState.termsEnum = termsEnum; fromDeState.docsEnum = null; fromDeState.minSetSizeCached = minDocFreqFrom; toDeState = new SolrIndexSearcher.DocsEnumState(); toDeState.fieldName = toField; toDeState.liveDocs = toLiveDocs; toDeState.termsEnum = toTermsEnum; toDeState.docsEnum = null; toDeState.minSetSizeCached = minDocFreqTo; while (term != null) { if (prefix != null && !StringHelper.startsWith(term, prefix)) break; fromTermCount++; boolean intersects = false; int freq = termsEnum.docFreq(); fromTermTotalDf++; if (freq < minDocFreqFrom) { fromTermDirectCount++; // OK to skip liveDocs, since we check for intersection with docs matching query fromDeState.docsEnum = fromDeState.termsEnum.docs(null, fromDeState.docsEnum, false); DocsEnum docsEnum = fromDeState.docsEnum; if (docsEnum instanceof MultiDocsEnum) { MultiDocsEnum.EnumWithSlice[] subs = ((MultiDocsEnum)docsEnum).getSubs(); int numSubs = ((MultiDocsEnum)docsEnum).getNumSubs(); outer: for (int subindex = 0; subindex<numSubs; subindex++) { MultiDocsEnum.EnumWithSlice sub = subs[subindex]; if (sub.docsEnum == null) continue; int base = sub.slice.start; int docid; while ((docid = sub.docsEnum.nextDoc()) != DocIdSetIterator.NO_MORE_DOCS) { if (fastForRandomSet.exists(docid+base)) { intersects = true; break outer; } } } } else { int docid; while ((docid = docsEnum.nextDoc()) != DocIdSetIterator.NO_MORE_DOCS) { if (fastForRandomSet.exists(docid)) { intersects = true; break; } } } } else { // use the filter cache DocSet fromTermSet = fromSearcher.getDocSet(fromDeState); intersects = fromSet.intersects(fromTermSet); } if (intersects) { fromTermHits++; fromTermHitsTotalDf++; TermsEnum.SeekStatus status = toTermsEnum.seekCeil(term); if (status == TermsEnum.SeekStatus.END) break; if (status == TermsEnum.SeekStatus.FOUND) { toTermHits++; int df = toTermsEnum.docFreq(); toTermHitsTotalDf += df; if (resultBits==null && df + resultListDocs > maxSortedIntSize && resultList.size() > 0) { resultBits = new OpenBitSet(toSearcher.maxDoc()); } // if we don't have a bitset yet, or if the resulting set will be too large // use the filterCache to get a DocSet if (toTermsEnum.docFreq() >= minDocFreqTo || resultBits == null) { // use filter cache DocSet toTermSet = toSearcher.getDocSet(toDeState); resultListDocs += toTermSet.size(); if (resultBits != null) { toTermSet.setBitsOn(resultBits); } else { if (toTermSet instanceof BitDocSet) { resultBits = (OpenBitSet)((BitDocSet)toTermSet).bits.clone(); } else { resultList.add(toTermSet); } } } else { toTermDirectCount++; // need to use liveDocs here so we don't map to any deleted ones toDeState.docsEnum = toDeState.termsEnum.docs(toDeState.liveDocs, toDeState.docsEnum, false); DocsEnum docsEnum = toDeState.docsEnum; if (docsEnum instanceof MultiDocsEnum) { MultiDocsEnum.EnumWithSlice[] subs = ((MultiDocsEnum)docsEnum).getSubs(); int numSubs = ((MultiDocsEnum)docsEnum).getNumSubs(); for (int subindex = 0; subindex<numSubs; subindex++) { MultiDocsEnum.EnumWithSlice sub = subs[subindex]; if (sub.docsEnum == null) continue; int base = sub.slice.start; int docid; while ((docid = sub.docsEnum.nextDoc()) != DocIdSetIterator.NO_MORE_DOCS) { resultListDocs++; resultBits.fastSet(docid + base); } } } else { int docid; while ((docid = docsEnum.nextDoc()) != DocIdSetIterator.NO_MORE_DOCS) { resultListDocs++; resultBits.fastSet(docid); } } } } } term = termsEnum.next(); } smallSetsDeferred = resultList.size(); if (resultBits != null) { for (DocSet set : resultList) { set.setBitsOn(resultBits); } return new BitDocSet(resultBits); } if (resultList.size()==0) { return DocSet.EMPTY; } if (resultList.size() == 1) { return resultList.get(0); } int sz = 0; for (DocSet set : resultList) sz += set.size(); int[] docs = new int[sz]; int pos = 0; for (DocSet set : resultList) { System.arraycopy(((SortedIntDocSet)set).getDocs(), 0, docs, pos, set.size()); pos += set.size(); } Arrays.sort(docs); int[] dedup = new int[sz]; pos = 0; int last = -1; for (int doc : docs) { if (doc != last) dedup[pos++] = doc; last = doc; } if (pos != dedup.length) { dedup = Arrays.copyOf(dedup, pos); } return new SortedIntDocSet(dedup, dedup.length); }
// in core/src/java/org/apache/solr/search/JoinQParserPlugin.java
Override public Explanation explain(AtomicReaderContext context, int doc) throws IOException { Scorer scorer = scorer(context, true, false, context.reader().getLiveDocs()); boolean exists = scorer.advance(doc) == doc; ComplexExplanation result = new ComplexExplanation(); if (exists) { result.setDescription(this.toString() + " , product of:"); result.setValue(queryWeight); result.setMatch(Boolean.TRUE); result.addDetail(new Explanation(getBoost(), "boost")); result.addDetail(new Explanation(queryNorm,"queryNorm")); } else { result.setDescription(this.toString() + " doesn't match id " + doc); result.setValue(0); result.setMatch(Boolean.FALSE); } return result; }
// in core/src/java/org/apache/solr/search/JoinQParserPlugin.java
Override public int nextDoc() throws IOException { return iter.nextDoc(); }
// in core/src/java/org/apache/solr/search/JoinQParserPlugin.java
Override public float score() throws IOException { return score; }
// in core/src/java/org/apache/solr/search/JoinQParserPlugin.java
Override public int advance(int target) throws IOException { return iter.advance(target); }
// in core/src/java/org/apache/solr/search/function/distance/StringDistanceFunction.java
Override public FunctionValues getValues(Map context, AtomicReaderContext readerContext) throws IOException { final FunctionValues str1DV = str1.getValues(context, readerContext); final FunctionValues str2DV = str2.getValues(context, readerContext); return new FloatDocValues(this) { @Override public float floatVal(int doc) { return dist.getDistance(str1DV.strVal(doc), str2DV.strVal(doc)); } @Override public String toString(int doc) { StringBuilder sb = new StringBuilder(); sb.append("strdist").append('('); sb.append(str1DV.toString(doc)).append(',').append(str2DV.toString(doc)) .append(", dist=").append(dist.getClass().getName()); sb.append(')'); return sb.toString(); } }; }
// in core/src/java/org/apache/solr/search/function/distance/GeohashHaversineFunction.java
Override public FunctionValues getValues(Map context, AtomicReaderContext readerContext) throws IOException { final FunctionValues gh1DV = geoHash1.getValues(context, readerContext); final FunctionValues gh2DV = geoHash2.getValues(context, readerContext); return new DoubleDocValues(this) { @Override public double doubleVal(int doc) { return distance(doc, gh1DV, gh2DV); } @Override public String toString(int doc) { StringBuilder sb = new StringBuilder(); sb.append(name()).append('('); sb.append(gh1DV.toString(doc)).append(',').append(gh2DV.toString(doc)); sb.append(')'); return sb.toString(); } }; }
// in core/src/java/org/apache/solr/search/function/distance/GeohashHaversineFunction.java
Override public void createWeight(Map context, IndexSearcher searcher) throws IOException { geoHash1.createWeight(context, searcher); geoHash2.createWeight(context, searcher); }
// in core/src/java/org/apache/solr/search/function/distance/GeohashFunction.java
Override public FunctionValues getValues(Map context, AtomicReaderContext readerContext) throws IOException { final FunctionValues latDV = lat.getValues(context, readerContext); final FunctionValues lonDV = lon.getValues(context, readerContext); return new FunctionValues() { @Override public String strVal(int doc) { return GeohashUtils.encodeLatLon(latDV.doubleVal(doc), lonDV.doubleVal(doc)); } @Override public String toString(int doc) { StringBuilder sb = new StringBuilder(); sb.append(name()).append('('); sb.append(latDV.toString(doc)).append(',').append(lonDV.toString(doc)); sb.append(')'); return sb.toString(); } }; }
// in core/src/java/org/apache/solr/search/function/distance/VectorDistanceFunction.java
Override public FunctionValues getValues(Map context, AtomicReaderContext readerContext) throws IOException { final FunctionValues vals1 = source1.getValues(context, readerContext); final FunctionValues vals2 = source2.getValues(context, readerContext); return new DoubleDocValues(this) { @Override public double doubleVal(int doc) { return distance(doc, vals1, vals2); } @Override public String toString(int doc) { StringBuilder sb = new StringBuilder(); sb.append(name()).append('(').append(power).append(','); boolean firstTime = true; sb.append(vals1.toString(doc)).append(','); sb.append(vals2.toString(doc)); sb.append(')'); return sb.toString(); } }; }
// in core/src/java/org/apache/solr/search/function/distance/VectorDistanceFunction.java
Override public void createWeight(Map context, IndexSearcher searcher) throws IOException { source1.createWeight(context, searcher); source2.createWeight(context, searcher); }
// in core/src/java/org/apache/solr/search/function/distance/HaversineFunction.java
Override public FunctionValues getValues(Map context, AtomicReaderContext readerContext) throws IOException { final FunctionValues vals1 = p1.getValues(context, readerContext); final FunctionValues vals2 = p2.getValues(context, readerContext); return new DoubleDocValues(this) { @Override public double doubleVal(int doc) { return distance(doc, vals1, vals2); } @Override public String toString(int doc) { StringBuilder sb = new StringBuilder(); sb.append(name()).append('('); sb.append(vals1.toString(doc)).append(',').append(vals2.toString(doc)); sb.append(')'); return sb.toString(); } }; }
// in core/src/java/org/apache/solr/search/function/distance/HaversineFunction.java
Override public void createWeight(Map context, IndexSearcher searcher) throws IOException { p1.createWeight(context, searcher); p2.createWeight(context, searcher); }
// in core/src/java/org/apache/solr/search/function/distance/HaversineConstFunction.java
Override public FunctionValues getValues(Map context, AtomicReaderContext readerContext) throws IOException { final FunctionValues latVals = latSource.getValues(context, readerContext); final FunctionValues lonVals = lonSource.getValues(context, readerContext); final double latCenterRad = this.latCenter * DEGREES_TO_RADIANS; final double lonCenterRad = this.lonCenter * DEGREES_TO_RADIANS; final double latCenterRad_cos = this.latCenterRad_cos; return new DoubleDocValues(this) { @Override public double doubleVal(int doc) { double latRad = latVals.doubleVal(doc) * DEGREES_TO_RADIANS; double lonRad = lonVals.doubleVal(doc) * DEGREES_TO_RADIANS; double diffX = latCenterRad - latRad; double diffY = lonCenterRad - lonRad; double hsinX = Math.sin(diffX * 0.5); double hsinY = Math.sin(diffY * 0.5); double h = hsinX * hsinX + (latCenterRad_cos * Math.cos(latRad) * hsinY * hsinY); return (EARTH_MEAN_DIAMETER * Math.atan2(Math.sqrt(h), Math.sqrt(1 - h))); } @Override public String toString(int doc) { return name() + '(' + latVals.toString(doc) + ',' + lonVals.toString(doc) + ',' + latCenter + ',' + lonCenter + ')'; } }; }
// in core/src/java/org/apache/solr/search/function/distance/HaversineConstFunction.java
Override public void createWeight(Map context, IndexSearcher searcher) throws IOException { latSource.createWeight(context, searcher); lonSource.createWeight(context, searcher); }
// in core/src/java/org/apache/solr/search/function/FileFloatSource.java
Override public FunctionValues getValues(Map context, AtomicReaderContext readerContext) throws IOException { final int off = readerContext.docBase; IndexReaderContext topLevelContext = ReaderUtil.getTopLevelContext(readerContext); final float[] arr = getCachedFloats(topLevelContext.reader()); return new FloatDocValues(this) { @Override public float floatVal(int doc) { return arr[doc + off]; } @Override public Object objectVal(int doc) { return floatVal(doc); // TODO: keep track of missing values } }; }
// in core/src/java/org/apache/solr/search/function/ValueSourceRangeFilter.java
Override public DocIdSet getDocIdSet(final Map context, final AtomicReaderContext readerContext, Bits acceptDocs) throws IOException { return BitsFilteredDocIdSet.wrap(new DocIdSet() { @Override public DocIdSetIterator iterator() throws IOException { return valueSource.getValues(context, readerContext).getRangeScorer(readerContext.reader(), lowerVal, upperVal, includeLower, includeUpper); } @Override public Bits bits() throws IOException { return null; // don't use random access } }, acceptDocs); }
// in core/src/java/org/apache/solr/search/function/ValueSourceRangeFilter.java
Override public DocIdSetIterator iterator() throws IOException { return valueSource.getValues(context, readerContext).getRangeScorer(readerContext.reader(), lowerVal, upperVal, includeLower, includeUpper); }
// in core/src/java/org/apache/solr/search/function/ValueSourceRangeFilter.java
Override public Bits bits() throws IOException { return null; // don't use random access }
// in core/src/java/org/apache/solr/search/function/ValueSourceRangeFilter.java
Override public void createWeight(Map context, IndexSearcher searcher) throws IOException { valueSource.createWeight(context, searcher); }
// in core/src/java/org/apache/solr/search/SortedIntDocSet.java
Override public Filter getTopFilter() { return new Filter() { int lastEndIdx = 0; @Override public DocIdSet getDocIdSet(final AtomicReaderContext context, final Bits acceptDocs) throws IOException { AtomicReader reader = context.reader(); // all Solr DocSets that are used as filters only include live docs final Bits acceptDocs2 = acceptDocs == null ? null : (reader.getLiveDocs() == acceptDocs ? null : acceptDocs); final int base = context.docBase; final int maxDoc = reader.maxDoc(); final int max = base + maxDoc; // one past the max doc in this segment. int sidx = Math.max(0,lastEndIdx); if (sidx > 0 && docs[sidx-1] >= base) { // oops, the lastEndIdx isn't correct... we must have been used // in a multi-threaded context, or the indexreaders are being // used out-of-order. start at 0. sidx = 0; } if (sidx < docs.length && docs[sidx] < base) { // if docs[sidx] is < base, we need to seek to find the real start. sidx = findIndex(docs, base, sidx, docs.length-1); } final int startIdx = sidx; // Largest possible end index is limited to the start index // plus the number of docs contained in the segment. Subtract 1 since // the end index is inclusive. int eidx = Math.min(docs.length, startIdx + maxDoc) - 1; // find the real end eidx = findIndex(docs, max, startIdx, eidx) - 1; final int endIdx = eidx; lastEndIdx = endIdx; return BitsFilteredDocIdSet.wrap(new DocIdSet() { @Override public DocIdSetIterator iterator() throws IOException { return new DocIdSetIterator() { int idx = startIdx; int adjustedDoc = -1; @Override public int docID() { return adjustedDoc; } @Override public int nextDoc() throws IOException { return adjustedDoc = (idx > endIdx) ? NO_MORE_DOCS : (docs[idx++] - base); } @Override public int advance(int target) throws IOException { if (idx > endIdx || target==NO_MORE_DOCS) return adjustedDoc=NO_MORE_DOCS; target += base; // probe next int rawDoc = docs[idx++]; if (rawDoc >= target) return adjustedDoc=rawDoc-base; int high = endIdx; // TODO: probe more before resorting to binary search? // binary search while (idx <= high) { int mid = (idx+high) >>> 1; rawDoc = docs[mid]; if (rawDoc < target) { idx = mid+1; } else if (rawDoc > target) { high = mid-1; } else { idx=mid+1; return adjustedDoc=rawDoc - base; } } // low is on the insertion point... if (idx <= endIdx) { return adjustedDoc = docs[idx++] - base; } else { return adjustedDoc=NO_MORE_DOCS; } } }; } @Override public boolean isCacheable() { return true; } @Override public Bits bits() throws IOException { // random access is expensive for this set return null; } }, acceptDocs2); } }; }
// in core/src/java/org/apache/solr/search/SortedIntDocSet.java
Override public DocIdSet getDocIdSet(final AtomicReaderContext context, final Bits acceptDocs) throws IOException { AtomicReader reader = context.reader(); // all Solr DocSets that are used as filters only include live docs final Bits acceptDocs2 = acceptDocs == null ? null : (reader.getLiveDocs() == acceptDocs ? null : acceptDocs); final int base = context.docBase; final int maxDoc = reader.maxDoc(); final int max = base + maxDoc; // one past the max doc in this segment. int sidx = Math.max(0,lastEndIdx); if (sidx > 0 && docs[sidx-1] >= base) { // oops, the lastEndIdx isn't correct... we must have been used // in a multi-threaded context, or the indexreaders are being // used out-of-order. start at 0. sidx = 0; } if (sidx < docs.length && docs[sidx] < base) { // if docs[sidx] is < base, we need to seek to find the real start. sidx = findIndex(docs, base, sidx, docs.length-1); } final int startIdx = sidx; // Largest possible end index is limited to the start index // plus the number of docs contained in the segment. Subtract 1 since // the end index is inclusive. int eidx = Math.min(docs.length, startIdx + maxDoc) - 1; // find the real end eidx = findIndex(docs, max, startIdx, eidx) - 1; final int endIdx = eidx; lastEndIdx = endIdx; return BitsFilteredDocIdSet.wrap(new DocIdSet() { @Override public DocIdSetIterator iterator() throws IOException { return new DocIdSetIterator() { int idx = startIdx; int adjustedDoc = -1; @Override public int docID() { return adjustedDoc; } @Override public int nextDoc() throws IOException { return adjustedDoc = (idx > endIdx) ? NO_MORE_DOCS : (docs[idx++] - base); } @Override public int advance(int target) throws IOException { if (idx > endIdx || target==NO_MORE_DOCS) return adjustedDoc=NO_MORE_DOCS; target += base; // probe next int rawDoc = docs[idx++]; if (rawDoc >= target) return adjustedDoc=rawDoc-base; int high = endIdx; // TODO: probe more before resorting to binary search? // binary search while (idx <= high) { int mid = (idx+high) >>> 1; rawDoc = docs[mid]; if (rawDoc < target) { idx = mid+1; } else if (rawDoc > target) { high = mid-1; } else { idx=mid+1; return adjustedDoc=rawDoc - base; } } // low is on the insertion point... if (idx <= endIdx) { return adjustedDoc = docs[idx++] - base; } else { return adjustedDoc=NO_MORE_DOCS; } } }; } @Override public boolean isCacheable() { return true; } @Override public Bits bits() throws IOException { // random access is expensive for this set return null; } }, acceptDocs2); }
// in core/src/java/org/apache/solr/search/SortedIntDocSet.java
Override public DocIdSetIterator iterator() throws IOException { return new DocIdSetIterator() { int idx = startIdx; int adjustedDoc = -1; @Override public int docID() { return adjustedDoc; } @Override public int nextDoc() throws IOException { return adjustedDoc = (idx > endIdx) ? NO_MORE_DOCS : (docs[idx++] - base); } @Override public int advance(int target) throws IOException { if (idx > endIdx || target==NO_MORE_DOCS) return adjustedDoc=NO_MORE_DOCS; target += base; // probe next int rawDoc = docs[idx++]; if (rawDoc >= target) return adjustedDoc=rawDoc-base; int high = endIdx; // TODO: probe more before resorting to binary search? // binary search while (idx <= high) { int mid = (idx+high) >>> 1; rawDoc = docs[mid]; if (rawDoc < target) { idx = mid+1; } else if (rawDoc > target) { high = mid-1; } else { idx=mid+1; return adjustedDoc=rawDoc - base; } } // low is on the insertion point... if (idx <= endIdx) { return adjustedDoc = docs[idx++] - base; } else { return adjustedDoc=NO_MORE_DOCS; } } }; }
// in core/src/java/org/apache/solr/search/SortedIntDocSet.java
Override public int nextDoc() throws IOException { return adjustedDoc = (idx > endIdx) ? NO_MORE_DOCS : (docs[idx++] - base); }
// in core/src/java/org/apache/solr/search/SortedIntDocSet.java
Override public int advance(int target) throws IOException { if (idx > endIdx || target==NO_MORE_DOCS) return adjustedDoc=NO_MORE_DOCS; target += base; // probe next int rawDoc = docs[idx++]; if (rawDoc >= target) return adjustedDoc=rawDoc-base; int high = endIdx; // TODO: probe more before resorting to binary search? // binary search while (idx <= high) { int mid = (idx+high) >>> 1; rawDoc = docs[mid]; if (rawDoc < target) { idx = mid+1; } else if (rawDoc > target) { high = mid-1; } else { idx=mid+1; return adjustedDoc=rawDoc - base; } } // low is on the insertion point... if (idx <= endIdx) { return adjustedDoc = docs[idx++] - base; } else { return adjustedDoc=NO_MORE_DOCS; } }
// in core/src/java/org/apache/solr/search/SortedIntDocSet.java
Override public Bits bits() throws IOException { // random access is expensive for this set return null; }
// in core/src/java/org/apache/solr/search/LuceneQueryOptimizer.java
public TopDocs optimize(BooleanQuery original, SolrIndexSearcher searcher, int numHits, Query[] queryOut, Filter[] filterOut ) throws IOException { BooleanQuery query = new BooleanQuery(); BooleanQuery filterQuery = null; for (BooleanClause c : original.clauses()) { /*** System.out.println("required="+c.required); System.out.println("boost="+c.query.getBoost()); System.out.println("isTermQuery="+(c.query instanceof TermQuery)); if (c.query instanceof TermQuery) { System.out.println("term="+((TermQuery)c.query).getTerm()); System.out.println("docFreq="+searcher.docFreq(((TermQuery)c.query).getTerm())); } ***/ Query q = c.getQuery(); if (c.isRequired() // required && q.getBoost() == 0.0f // boost is zero && q instanceof TermQuery // TermQuery && (searcher.docFreq(((TermQuery)q).getTerm()) / (float)searcher.maxDoc()) >= threshold) { // check threshold if (filterQuery == null) filterQuery = new BooleanQuery(); filterQuery.add(q, BooleanClause.Occur.MUST); // filter it //System.out.println("WooHoo... qualified to be hoisted to a filter!"); } else { query.add(c); // query it } } Filter filter = null; if (filterQuery != null) { synchronized (cache) { // check cache filter = (Filter)cache.get(filterQuery); } if (filter == null) { // miss filter = new CachingWrapperFilter(new QueryWrapperFilter(filterQuery)); // construct new entry synchronized (cache) { cache.put(filterQuery, filter); // cache it } } } // YCS: added code to pass out optimized query and filter // so they can be used with Hits if (queryOut != null && filterOut != null) { queryOut[0] = query; filterOut[0] = filter; return null; } else { return searcher.search(query, filter, numHits); } }
// in core/src/java/org/apache/solr/search/MissingStringLastComparatorSource.java
Override public FieldComparator newComparator(String fieldname, int numHits, int sortPos, boolean reversed) throws IOException { return new TermOrdValComparator_SML(numHits, fieldname, sortPos, reversed, missingValueProxy); }
// in core/src/java/org/apache/solr/search/MissingStringLastComparatorSource.java
Override public FieldComparator setNextReader(AtomicReaderContext context) throws IOException { return TermOrdValComparator_SML.createComparator(context.reader(), this); }
// in core/src/java/org/apache/solr/search/MissingStringLastComparatorSource.java
Override public FieldComparator setNextReader(AtomicReaderContext context) throws IOException { return TermOrdValComparator_SML.createComparator(context.reader(), parent); }
// in core/src/java/org/apache/solr/search/MissingStringLastComparatorSource.java
public static FieldComparator createComparator(AtomicReader reader, TermOrdValComparator_SML parent) throws IOException { parent.termsIndex = FieldCache.DEFAULT.getTermsIndex(reader, parent.field); final PackedInts.Reader docToOrd = parent.termsIndex.getDocToOrd(); PerSegmentComparator perSegComp = null; if (docToOrd.hasArray()) { final Object arr = docToOrd.getArray(); if (arr instanceof byte[]) { perSegComp = new ByteOrdComparator((byte[]) arr, parent); } else if (arr instanceof short[]) { perSegComp = new ShortOrdComparator((short[]) arr, parent); } else if (arr instanceof int[]) { perSegComp = new IntOrdComparator((int[]) arr, parent); } } if (perSegComp == null) { perSegComp = new AnyOrdComparator(docToOrd, parent); } if (perSegComp.bottomSlot != -1) { perSegComp.setBottom(perSegComp.bottomSlot); } parent.current = perSegComp; return perSegComp; }
// in core/src/java/org/apache/solr/search/WrappedQuery.java
Override public Weight createWeight(IndexSearcher searcher) throws IOException { return q.createWeight(searcher); }
// in core/src/java/org/apache/solr/search/WrappedQuery.java
Override public Query rewrite(IndexReader reader) throws IOException { // currently no need to continue wrapping at this point. return q.rewrite(reader); }
// in core/src/java/org/apache/solr/search/DocSetBase.java
public Filter getTopFilter() { final OpenBitSet bs = getBits(); return new Filter() { @Override public DocIdSet getDocIdSet(final AtomicReaderContext context, Bits acceptDocs) throws IOException { AtomicReader reader = context.reader(); // all Solr DocSets that are used as filters only include live docs final Bits acceptDocs2 = acceptDocs == null ? null : (reader.getLiveDocs() == acceptDocs ? null : acceptDocs); if (context.isTopLevel) { return BitsFilteredDocIdSet.wrap(bs, acceptDocs); } final int base = context.docBase; final int maxDoc = reader.maxDoc(); final int max = base + maxDoc; // one past the max doc in this segment. return BitsFilteredDocIdSet.wrap(new DocIdSet() { @Override public DocIdSetIterator iterator() throws IOException { return new DocIdSetIterator() { int pos=base-1; int adjustedDoc=-1; @Override public int docID() { return adjustedDoc; } @Override public int nextDoc() throws IOException { pos = bs.nextSetBit(pos+1); return adjustedDoc = (pos>=0 && pos<max) ? pos-base : NO_MORE_DOCS; } @Override public int advance(int target) throws IOException { if (target==NO_MORE_DOCS) return adjustedDoc=NO_MORE_DOCS; pos = bs.nextSetBit(target+base); return adjustedDoc = (pos>=0 && pos<max) ? pos-base : NO_MORE_DOCS; } }; } @Override public boolean isCacheable() { return true; } @Override public Bits bits() throws IOException { // sparse filters should not use random access return null; } }, acceptDocs2); } }; }
// in core/src/java/org/apache/solr/search/DocSetBase.java
Override public DocIdSet getDocIdSet(final AtomicReaderContext context, Bits acceptDocs) throws IOException { AtomicReader reader = context.reader(); // all Solr DocSets that are used as filters only include live docs final Bits acceptDocs2 = acceptDocs == null ? null : (reader.getLiveDocs() == acceptDocs ? null : acceptDocs); if (context.isTopLevel) { return BitsFilteredDocIdSet.wrap(bs, acceptDocs); } final int base = context.docBase; final int maxDoc = reader.maxDoc(); final int max = base + maxDoc; // one past the max doc in this segment. return BitsFilteredDocIdSet.wrap(new DocIdSet() { @Override public DocIdSetIterator iterator() throws IOException { return new DocIdSetIterator() { int pos=base-1; int adjustedDoc=-1; @Override public int docID() { return adjustedDoc; } @Override public int nextDoc() throws IOException { pos = bs.nextSetBit(pos+1); return adjustedDoc = (pos>=0 && pos<max) ? pos-base : NO_MORE_DOCS; } @Override public int advance(int target) throws IOException { if (target==NO_MORE_DOCS) return adjustedDoc=NO_MORE_DOCS; pos = bs.nextSetBit(target+base); return adjustedDoc = (pos>=0 && pos<max) ? pos-base : NO_MORE_DOCS; } }; } @Override public boolean isCacheable() { return true; } @Override public Bits bits() throws IOException { // sparse filters should not use random access return null; } }, acceptDocs2); }
// in core/src/java/org/apache/solr/search/DocSetBase.java
Override public DocIdSetIterator iterator() throws IOException { return new DocIdSetIterator() { int pos=base-1; int adjustedDoc=-1; @Override public int docID() { return adjustedDoc; } @Override public int nextDoc() throws IOException { pos = bs.nextSetBit(pos+1); return adjustedDoc = (pos>=0 && pos<max) ? pos-base : NO_MORE_DOCS; } @Override public int advance(int target) throws IOException { if (target==NO_MORE_DOCS) return adjustedDoc=NO_MORE_DOCS; pos = bs.nextSetBit(target+base); return adjustedDoc = (pos>=0 && pos<max) ? pos-base : NO_MORE_DOCS; } }; }
// in core/src/java/org/apache/solr/search/DocSetBase.java
Override public int nextDoc() throws IOException { pos = bs.nextSetBit(pos+1); return adjustedDoc = (pos>=0 && pos<max) ? pos-base : NO_MORE_DOCS; }
// in core/src/java/org/apache/solr/search/DocSetBase.java
Override public int advance(int target) throws IOException { if (target==NO_MORE_DOCS) return adjustedDoc=NO_MORE_DOCS; pos = bs.nextSetBit(target+base); return adjustedDoc = (pos>=0 && pos<max) ? pos-base : NO_MORE_DOCS; }
// in core/src/java/org/apache/solr/search/DocSetBase.java
Override public Bits bits() throws IOException { // sparse filters should not use random access return null; }
// in core/src/java/org/apache/solr/search/DocSetCollector.java
Override public void collect(int doc) throws IOException { doc += base; // optimistically collect the first docs in an array // in case the total number will be small enough to represent // as a small set like SortedIntDocSet instead... // Storing in this array will be quicker to convert // than scanning through a potentially huge bit vector. // FUTURE: when search methods all start returning docs in order, maybe // we could have a ListDocSet() and use the collected array directly. if (pos < scratch.length) { scratch[pos]=doc; } else { // this conditional could be removed if BitSet was preallocated, but that // would take up more memory, and add more GC time... if (bits==null) bits = new OpenBitSet(maxDoc); bits.fastSet(doc); } pos++; }
// in core/src/java/org/apache/solr/search/DocSetCollector.java
Override public void setScorer(Scorer scorer) throws IOException { }
// in core/src/java/org/apache/solr/search/DocSetCollector.java
Override public void setNextReader(AtomicReaderContext context) throws IOException { this.base = context.docBase; }
// in core/src/java/org/apache/solr/search/ValueSourceParser.java
Override public FunctionValues getValues(Map context, AtomicReaderContext readerContext) throws IOException { return new LongDocValues(this) { @Override public float floatVal(int doc) { return fv; } @Override public int intVal(int doc) { return (int) constant; } @Override public long longVal(int doc) { return constant; } @Override public double doubleVal(int doc) { return dv; } @Override public String toString(int doc) { return description(); } }; }
// in core/src/java/org/apache/solr/search/ValueSourceParser.java
Override public FunctionValues getValues(Map context, AtomicReaderContext readerContext) throws IOException { final FunctionValues vals = source.getValues(context, readerContext); return new DoubleDocValues(this) { @Override public double doubleVal(int doc) { return func(doc, vals); } @Override public String toString(int doc) { return name() + '(' + vals.toString(doc) + ')'; } }; }
// in core/src/java/org/apache/solr/search/ValueSourceParser.java
Override public FunctionValues getValues(Map context, AtomicReaderContext readerContext) throws IOException { final FunctionValues aVals = a.getValues(context, readerContext); final FunctionValues bVals = b.getValues(context, readerContext); return new DoubleDocValues(this) { @Override public double doubleVal(int doc) { return func(doc, aVals, bVals); } @Override public String toString(int doc) { return name() + '(' + aVals.toString(doc) + ',' + bVals.toString(doc) + ')'; } }; }
// in core/src/java/org/apache/solr/search/ValueSourceParser.java
Override public void createWeight(Map context, IndexSearcher searcher) throws IOException { }
// in core/src/java/org/apache/solr/search/ValueSourceParser.java
Override public FunctionValues getValues(Map context, AtomicReaderContext readerContext) throws IOException { return new BoolDocValues(this) { @Override public boolean boolVal(int doc) { return constant; } }; }
// in core/src/java/org/apache/solr/search/ValueSourceParser.java
Override public FunctionValues getValues(Map context, AtomicReaderContext readerContext) throws IOException { if (context.get(this) == null) { SolrRequestInfo requestInfo = SolrRequestInfo.getRequestInfo(); throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "testfunc: unweighted value source detected. delegate="+source + " request=" + (requestInfo==null ? "null" : requestInfo.getReq())); } return source.getValues(context, readerContext); }
// in core/src/java/org/apache/solr/search/ValueSourceParser.java
Override public void createWeight(Map context, IndexSearcher searcher) throws IOException { context.put(this, this); }
// in core/src/java/org/apache/solr/search/ValueSourceParser.java
Override public SortField getSortField(boolean reverse) throws IOException { return super.getSortField(reverse); }
// in core/src/java/org/apache/solr/search/SolrFilter.java
Override public DocIdSet getDocIdSet(AtomicReaderContext context, Bits acceptDocs) throws IOException { return getDocIdSet(null, context, acceptDocs); }
// in core/src/java/org/apache/solr/search/FastLRUCache.java
public void warm(SolrIndexSearcher searcher, SolrCache old) throws IOException { if (regenerator == null) return; long warmingStartTime = System.currentTimeMillis(); FastLRUCache other = (FastLRUCache) old; // warm entries if (isAutowarmingOn()) { int sz = autowarm.getWarmCount(other.size()); Map items = other.cache.getLatestAccessedItems(sz); Map.Entry[] itemsArr = new Map.Entry[items.size()]; int counter = 0; for (Object mapEntry : items.entrySet()) { itemsArr[counter++] = (Map.Entry) mapEntry; } for (int i = itemsArr.length - 1; i >= 0; i--) { try { boolean continueRegen = regenerator.regenerateItem(searcher, this, old, itemsArr[i].getKey(), itemsArr[i].getValue()); if (!continueRegen) break; } catch (Throwable e) { SolrException.log(log, "Error during auto-warming of key:" + itemsArr[i].getKey(), e); } } } warmupTime = System.currentTimeMillis() - warmingStartTime; }
// in core/src/java/org/apache/solr/highlight/DefaultSolrHighlighter.java
protected Highlighter getPhraseHighlighter(Query query, String fieldName, SolrQueryRequest request, CachingTokenFilter tokenStream) throws IOException { SolrParams params = request.getParams(); Highlighter highlighter = null; highlighter = new Highlighter( getFormatter(fieldName, params), getEncoder(fieldName, params), getSpanQueryScorer(query, fieldName, tokenStream, request)); highlighter.setTextFragmenter(getFragmenter(fieldName, params)); return highlighter; }
// in core/src/java/org/apache/solr/highlight/DefaultSolrHighlighter.java
private QueryScorer getSpanQueryScorer(Query query, String fieldName, TokenStream tokenStream, SolrQueryRequest request) throws IOException { boolean reqFieldMatch = request.getParams().getFieldBool(fieldName, HighlightParams.FIELD_MATCH, false); Boolean highlightMultiTerm = request.getParams().getBool(HighlightParams.HIGHLIGHT_MULTI_TERM, true); if(highlightMultiTerm == null) { highlightMultiTerm = false; } QueryScorer scorer; if (reqFieldMatch) { scorer = new QueryScorer(query, fieldName); } else { scorer = new QueryScorer(query, null); } scorer.setExpandMultiTermQuery(highlightMultiTerm); return scorer; }
// in core/src/java/org/apache/solr/highlight/DefaultSolrHighlighter.java
private void doHighlightingByHighlighter( Query query, SolrQueryRequest req, NamedList docSummaries, int docId, Document doc, String fieldName ) throws IOException { final SolrIndexSearcher searcher = req.getSearcher(); final IndexSchema schema = searcher.getSchema(); // TODO: Currently in trunk highlighting numeric fields is broken (Lucene) - // so we disable them until fixed (see LUCENE-3080)! // BEGIN: Hack final SchemaField schemaField = schema.getFieldOrNull(fieldName); if (schemaField != null && ( (schemaField.getType() instanceof org.apache.solr.schema.TrieField) || (schemaField.getType() instanceof org.apache.solr.schema.TrieDateField) )) return; // END: Hack SolrParams params = req.getParams(); IndexableField[] docFields = doc.getFields(fieldName); List<String> listFields = new ArrayList<String>(); for (IndexableField field : docFields) { listFields.add(field.stringValue()); } String[] docTexts = (String[]) listFields.toArray(new String[listFields.size()]); // according to Document javadoc, doc.getValues() never returns null. check empty instead of null if (docTexts.length == 0) return; TokenStream tstream = null; int numFragments = getMaxSnippets(fieldName, params); boolean mergeContiguousFragments = isMergeContiguousFragments(fieldName, params); String[] summaries = null; List<TextFragment> frags = new ArrayList<TextFragment>(); TermOffsetsTokenStream tots = null; // to be non-null iff we're using TermOffsets optimization try { TokenStream tvStream = TokenSources.getTokenStream(searcher.getIndexReader(), docId, fieldName); if (tvStream != null) { tots = new TermOffsetsTokenStream(tvStream); } } catch (IllegalArgumentException e) { // No problem. But we can't use TermOffsets optimization. } for (int j = 0; j < docTexts.length; j++) { if( tots != null ) { // if we're using TermOffsets optimization, then get the next // field value's TokenStream (i.e. get field j's TokenStream) from tots: tstream = tots.getMultiValuedTokenStream( docTexts[j].length() ); } else { // fall back to analyzer tstream = createAnalyzerTStream(schema, fieldName, docTexts[j]); } int maxCharsToAnalyze = params.getFieldInt(fieldName, HighlightParams.MAX_CHARS, Highlighter.DEFAULT_MAX_CHARS_TO_ANALYZE); Highlighter highlighter; if (Boolean.valueOf(req.getParams().get(HighlightParams.USE_PHRASE_HIGHLIGHTER, "true"))) { if (maxCharsToAnalyze < 0) { tstream = new CachingTokenFilter(tstream); } else { tstream = new CachingTokenFilter(new OffsetLimitTokenFilter(tstream, maxCharsToAnalyze)); } // get highlighter highlighter = getPhraseHighlighter(query, fieldName, req, (CachingTokenFilter) tstream); // after highlighter initialization, reset tstream since construction of highlighter already used it tstream.reset(); } else { // use "the old way" highlighter = getHighlighter(query, fieldName, req); } if (maxCharsToAnalyze < 0) { highlighter.setMaxDocCharsToAnalyze(docTexts[j].length()); } else { highlighter.setMaxDocCharsToAnalyze(maxCharsToAnalyze); } try { TextFragment[] bestTextFragments = highlighter.getBestTextFragments(tstream, docTexts[j], mergeContiguousFragments, numFragments); for (int k = 0; k < bestTextFragments.length; k++) { if ((bestTextFragments[k] != null) && (bestTextFragments[k].getScore() > 0)) { frags.add(bestTextFragments[k]); } } } catch (InvalidTokenOffsetsException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); } } // sort such that the fragments with the highest score come first Collections.sort(frags, new Comparator<TextFragment>() { public int compare(TextFragment arg0, TextFragment arg1) { return Math.round(arg1.getScore() - arg0.getScore()); } }); // convert fragments back into text // TODO: we can include score and position information in output as snippet attributes if (frags.size() > 0) { ArrayList<String> fragTexts = new ArrayList<String>(); for (TextFragment fragment: frags) { if ((fragment != null) && (fragment.getScore() > 0)) { fragTexts.add(fragment.toString()); } if (fragTexts.size() >= numFragments) break; } summaries = fragTexts.toArray(new String[0]); if (summaries.length > 0) docSummaries.add(fieldName, summaries); } // no summeries made, copy text from alternate field if (summaries == null || summaries.length == 0) { alternateField( docSummaries, params, doc, fieldName ); } }
// in core/src/java/org/apache/solr/highlight/DefaultSolrHighlighter.java
private void doHighlightingByFastVectorHighlighter( FastVectorHighlighter highlighter, FieldQuery fieldQuery, SolrQueryRequest req, NamedList docSummaries, int docId, Document doc, String fieldName ) throws IOException { SolrParams params = req.getParams(); SolrFragmentsBuilder solrFb = getSolrFragmentsBuilder( fieldName, params ); String[] snippets = highlighter.getBestFragments( fieldQuery, req.getSearcher().getIndexReader(), docId, fieldName, params.getFieldInt( fieldName, HighlightParams.FRAGSIZE, 100 ), params.getFieldInt( fieldName, HighlightParams.SNIPPETS, 1 ), getFragListBuilder( fieldName, params ), getFragmentsBuilder( fieldName, params ), solrFb.getPreTags( params, fieldName ), solrFb.getPostTags( params, fieldName ), getEncoder( fieldName, params ) ); if( snippets != null && snippets.length > 0 ) docSummaries.add( fieldName, snippets ); else alternateField( docSummaries, params, doc, fieldName ); }
// in core/src/java/org/apache/solr/highlight/DefaultSolrHighlighter.java
private TokenStream createAnalyzerTStream(IndexSchema schema, String fieldName, String docText) throws IOException { TokenStream tstream; TokenStream ts = schema.getAnalyzer().tokenStream(fieldName, new StringReader(docText)); ts.reset(); tstream = new TokenOrderingFilter(ts, 10); return tstream; }
// in core/src/java/org/apache/solr/highlight/DefaultSolrHighlighter.java
Override public boolean incrementToken() throws IOException { while (!done && queue.size() < windowSize) { if (!input.incrementToken()) { done = true; break; } // reverse iterating for better efficiency since we know the // list is already sorted, and most token start offsets will be too. ListIterator<OrderedToken> iter = queue.listIterator(queue.size()); while(iter.hasPrevious()) { if (offsetAtt.startOffset() >= iter.previous().startOffset) { // insertion will be before what next() would return (what // we just compared against), so move back one so the insertion // will be after. iter.next(); break; } } OrderedToken ot = new OrderedToken(); ot.state = captureState(); ot.startOffset = offsetAtt.startOffset(); iter.add(ot); } if (queue.isEmpty()) { return false; } else { restoreState(queue.removeFirst().state); return true; } }
// in core/src/java/org/apache/solr/highlight/DefaultSolrHighlighter.java
Override public boolean incrementToken() throws IOException { while( true ){ if( bufferedToken == null ) { if (!bufferedTokenStream.incrementToken()) return false; bufferedToken = bufferedTokenStream.captureState(); bufferedStartOffset = bufferedOffsetAtt.startOffset(); bufferedEndOffset = bufferedOffsetAtt.endOffset(); } if( startOffset <= bufferedStartOffset && bufferedEndOffset <= endOffset ){ restoreState(bufferedToken); bufferedToken = null; offsetAtt.setOffset( offsetAtt.startOffset() - startOffset, offsetAtt.endOffset() - startOffset ); return true; } else if( bufferedEndOffset > endOffset ){ startOffset += length + 1; return false; } bufferedToken = null; } }
// in core/src/java/org/apache/solr/spelling/SpellingQueryConverter.java
protected void analyze(Collection<Token> result, Reader text, int offset) throws IOException { TokenStream stream = analyzer.tokenStream("", text); // TODO: support custom attributes CharTermAttribute termAtt = stream.addAttribute(CharTermAttribute.class); FlagsAttribute flagsAtt = stream.addAttribute(FlagsAttribute.class); TypeAttribute typeAtt = stream.addAttribute(TypeAttribute.class); PayloadAttribute payloadAtt = stream.addAttribute(PayloadAttribute.class); PositionIncrementAttribute posIncAtt = stream.addAttribute(PositionIncrementAttribute.class); OffsetAttribute offsetAtt = stream.addAttribute(OffsetAttribute.class); stream.reset(); while (stream.incrementToken()) { Token token = new Token(); token.copyBuffer(termAtt.buffer(), 0, termAtt.length()); token.setStartOffset(offset + offsetAtt.startOffset()); token.setEndOffset(offset + offsetAtt.endOffset()); token.setFlags(flagsAtt.getFlags()); token.setType(typeAtt.type()); token.setPayload(payloadAtt.getPayload()); token.setPositionIncrement(posIncAtt.getPositionIncrement()); result.add(token); } stream.end(); stream.close(); }
// in core/src/java/org/apache/solr/spelling/AbstractLuceneSpellChecker.java
Override public SpellingResult getSuggestions(SpellingOptions options) throws IOException { SpellingResult result = new SpellingResult(options.tokens); IndexReader reader = determineReader(options.reader); Term term = field != null ? new Term(field, "") : null; float theAccuracy = (options.accuracy == Float.MIN_VALUE) ? spellChecker.getAccuracy() : options.accuracy; int count = Math.max(options.count, AbstractLuceneSpellChecker.DEFAULT_SUGGESTION_COUNT); for (Token token : options.tokens) { String tokenText = new String(token.buffer(), 0, token.length()); term = new Term(field, tokenText); int docFreq = 0; if (reader != null) { docFreq = reader.docFreq(term); } String[] suggestions = spellChecker.suggestSimilar(tokenText, ((options.alternativeTermCount == null || docFreq == 0) ? count : options.alternativeTermCount), field != null ? reader : null, // workaround LUCENE-1295 field, options.suggestMode, theAccuracy); if (suggestions.length == 1 && suggestions[0].equals(tokenText) && options.alternativeTermCount == null) { // These are spelled the same, continue on continue; } // If considering alternatives to "correctly-spelled" terms, then add the // original as a viable suggestion. if (options.alternativeTermCount != null && docFreq > 0) { boolean foundOriginal = false; String[] suggestionsWithOrig = new String[suggestions.length + 1]; for (int i = 0; i < suggestions.length; i++) { if (suggestions[i].equals(tokenText)) { foundOriginal = true; break; } suggestionsWithOrig[i + 1] = suggestions[i]; } if (!foundOriginal) { suggestionsWithOrig[0] = tokenText; suggestions = suggestionsWithOrig; } } if (options.extendedResults == true && reader != null && field != null) { result.addFrequency(token, docFreq); int countLimit = Math.min(options.count, suggestions.length); if(countLimit>0) { for (int i = 0; i < countLimit; i++) { term = new Term(field, suggestions[i]); result.add(token, suggestions[i], reader.docFreq(term)); } } else { List<String> suggList = Collections.emptyList(); result.add(token, suggList); } } else { if (suggestions.length > 0) { List<String> suggList = Arrays.asList(suggestions); if (suggestions.length > options.count) { suggList = suggList.subList(0, options.count); } result.add(token, suggList); } else { List<String> suggList = Collections.emptyList(); result.add(token, suggList); } } } return result; }
// in core/src/java/org/apache/solr/spelling/AbstractLuceneSpellChecker.java
Override public void reload(SolrCore core, SolrIndexSearcher searcher) throws IOException { spellChecker.setSpellIndex(index); }
// in core/src/java/org/apache/solr/spelling/AbstractLuceneSpellChecker.java
protected void initIndex() throws IOException { if (indexDir != null) { index = FSDirectory.open(new File(indexDir)); } else { index = new RAMDirectory(); } }
// in core/src/java/org/apache/solr/spelling/suggest/Suggester.java
Override public void reload(SolrCore core, SolrIndexSearcher searcher) throws IOException { LOG.info("reload()"); if (dictionary == null && storeDir != null) { // this may be a firstSearcher event, try loading it if (lookup.load(new FileInputStream(new File(storeDir, factory.storeFileName())))) { return; // loaded ok } LOG.debug("load failed, need to build Lookup again"); } // loading was unsuccessful - build it again build(core, searcher); }
// in core/src/java/org/apache/solr/spelling/suggest/Suggester.java
Override public SpellingResult getSuggestions(SpellingOptions options) throws IOException { LOG.debug("getSuggestions: " + options.tokens); if (lookup == null) { LOG.info("Lookup is null - invoke spellchecker.build first"); return EMPTY_RESULT; } SpellingResult res = new SpellingResult(); CharsRef scratch = new CharsRef(); for (Token t : options.tokens) { scratch.chars = t.buffer(); scratch.offset = 0; scratch.length = t.length(); List<LookupResult> suggestions = lookup.lookup(scratch, (options.suggestMode == SuggestMode.SUGGEST_MORE_POPULAR), options.count); if (suggestions == null) { continue; } if (options.suggestMode != SuggestMode.SUGGEST_MORE_POPULAR) { Collections.sort(suggestions); } for (LookupResult lr : suggestions) { res.add(t, lr.key.toString(), (int)lr.value); } } return res; }
// in core/src/java/org/apache/solr/spelling/DirectSolrSpellChecker.java
Override public void reload(SolrCore core, SolrIndexSearcher searcher) throws IOException {}
// in core/src/java/org/apache/solr/spelling/DirectSolrSpellChecker.java
Override public SpellingResult getSuggestions(SpellingOptions options) throws IOException { LOG.debug("getSuggestions: " + options.tokens); SpellingResult result = new SpellingResult(); float accuracy = (options.accuracy == Float.MIN_VALUE) ? checker.getAccuracy() : options.accuracy; for (Token token : options.tokens) { String tokenText = token.toString(); Term term = new Term(field, tokenText); int freq = options.reader.docFreq(term); int count = (options.alternativeTermCount != null && freq > 0) ? options.alternativeTermCount: options.count; SuggestWord[] suggestions = checker.suggestSimilar(term, count,options.reader, options.suggestMode, accuracy); result.addFrequency(token, freq); // If considering alternatives to "correctly-spelled" terms, then add the // original as a viable suggestion. if (options.alternativeTermCount != null && freq > 0) { boolean foundOriginal = false; SuggestWord[] suggestionsWithOrig = new SuggestWord[suggestions.length + 1]; for (int i = 0; i < suggestions.length; i++) { if (suggestions[i].string.equals(tokenText)) { foundOriginal = true; break; } suggestionsWithOrig[i + 1] = suggestions[i]; } if (!foundOriginal) { SuggestWord orig = new SuggestWord(); orig.freq = freq; orig.string = tokenText; suggestionsWithOrig[0] = orig; suggestions = suggestionsWithOrig; } } if(suggestions.length==0 && freq==0) { List<String> empty = Collections.emptyList(); result.add(token, empty); } else { for (SuggestWord suggestion : suggestions) { result.add(token, suggestion.string, suggestion.freq); } } } return result; }
// in core/src/java/org/apache/solr/spelling/IndexBasedSpellChecker.java
Override public void reload(SolrCore core, SolrIndexSearcher searcher) throws IOException { super.reload(core, searcher); //reload the source initSourceReader(); }
// in core/src/java/org/apache/solr/cloud/SyncStrategy.java
private boolean syncWithReplicas(ZkController zkController, SolrCore core, ZkNodeProps props, String collection, String shardId) throws MalformedURLException, SolrServerException, IOException { List<ZkCoreNodeProps> nodes = zkController.getZkStateReader() .getReplicaProps(collection, shardId, props.get(ZkStateReader.NODE_NAME_PROP), props.get(ZkStateReader.CORE_NAME_PROP), ZkStateReader.ACTIVE); // TODO: // should // there // be a // state // filter? if (nodes == null) { // I have no replicas return true; } List<String> syncWith = new ArrayList<String>(); for (ZkCoreNodeProps node : nodes) { // if we see a leader, must be stale state, and this is the guy that went down if (!node.getNodeProps().keySet().contains(ZkStateReader.LEADER_PROP)) { syncWith.add(node.getCoreUrl()); } } PeerSync peerSync = new PeerSync(core, syncWith, core.getUpdateHandler().getUpdateLog().numRecordsToKeep); return peerSync.sync(); }
// in core/src/java/org/apache/solr/cloud/SyncStrategy.java
private void syncToMe(ZkController zkController, String collection, String shardId, ZkNodeProps leaderProps) throws MalformedURLException, SolrServerException, IOException { // sync everyone else // TODO: we should do this in parallel at least List<ZkCoreNodeProps> nodes = zkController .getZkStateReader() .getReplicaProps(collection, shardId, leaderProps.get(ZkStateReader.NODE_NAME_PROP), leaderProps.get(ZkStateReader.CORE_NAME_PROP), ZkStateReader.ACTIVE); if (nodes == null) { // System.out.println("I have no replicas"); // I have no replicas return; } //System.out.println("tell my replicas to sync"); ZkCoreNodeProps zkLeader = new ZkCoreNodeProps(leaderProps); for (ZkCoreNodeProps node : nodes) { try { // System.out // .println("try and ask " + node.getCoreUrl() + " to sync"); log.info("try and ask " + node.getCoreUrl() + " to sync"); requestSync(zkLeader.getCoreUrl(), node.getCoreName()); } catch (Exception e) { SolrException.log(log, "Error syncing replica to leader", e); } } for(;;) { ShardResponse srsp = shardHandler.takeCompletedOrError(); if (srsp == null) break; boolean success = handleResponse(srsp); //System.out.println("got response:" + success); if (!success) { try { log.info("Sync failed - asking replica to recover."); //System.out.println("Sync failed - asking replica to recover."); RequestRecovery recoverRequestCmd = new RequestRecovery(); recoverRequestCmd.setAction(CoreAdminAction.REQUESTRECOVERY); recoverRequestCmd.setCoreName(((SyncShardRequest)srsp.getShardRequest()).coreName); HttpSolrServer server = new HttpSolrServer(zkLeader.getBaseUrl()); server.request(recoverRequestCmd); } catch (Exception e) { log.info("Could not tell a replica to recover", e); } shardHandler.cancelAll(); break; } } }
// in core/src/java/org/apache/solr/cloud/ElectionContext.java
Override void runLeaderProcess(boolean weAreReplacement) throws KeeperException, InterruptedException, IOException { try { zkClient.makePath(leaderPath, leaderProps == null ? null : ZkStateReader.toJSON(leaderProps), CreateMode.EPHEMERAL, true); } catch (NodeExistsException e) { // if a previous leader ephemeral still exists for some reason, try and // remove it zkClient.delete(leaderPath, -1, true); zkClient.makePath(leaderPath, leaderProps == null ? null : ZkStateReader.toJSON(leaderProps), CreateMode.EPHEMERAL, true); } }
// in core/src/java/org/apache/solr/cloud/ElectionContext.java
Override void runLeaderProcess(boolean weAreReplacement) throws KeeperException, InterruptedException, IOException { if (cc != null) { String coreName = leaderProps.get(ZkStateReader.CORE_NAME_PROP); SolrCore core = null; try { // the first time we are run, we will get a startupCore - after // we will get null and must use cc.getCore core = cc.getCore(coreName); if (core == null) { cancelElection(); throw new SolrException(ErrorCode.SERVER_ERROR, "Fatal Error, SolrCore not found:" + coreName + " in " + cc.getCoreNames()); } // should I be leader? if (weAreReplacement && !shouldIBeLeader(leaderProps)) { // System.out.println("there is a better leader candidate it appears"); rejoinLeaderElection(leaderSeqPath, core); return; } if (weAreReplacement) { if (zkClient.exists(leaderPath, true)) { zkClient.delete(leaderPath, -1, true); } // System.out.println("I may be the new Leader:" + leaderPath // + " - I need to try and sync"); boolean success = syncStrategy.sync(zkController, core, leaderProps); if (!success && anyoneElseActive()) { rejoinLeaderElection(leaderSeqPath, core); return; } } // If I am going to be the leader I have to be active // System.out.println("I am leader go active"); core.getUpdateHandler().getSolrCoreState().cancelRecovery(); zkController.publish(core.getCoreDescriptor(), ZkStateReader.ACTIVE); } finally { if (core != null ) { core.close(); } } } super.runLeaderProcess(weAreReplacement); }
// in core/src/java/org/apache/solr/cloud/ElectionContext.java
private void rejoinLeaderElection(String leaderSeqPath, SolrCore core) throws InterruptedException, KeeperException, IOException { // remove our ephemeral and re join the election // System.out.println("sync failed, delete our election node:" // + leaderSeqPath); zkController.publish(core.getCoreDescriptor(), ZkStateReader.DOWN); cancelElection(); core.getUpdateHandler().getSolrCoreState().doRecovery(cc, core.getName()); leaderElector.joinElection(this); }
// in core/src/java/org/apache/solr/cloud/SolrZkServer.java
Override public void parseProperties(Properties zkProp) throws IOException, ConfigException { for (Entry<Object, Object> entry : zkProp.entrySet()) { String key = entry.getKey().toString().trim(); String value = entry.getValue().toString().trim(); if (key.equals("dataDir")) { dataDir = value; } else if (key.equals("dataLogDir")) { dataLogDir = value; } else if (key.equals("clientPort")) { setClientPort(Integer.parseInt(value)); } else if (key.equals("tickTime")) { tickTime = Integer.parseInt(value); } else if (key.equals("initLimit")) { initLimit = Integer.parseInt(value); } else if (key.equals("syncLimit")) { syncLimit = Integer.parseInt(value); } else if (key.equals("electionAlg")) { electionAlg = Integer.parseInt(value); } else if (key.equals("maxClientCnxns")) { maxClientCnxns = Integer.parseInt(value); } else if (key.startsWith("server.")) { int dot = key.indexOf('.'); long sid = Long.parseLong(key.substring(dot + 1)); String parts[] = value.split(":"); if ((parts.length != 2) && (parts.length != 3)) { LOG.error(value + " does not have the form host:port or host:port:port"); } InetSocketAddress addr = new InetSocketAddress(parts[0], Integer.parseInt(parts[1])); if (parts.length == 2) { servers.put(Long.valueOf(sid), new QuorumPeer.QuorumServer(sid, addr)); } else if (parts.length == 3) { InetSocketAddress electionAddr = new InetSocketAddress( parts[0], Integer.parseInt(parts[2])); servers.put(Long.valueOf(sid), new QuorumPeer.QuorumServer(sid, addr, electionAddr)); } } else if (key.startsWith("group")) { int dot = key.indexOf('.'); long gid = Long.parseLong(key.substring(dot + 1)); numGroups++; String parts[] = value.split(":"); for(String s : parts){ long sid = Long.parseLong(s); if(serverGroup.containsKey(sid)) throw new ConfigException("Server " + sid + "is in multiple groups"); else serverGroup.put(sid, gid); } } else if(key.startsWith("weight")) { int dot = key.indexOf('.'); long sid = Long.parseLong(key.substring(dot + 1)); serverWeight.put(sid, Long.parseLong(value)); } else { System.setProperty("zookeeper." + key, value); } } if (dataDir == null) { throw new IllegalArgumentException("dataDir is not set"); } if (dataLogDir == null) { dataLogDir = dataDir; } else { if (!new File(dataLogDir).isDirectory()) { throw new IllegalArgumentException("dataLogDir " + dataLogDir + " is missing."); } } if (tickTime == 0) { throw new IllegalArgumentException("tickTime is not set"); } if (servers.size() > 1) { if (initLimit == 0) { throw new IllegalArgumentException("initLimit is not set"); } if (syncLimit == 0) { throw new IllegalArgumentException("syncLimit is not set"); } /* * If using FLE, then every server requires a separate election * port. */ if (electionAlg != 0) { for (QuorumPeer.QuorumServer s : servers.values()) { if (s.electionAddr == null) throw new IllegalArgumentException( "Missing election port for server: " + s.id); } } /* * Default of quorum config is majority */ if(serverGroup.size() > 0){ if(servers.size() != serverGroup.size()) throw new ConfigException("Every server must be in exactly one group"); /* * The deafult weight of a server is 1 */ for(QuorumPeer.QuorumServer s : servers.values()){ if(!serverWeight.containsKey(s.id)) serverWeight.put(s.id, (long) 1); } /* * Set the quorumVerifier to be QuorumHierarchical */ quorumVerifier = new QuorumHierarchical(numGroups, serverWeight, serverGroup); } else { /* * The default QuorumVerifier is QuorumMaj */ LOG.info("Defaulting to majority quorums"); quorumVerifier = new QuorumMaj(servers.size()); } File myIdFile = new File(dataDir, "myid"); if (!myIdFile.exists()) { ///////////////// ADDED FOR SOLR ////// Long myid = getMySeverId(); if (myid != null) { serverId = myid; return; } if (zkRun == null) return; //////////////// END ADDED FOR SOLR ////// throw new IllegalArgumentException(myIdFile.toString() + " file is missing"); } BufferedReader br = new BufferedReader(new FileReader(myIdFile)); String myIdString; try { myIdString = br.readLine(); } finally { br.close(); } try { serverId = Long.parseLong(myIdString); } catch (NumberFormatException e) { throw new IllegalArgumentException("serverid " + myIdString + " is not a number"); } } }
// in core/src/java/org/apache/solr/cloud/RecoveryStrategy.java
private void replicate(String nodeName, SolrCore core, ZkNodeProps leaderprops, String baseUrl) throws SolrServerException, IOException { String leaderBaseUrl = leaderprops.get(ZkStateReader.BASE_URL_PROP); ZkCoreNodeProps leaderCNodeProps = new ZkCoreNodeProps(leaderprops); String leaderUrl = leaderCNodeProps.getCoreUrl(); log.info("Attempting to replicate from " + leaderUrl); // if we are the leader, either we are trying to recover faster // then our ephemeral timed out or we are the only node if (!leaderBaseUrl.equals(baseUrl)) { // send commit commitOnLeader(leaderUrl); // use rep handler directly, so we can do this sync rather than async SolrRequestHandler handler = core.getRequestHandler(REPLICATION_HANDLER); if (handler instanceof LazyRequestHandlerWrapper) { handler = ((LazyRequestHandlerWrapper)handler).getWrappedHandler(); } ReplicationHandler replicationHandler = (ReplicationHandler) handler; if (replicationHandler == null) { throw new SolrException(ErrorCode.SERVICE_UNAVAILABLE, "Skipping recovery, no " + REPLICATION_HANDLER + " handler found"); } ModifiableSolrParams solrParams = new ModifiableSolrParams(); solrParams.set(ReplicationHandler.MASTER_URL, leaderUrl + "replication"); if (isClosed()) retries = INTERRUPTED; boolean success = replicationHandler.doFetch(solrParams, true); // TODO: look into making sure force=true does not download files we already have if (!success) { throw new SolrException(ErrorCode.SERVER_ERROR, "Replication for recovery failed."); } // solrcloud_debug // try { // RefCounted<SolrIndexSearcher> searchHolder = core.getNewestSearcher(false); // SolrIndexSearcher searcher = searchHolder.get(); // try { // System.out.println(core.getCoreDescriptor().getCoreContainer().getZkController().getNodeName() + " replicated " // + searcher.search(new MatchAllDocsQuery(), 1).totalHits + " from " + leaderUrl + " gen:" + core.getDeletionPolicy().getLatestCommit().getGeneration() + " data:" + core.getDataDir()); // } finally { // searchHolder.decref(); // } // } catch (Exception e) { // // } } }
// in core/src/java/org/apache/solr/cloud/RecoveryStrategy.java
private void commitOnLeader(String leaderUrl) throws MalformedURLException, SolrServerException, IOException { HttpSolrServer server = new HttpSolrServer(leaderUrl); server.setConnectionTimeout(30000); server.setSoTimeout(30000); UpdateRequest ureq = new UpdateRequest(); ureq.setParams(new ModifiableSolrParams()); ureq.getParams().set(DistributedUpdateProcessor.COMMIT_END_POINT, true); ureq.setAction(AbstractUpdateRequest.ACTION.COMMIT, false, true).process( server); server.shutdown(); }
// in core/src/java/org/apache/solr/cloud/RecoveryStrategy.java
private void sendPrepRecoveryCmd(String leaderBaseUrl, String leaderCoreName) throws MalformedURLException, SolrServerException, IOException { HttpSolrServer server = new HttpSolrServer(leaderBaseUrl); server.setConnectionTimeout(45000); server.setSoTimeout(45000); WaitForState prepCmd = new WaitForState(); prepCmd.setCoreName(leaderCoreName); prepCmd.setNodeName(zkController.getNodeName()); prepCmd.setCoreNodeName(coreZkNodeName); prepCmd.setState(ZkStateReader.RECOVERING); prepCmd.setCheckLive(true); prepCmd.setPauseFor(6000); server.request(prepCmd); server.shutdown(); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
private String getHostAddress(String host) throws IOException { if (host == null) { host = "http://" + InetAddress.getLocalHost().getHostName(); } else { Matcher m = URL_PREFIX.matcher(host); if (m.matches()) { String prefix = m.group(1); host = prefix + host; } else { host = "http://" + host; } } return host; }
// in core/src/java/org/apache/solr/cloud/ZkController.java
public String readConfigName(String collection) throws KeeperException, InterruptedException, IOException { String configName = null; String path = ZkStateReader.COLLECTIONS_ZKNODE + "/" + collection; if (log.isInfoEnabled()) { log.info("Load collection config from:" + path); } byte[] data = zkClient.getData(path, null, null, true); if(data != null) { ZkNodeProps props = ZkNodeProps.load(data); configName = props.get(CONFIGNAME_PROP); } if (configName != null && !zkClient.exists(CONFIGS_ZKNODE + "/" + configName, true)) { log.error("Specified config does not exist in ZooKeeper:" + configName); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "Specified config does not exist in ZooKeeper:" + configName); } return configName; }
// in core/src/java/org/apache/solr/cloud/ZkController.java
private void joinElection(CoreDescriptor cd) throws InterruptedException, KeeperException, IOException { String shardId = cd.getCloudDescriptor().getShardId(); Map<String,String> props = new HashMap<String,String>(); // we only put a subset of props into the leader node props.put(ZkStateReader.BASE_URL_PROP, getBaseUrl()); props.put(ZkStateReader.CORE_NAME_PROP, cd.getName()); props.put(ZkStateReader.NODE_NAME_PROP, getNodeName()); final String coreZkNodeName = getNodeName() + "_" + cd.getName(); ZkNodeProps ourProps = new ZkNodeProps(props); String collection = cd.getCloudDescriptor() .getCollectionName(); ElectionContext context = new ShardLeaderElectionContext(leaderElector, shardId, collection, coreZkNodeName, ourProps, this, cc); leaderElector.setup(context); electionContexts.put(coreZkNodeName, context); leaderElector.joinElection(context); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
private boolean checkRecovery(String coreName, final CoreDescriptor desc, boolean recoverReloadedCores, final boolean isLeader, final CloudDescriptor cloudDesc, final String collection, final String shardZkNodeName, String shardId, ZkNodeProps leaderProps, SolrCore core, CoreContainer cc) throws InterruptedException, KeeperException, IOException, ExecutionException { if (SKIP_AUTO_RECOVERY) { log.warn("Skipping recovery according to sys prop solrcloud.skip.autorecovery"); return false; } boolean doRecovery = true; if (!isLeader) { if (core.isReloaded() && !recoverReloadedCores) { doRecovery = false; } if (doRecovery) { log.info("Core needs to recover:" + core.getName()); core.getUpdateHandler().getSolrCoreState().doRecovery(cc, coreName); return true; } } else { log.info("I am the leader, no recovery necessary"); } return false; }
// in core/src/java/org/apache/solr/cloud/ZkController.java
public void uploadToZK(File dir, String zkPath) throws IOException, KeeperException, InterruptedException { uploadToZK(zkClient, dir, zkPath); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
public void uploadConfigDir(File dir, String configName) throws IOException, KeeperException, InterruptedException { uploadToZK(zkClient, dir, ZkController.CONFIGS_ZKNODE + "/" + configName); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
public void createCollectionZkNode(CloudDescriptor cd) throws KeeperException, InterruptedException, IOException { String collection = cd.getCollectionName(); log.info("Check for collection zkNode:" + collection); String collectionPath = ZkStateReader.COLLECTIONS_ZKNODE + "/" + collection; try { if(!zkClient.exists(collectionPath, true)) { log.info("Creating collection in ZooKeeper:" + collection); SolrParams params = cd.getParams(); try { Map<String,String> collectionProps = new HashMap<String,String>(); // TODO: if collection.configName isn't set, and there isn't already a conf in zk, just use that? String defaultConfigName = System.getProperty(COLLECTION_PARAM_PREFIX+CONFIGNAME_PROP, collection); // params passed in - currently only done via core admin (create core commmand). if (params != null) { Iterator<String> iter = params.getParameterNamesIterator(); while (iter.hasNext()) { String paramName = iter.next(); if (paramName.startsWith(COLLECTION_PARAM_PREFIX)) { collectionProps.put(paramName.substring(COLLECTION_PARAM_PREFIX.length()), params.get(paramName)); } } // if the config name wasn't passed in, use the default if (!collectionProps.containsKey(CONFIGNAME_PROP)) getConfName(collection, collectionPath, collectionProps); } else if(System.getProperty("bootstrap_confdir") != null) { // if we are bootstrapping a collection, default the config for // a new collection to the collection we are bootstrapping log.info("Setting config for collection:" + collection + " to " + defaultConfigName); Properties sysProps = System.getProperties(); for (String sprop : System.getProperties().stringPropertyNames()) { if (sprop.startsWith(COLLECTION_PARAM_PREFIX)) { collectionProps.put(sprop.substring(COLLECTION_PARAM_PREFIX.length()), sysProps.getProperty(sprop)); } } // if the config name wasn't passed in, use the default if (!collectionProps.containsKey(CONFIGNAME_PROP)) collectionProps.put(CONFIGNAME_PROP, defaultConfigName); } else if (Boolean.getBoolean("bootstrap_conf")) { // the conf name should should be the collection name of this core collectionProps.put(CONFIGNAME_PROP, cd.getCollectionName()); } else { getConfName(collection, collectionPath, collectionProps); } ZkNodeProps zkProps = new ZkNodeProps(collectionProps); zkClient.makePath(collectionPath, ZkStateReader.toJSON(zkProps), CreateMode.PERSISTENT, null, true); // ping that there is a new collection zkClient.setData(ZkStateReader.COLLECTIONS_ZKNODE, (byte[])null, true); } catch (KeeperException e) { // its okay if the node already exists if (e.code() != KeeperException.Code.NODEEXISTS) { throw e; } } } else { log.info("Collection zkNode exists"); } } catch (KeeperException e) { // its okay if another beats us creating the node if (e.code() != KeeperException.Code.NODEEXISTS) { throw e; } } }
// in core/src/java/org/apache/solr/cloud/ZkController.java
public static void uploadToZK(SolrZkClient zkClient, File dir, String zkPath) throws IOException, KeeperException, InterruptedException { File[] files = dir.listFiles(); if (files == null) { throw new IllegalArgumentException("Illegal directory: " + dir); } for(File file : files) { if (!file.getName().startsWith(".")) { if (!file.isDirectory()) { zkClient.makePath(zkPath + "/" + file.getName(), file, false, true); } else { uploadToZK(zkClient, file, zkPath + "/" + file.getName()); } } } }
// in core/src/java/org/apache/solr/cloud/ZkController.java
public static void uploadConfigDir(SolrZkClient zkClient, File dir, String configName) throws IOException, KeeperException, InterruptedException { uploadToZK(zkClient, dir, ZkController.CONFIGS_ZKNODE + "/" + configName); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
public static void bootstrapConf(SolrZkClient zkClient, Config cfg, String solrHome) throws IOException, KeeperException, InterruptedException { NodeList nodes = (NodeList)cfg.evaluate("solr/cores/core", XPathConstants.NODESET); for (int i=0; i<nodes.getLength(); i++) { Node node = nodes.item(i); String rawName = DOMUtil.getAttr(node, "name", null); String instanceDir = DOMUtil.getAttr(node, "instanceDir", null); File idir = new File(instanceDir); if (!idir.isAbsolute()) { idir = new File(solrHome, instanceDir); } String confName = DOMUtil.getAttr(node, "collection", null); if (confName == null) { confName = rawName; } ZkController.uploadConfigDir(zkClient, new File(idir, "conf"), confName); } }
// in core/src/java/org/apache/solr/cloud/LeaderElector.java
private void checkIfIamLeader(final int seq, final ElectionContext context, boolean replacement) throws KeeperException, InterruptedException, IOException { // get all other numbers... final String holdElectionPath = context.electionPath + ELECTION_NODE; List<String> seqs = zkClient.getChildren(holdElectionPath, null, true); sortSeqs(seqs); List<Integer> intSeqs = getSeqs(seqs); if (seq <= intSeqs.get(0)) { runIamLeaderProcess(context, replacement); } else { // I am not the leader - watch the node below me int i = 1; for (; i < intSeqs.size(); i++) { int s = intSeqs.get(i); if (seq < s) { // we found who we come before - watch the guy in front break; } } int index = i - 2; if (index < 0) { log.warn("Our node is no longer in line to be leader"); return; } try { zkClient.getData(holdElectionPath + "/" + seqs.get(index), new Watcher() { @Override public void process(WatchedEvent event) { // am I the next leader? try { checkIfIamLeader(seq, context, true); } catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.warn("", e); } catch (IOException e) { log.warn("", e); } catch (Exception e) { log.warn("", e); } } }, null, true); } catch (KeeperException.SessionExpiredException e) { throw e; } catch (KeeperException e) { // we couldn't set our watch - the node before us may already be down? // we need to check if we are the leader again checkIfIamLeader(seq, context, true); } }
// in core/src/java/org/apache/solr/cloud/LeaderElector.java
protected void runIamLeaderProcess(final ElectionContext context, boolean weAreReplacement) throws KeeperException, InterruptedException, IOException { context.runLeaderProcess(weAreReplacement); }
// in core/src/java/org/apache/solr/cloud/LeaderElector.java
public int joinElection(ElectionContext context) throws KeeperException, InterruptedException, IOException { final String shardsElectZkPath = context.electionPath + LeaderElector.ELECTION_NODE; long sessionId = zkClient.getSolrZooKeeper().getSessionId(); String id = sessionId + "-" + context.id; String leaderSeqPath = null; boolean cont = true; int tries = 0; while (cont) { try { leaderSeqPath = zkClient.create(shardsElectZkPath + "/" + id + "-n_", null, CreateMode.EPHEMERAL_SEQUENTIAL, false); context.leaderSeqPath = leaderSeqPath; cont = false; } catch (ConnectionLossException e) { // we don't know if we made our node or not... List<String> entries = zkClient.getChildren(shardsElectZkPath, null, true); boolean foundId = false; for (String entry : entries) { String nodeId = getNodeId(entry); if (id.equals(nodeId)) { // we did create our node... foundId = true; break; } } if (!foundId) { throw e; } } catch (KeeperException.NoNodeException e) { // we must have failed in creating the election node - someone else must // be working on it, lets try again if (tries++ > 9) { throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } cont = true; Thread.sleep(50); } } int seq = getSeq(leaderSeqPath); checkIfIamLeader(seq, context, false); return seq; }
// in core/src/java/org/apache/solr/update/DirectUpdateHandler2.java
private void deleteAll() throws IOException { SolrCore.log.info(core.getLogId()+"REMOVING ALL DOCUMENTS FROM INDEX"); solrCoreState.getIndexWriter(core).deleteAll(); }
// in core/src/java/org/apache/solr/update/DirectUpdateHandler2.java
protected void rollbackWriter() throws IOException { numDocsPending.set(0); solrCoreState.rollbackIndexWriter(core); }
// in core/src/java/org/apache/solr/update/DirectUpdateHandler2.java
Override public int addDoc(AddUpdateCommand cmd) throws IOException { IndexWriter writer = solrCoreState.getIndexWriter(core); addCommands.incrementAndGet(); addCommandsCumulative.incrementAndGet(); int rc=-1; // if there is no ID field, don't overwrite if( idField == null ) { cmd.overwrite = false; } try { if (cmd.overwrite) { Term updateTerm; Term idTerm = new Term(idField.getName(), cmd.getIndexedId()); boolean del = false; if (cmd.updateTerm == null) { updateTerm = idTerm; } else { del = true; updateTerm = cmd.updateTerm; } Document luceneDocument = cmd.getLuceneDocument(); // SolrCore.verbose("updateDocument",updateTerm,luceneDocument,writer); writer.updateDocument(updateTerm, luceneDocument); // SolrCore.verbose("updateDocument",updateTerm,"DONE"); if(del) { // ensure id remains unique BooleanQuery bq = new BooleanQuery(); bq.add(new BooleanClause(new TermQuery(updateTerm), Occur.MUST_NOT)); bq.add(new BooleanClause(new TermQuery(idTerm), Occur.MUST)); writer.deleteDocuments(bq); } } else { // allow duplicates writer.addDocument(cmd.getLuceneDocument()); } // Add to the transaction log *after* successfully adding to the index, if there was no error. // This ordering ensures that if we log it, it's definitely been added to the the index. // This also ensures that if a commit sneaks in-between, that we know everything in a particular // log version was definitely committed. if (ulog != null) ulog.add(cmd); if ((cmd.getFlags() & UpdateCommand.IGNORE_AUTOCOMMIT) == 0) { commitTracker.addedDocument( -1 ); softCommitTracker.addedDocument( cmd.commitWithin ); } rc = 1; } finally { if (rc!=1) { numErrors.incrementAndGet(); numErrorsCumulative.incrementAndGet(); } else { numDocsPending.incrementAndGet(); } } return rc; }
// in core/src/java/org/apache/solr/update/DirectUpdateHandler2.java
Override public void delete(DeleteUpdateCommand cmd) throws IOException { deleteByIdCommands.incrementAndGet(); deleteByIdCommandsCumulative.incrementAndGet(); IndexWriter writer = solrCoreState.getIndexWriter(core); Term deleteTerm = new Term(idField.getName(), cmd.getIndexedId()); // SolrCore.verbose("deleteDocuments",deleteTerm,writer); writer.deleteDocuments(deleteTerm); // SolrCore.verbose("deleteDocuments",deleteTerm,"DONE"); if (ulog != null) ulog.delete(cmd); updateDeleteTrackers(cmd); }
// in core/src/java/org/apache/solr/update/DirectUpdateHandler2.java
Override public void deleteByQuery(DeleteUpdateCommand cmd) throws IOException { deleteByQueryCommands.incrementAndGet(); deleteByQueryCommandsCumulative.incrementAndGet(); boolean madeIt=false; try { Query q; try { // TODO: move this higher in the stack? QParser parser = QParser.getParser(cmd.query, "lucene", cmd.req); q = parser.getQuery(); q = QueryUtils.makeQueryable(q); // peer-sync can cause older deleteByQueries to be executed and could // delete newer documents. We prevent this by adding a clause restricting // version. if ((cmd.getFlags() & UpdateCommand.PEER_SYNC) != 0) { BooleanQuery bq = new BooleanQuery(); bq.add(q, Occur.MUST); SchemaField sf = core.getSchema().getField(VersionInfo.VERSION_FIELD); ValueSource vs = sf.getType().getValueSource(sf, null); ValueSourceRangeFilter filt = new ValueSourceRangeFilter(vs, null, Long.toString(Math.abs(cmd.version)), true, true); FunctionRangeQuery range = new FunctionRangeQuery(filt); bq.add(range, Occur.MUST); q = bq; } } catch (ParseException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); } boolean delAll = MatchAllDocsQuery.class == q.getClass(); // // synchronized to prevent deleteByQuery from running during the "open new searcher" // part of a commit. DBQ needs to signal that a fresh reader will be needed for // a realtime view of the index. When a new searcher is opened after a DBQ, that // flag can be cleared. If those thing happen concurrently, it's not thread safe. // synchronized (this) { if (delAll) { deleteAll(); } else { solrCoreState.getIndexWriter(core).deleteDocuments(q); } if (ulog != null) ulog.deleteByQuery(cmd); } madeIt = true; updateDeleteTrackers(cmd); } finally { if (!madeIt) { numErrors.incrementAndGet(); numErrorsCumulative.incrementAndGet(); } } }
// in core/src/java/org/apache/solr/update/DirectUpdateHandler2.java
Override public int mergeIndexes(MergeIndexesCommand cmd) throws IOException { mergeIndexesCommands.incrementAndGet(); int rc; log.info("start " + cmd); IndexReader[] readers = cmd.readers; if (readers != null && readers.length > 0) { solrCoreState.getIndexWriter(core).addIndexes(readers); rc = 1; } else { rc = 0; } log.info("end_mergeIndexes"); // TODO: consider soft commit issues if (rc == 1 && commitTracker.getTimeUpperBound() > 0) { commitTracker.scheduleCommitWithin(commitTracker.getTimeUpperBound()); } else if (rc == 1 && softCommitTracker.getTimeUpperBound() > 0) { softCommitTracker.scheduleCommitWithin(softCommitTracker.getTimeUpperBound()); } return rc; }
// in core/src/java/org/apache/solr/update/DirectUpdateHandler2.java
public void prepareCommit(CommitUpdateCommand cmd) throws IOException { boolean error=true; try { log.info("start "+cmd); IndexWriter writer = solrCoreState.getIndexWriter(core); writer.prepareCommit(); log.info("end_prepareCommit"); error=false; } finally { if (error) numErrors.incrementAndGet(); } }
// in core/src/java/org/apache/solr/update/DirectUpdateHandler2.java
Override public void commit(CommitUpdateCommand cmd) throws IOException { if (cmd.prepareCommit) { prepareCommit(cmd); return; } IndexWriter writer = solrCoreState.getIndexWriter(core); if (cmd.optimize) { optimizeCommands.incrementAndGet(); } else { commitCommands.incrementAndGet(); if (cmd.expungeDeletes) expungeDeleteCommands.incrementAndGet(); } Future[] waitSearcher = null; if (cmd.waitSearcher) { waitSearcher = new Future[1]; } boolean error=true; try { // only allow one hard commit to proceed at once if (!cmd.softCommit) { commitLock.lock(); } log.info("start "+cmd); // We must cancel pending commits *before* we actually execute the commit. if (cmd.openSearcher) { // we can cancel any pending soft commits if this commit will open a new searcher softCommitTracker.cancelPendingCommit(); } if (!cmd.softCommit && (cmd.openSearcher || !commitTracker.getOpenSearcher())) { // cancel a pending hard commit if this commit is of equal or greater "strength"... // If the autoCommit has openSearcher=true, then this commit must have openSearcher=true // to cancel. commitTracker.cancelPendingCommit(); } if (cmd.optimize) { writer.forceMerge(cmd.maxOptimizeSegments); } else if (cmd.expungeDeletes) { writer.forceMergeDeletes(); } if (!cmd.softCommit) { synchronized (this) { // sync is currently needed to prevent preCommit from being called between preSoft and postSoft... see postSoft comments. if (ulog != null) ulog.preCommit(cmd); } // SolrCore.verbose("writer.commit() start writer=",writer); final Map<String,String> commitData = new HashMap<String,String>(); commitData.put(SolrIndexWriter.COMMIT_TIME_MSEC_KEY, String.valueOf(System.currentTimeMillis())); writer.commit(commitData); // SolrCore.verbose("writer.commit() end"); numDocsPending.set(0); callPostCommitCallbacks(); } else { callPostSoftCommitCallbacks(); } if (cmd.optimize) { callPostOptimizeCallbacks(); } if (cmd.softCommit) { // ulog.preSoftCommit(); synchronized (this) { if (ulog != null) ulog.preSoftCommit(cmd); core.getSearcher(true, false, waitSearcher, true); if (ulog != null) ulog.postSoftCommit(cmd); } // ulog.postSoftCommit(); } else { synchronized (this) { if (ulog != null) ulog.preSoftCommit(cmd); if (cmd.openSearcher) { core.getSearcher(true, false, waitSearcher); } else { // force open a new realtime searcher so realtime-get and versioning code can see the latest RefCounted<SolrIndexSearcher> searchHolder = core.openNewSearcher(true, true); searchHolder.decref(); } if (ulog != null) ulog.postSoftCommit(cmd); } if (ulog != null) ulog.postCommit(cmd); // postCommit currently means new searcher has // also been opened } // reset commit tracking if (cmd.softCommit) { softCommitTracker.didCommit(); } else { commitTracker.didCommit(); } log.info("end_commit_flush"); error=false; } finally { if (!cmd.softCommit) { commitLock.unlock(); } addCommands.set(0); deleteByIdCommands.set(0); deleteByQueryCommands.set(0); if (error) numErrors.incrementAndGet(); } // if we are supposed to wait for the searcher to be registered, then we should do it // outside any synchronized block so that other update operations can proceed. if (waitSearcher!=null && waitSearcher[0] != null) { try { waitSearcher[0].get(); } catch (InterruptedException e) { SolrException.log(log,e); } catch (ExecutionException e) { SolrException.log(log,e); } } }
// in core/src/java/org/apache/solr/update/DirectUpdateHandler2.java
Override public void newIndexWriter() throws IOException { solrCoreState.newIndexWriter(core); }
// in core/src/java/org/apache/solr/update/DirectUpdateHandler2.java
Override public void rollback(RollbackUpdateCommand cmd) throws IOException { rollbackCommands.incrementAndGet(); boolean error=true; try { log.info("start "+cmd); rollbackWriter(); //callPostRollbackCallbacks(); // reset commit tracking commitTracker.didRollback(); softCommitTracker.didRollback(); log.info("end_rollback"); error=false; } finally { addCommandsCumulative.set( addCommandsCumulative.get() - addCommands.getAndSet( 0 ) ); deleteByIdCommandsCumulative.set( deleteByIdCommandsCumulative.get() - deleteByIdCommands.getAndSet( 0 ) ); deleteByQueryCommandsCumulative.set( deleteByQueryCommandsCumulative.get() - deleteByQueryCommands.getAndSet( 0 ) ); if (error) numErrors.incrementAndGet(); } }
// in core/src/java/org/apache/solr/update/DirectUpdateHandler2.java
Override public void close() throws IOException { log.info("closing " + this); commitTracker.close(); softCommitTracker.close(); numDocsPending.set(0); solrCoreState.decref(this); }
// in core/src/java/org/apache/solr/update/DirectUpdateHandler2.java
Override public void closeWriter(IndexWriter writer) throws IOException { boolean clearRequestInfo = false; commitLock.lock(); try { SolrQueryRequest req = new LocalSolrQueryRequest(core, new ModifiableSolrParams()); SolrQueryResponse rsp = new SolrQueryResponse(); if (SolrRequestInfo.getRequestInfo() == null) { clearRequestInfo = true; SolrRequestInfo.setRequestInfo(new SolrRequestInfo(req, rsp)); // important for debugging } if (!commitOnClose) { if (writer != null) { writer.rollback(); } // we shouldn't close the transaction logs either, but leaving them open // means we can't delete them on windows (needed for tests) if (ulog != null) ulog.close(false); return; } // do a commit before we quit? boolean tryToCommit = writer != null && ulog != null && ulog.hasUncommittedChanges() && ulog.getState() == UpdateLog.State.ACTIVE; try { if (tryToCommit) { CommitUpdateCommand cmd = new CommitUpdateCommand(req, false); cmd.openSearcher = false; cmd.waitSearcher = false; cmd.softCommit = false; // TODO: keep other commit callbacks from being called? // this.commit(cmd); // too many test failures using this method... is it because of callbacks? synchronized (this) { ulog.preCommit(cmd); } // todo: refactor this shared code (or figure out why a real CommitUpdateCommand can't be used) final Map<String,String> commitData = new HashMap<String,String>(); commitData.put(SolrIndexWriter.COMMIT_TIME_MSEC_KEY, String.valueOf(System.currentTimeMillis())); writer.commit(commitData); synchronized (this) { ulog.postCommit(cmd); } } } catch (Throwable th) { log.error("Error in final commit", th); } // we went through the normal process to commit, so we don't have to artificially // cap any ulog files. try { if (ulog != null) ulog.close(false); } catch (Throwable th) { log.error("Error closing log files", th); } if (writer != null) writer.close(); } finally { commitLock.unlock(); if (clearRequestInfo) SolrRequestInfo.clearRequestInfo(); } }
// in core/src/java/org/apache/solr/update/processor/DistributedUpdateProcessor.java
Override public void processAdd(AddUpdateCommand cmd) throws IOException { // TODO: check for id field? int hash = 0; if (zkEnabled) { zkCheck(); hash = hash(cmd); nodes = setupRequest(hash); } else { isLeader = getNonZkLeaderAssumption(req); } boolean dropCmd = false; if (!forwardToLeader) { dropCmd = versionAdd(cmd); } if (dropCmd) { // TODO: do we need to add anything to the response? return; } ModifiableSolrParams params = null; if (nodes != null) { params = new ModifiableSolrParams(req.getParams()); params.set(DISTRIB_UPDATE_PARAM, (isLeader ? DistribPhase.FROMLEADER.toString() : DistribPhase.TOLEADER.toString())); params.remove("commit"); // this will be distributed from the local commit cmdDistrib.distribAdd(cmd, nodes, params); } // TODO: what to do when no idField? if (returnVersions && rsp != null && idField != null) { if (addsResponse == null) { addsResponse = new NamedList<String>(); rsp.add("adds",addsResponse); } if (scratch == null) scratch = new CharsRef(); idField.getType().indexedToReadable(cmd.getIndexedId(), scratch); addsResponse.add(scratch.toString(), cmd.getVersion()); } // TODO: keep track of errors? needs to be done at a higher level though since // an id may fail before it gets to this processor. // Given that, it may also make sense to move the version reporting out of this // processor too. }
// in core/src/java/org/apache/solr/update/processor/DistributedUpdateProcessor.java
private void doLocalAdd(AddUpdateCommand cmd) throws IOException { super.processAdd(cmd); }
// in core/src/java/org/apache/solr/update/processor/DistributedUpdateProcessor.java
private void doLocalDelete(DeleteUpdateCommand cmd) throws IOException { super.processDelete(cmd); }
// in core/src/java/org/apache/solr/update/processor/DistributedUpdateProcessor.java
private boolean versionAdd(AddUpdateCommand cmd) throws IOException { BytesRef idBytes = cmd.getIndexedId(); if (vinfo == null || idBytes == null) { super.processAdd(cmd); return false; } // This is only the hash for the bucket, and must be based only on the uniqueKey (i.e. do not use a pluggable hash here) int bucketHash = Hash.murmurhash3_x86_32(idBytes.bytes, idBytes.offset, idBytes.length, 0); // at this point, there is an update we need to try and apply. // we may or may not be the leader. // Find any existing version in the document // TODO: don't reuse update commands any more! long versionOnUpdate = cmd.getVersion(); if (versionOnUpdate == 0) { SolrInputField versionField = cmd.getSolrInputDocument().getField(VersionInfo.VERSION_FIELD); if (versionField != null) { Object o = versionField.getValue(); versionOnUpdate = o instanceof Number ? ((Number) o).longValue() : Long.parseLong(o.toString()); } else { // Find the version String versionOnUpdateS = req.getParams().get(VERSION_FIELD); versionOnUpdate = versionOnUpdateS == null ? 0 : Long.parseLong(versionOnUpdateS); } } boolean isReplay = (cmd.getFlags() & UpdateCommand.REPLAY) != 0; boolean leaderLogic = isLeader && !isReplay; VersionBucket bucket = vinfo.bucket(bucketHash); vinfo.lockForUpdate(); try { synchronized (bucket) { // we obtain the version when synchronized and then do the add so we can ensure that // if version1 < version2 then version1 is actually added before version2. // even if we don't store the version field, synchronizing on the bucket // will enable us to know what version happened first, and thus enable // realtime-get to work reliably. // TODO: if versions aren't stored, do we need to set on the cmd anyway for some reason? // there may be other reasons in the future for a version on the commands if (versionsStored) { long bucketVersion = bucket.highest; if (leaderLogic) { boolean updated = getUpdatedDocument(cmd); if (updated && versionOnUpdate == -1) { versionOnUpdate = 1; // implied "doc must exist" for now... } if (versionOnUpdate != 0) { Long lastVersion = vinfo.lookupVersion(cmd.getIndexedId()); long foundVersion = lastVersion == null ? -1 : lastVersion; if ( versionOnUpdate == foundVersion || (versionOnUpdate < 0 && foundVersion < 0) || (versionOnUpdate==1 && foundVersion > 0) ) { // we're ok if versions match, or if both are negative (all missing docs are equal), or if cmd // specified it must exist (versionOnUpdate==1) and it does. } else { throw new SolrException(ErrorCode.CONFLICT, "version conflict for " + cmd.getPrintableId() + " expected=" + versionOnUpdate + " actual=" + foundVersion); } } long version = vinfo.getNewClock(); cmd.setVersion(version); cmd.getSolrInputDocument().setField(VersionInfo.VERSION_FIELD, version); bucket.updateHighest(version); } else { // The leader forwarded us this update. cmd.setVersion(versionOnUpdate); if (ulog.getState() != UpdateLog.State.ACTIVE && (cmd.getFlags() & UpdateCommand.REPLAY) == 0) { // we're not in an active state, and this update isn't from a replay, so buffer it. cmd.setFlags(cmd.getFlags() | UpdateCommand.BUFFERING); ulog.add(cmd); return true; } // if we aren't the leader, then we need to check that updates were not re-ordered if (bucketVersion != 0 && bucketVersion < versionOnUpdate) { // we're OK... this update has a version higher than anything we've seen // in this bucket so far, so we know that no reordering has yet occured. bucket.updateHighest(versionOnUpdate); } else { // there have been updates higher than the current update. we need to check // the specific version for this id. Long lastVersion = vinfo.lookupVersion(cmd.getIndexedId()); if (lastVersion != null && Math.abs(lastVersion) >= versionOnUpdate) { // This update is a repeat, or was reordered. We need to drop this update. return true; } } } } doLocalAdd(cmd); } // end synchronized (bucket) } finally { vinfo.unlockForUpdate(); } return false; }
// in core/src/java/org/apache/solr/update/processor/DistributedUpdateProcessor.java
boolean getUpdatedDocument(AddUpdateCommand cmd) throws IOException { SolrInputDocument sdoc = cmd.getSolrInputDocument(); boolean update = false; for (SolrInputField sif : sdoc.values()) { if (sif.getValue() instanceof Map) { update = true; break; } } if (!update) return false; BytesRef id = cmd.getIndexedId(); SolrInputDocument oldDoc = RealTimeGetComponent.getInputDocument(cmd.getReq().getCore(), id); if (oldDoc == null) { // not found... allow this in the future (depending on the details of the update, or if the user explicitly sets it). // could also just not change anything here and let the optimistic locking throw the error throw new SolrException(ErrorCode.CONFLICT, "Document not found for update. id=" + cmd.getPrintableId()); } oldDoc.remove(VERSION_FIELD); for (SolrInputField sif : sdoc.values()) { Object val = sif.getValue(); if (val instanceof Map) { for (Entry<String,Object> entry : ((Map<String,Object>) val).entrySet()) { String key = entry.getKey(); Object fieldVal = entry.getValue(); if ("add".equals(key)) { oldDoc.addField( sif.getName(), fieldVal, sif.getBoost()); } else if ("set".equals(key)) { oldDoc.setField(sif.getName(), fieldVal, sif.getBoost()); } else if ("inc".equals(key)) { SolrInputField numericField = oldDoc.get(sif.getName()); if (numericField == null) { oldDoc.setField(sif.getName(), fieldVal, sif.getBoost()); } else { // TODO: fieldtype needs externalToObject? String oldValS = numericField.getFirstValue().toString(); SchemaField sf = cmd.getReq().getSchema().getField(sif.getName()); BytesRef term = new BytesRef(); sf.getType().readableToIndexed(oldValS, term); Object oldVal = sf.getType().toObject(sf, term); String fieldValS = fieldVal.toString(); Number result; if (oldVal instanceof Long) { result = ((Long) oldVal).longValue() + Long.parseLong(fieldValS); } else if (oldVal instanceof Float) { result = ((Float) oldVal).floatValue() + Float.parseFloat(fieldValS); } else if (oldVal instanceof Double) { result = ((Double) oldVal).doubleValue() + Double.parseDouble(fieldValS); } else { // int, short, byte result = ((Integer) oldVal).intValue() + Integer.parseInt(fieldValS); } oldDoc.setField(sif.getName(), result, sif.getBoost()); } } } } else { // normal fields are treated as a "set" oldDoc.put(sif.getName(), sif); } } cmd.solrDoc = oldDoc; return true; }
// in core/src/java/org/apache/solr/update/processor/DistributedUpdateProcessor.java
Override public void processDelete(DeleteUpdateCommand cmd) throws IOException { if (!cmd.isDeleteById()) { doDeleteByQuery(cmd); return; } int hash = 0; if (zkEnabled) { zkCheck(); hash = hash(cmd); nodes = setupRequest(hash); } else { isLeader = getNonZkLeaderAssumption(req); } boolean dropCmd = false; if (!forwardToLeader) { dropCmd = versionDelete(cmd); } if (dropCmd) { // TODO: do we need to add anything to the response? return; } ModifiableSolrParams params = null; if (nodes != null) { params = new ModifiableSolrParams(req.getParams()); params.set(DISTRIB_UPDATE_PARAM, (isLeader ? DistribPhase.FROMLEADER.toString() : DistribPhase.TOLEADER.toString())); params.remove("commit"); // we already will have forwarded this from our local commit cmdDistrib.distribDelete(cmd, nodes, params); } // cmd.getIndexId == null when delete by query // TODO: what to do when no idField? if (returnVersions && rsp != null && cmd.getIndexedId() != null && idField != null) { if (deleteResponse == null) { deleteResponse = new NamedList<String>(); rsp.add("deletes",deleteResponse); } if (scratch == null) scratch = new CharsRef(); idField.getType().indexedToReadable(cmd.getIndexedId(), scratch); deleteResponse.add(scratch.toString(), cmd.getVersion()); // we're returning the version of the delete.. not the version of the doc we deleted. } }
// in core/src/java/org/apache/solr/update/processor/DistributedUpdateProcessor.java
public void doDeleteByQuery(DeleteUpdateCommand cmd) throws IOException { // even in non zk mode, tests simulate updates from a leader if(!zkEnabled) { isLeader = getNonZkLeaderAssumption(req); } else { zkCheck(); } // NONE: we are the first to receive this deleteByQuery // - it must be forwarded to the leader of every shard // TO: we are a leader receiving a forwarded deleteByQuery... we must: // - block all updates (use VersionInfo) // - flush *all* updates going to our replicas // - forward the DBQ to our replicas and wait for the response // - log + execute the local DBQ // FROM: we are a replica receiving a DBQ from our leader // - log + execute the local DBQ DistribPhase phase = DistribPhase.parseParam(req.getParams().get(DISTRIB_UPDATE_PARAM)); if (zkEnabled && DistribPhase.NONE == phase) { boolean leaderForAnyShard = false; // start off by assuming we are not a leader for any shard Map<String,Slice> slices = zkController.getCloudState().getSlices(collection); if (slices == null) { throw new SolrException(ErrorCode.BAD_REQUEST, "Cannot find collection:" + collection + " in " + zkController.getCloudState().getCollections()); } ModifiableSolrParams params = new ModifiableSolrParams(req.getParams()); params.set(DISTRIB_UPDATE_PARAM, DistribPhase.TOLEADER.toString()); List<Node> leaders = new ArrayList<Node>(slices.size()); for (Map.Entry<String,Slice> sliceEntry : slices.entrySet()) { String sliceName = sliceEntry.getKey(); ZkNodeProps leaderProps; try { leaderProps = zkController.getZkStateReader().getLeaderProps(collection, sliceName); } catch (InterruptedException e) { throw new SolrException(ErrorCode.SERVER_ERROR, "Exception finding leader for shard " + sliceName, e); } // TODO: What if leaders changed in the meantime? // should we send out slice-at-a-time and if a node returns "hey, I'm not a leader" (or we get an error because it went down) then look up the new leader? // Am I the leader for this slice? ZkCoreNodeProps coreLeaderProps = new ZkCoreNodeProps(leaderProps); String leaderNodeName = coreLeaderProps.getCoreNodeName(); String coreName = req.getCore().getName(); String coreNodeName = zkController.getNodeName() + "_" + coreName; isLeader = coreNodeName.equals(leaderNodeName); if (isLeader) { // don't forward to ourself leaderForAnyShard = true; } else { leaders.add(new StdNode(coreLeaderProps)); } } params.remove("commit"); // this will be distributed from the local commit cmdDistrib.distribDelete(cmd, leaders, params); if (!leaderForAnyShard) { return; } // change the phase to TOLEADER so we look up and forward to our own replicas (if any) phase = DistribPhase.TOLEADER; } List<Node> replicas = null; if (zkEnabled && DistribPhase.TOLEADER == phase) { // This core should be a leader replicas = setupRequest(); } if (vinfo == null) { super.processDelete(cmd); return; } // at this point, there is an update we need to try and apply. // we may or may not be the leader. // Find the version long versionOnUpdate = cmd.getVersion(); if (versionOnUpdate == 0) { String versionOnUpdateS = req.getParams().get(VERSION_FIELD); versionOnUpdate = versionOnUpdateS == null ? 0 : Long.parseLong(versionOnUpdateS); } versionOnUpdate = Math.abs(versionOnUpdate); // normalize to positive version boolean isReplay = (cmd.getFlags() & UpdateCommand.REPLAY) != 0; boolean leaderLogic = isLeader && !isReplay; if (!leaderLogic && versionOnUpdate==0) { throw new SolrException(ErrorCode.BAD_REQUEST, "missing _version_ on update from leader"); } vinfo.blockUpdates(); try { if (versionsStored) { if (leaderLogic) { long version = vinfo.getNewClock(); cmd.setVersion(-version); // TODO update versions in all buckets doLocalDelete(cmd); } else { cmd.setVersion(-versionOnUpdate); if (ulog.getState() != UpdateLog.State.ACTIVE && (cmd.getFlags() & UpdateCommand.REPLAY) == 0) { // we're not in an active state, and this update isn't from a replay, so buffer it. cmd.setFlags(cmd.getFlags() | UpdateCommand.BUFFERING); ulog.deleteByQuery(cmd); return; } doLocalDelete(cmd); } } // since we don't know which documents were deleted, the easiest thing to do is to invalidate // all real-time caches (i.e. UpdateLog) which involves also getting a new version of the IndexReader // (so cache misses will see up-to-date data) } finally { vinfo.unblockUpdates(); } // TODO: need to handle reorders to replicas somehow // forward to all replicas if (leaderLogic && replicas != null) { ModifiableSolrParams params = new ModifiableSolrParams(req.getParams()); params.set(VERSION_FIELD, Long.toString(cmd.getVersion())); params.set(DISTRIB_UPDATE_PARAM, DistribPhase.FROMLEADER.toString()); cmdDistrib.distribDelete(cmd, replicas, params); cmdDistrib.finish(); } if (returnVersions && rsp != null) { if (deleteByQueryResponse == null) { deleteByQueryResponse = new NamedList<String>(); rsp.add("deleteByQuery",deleteByQueryResponse); } deleteByQueryResponse.add(cmd.getQuery(), cmd.getVersion()); } }
// in core/src/java/org/apache/solr/update/processor/DistributedUpdateProcessor.java
private boolean versionDelete(DeleteUpdateCommand cmd) throws IOException { BytesRef idBytes = cmd.getIndexedId(); if (vinfo == null || idBytes == null) { super.processDelete(cmd); return false; } // This is only the hash for the bucket, and must be based only on the uniqueKey (i.e. do not use a pluggable hash here) int bucketHash = Hash.murmurhash3_x86_32(idBytes.bytes, idBytes.offset, idBytes.length, 0); // at this point, there is an update we need to try and apply. // we may or may not be the leader. // Find the version long versionOnUpdate = cmd.getVersion(); if (versionOnUpdate == 0) { String versionOnUpdateS = req.getParams().get(VERSION_FIELD); versionOnUpdate = versionOnUpdateS == null ? 0 : Long.parseLong(versionOnUpdateS); } long signedVersionOnUpdate = versionOnUpdate; versionOnUpdate = Math.abs(versionOnUpdate); // normalize to positive version boolean isReplay = (cmd.getFlags() & UpdateCommand.REPLAY) != 0; boolean leaderLogic = isLeader && !isReplay; if (!leaderLogic && versionOnUpdate==0) { throw new SolrException(ErrorCode.BAD_REQUEST, "missing _version_ on update from leader"); } VersionBucket bucket = vinfo.bucket(bucketHash); vinfo.lockForUpdate(); try { synchronized (bucket) { if (versionsStored) { long bucketVersion = bucket.highest; if (leaderLogic) { if (signedVersionOnUpdate != 0) { Long lastVersion = vinfo.lookupVersion(cmd.getIndexedId()); long foundVersion = lastVersion == null ? -1 : lastVersion; if ( (signedVersionOnUpdate == foundVersion) || (signedVersionOnUpdate < 0 && foundVersion < 0) || (signedVersionOnUpdate == 1 && foundVersion > 0) ) { // we're ok if versions match, or if both are negative (all missing docs are equal), or if cmd // specified it must exist (versionOnUpdate==1) and it does. } else { throw new SolrException(ErrorCode.CONFLICT, "version conflict for " + cmd.getId() + " expected=" + signedVersionOnUpdate + " actual=" + foundVersion); } } long version = vinfo.getNewClock(); cmd.setVersion(-version); bucket.updateHighest(version); } else { cmd.setVersion(-versionOnUpdate); if (ulog.getState() != UpdateLog.State.ACTIVE && (cmd.getFlags() & UpdateCommand.REPLAY) == 0) { // we're not in an active state, and this update isn't from a replay, so buffer it. cmd.setFlags(cmd.getFlags() | UpdateCommand.BUFFERING); ulog.delete(cmd); return true; } // if we aren't the leader, then we need to check that updates were not re-ordered if (bucketVersion != 0 && bucketVersion < versionOnUpdate) { // we're OK... this update has a version higher than anything we've seen // in this bucket so far, so we know that no reordering has yet occured. bucket.updateHighest(versionOnUpdate); } else { // there have been updates higher than the current update. we need to check // the specific version for this id. Long lastVersion = vinfo.lookupVersion(cmd.getIndexedId()); if (lastVersion != null && Math.abs(lastVersion) >= versionOnUpdate) { // This update is a repeat, or was reordered. We need to drop this update. return true; } } } } doLocalDelete(cmd); return false; } // end synchronized (bucket) } finally { vinfo.unlockForUpdate(); } }
// in core/src/java/org/apache/solr/update/processor/DistributedUpdateProcessor.java
Override public void processCommit(CommitUpdateCommand cmd) throws IOException { if (zkEnabled) { zkCheck(); } if (vinfo != null) { vinfo.lockForUpdate(); } try { if (ulog == null || ulog.getState() == UpdateLog.State.ACTIVE || (cmd.getFlags() & UpdateCommand.REPLAY) != 0) { super.processCommit(cmd); } else { log.info("Ignoring commit while not ACTIVE - state: " + ulog.getState() + " replay:" + (cmd.getFlags() & UpdateCommand.REPLAY)); } } finally { if (vinfo != null) { vinfo.unlockForUpdate(); } } // TODO: we should consider this? commit everyone in the current collection if (zkEnabled) { ModifiableSolrParams params = new ModifiableSolrParams(req.getParams()); if (!params.getBool(COMMIT_END_POINT, false)) { params.set(COMMIT_END_POINT, true); String nodeName = req.getCore().getCoreDescriptor().getCoreContainer() .getZkController().getNodeName(); String shardZkNodeName = nodeName + "_" + req.getCore().getName(); List<Node> nodes = getCollectionUrls(req, req.getCore().getCoreDescriptor() .getCloudDescriptor().getCollectionName(), shardZkNodeName); if (nodes != null) { cmdDistrib.distribCommit(cmd, nodes, params); finish(); } } } }
// in core/src/java/org/apache/solr/update/processor/DistributedUpdateProcessor.java
Override public void finish() throws IOException { doFinish(); if (next != null && nodes == null) next.finish(); }
// in core/src/java/org/apache/solr/update/processor/RunUpdateProcessorFactory.java
Override public void processAdd(AddUpdateCommand cmd) throws IOException { updateHandler.addDoc(cmd); super.processAdd(cmd); changesSinceCommit = true; }
// in core/src/java/org/apache/solr/update/processor/RunUpdateProcessorFactory.java
Override public void processDelete(DeleteUpdateCommand cmd) throws IOException { if( cmd.isDeleteById()) { updateHandler.delete(cmd); } else { updateHandler.deleteByQuery(cmd); } super.processDelete(cmd); changesSinceCommit = true; }
// in core/src/java/org/apache/solr/update/processor/RunUpdateProcessorFactory.java
Override public void processMergeIndexes(MergeIndexesCommand cmd) throws IOException { updateHandler.mergeIndexes(cmd); super.processMergeIndexes(cmd); }
// in core/src/java/org/apache/solr/update/processor/RunUpdateProcessorFactory.java
Override public void processCommit(CommitUpdateCommand cmd) throws IOException { updateHandler.commit(cmd); super.processCommit(cmd); changesSinceCommit = false; }
// in core/src/java/org/apache/solr/update/processor/RunUpdateProcessorFactory.java
Override public void processRollback(RollbackUpdateCommand cmd) throws IOException { updateHandler.rollback(cmd); super.processRollback(cmd); changesSinceCommit = false; }
// in core/src/java/org/apache/solr/update/processor/RunUpdateProcessorFactory.java
Override public void finish() throws IOException { if (changesSinceCommit && updateHandler.getUpdateLog() != null) { updateHandler.getUpdateLog().finish(null); } super.finish(); }
// in core/src/java/org/apache/solr/update/processor/UpdateRequestProcessor.java
public void processAdd(AddUpdateCommand cmd) throws IOException { if (next != null) next.processAdd(cmd); }
// in core/src/java/org/apache/solr/update/processor/UpdateRequestProcessor.java
public void processDelete(DeleteUpdateCommand cmd) throws IOException { if (next != null) next.processDelete(cmd); }
// in core/src/java/org/apache/solr/update/processor/UpdateRequestProcessor.java
public void processMergeIndexes(MergeIndexesCommand cmd) throws IOException { if (next != null) next.processMergeIndexes(cmd); }
// in core/src/java/org/apache/solr/update/processor/UpdateRequestProcessor.java
public void processCommit(CommitUpdateCommand cmd) throws IOException { if (next != null) next.processCommit(cmd); }
// in core/src/java/org/apache/solr/update/processor/UpdateRequestProcessor.java
public void processRollback(RollbackUpdateCommand cmd) throws IOException { if (next != null) next.processRollback(cmd); }
// in core/src/java/org/apache/solr/update/processor/UpdateRequestProcessor.java
public void finish() throws IOException { if (next != null) next.finish(); }
// in core/src/java/org/apache/solr/update/processor/FieldMutatingUpdateProcessor.java
Override public void processAdd(AddUpdateCommand cmd) throws IOException { final SolrInputDocument doc = cmd.getSolrInputDocument(); // make a copy we can iterate over while mutating the doc final Collection<String> fieldNames = new ArrayList<String>(doc.getFieldNames()); for (final String fname : fieldNames) { if (! selector.shouldMutate(fname)) continue; final SolrInputField src = doc.get(fname); SolrInputField dest = null; try { dest = mutate(src); } catch (SolrException e) { String msg = "Unable to mutate field '"+fname+"': "+e.getMessage(); SolrException.log(log, msg, e); throw new SolrException(BAD_REQUEST, msg, e); } if (null == dest) { doc.remove(fname); } else { // semantics of what happens if dest has diff name are hard // we could treat it as a copy, or a rename // for now, don't allow it. if (! fname.equals(dest.getName()) ) { throw new SolrException(SERVER_ERROR, "mutute returned field with different name: " + fname + " => " + dest.getName()); } doc.put(dest.getName(), dest); } } super.processAdd(cmd); }
// in core/src/java/org/apache/solr/update/processor/URLClassifyProcessor.java
Override public void processAdd(AddUpdateCommand command) throws IOException { if (isEnabled()) { SolrInputDocument document = command.getSolrInputDocument(); if (document.containsKey(urlFieldname)) { String url = (String) document.getFieldValue(urlFieldname); try { URL normalizedURL = getNormalizedURL(url); document.setField(lengthFieldname, length(normalizedURL)); document.setField(levelsFieldname, levels(normalizedURL)); document.setField(toplevelpageFieldname, isTopLevelPage(normalizedURL) ? 1 : 0); document.setField(landingpageFieldname, isLandingPage(normalizedURL) ? 1 : 0); if (domainFieldname != null) { document.setField(domainFieldname, normalizedURL.getHost()); } if (canonicalUrlFieldname != null) { document.setField(canonicalUrlFieldname, getCanonicalUrl(normalizedURL)); } log.debug(document.toString()); } catch (MalformedURLException e) { log.warn("cannot get the normalized url for \"" + url + "\" due to " + e.getMessage()); } catch (URISyntaxException e) { log.warn("cannot get the normalized url for \"" + url + "\" due to " + e.getMessage()); } } } super.processAdd(command); }
// in core/src/java/org/apache/solr/update/processor/SignatureUpdateProcessorFactory.java
Override public void processAdd(AddUpdateCommand cmd) throws IOException { if (enabled) { SolrInputDocument doc = cmd.getSolrInputDocument(); List<String> currDocSigFields = null; if (sigFields == null || sigFields.size() == 0) { Collection<String> docFields = doc.getFieldNames(); currDocSigFields = new ArrayList<String>(docFields.size()); currDocSigFields.addAll(docFields); Collections.sort(currDocSigFields); } else { currDocSigFields = sigFields; } Signature sig = req.getCore().getResourceLoader().newInstance(signatureClass, Signature.class); sig.init(params); for (String field : currDocSigFields) { SolrInputField f = doc.getField(field); if (f != null) { sig.add(field); Object o = f.getValue(); if (o instanceof Collection) { for (Object oo : (Collection)o) { sig.add(String.valueOf(oo)); } } else { sig.add(String.valueOf(o)); } } } byte[] signature = sig.getSignature(); char[] arr = new char[signature.length<<1]; for (int i=0; i<signature.length; i++) { int b = signature[i]; int idx = i<<1; arr[idx]= StrUtils.HEX_DIGITS[(b >> 4) & 0xf]; arr[idx+1]= StrUtils.HEX_DIGITS[b & 0xf]; } String sigString = new String(arr); doc.addField(signatureField, sigString); if (overwriteDupes) { cmd.updateTerm = new Term(signatureField, sigString); } } if (next != null) next.processAdd(cmd); }
// in core/src/java/org/apache/solr/update/processor/LogUpdateProcessorFactory.java
Override public void processAdd(AddUpdateCommand cmd) throws IOException { if (logDebug) { log.debug("PRE_UPDATE " + cmd.toString()); } // call delegate first so we can log things like the version that get set later if (next != null) next.processAdd(cmd); // Add a list of added id's to the response if (adds == null) { adds = new ArrayList<String>(); toLog.add("add",adds); } if (adds.size() < maxNumToLog) { long version = cmd.getVersion(); String msg = cmd.getPrintableId(); if (version != 0) msg = msg + " (" + version + ')'; adds.add(msg); } numAdds++; }
// in core/src/java/org/apache/solr/update/processor/LogUpdateProcessorFactory.java
Override public void processDelete( DeleteUpdateCommand cmd ) throws IOException { if (logDebug) { log.debug("PRE_UPDATE " + cmd.toString()); } if (next != null) next.processDelete(cmd); if (cmd.isDeleteById()) { if (deletes == null) { deletes = new ArrayList<String>(); toLog.add("delete",deletes); } if (deletes.size() < maxNumToLog) { long version = cmd.getVersion(); String msg = cmd.getId(); if (version != 0) msg = msg + " (" + version + ')'; deletes.add(msg); } } else { if (toLog.size() < maxNumToLog) { long version = cmd.getVersion(); String msg = cmd.query; if (version != 0) msg = msg + " (" + version + ')'; toLog.add("deleteByQuery", msg); } } numDeletes++; }
// in core/src/java/org/apache/solr/update/processor/LogUpdateProcessorFactory.java
Override public void processMergeIndexes(MergeIndexesCommand cmd) throws IOException { if (logDebug) { log.debug("PRE_UPDATE " + cmd.toString()); } if (next != null) next.processMergeIndexes(cmd); toLog.add("mergeIndexes", cmd.toString()); }
// in core/src/java/org/apache/solr/update/processor/LogUpdateProcessorFactory.java
Override public void processCommit( CommitUpdateCommand cmd ) throws IOException { if (logDebug) { log.debug("PRE_UPDATE " + cmd.toString()); } if (next != null) next.processCommit(cmd); final String msg = cmd.optimize ? "optimize" : "commit"; toLog.add(msg, ""); }
// in core/src/java/org/apache/solr/update/processor/LogUpdateProcessorFactory.java
Override public void processRollback( RollbackUpdateCommand cmd ) throws IOException { if (logDebug) { log.debug("PRE_UPDATE " + cmd.toString()); } if (next != null) next.processRollback(cmd); toLog.add("rollback", ""); }
// in core/src/java/org/apache/solr/update/processor/LogUpdateProcessorFactory.java
Override public void finish() throws IOException { if (logDebug) { log.debug("PRE_UPDATE finish()"); } if (next != null) next.finish(); // LOG A SUMMARY WHEN ALL DONE (INFO LEVEL) NamedList<Object> stdLog = rsp.getToLog(); StringBuilder sb = new StringBuilder(req.getCore().getLogId()); for (int i=0; i<stdLog.size(); i++) { String name = stdLog.getName(i); Object val = stdLog.getVal(i); if (name != null) { sb.append(name).append('='); } sb.append(val).append(' '); } stdLog.clear(); // make it so SolrCore.exec won't log this again // if id lists were truncated, show how many more there were if (adds != null && numAdds > maxNumToLog) { adds.add("... (" + numAdds + " adds)"); } if (deletes != null && numDeletes > maxNumToLog) { deletes.add("... (" + numDeletes + " deletes)"); } long elapsed = rsp.getEndTime() - req.getStartTime(); sb.append(toLog).append(" 0 ").append(elapsed); log.info(sb.toString()); }
// in core/src/java/org/apache/solr/update/processor/UniqFieldsUpdateProcessorFactory.java
Override public void processAdd(AddUpdateCommand cmd) throws IOException { if(fields != null){ SolrInputDocument solrInputDocument = cmd.getSolrInputDocument(); List<Object> uniqList = new ArrayList<Object>(); for (String field : fields) { uniqList.clear(); Collection<Object> col = solrInputDocument.getFieldValues(field); if (col != null) { for (Object o : col) { if(!uniqList.contains(o)) uniqList.add(o); } solrInputDocument.remove(field); for (Object o : uniqList) { solrInputDocument.addField(field, o); } } } } super.processAdd(cmd); }
// in core/src/java/org/apache/solr/update/DefaultSolrCoreState.java
Override public synchronized IndexWriter getIndexWriter(SolrCore core) throws IOException { if (indexWriter == null) { indexWriter = createMainIndexWriter(core, "DirectUpdateHandler2", false, false); } return indexWriter; }
// in core/src/java/org/apache/solr/update/DefaultSolrCoreState.java
Override public synchronized void newIndexWriter(SolrCore core) throws IOException { if (indexWriter != null) { indexWriter.close(); } indexWriter = createMainIndexWriter(core, "DirectUpdateHandler2", false, true); }
// in core/src/java/org/apache/solr/update/DefaultSolrCoreState.java
Override public void decref(IndexWriterCloser closer) throws IOException { synchronized (this) { refCnt--; if (refCnt == 0) { try { if (closer != null) { closer.closeWriter(indexWriter); } else if (indexWriter != null) { indexWriter.close(); } } catch (Throwable t) { log.error("Error during shutdown of writer.", t); } try { directoryFactory.close(); } catch (Throwable t) { log.error("Error during shutdown of directory factory.", t); } try { cancelRecovery(); } catch (Throwable t) { log.error("Error cancelling recovery", t); } closed = true; } } }
// in core/src/java/org/apache/solr/update/DefaultSolrCoreState.java
Override public synchronized void rollbackIndexWriter(SolrCore core) throws IOException { indexWriter.rollback(); newIndexWriter(core); }
// in core/src/java/org/apache/solr/update/DefaultSolrCoreState.java
protected SolrIndexWriter createMainIndexWriter(SolrCore core, String name, boolean removeAllExisting, boolean forceNewDirectory) throws IOException { return new SolrIndexWriter(name, core.getNewIndexDir(), core.getDirectoryFactory(), removeAllExisting, core.getSchema(), core.getSolrConfig().indexConfig, core.getDeletionPolicy(), core.getCodec(), forceNewDirectory); }
// in core/src/java/org/apache/solr/update/SolrIndexWriter.java
private static InfoStream toInfoStream(SolrIndexConfig config) throws IOException { String infoStreamFile = config.infoStreamFile; if (infoStreamFile != null) { File f = new File(infoStreamFile); File parent = f.getParentFile(); if (parent != null) parent.mkdirs(); FileOutputStream fos = new FileOutputStream(f, true); return new PrintStreamInfoStream(new PrintStream(fos, true)); } else { return InfoStream.NO_OUTPUT; } }
// in core/src/java/org/apache/solr/update/SolrIndexWriter.java
Override public void close() throws IOException { log.debug("Closing Writer " + name); Directory directory = getDirectory(); final InfoStream infoStream = isClosed ? null : getConfig().getInfoStream(); try { super.close(); if(infoStream != null) { infoStream.close(); } } finally { isClosed = true; directoryFactory.release(directory); numCloses.incrementAndGet(); } }
// in core/src/java/org/apache/solr/update/SolrIndexWriter.java
Override public void rollback() throws IOException { try { super.rollback(); } finally { isClosed = true; } }
// in core/src/java/org/apache/solr/update/SolrCmdDistributor.java
public void distribDelete(DeleteUpdateCommand cmd, List<Node> urls, ModifiableSolrParams params) throws IOException { checkResponses(false); if (cmd.isDeleteById()) { doDelete(cmd, urls, params); } else { doDelete(cmd, urls, params); } }
// in core/src/java/org/apache/solr/update/SolrCmdDistributor.java
public void distribAdd(AddUpdateCommand cmd, List<Node> nodes, ModifiableSolrParams params) throws IOException { checkResponses(false); // make sure any pending deletes are flushed flushDeletes(1); // TODO: this is brittle // need to make a clone since these commands may be reused AddUpdateCommand clone = new AddUpdateCommand(null); clone.solrDoc = cmd.solrDoc; clone.commitWithin = cmd.commitWithin; clone.overwrite = cmd.overwrite; clone.setVersion(cmd.getVersion()); AddRequest addRequest = new AddRequest(); addRequest.cmd = clone; addRequest.params = params; for (Node node : nodes) { List<AddRequest> alist = adds.get(node); if (alist == null) { alist = new ArrayList<AddRequest>(2); adds.put(node, alist); } alist.add(addRequest); } flushAdds(maxBufferedAddsPerServer); }
// in core/src/java/org/apache/solr/update/SolrCmdDistributor.java
public void distribCommit(CommitUpdateCommand cmd, List<Node> nodes, ModifiableSolrParams params) throws IOException { // Wait for all outstanding responses to make sure that a commit // can't sneak in ahead of adds or deletes we already sent. // We could do this on a per-server basis, but it's more complex // and this solution will lead to commits happening closer together. checkResponses(true); // currently, we dont try to piggy back on outstanding adds or deletes UpdateRequestExt ureq = new UpdateRequestExt(); ureq.setParams(params); addCommit(ureq, cmd); for (Node node : nodes) { submit(ureq, node); } // if the command wanted to block until everything was committed, // then do that here. if (cmd.waitSearcher) { checkResponses(true); } }
// in core/src/java/org/apache/solr/update/SolrCmdDistributor.java
private void doDelete(DeleteUpdateCommand cmd, List<Node> nodes, ModifiableSolrParams params) throws IOException { flushAdds(1); DeleteUpdateCommand clonedCmd = clone(cmd); DeleteRequest deleteRequest = new DeleteRequest(); deleteRequest.cmd = clonedCmd; deleteRequest.params = params; for (Node node : nodes) { List<DeleteRequest> dlist = deletes.get(node); if (dlist == null) { dlist = new ArrayList<DeleteRequest>(2); deletes.put(node, dlist); } dlist.add(deleteRequest); } flushDeletes(maxBufferedDeletesPerServer); }
// in core/src/java/org/apache/solr/update/TransactionLog.java
Override public Object resolve(Object o, JavaBinCodec codec) throws IOException { if (o instanceof BytesRef) { BytesRef br = (BytesRef)o; codec.writeByteArray(br.bytes, br.offset, br.length); return null; } return o; }
// in core/src/java/org/apache/solr/update/TransactionLog.java
Override public void writeExternString(String s) throws IOException { if (s == null) { writeTag(NULL); return; } // no need to synchronize globalStringMap - it's only updated before the first record is written to the log Integer idx = globalStringMap.get(s); if (idx == null) { // write a normal string writeStr(s); } else { // write the extern string writeTag(EXTERN_STRING, idx); } }
// in core/src/java/org/apache/solr/update/TransactionLog.java
Override public String readExternString(FastInputStream fis) throws IOException { int idx = readSize(fis); if (idx != 0) {// idx != 0 is the index of the extern string // no need to synchronize globalStringList - it's only updated before the first record is written to the log return globalStringList.get(idx - 1); } else {// idx == 0 means it has a string value // this shouldn't happen with this codec subclass. throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Corrupt transaction log"); } }
// in core/src/java/org/apache/solr/update/TransactionLog.java
public boolean endsWithCommit() throws IOException { long size; synchronized (this) { fos.flush(); size = fos.size(); } // the end of the file should have the end message (added during a commit) plus a 4 byte size byte[] buf = new byte[ END_MESSAGE.length() ]; long pos = size - END_MESSAGE.length() - 4; if (pos < 0) return false; ChannelFastInputStream is = new ChannelFastInputStream(channel, pos); is.read(buf); for (int i=0; i<buf.length; i++) { if (buf[i] != END_MESSAGE.charAt(i)) return false; } return true; }
// in core/src/java/org/apache/solr/update/TransactionLog.java
public void rollback(long pos) throws IOException { synchronized (this) { assert snapshot_size == pos; fos.flush(); raf.setLength(pos); fos.setWritten(pos); assert fos.size() == pos; numRecords = snapshot_numRecords; } }
// in core/src/java/org/apache/solr/update/TransactionLog.java
private void readHeader(FastInputStream fis) throws IOException { // read existing header fis = fis != null ? fis : new ChannelFastInputStream(channel, 0); LogCodec codec = new LogCodec(); Map header = (Map)codec.unmarshal(fis); fis.readInt(); // skip size // needed to read other records synchronized (this) { globalStringList = (List<String>)header.get("strings"); globalStringMap = new HashMap<String, Integer>(globalStringList.size()); for (int i=0; i<globalStringList.size(); i++) { globalStringMap.put( globalStringList.get(i), i+1); } } }
// in core/src/java/org/apache/solr/update/TransactionLog.java
private void writeLogHeader(LogCodec codec) throws IOException { long pos = fos.size(); assert pos == 0; Map header = new LinkedHashMap<String,Object>(); header.put("SOLR_TLOG",1); // a magic string + version number header.put("strings",globalStringList); codec.marshal(header, fos); endRecord(pos); }
// in core/src/java/org/apache/solr/update/TransactionLog.java
private void endRecord(long startRecordPosition) throws IOException { fos.writeInt((int)(fos.size() - startRecordPosition)); numRecords++; }
// in core/src/java/org/apache/solr/update/TransactionLog.java
public ReverseReader getReverseReader() throws IOException { return new ReverseReader(); }
// in core/src/java/org/apache/solr/update/TransactionLog.java
public Object next() throws IOException, InterruptedException { long pos = fis.position(); synchronized (TransactionLog.this) { if (trace) { log.trace("Reading log record. pos="+pos+" currentSize="+fos.size()); } if (pos >= fos.size()) { return null; } fos.flushBuffer(); } if (pos == 0) { readHeader(fis); // shouldn't currently happen - header and first record are currently written at the same time synchronized (TransactionLog.this) { if (fis.position() >= fos.size()) { return null; } pos = fis.position(); } } Object o = codec.readVal(fis); // skip over record size int size = fis.readInt(); assert size == fis.position() - pos - 4; return o; }
// in core/src/java/org/apache/solr/update/TransactionLog.java
Override public SolrInputDocument readSolrInputDocument(FastInputStream dis) throws IOException { // Given that the SolrInputDocument is last in an add record, it's OK to just skip // reading it completely. return null; }
// in core/src/java/org/apache/solr/update/TransactionLog.java
public Object next() throws IOException { if (prevPos <= 0) return null; long endOfThisRecord = prevPos; int thisLength = nextLength; long recordStart = prevPos - thisLength; // back up to the beginning of the next record prevPos = recordStart - 4; // back up 4 more to read the length of the next record if (prevPos <= 0) return null; // this record is the header long bufferPos = fis.getBufferPos(); if (prevPos >= bufferPos) { // nothing to do... we're within the current buffer } else { // Position buffer so that this record is at the end. // For small records, this will cause subsequent calls to next() to be within the buffer. long seekPos = endOfThisRecord - fis.getBufferSize(); seekPos = Math.min(seekPos, prevPos); // seek to the start of the record if it's larger then the block size. seekPos = Math.max(seekPos, 0); fis.seek(seekPos); fis.peek(); // cause buffer to be filled } fis.seek(prevPos); nextLength = fis.readInt(); // this is the length of the *next* record (i.e. closer to the beginning) // TODO: optionally skip document data Object o = codec.readVal(fis); // assert fis.position() == prevPos + 4 + thisLength; // this is only true if we read all the data (and we currently skip reading SolrInputDocument return o; }
// in core/src/java/org/apache/solr/update/TransactionLog.java
Override public int readWrappedStream(byte[] target, int offset, int len) throws IOException { ByteBuffer bb = ByteBuffer.wrap(target, offset, len); int ret = ch.read(bb, readFromStream); return ret; }
// in core/src/java/org/apache/solr/update/TransactionLog.java
public void seek(long position) throws IOException { if (position <= readFromStream && position >= getBufferPos()) { // seek within buffer pos = (int)(position - getBufferPos()); } else { // long currSize = ch.size(); // not needed - underlying read should handle (unless read never done) // if (position > currSize) throw new EOFException("Read past EOF: seeking to " + position + " on file of size " + currSize + " file=" + ch); readFromStream = position; end = pos = 0; } assert position() == position; }
// in core/src/java/org/apache/solr/update/TransactionLog.java
Override public void close() throws IOException { ch.close(); }
// in core/src/java/org/apache/solr/core/StandardDirectoryFactory.java
Override protected Directory create(String path) throws IOException { return FSDirectory.open(new File(path)); }
// in core/src/java/org/apache/solr/core/RAMDirectoryFactory.java
Override protected Directory create(String path) throws IOException { return new RAMDirectory(); }
// in core/src/java/org/apache/solr/core/CachingDirectoryFactory.java
Override public void close() throws IOException { synchronized (this) { for (CacheValue val : byDirectoryCache.values()) { val.directory.close(); } byDirectoryCache.clear(); byPathCache.clear(); } }
// in core/src/java/org/apache/solr/core/CachingDirectoryFactory.java
private void close(Directory directory) throws IOException { synchronized (this) { CacheValue cacheValue = byDirectoryCache.get(directory); if (cacheValue == null) { throw new IllegalArgumentException("Unknown directory: " + directory + " " + byDirectoryCache); } cacheValue.refCnt--; if (cacheValue.refCnt == 0 && cacheValue.doneWithDir) { directory.close(); byDirectoryCache.remove(directory); byPathCache.remove(cacheValue.path); } } }
// in core/src/java/org/apache/solr/core/CachingDirectoryFactory.java
Override public final Directory get(String path, String rawLockType) throws IOException { return get(path, rawLockType, false); }
// in core/src/java/org/apache/solr/core/CachingDirectoryFactory.java
Override public final Directory get(String path, String rawLockType, boolean forceNew) throws IOException { String fullPath = new File(path).getAbsolutePath(); synchronized (this) { CacheValue cacheValue = byPathCache.get(fullPath); Directory directory = null; if (cacheValue != null) { directory = cacheValue.directory; if (forceNew) { cacheValue.doneWithDir = true; if (cacheValue.refCnt == 0) { close(cacheValue.directory); } } } if (directory == null || forceNew) { directory = create(fullPath); CacheValue newCacheValue = new CacheValue(); newCacheValue.directory = directory; newCacheValue.path = fullPath; injectLockFactory(directory, path, rawLockType); byDirectoryCache.put(directory, newCacheValue); byPathCache.put(fullPath, newCacheValue); } else { cacheValue.refCnt++; } return directory; } }
// in core/src/java/org/apache/solr/core/CachingDirectoryFactory.java
Override public void release(Directory directory) throws IOException { if (directory == null) { throw new NullPointerException(); } close(directory); }
// in core/src/java/org/apache/solr/core/CachingDirectoryFactory.java
private static Directory injectLockFactory(Directory dir, String lockPath, String rawLockType) throws IOException { if (null == rawLockType) { // we default to "simple" for backwards compatibility log.warn("No lockType configured for " + dir + " assuming 'simple'"); rawLockType = "simple"; } final String lockType = rawLockType.toLowerCase(Locale.ENGLISH).trim(); if ("simple".equals(lockType)) { // multiple SimpleFSLockFactory instances should be OK dir.setLockFactory(new SimpleFSLockFactory(lockPath)); } else if ("native".equals(lockType)) { dir.setLockFactory(new NativeFSLockFactory(lockPath)); } else if ("single".equals(lockType)) { if (!(dir.getLockFactory() instanceof SingleInstanceLockFactory)) dir .setLockFactory(new SingleInstanceLockFactory()); } else if ("none".equals(lockType)) { // Recipe for disaster log.error("CONFIGURATION WARNING: locks are disabled on " + dir); dir.setLockFactory(NoLockFactory.getNoLockFactory()); } else { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Unrecognized lockType: " + rawLockType); } return dir; }
// in core/src/java/org/apache/solr/core/NIOFSDirectoryFactory.java
Override protected Directory create(String path) throws IOException { return new NIOFSDirectory(new File(path)); }
// in core/src/java/org/apache/solr/core/MMapDirectoryFactory.java
Override protected Directory create(String path) throws IOException { MMapDirectory mapDirectory = new MMapDirectory(new File(path)); try { mapDirectory.setUseUnmap(unmapHack); } catch (Exception e) { log.warn("Unmap not supported on this JVM, continuing on without setting unmap", e); } mapDirectory.setMaxChunkSize(maxChunk); return mapDirectory; }
// in core/src/java/org/apache/solr/core/SolrCore.java
public SolrCore reload(SolrResourceLoader resourceLoader) throws IOException, ParserConfigurationException, SAXException { // TODO - what if indexwriter settings have changed SolrConfig config = new SolrConfig(resourceLoader, getSolrConfig().getName(), null); IndexSchema schema = new IndexSchema(config, getSchema().getResourceName(), null); updateHandler.incref(); SolrCore core = new SolrCore(getName(), null, config, schema, coreDescriptor, updateHandler); return core; }
// in core/src/java/org/apache/solr/core/SolrCore.java
public SolrIndexSearcher newSearcher(String name) throws IOException { return new SolrIndexSearcher(this, getNewIndexDir(), schema, getSolrConfig().indexConfig, name, false, directoryFactory); }
// in core/src/java/org/apache/solr/core/SolrCore.java
public RefCounted<SolrIndexSearcher> getSearcher(boolean forceNew, boolean returnSearcher, final Future[] waitSearcher) throws IOException { return getSearcher(forceNew, returnSearcher, waitSearcher, false); }
// in core/src/java/org/apache/solr/core/SolrCore.java
public RefCounted<SolrIndexSearcher> getSearcher(boolean forceNew, boolean returnSearcher, final Future[] waitSearcher, boolean updateHandlerReopens) throws IOException { // it may take some time to open an index.... we may need to make // sure that two threads aren't trying to open one at the same time // if it isn't necessary. synchronized (searcherLock) { // see if we can return the current searcher if (_searcher!=null && !forceNew) { if (returnSearcher) { _searcher.incref(); return _searcher; } else { return null; } } // check to see if we can wait for someone else's searcher to be set if (onDeckSearchers>0 && !forceNew && _searcher==null) { try { searcherLock.wait(); } catch (InterruptedException e) { log.info(SolrException.toStr(e)); } } // check again: see if we can return right now if (_searcher!=null && !forceNew) { if (returnSearcher) { _searcher.incref(); return _searcher; } else { return null; } } // At this point, we know we need to open a new searcher... // first: increment count to signal other threads that we are // opening a new searcher. onDeckSearchers++; if (onDeckSearchers < 1) { // should never happen... just a sanity check log.error(logid+"ERROR!!! onDeckSearchers is " + onDeckSearchers); onDeckSearchers=1; // reset } else if (onDeckSearchers > maxWarmingSearchers) { onDeckSearchers--; String msg="Error opening new searcher. exceeded limit of maxWarmingSearchers="+maxWarmingSearchers + ", try again later."; log.warn(logid+""+ msg); // HTTP 503==service unavailable, or 409==Conflict throw new SolrException(SolrException.ErrorCode.SERVICE_UNAVAILABLE,msg); } else if (onDeckSearchers > 1) { log.warn(logid+"PERFORMANCE WARNING: Overlapping onDeckSearchers=" + onDeckSearchers); } } // a signal to decrement onDeckSearchers if something goes wrong. final boolean[] decrementOnDeckCount=new boolean[]{true}; RefCounted<SolrIndexSearcher> currSearcherHolder = null; // searcher we are autowarming from RefCounted<SolrIndexSearcher> searchHolder = null; boolean success = false; openSearcherLock.lock(); try { searchHolder = openNewSearcher(updateHandlerReopens, false); // the searchHolder will be incremented once already (and it will eventually be assigned to _searcher when registered) // increment it again if we are going to return it to the caller. if (returnSearcher) { searchHolder.incref(); } final RefCounted<SolrIndexSearcher> newSearchHolder = searchHolder; final SolrIndexSearcher newSearcher = newSearchHolder.get(); boolean alreadyRegistered = false; synchronized (searcherLock) { if (_searcher == null) { // if there isn't a current searcher then we may // want to register this one before warming is complete instead of waiting. if (solrConfig.useColdSearcher) { registerSearcher(newSearchHolder); decrementOnDeckCount[0]=false; alreadyRegistered=true; } } else { // get a reference to the current searcher for purposes of autowarming. currSearcherHolder=_searcher; currSearcherHolder.incref(); } } final SolrIndexSearcher currSearcher = currSearcherHolder==null ? null : currSearcherHolder.get(); Future future=null; // warm the new searcher based on the current searcher. // should this go before the other event handlers or after? if (currSearcher != null) { future = searcherExecutor.submit( new Callable() { public Object call() throws Exception { try { newSearcher.warm(currSearcher); } catch (Throwable e) { SolrException.log(log,e); } return null; } } ); } if (currSearcher==null && firstSearcherListeners.size() > 0) { future = searcherExecutor.submit( new Callable() { public Object call() throws Exception { try { for (SolrEventListener listener : firstSearcherListeners) { listener.newSearcher(newSearcher,null); } } catch (Throwable e) { SolrException.log(log,null,e); } return null; } } ); }
// in core/src/java/org/apache/solr/core/SolrCore.java
private void registerSearcher(RefCounted<SolrIndexSearcher> newSearcherHolder) throws IOException { synchronized (searcherLock) { try { if (_searcher != null) { _searcher.decref(); // dec refcount for this._searcher _searcher=null; } _searcher = newSearcherHolder; SolrIndexSearcher newSearcher = newSearcherHolder.get(); /*** // a searcher may have been warming asynchronously while the core was being closed. // if this happens, just close the searcher. if (isClosed()) { // NOTE: this should not happen now - see close() for details. // *BUT* if we left it enabled, this could still happen before // close() stopped the executor - so disable this test for now. log.error("Ignoring searcher register on closed core:" + newSearcher); _searcher.decref(); } ***/ newSearcher.register(); // register subitems (caches) log.info(logid+"Registered new searcher " + newSearcher); } catch (Throwable e) { // an exception in register() shouldn't be fatal. log(e); } finally { // wake up anyone waiting for a searcher // even in the face of errors. onDeckSearchers--; searcherLock.notifyAll(); } } }
// in core/src/java/org/apache/solr/core/SolrCore.java
Override public void write(Writer writer, SolrQueryRequest request, SolrQueryResponse response) throws IOException { getWrappedWriter().write(writer, request, response); }
// in core/src/java/org/apache/solr/core/SolrResourceLoader.java
public List<String> getLines(String resource) throws IOException { return getLines(resource, UTF_8); }
// in core/src/java/org/apache/solr/core/SolrResourceLoader.java
public List<String> getLines(String resource, String encoding) throws IOException { return getLines(resource, Charset.forName(encoding)); }
// in core/src/java/org/apache/solr/core/SolrResourceLoader.java
public List<String> getLines(String resource, Charset charset) throws IOException{ BufferedReader input = null; ArrayList<String> lines; try { input = new BufferedReader(new InputStreamReader(openResource(resource), charset.newDecoder() .onMalformedInput(CodingErrorAction.REPORT) .onUnmappableCharacter(CodingErrorAction.REPORT))); lines = new ArrayList<String>(); for (String word=null; (word=input.readLine())!=null;) { // skip initial bom marker if (lines.isEmpty() && word.length() > 0 && word.charAt(0) == '\uFEFF') word = word.substring(1); // skip comments if (word.startsWith("#")) continue; word=word.trim(); // skip blank lines if (word.length()==0) continue; lines.add(word); } }
// in core/src/java/org/apache/solr/core/CoreContainer.java
public CoreContainer initialize() throws IOException, ParserConfigurationException, SAXException { CoreContainer cores = null; String solrHome = SolrResourceLoader.locateSolrHome(); File fconf = new File(solrHome, containerConfigFilename == null ? "solr.xml" : containerConfigFilename); log.info("looking for solr.xml: " + fconf.getAbsolutePath()); cores = new CoreContainer(solrHome); if (fconf.exists()) { cores.load(solrHome, fconf); } else { log.info("no solr.xml file found - using default"); cores.load(solrHome, new InputSource(new ByteArrayInputStream(DEF_SOLR_XML.getBytes("UTF-8")))); cores.configFile = fconf; } containerConfigFilename = cores.getConfigFile().getName(); return cores; }
// in core/src/java/org/apache/solr/core/CoreContainer.java
public void load(String dir, File configFile ) throws ParserConfigurationException, IOException, SAXException { this.configFile = configFile; this.load(dir, new InputSource(configFile.toURI().toASCIIString())); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
public void load(String dir, InputSource cfgis) throws ParserConfigurationException, IOException, SAXException { if (null == dir) { // don't rely on SolrResourceLoader(), determine explicitly first dir = SolrResourceLoader.locateSolrHome(); } log.info("Loading CoreContainer using Solr Home: '{}'", dir); this.loader = new SolrResourceLoader(dir); solrHome = loader.getInstanceDir(); Config cfg = new Config(loader, null, cfgis, null, false); // keep orig config for persist to consult try { this.cfg = new Config(loader, null, copyDoc(cfg.getDocument())); } catch (TransformerException e) { throw new SolrException(ErrorCode.SERVER_ERROR, "", e); } cfg.substituteProperties(); // Initialize Logging if(cfg.getBool("solr/logging/@enabled",true)) { String slf4jImpl = null; String fname = cfg.get("solr/logging/watcher/@class", null); try { slf4jImpl = StaticLoggerBinder.getSingleton().getLoggerFactoryClassStr(); if(fname==null) { if( slf4jImpl.indexOf("Log4j") > 0) { log.warn("Log watching is not yet implemented for log4j" ); } else if( slf4jImpl.indexOf("JDK") > 0) { fname = "JUL"; } } } catch(Throwable ex) { log.warn("Unable to read SLF4J version. LogWatcher will be disabled: "+ex); } // Now load the framework if(fname!=null) { if("JUL".equalsIgnoreCase(fname)) { logging = new JulWatcher(slf4jImpl); } // else if( "Log4j".equals(fname) ) { // logging = new Log4jWatcher(slf4jImpl); // } else { try { logging = loader.newInstance(fname, LogWatcher.class); } catch (Throwable e) { log.warn("Unable to load LogWatcher", e); } } if( logging != null ) { ListenerConfig v = new ListenerConfig(); v.size = cfg.getInt("solr/logging/watcher/@size",50); v.threshold = cfg.get("solr/logging/watcher/@threshold",null); if(v.size>0) { log.info("Registering Log Listener"); logging.registerListener(v, this); } } } } String dcoreName = cfg.get("solr/cores/@defaultCoreName", null); if(dcoreName != null && !dcoreName.isEmpty()) { defaultCoreName = dcoreName; } persistent = cfg.getBool("solr/@persistent", false); libDir = cfg.get("solr/@sharedLib", null); zkHost = cfg.get("solr/@zkHost" , null); adminPath = cfg.get("solr/cores/@adminPath", null); shareSchema = cfg.getBool("solr/cores/@shareSchema", DEFAULT_SHARE_SCHEMA); zkClientTimeout = cfg.getInt("solr/cores/@zkClientTimeout", DEFAULT_ZK_CLIENT_TIMEOUT); hostPort = cfg.get("solr/cores/@hostPort", DEFAULT_HOST_PORT); hostContext = cfg.get("solr/cores/@hostContext", DEFAULT_HOST_CONTEXT); host = cfg.get("solr/cores/@host", null); if(shareSchema){ indexSchemaCache = new ConcurrentHashMap<String ,IndexSchema>(); } adminHandler = cfg.get("solr/cores/@adminHandler", null ); managementPath = cfg.get("solr/cores/@managementPath", null ); zkClientTimeout = Integer.parseInt(System.getProperty("zkClientTimeout", Integer.toString(zkClientTimeout))); initZooKeeper(zkHost, zkClientTimeout); if (libDir != null) { File f = FileUtils.resolvePath(new File(dir), libDir); log.info( "loading shared library: "+f.getAbsolutePath() ); libLoader = SolrResourceLoader.createClassLoader(f, null); } if (adminPath != null) { if (adminHandler == null) { coreAdminHandler = new CoreAdminHandler(this); } else { coreAdminHandler = this.createMultiCoreHandler(adminHandler); } } try { containerProperties = readProperties(cfg, ((NodeList) cfg.evaluate(DEFAULT_HOST_CONTEXT, XPathConstants.NODESET)).item(0)); } catch (Throwable e) { SolrException.log(log,null,e); } NodeList nodes = (NodeList)cfg.evaluate("solr/cores/core", XPathConstants.NODESET); for (int i=0; i<nodes.getLength(); i++) { Node node = nodes.item(i); try { String rawName = DOMUtil.getAttr(node, "name", null); if (null == rawName) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Each core in solr.xml must have a 'name'"); } String name = rawName; CoreDescriptor p = new CoreDescriptor(this, name, DOMUtil.getAttr(node, "instanceDir", null)); // deal with optional settings String opt = DOMUtil.getAttr(node, "config", null); if (opt != null) { p.setConfigName(opt); } opt = DOMUtil.getAttr(node, "schema", null); if (opt != null) { p.setSchemaName(opt); } if (zkController != null) { opt = DOMUtil.getAttr(node, "shard", null); if (opt != null && opt.length() > 0) { p.getCloudDescriptor().setShardId(opt); } opt = DOMUtil.getAttr(node, "collection", null); if (opt != null) { p.getCloudDescriptor().setCollectionName(opt); } opt = DOMUtil.getAttr(node, "roles", null); if(opt != null){ p.getCloudDescriptor().setRoles(opt); } } opt = DOMUtil.getAttr(node, "properties", null); if (opt != null) { p.setPropertiesName(opt); } opt = DOMUtil.getAttr(node, CoreAdminParams.DATA_DIR, null); if (opt != null) { p.setDataDir(opt); } p.setCoreProperties(readProperties(cfg, node)); SolrCore core = create(p); register(name, core, false); // track original names coreToOrigName.put(core, rawName); } catch (Throwable ex) { SolrException.log(log,null,ex); } } }
// in core/src/java/org/apache/solr/core/CoreContainer.java
public SolrCore create(CoreDescriptor dcore) throws ParserConfigurationException, IOException, SAXException { // Make the instanceDir relative to the cores instanceDir if not absolute File idir = new File(dcore.getInstanceDir()); if (!idir.isAbsolute()) { idir = new File(solrHome, dcore.getInstanceDir()); } String instanceDir = idir.getPath(); log.info("Creating SolrCore '{}' using instanceDir: {}", dcore.getName(), instanceDir); // Initialize the solr config SolrResourceLoader solrLoader = null; SolrConfig config = null; String zkConfigName = null; if(zkController == null) { solrLoader = new SolrResourceLoader(instanceDir, libLoader, getCoreProps(instanceDir, dcore.getPropertiesName(),dcore.getCoreProperties())); config = new SolrConfig(solrLoader, dcore.getConfigName(), null); } else { try { String collection = dcore.getCloudDescriptor().getCollectionName(); zkController.createCollectionZkNode(dcore.getCloudDescriptor()); zkConfigName = zkController.readConfigName(collection); if (zkConfigName == null) { log.error("Could not find config name for collection:" + collection); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "Could not find config name for collection:" + collection); } solrLoader = new ZkSolrResourceLoader(instanceDir, zkConfigName, libLoader, getCoreProps(instanceDir, dcore.getPropertiesName(),dcore.getCoreProperties()), zkController); config = getSolrConfigFromZk(zkConfigName, dcore.getConfigName(), solrLoader); } catch (KeeperException e) { log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } } IndexSchema schema = null; if (indexSchemaCache != null) { if (zkController != null) { File schemaFile = new File(dcore.getSchemaName()); if (!schemaFile.isAbsolute()) { schemaFile = new File(solrLoader.getInstanceDir() + "conf" + File.separator + dcore.getSchemaName()); } if (schemaFile.exists()) { String key = schemaFile.getAbsolutePath() + ":" + new SimpleDateFormat("yyyyMMddHHmmss", Locale.US).format(new Date( schemaFile.lastModified())); schema = indexSchemaCache.get(key); if (schema == null) { log.info("creating new schema object for core: " + dcore.name); schema = new IndexSchema(config, dcore.getSchemaName(), null); indexSchemaCache.put(key, schema); } else { log.info("re-using schema object for core: " + dcore.name); } } } else { // TODO: handle caching from ZooKeeper - perhaps using ZooKeepers versioning // Don't like this cache though - how does it empty as last modified changes? } } if(schema == null){ if(zkController != null) { try { schema = getSchemaFromZk(zkConfigName, dcore.getSchemaName(), config, solrLoader); } catch (KeeperException e) { log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } } else { schema = new IndexSchema(config, dcore.getSchemaName(), null); } } SolrCore core = new SolrCore(dcore.getName(), null, config, schema, dcore); if (zkController == null && core.getUpdateHandler().getUpdateLog() != null) { // always kick off recovery if we are in standalone mode. core.getUpdateHandler().getUpdateLog().recoverFromLog(); } return core; }
// in core/src/java/org/apache/solr/core/CoreContainer.java
public void reload(String name) throws ParserConfigurationException, IOException, SAXException { name= checkDefault(name); SolrCore core; synchronized(cores) { core = cores.get(name); } if (core == null) throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "No such core: " + name ); CoreDescriptor cd = core.getCoreDescriptor(); File instanceDir = new File(cd.getInstanceDir()); if (!instanceDir.isAbsolute()) { instanceDir = new File(getSolrHome(), cd.getInstanceDir()); } log.info("Reloading SolrCore '{}' using instanceDir: {}", cd.getName(), instanceDir.getAbsolutePath()); SolrResourceLoader solrLoader; if(zkController == null) { solrLoader = new SolrResourceLoader(instanceDir.getAbsolutePath(), libLoader, getCoreProps(instanceDir.getAbsolutePath(), cd.getPropertiesName(),cd.getCoreProperties())); } else { try { String collection = cd.getCloudDescriptor().getCollectionName(); zkController.createCollectionZkNode(cd.getCloudDescriptor()); String zkConfigName = zkController.readConfigName(collection); if (zkConfigName == null) { log.error("Could not find config name for collection:" + collection); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "Could not find config name for collection:" + collection); } solrLoader = new ZkSolrResourceLoader(instanceDir.getAbsolutePath(), zkConfigName, libLoader, getCoreProps(instanceDir.getAbsolutePath(), cd.getPropertiesName(),cd.getCoreProperties()), zkController); } catch (KeeperException e) { log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } } SolrCore newCore = core.reload(solrLoader); // keep core to orig name link String origName = coreToOrigName.remove(core); if (origName != null) { coreToOrigName.put(newCore, origName); } register(name, newCore, false); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
private SolrConfig getSolrConfigFromZk(String zkConfigName, String solrConfigFileName, SolrResourceLoader resourceLoader) throws IOException, ParserConfigurationException, SAXException, KeeperException, InterruptedException { byte[] config = zkController.getConfigFileData(zkConfigName, solrConfigFileName); InputSource is = new InputSource(new ByteArrayInputStream(config)); is.setSystemId(SystemIdResolver.createSystemIdFromResourceName(solrConfigFileName)); SolrConfig cfg = solrConfigFileName == null ? new SolrConfig( resourceLoader, SolrConfig.DEFAULT_CONF_FILE, is) : new SolrConfig( resourceLoader, solrConfigFileName, is); return cfg; }
// in core/src/java/org/apache/solr/core/SolrDeletionPolicy.java
public void onInit(List commits) throws IOException { log.info("SolrDeletionPolicy.onInit: commits:" + str(commits)); updateCommits((List<IndexCommit>) commits); }
// in core/src/java/org/apache/solr/core/SolrDeletionPolicy.java
public void onCommit(List commits) throws IOException { log.info("SolrDeletionPolicy.onCommit: commits:" + str(commits)); updateCommits((List<IndexCommit>) commits); }
// in core/src/java/org/apache/solr/core/IndexDeletionPolicyWrapper.java
public void onInit(List list) throws IOException { List<IndexCommitWrapper> wrapperList = wrap(list); deletionPolicy.onInit(wrapperList); updateCommitPoints(wrapperList); cleanReserves(); }
// in core/src/java/org/apache/solr/core/IndexDeletionPolicyWrapper.java
public void onCommit(List list) throws IOException { List<IndexCommitWrapper> wrapperList = wrap(list); deletionPolicy.onCommit(wrapperList); updateCommitPoints(wrapperList); cleanReserves(); }
// in core/src/java/org/apache/solr/core/IndexDeletionPolicyWrapper.java
Override public Collection getFileNames() throws IOException { return delegate.getFileNames(); }
// in core/src/java/org/apache/solr/core/IndexDeletionPolicyWrapper.java
Override public Map getUserData() throws IOException { return delegate.getUserData(); }
// in core/src/java/org/apache/solr/core/IndexDeletionPolicyWrapper.java
public static long getCommitTimestamp(IndexCommit commit) throws IOException { final Map<String,String> commitData = commit.getUserData(); String commitTime = commitData.get(SolrIndexWriter.COMMIT_TIME_MSEC_KEY); if (commitTime != null) { return Long.parseLong(commitTime); } else { return 0; } }
// in core/src/java/org/apache/solr/core/SolrXMLSerializer.java
void persist(Writer w, SolrXMLDef solrXMLDef) throws IOException { w.write("<?xml version=\"1.0\" encoding=\"UTF-8\" ?>\n"); w.write("<solr"); Map<String,String> rootSolrAttribs = solrXMLDef.solrAttribs; Set<String> solrAttribKeys = rootSolrAttribs.keySet(); for (String key : solrAttribKeys) { String value = rootSolrAttribs.get(key); writeAttribute(w, key, value); } w.write(">\n"); Properties containerProperties = solrXMLDef.containerProperties; if (containerProperties != null && !containerProperties.isEmpty()) { writeProperties(w, containerProperties, " "); } w.write(INDENT + "<cores"); Map<String,String> coresAttribs = solrXMLDef.coresAttribs; Set<String> coreAttribKeys = coresAttribs.keySet(); for (String key : coreAttribKeys) { String value = coresAttribs.get(key); writeAttribute(w, key, value); } w.write(">\n"); for (SolrCoreXMLDef coreDef : solrXMLDef.coresDefs) { persist(w, coreDef); } w.write(INDENT + "</cores>\n"); w.write("</solr>\n"); }
// in core/src/java/org/apache/solr/core/SolrXMLSerializer.java
private void persist(Writer w, SolrCoreXMLDef coreDef) throws IOException { w.write(INDENT + INDENT + "<core"); Set<String> keys = coreDef.coreAttribs.keySet(); for (String key : keys) { writeAttribute(w, key, coreDef.coreAttribs.get(key)); } Properties properties = coreDef.coreProperties; if (properties == null || properties.isEmpty()) w.write("/>\n"); // core else { w.write(">\n"); writeProperties(w, properties, " "); w.write(INDENT + INDENT + "</core>\n"); } }
// in core/src/java/org/apache/solr/core/SolrXMLSerializer.java
private void writeProperties(Writer w, Properties props, String indent) throws IOException { for (Map.Entry<Object,Object> entry : props.entrySet()) { w.write(indent + "<property"); writeAttribute(w, "name", entry.getKey()); writeAttribute(w, "value", entry.getValue()); w.write("/>\n"); } }
// in core/src/java/org/apache/solr/core/SolrXMLSerializer.java
private void writeAttribute(Writer w, String name, Object value) throws IOException { if (value == null) return; w.write(" "); w.write(name); w.write("=\""); XML.escapeAttributeValue(value.toString(), w); w.write("\""); }
// in core/src/java/org/apache/solr/core/SolrXMLSerializer.java
private static void fileCopy(File src, File dest) throws IOException { IOException xforward = null; FileInputStream fis = null; FileOutputStream fos = null; FileChannel fcin = null; FileChannel fcout = null; try { fis = new FileInputStream(src); fos = new FileOutputStream(dest); fcin = fis.getChannel(); fcout = fos.getChannel(); // do the file copy 32Mb at a time final int MB32 = 32 * 1024 * 1024; long size = fcin.size(); long position = 0; while (position < size) { position += fcin.transferTo(position, MB32, fcout); } } catch (IOException xio) { xforward = xio; } finally { if (fis != null) try { fis.close(); fis = null; } catch (IOException xio) {} if (fos != null) try { fos.close(); fos = null; } catch (IOException xio) {} if (fcin != null && fcin.isOpen()) try { fcin.close(); fcin = null; } catch (IOException xio) {} if (fcout != null && fcout.isOpen()) try { fcout.close(); fcout = null; } catch (IOException xio) {} } if (xforward != null) { throw xforward; } }
// in core/src/java/org/apache/solr/core/SimpleFSDirectoryFactory.java
Override protected Directory create(String path) throws IOException { return new SimpleFSDirectory(new File(path)); }
// in core/src/java/org/apache/solr/core/NRTCachingDirectoryFactory.java
Override protected Directory create(String path) throws IOException { return new NRTCachingDirectory(FSDirectory.open(new File(path)), 4, 48); }
// in core/src/java/org/apache/solr/core/StandardIndexReaderFactory.java
Override public DirectoryReader newReader(Directory indexDir, SolrCore core) throws IOException { return DirectoryReader.open(indexDir, termInfosIndexDivisor); }
// in core/src/java/org/apache/solr/util/SolrPluginUtils.java
public static int numDocs(SolrIndexSearcher s, Query q, Query f) throws IOException { return (null == f) ? s.getDocSet(q).size() : s.numDocs(q,f); }
// in core/src/java/org/apache/solr/util/SolrPluginUtils.java
public static void optimizePreFetchDocs(ResponseBuilder rb, DocList docs, Query query, SolrQueryRequest req, SolrQueryResponse res) throws IOException { SolrIndexSearcher searcher = req.getSearcher(); if(!searcher.enableLazyFieldLoading) { // nothing to do return; } ReturnFields returnFields = res.getReturnFields(); if(returnFields.getLuceneFieldNames() != null) { Set<String> fieldFilter = returnFields.getLuceneFieldNames(); if (rb.doHighlights) { // copy return fields list fieldFilter = new HashSet<String>(fieldFilter); // add highlight fields SolrHighlighter highlighter = HighlightComponent.getHighlighter(req.getCore()); for (String field: highlighter.getHighlightFields(query, req, null)) fieldFilter.add(field); // fetch unique key if one exists. SchemaField keyField = req.getSearcher().getSchema().getUniqueKeyField(); if(null != keyField) fieldFilter.add(keyField.getName()); } // get documents DocIterator iter = docs.iterator(); for (int i=0; i<docs.size(); i++) { searcher.doc(iter.nextDoc(), fieldFilter); } } }
// in core/src/java/org/apache/solr/util/SolrPluginUtils.java
public static NamedList doStandardDebug(SolrQueryRequest req, String userQuery, Query query, DocList results, boolean dbgQuery, boolean dbgResults) throws IOException { NamedList dbg = null; dbg = new SimpleOrderedMap(); SolrIndexSearcher searcher = req.getSearcher(); IndexSchema schema = req.getSchema(); boolean explainStruct = req.getParams().getBool(CommonParams.EXPLAIN_STRUCT, false); if (dbgQuery) { /* userQuery may have been pre-processed .. expose that */ dbg.add("rawquerystring", req.getParams().get(CommonParams.Q)); dbg.add("querystring", userQuery); /* QueryParsing.toString isn't perfect, use it to see converted * values, use regular toString to see any attributes of the * underlying Query it may have missed. */ dbg.add("parsedquery", QueryParsing.toString(query, schema)); dbg.add("parsedquery_toString", query.toString()); } if (dbgResults) { NamedList<Explanation> explain = getExplanations(query, results, searcher, schema); dbg.add("explain", explainStruct ? explanationsToNamedLists(explain) : explanationsToStrings(explain)); String otherQueryS = req.getParams().get(CommonParams.EXPLAIN_OTHER); if (otherQueryS != null && otherQueryS.length() > 0) { DocList otherResults = doSimpleQuery (otherQueryS, req, 0, 10); dbg.add("otherQuery", otherQueryS); NamedList<Explanation> explainO = getExplanations(query, otherResults, searcher, schema); dbg.add("explainOther", explainStruct ? explanationsToNamedLists(explainO) : explanationsToStrings(explainO)); } } return dbg; }
// in core/src/java/org/apache/solr/util/SolrPluginUtils.java
public static NamedList<Explanation> getExplanations (Query query, DocList docs, SolrIndexSearcher searcher, IndexSchema schema) throws IOException { NamedList<Explanation> explainList = new SimpleOrderedMap<Explanation>(); DocIterator iterator = docs.iterator(); for (int i=0; i<docs.size(); i++) { int id = iterator.nextDoc(); Document doc = searcher.doc(id); String strid = schema.printableUniqueKey(doc); explainList.add(strid, searcher.explain(query, id) ); } return explainList; }
// in core/src/java/org/apache/solr/util/SolrPluginUtils.java
public static DocList doSimpleQuery(String sreq, SolrQueryRequest req, int start, int limit) throws IOException { List<String> commands = StrUtils.splitSmart(sreq,';'); String qs = commands.size() >= 1 ? commands.get(0) : ""; try { Query query = QParser.getParser(qs, null, req).getQuery(); // If the first non-query, non-filter command is a simple sort on an indexed field, then // we can use the Lucene sort ability. Sort sort = null; if (commands.size() >= 2) { sort = QueryParsing.parseSort(commands.get(1), req); } DocList results = req.getSearcher().getDocList(query,(DocSet)null, sort, start, limit); return results; } catch (ParseException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Error parsing query: " + qs); } }
// in core/src/java/org/apache/solr/util/SolrPluginUtils.java
public boolean regenerateItem(SolrIndexSearcher newSearcher, SolrCache newCache, SolrCache oldCache, Object oldKey, Object oldVal) throws IOException { newCache.put(oldKey,oldVal); return true; }
// in core/src/java/org/apache/solr/util/SolrPluginUtils.java
public static SolrDocumentList docListToSolrDocumentList( DocList docs, SolrIndexSearcher searcher, Set<String> fields, Map<SolrDocument, Integer> ids ) throws IOException { IndexSchema schema = searcher.getSchema(); SolrDocumentList list = new SolrDocumentList(); list.setNumFound(docs.matches()); list.setMaxScore(docs.maxScore()); list.setStart(docs.offset()); DocIterator dit = docs.iterator(); while (dit.hasNext()) { int docid = dit.nextDoc(); Document luceneDoc = searcher.doc(docid, fields); SolrDocument doc = new SolrDocument(); for( IndexableField field : luceneDoc) { if (null == fields || fields.contains(field.name())) { SchemaField sf = schema.getField( field.name() ); doc.addField( field.name(), sf.getType().toObject( field ) ); } } if (docs.hasScores() && (null == fields || fields.contains("score"))) { doc.addField("score", dit.score()); } list.add( doc ); if( ids != null ) { ids.put( doc, new Integer(docid) ); } } return list; }
// in core/src/java/org/apache/solr/util/FastWriter.java
Override public void write(int c) throws IOException { write((char)c); }
// in core/src/java/org/apache/solr/util/FastWriter.java
public void write(char c) throws IOException { if (pos >= buf.length) { sink.write(buf,0,pos); pos=0; } buf[pos++] = c; }
// in core/src/java/org/apache/solr/util/FastWriter.java
Override public FastWriter append(char c) throws IOException { if (pos >= buf.length) { sink.write(buf,0,pos); pos=0; } buf[pos++] = c; return this; }
// in core/src/java/org/apache/solr/util/FastWriter.java
Override public void write(char cbuf[], int off, int len) throws IOException { int space = buf.length - pos; if (len < space) { System.arraycopy(cbuf, off, buf, pos, len); pos += len; } else if (len<BUFSIZE) { // if the data to write is small enough, buffer it. System.arraycopy(cbuf, off, buf, pos, space); sink.write(buf, 0, buf.length); pos = len-space; System.arraycopy(cbuf, off+space, buf, 0, pos); } else { sink.write(buf,0,pos); // flush pos=0; // don't buffer, just write to sink sink.write(cbuf, off, len); } }
// in core/src/java/org/apache/solr/util/FastWriter.java
Override public void write(String str, int off, int len) throws IOException { int space = buf.length - pos; if (len < space) { str.getChars(off, off+len, buf, pos); pos += len; } else if (len<BUFSIZE) { // if the data to write is small enough, buffer it. str.getChars(off, off+space, buf, pos); sink.write(buf, 0, buf.length); str.getChars(off+space, off+len, buf, 0); pos = len-space; } else { sink.write(buf,0,pos); // flush pos=0; // don't buffer, just write to sink sink.write(str, off, len); } }
// in core/src/java/org/apache/solr/util/FastWriter.java
Override public void flush() throws IOException { sink.write(buf,0,pos); pos=0; sink.flush(); }
// in core/src/java/org/apache/solr/util/FastWriter.java
Override public void close() throws IOException { flush(); sink.close(); }
// in core/src/java/org/apache/solr/util/FastWriter.java
public void flushBuffer() throws IOException { sink.write(buf, 0, pos); pos=0; }
// in core/src/java/org/apache/solr/util/SystemIdResolver.java
URI resolveRelativeURI(String baseURI, String systemId) throws IOException,URISyntaxException { URI uri; // special case for backwards compatibility: if relative systemId starts with "/" (we convert that to an absolute solrres:-URI) if (systemId.startsWith("/")) { uri = new URI(RESOURCE_LOADER_URI_SCHEME, RESOURCE_LOADER_AUTHORITY_ABSOLUTE, "/", null, null).resolve(systemId); } else { // simply parse as URI uri = new URI(systemId); } // do relative resolving if (baseURI != null ) { uri = new URI(baseURI).resolve(uri); } return uri; }
// in core/src/java/org/apache/solr/util/SystemIdResolver.java
public InputSource resolveEntity(String name, String publicId, String baseURI, String systemId) throws IOException { if (systemId == null) return null; try { final URI uri = resolveRelativeURI(baseURI, systemId); // check schema and resolve with ResourceLoader if (RESOURCE_LOADER_URI_SCHEME.equals(uri.getScheme())) { String path = uri.getPath(), authority = uri.getAuthority(); if (!RESOURCE_LOADER_AUTHORITY_ABSOLUTE.equals(authority)) { path = path.substring(1); } try { final InputSource is = new InputSource(loader.openResource(path)); is.setSystemId(uri.toASCIIString()); is.setPublicId(publicId); return is; } catch (RuntimeException re) { // unfortunately XInclude fallback only works with IOException, but openResource() never throws that one throw (IOException) (new IOException(re.getMessage()).initCause(re)); } } else { // resolve all other URIs using the standard resolver return null; } } catch (URISyntaxException use) { log.warn("An URI systax problem occurred during resolving SystemId, falling back to default resolver", use); return null; } }
// in core/src/java/org/apache/solr/util/SystemIdResolver.java
public InputSource resolveEntity(String publicId, String systemId) throws IOException { return resolveEntity(null, publicId, null, systemId); }
// in core/src/java/org/apache/solr/util/xslt/TransformerProvider.java
public synchronized Transformer getTransformer(SolrConfig solrConfig, String filename,int cacheLifetimeSeconds) throws IOException { // For now, the Templates are blindly reloaded once cacheExpires is over. // It'd be better to check the file modification time to reload only if needed. if(lastTemplates!=null && filename.equals(lastFilename) && System.currentTimeMillis() < cacheExpires) { if(log.isDebugEnabled()) { log.debug("Using cached Templates:" + filename); } } else { lastTemplates = getTemplates(solrConfig.getResourceLoader(), filename,cacheLifetimeSeconds); } Transformer result = null; try { result = lastTemplates.newTransformer(); } catch(TransformerConfigurationException tce) { log.error(getClass().getName(), "getTransformer", tce); final IOException ioe = new IOException("newTransformer fails ( " + lastFilename + ")"); ioe.initCause(tce); throw ioe; } return result; }
// in core/src/java/org/apache/solr/util/xslt/TransformerProvider.java
private Templates getTemplates(ResourceLoader loader, String filename,int cacheLifetimeSeconds) throws IOException { Templates result = null; lastFilename = null; try { if(log.isDebugEnabled()) { log.debug("compiling XSLT templates:" + filename); } final String fn = "xslt/" + filename; final TransformerFactory tFactory = TransformerFactory.newInstance(); tFactory.setURIResolver(new SystemIdResolver(loader).asURIResolver()); tFactory.setErrorListener(xmllog); final StreamSource src = new StreamSource(loader.openResource(fn), SystemIdResolver.createSystemIdFromResourceName(fn)); try { result = tFactory.newTemplates(src); } finally { // some XML parsers are broken and don't close the byte stream (but they should according to spec) IOUtils.closeQuietly(src.getInputStream()); } } catch (Exception e) { log.error(getClass().getName(), "newTemplates", e); final IOException ioe = new IOException("Unable to initialize Templates '" + filename + "'"); ioe.initCause(e); throw ioe; } lastFilename = filename; lastTemplates = result; cacheExpires = System.currentTimeMillis() + (cacheLifetimeSeconds * 1000); return result; }
// in core/src/java/org/apache/solr/util/SuggestMissingFactories.java
public static void main(String[] args) throws ClassNotFoundException, IOException, NoSuchMethodException { final File[] files = new File[args.length]; for (int i = 0; i < args.length; i++) { files[i] = new File(args[i]); } final FindClasses finder = new FindClasses(files); final ClassLoader cl = finder.getClassLoader(); final Class TOKENSTREAM = cl.loadClass("org.apache.lucene.analysis.TokenStream"); final Class TOKENIZER = cl.loadClass("org.apache.lucene.analysis.Tokenizer"); final Class TOKENFILTER = cl.loadClass("org.apache.lucene.analysis.TokenFilter"); final Class TOKENIZERFACTORY = cl.loadClass("org.apache.solr.analysis.TokenizerFactory"); final Class TOKENFILTERFACTORY = cl.loadClass("org.apache.solr.analysis.TokenFilterFactory"); final HashSet<Class> result = new HashSet<Class>(finder.findExtends(TOKENIZER)); result.addAll(finder.findExtends(TOKENFILTER)); result.removeAll(finder.findMethodReturns (finder.findExtends(TOKENIZERFACTORY), "create", Reader.class).values()); result.removeAll(finder.findMethodReturns (finder.findExtends(TOKENFILTERFACTORY), "create", TOKENSTREAM).values()); for (final Class c : result) { System.out.println(c.getName()); } }
// in core/src/java/org/apache/solr/util/SuggestMissingFactories.java
public static void main(String[] args) throws ClassNotFoundException, IOException, NoSuchMethodException { FindClasses finder = new FindClasses(new File(args[1])); ClassLoader cl = finder.getClassLoader(); Class clazz = cl.loadClass(args[0]); if (args.length == 2) { System.out.println("Finding all extenders of " + clazz.getName()); for (Class c : finder.findExtends(clazz)) { System.out.println(c.getName()); } } else { String methName = args[2]; System.out.println("Finding all extenders of " + clazz.getName() + " with method: " + methName); Class[] methArgs = new Class[args.length-3]; for (int i = 3; i < args.length; i++) { methArgs[i-3] = cl.loadClass(args[i]); } Map<Class,Class> map = finder.findMethodReturns (finder.findExtends(clazz),methName, methArgs); for (Class key : map.keySet()) { System.out.println(key.getName() + " => " + map.get(key).getName()); } } }
// in core/src/java/org/apache/solr/util/SimplePostTool.java
private static void pipe(InputStream source, OutputStream dest) throws IOException { byte[] buf = new byte[1024]; int read = 0; while ( (read = source.read(buf) ) >= 0) { if (null != dest) dest.write(buf, 0, read); } if (null != dest) dest.flush(); }
// in core/src/java/org/apache/solr/util/FileUtils.java
public static void copyFile(File src , File destination) throws IOException { FileChannel in = null; FileChannel out = null; try { in = new FileInputStream(src).getChannel(); out = new FileOutputStream(destination).getChannel(); in.transferTo(0, in.size(), out); } finally { try { if (in != null) in.close(); } catch (IOException e) {} try { if (out != null) out.close(); } catch (IOException e) {} } }
// in core/src/java/org/apache/solr/util/FileUtils.java
public static void sync(File fullFile) throws IOException { if (fullFile == null || !fullFile.exists()) throw new FileNotFoundException("File does not exist " + fullFile); boolean success = false; int retryCount = 0; IOException exc = null; while(!success && retryCount < 5) { retryCount++; RandomAccessFile file = null; try { try { file = new RandomAccessFile(fullFile, "rw"); file.getFD().sync(); success = true; } finally { if (file != null) file.close(); } } catch (IOException ioe) { if (exc == null) exc = ioe; try { // Pause 5 msec Thread.sleep(5); } catch (InterruptedException ie) { Thread.currentThread().interrupt(); } } } if (!success) // Throw original exception throw exc; }
(Lib) IllegalStateException 19
              
// in solrj/src/java/org/apache/solr/common/cloud/CloudState.java
public String getShard(int hash, String collection) { RangeInfo rangInfo = getRanges(collection); int cnt = 0; for (Range range : rangInfo.ranges) { if (hash < range.max) { return rangInfo.shardList.get(cnt); } cnt++; } throw new IllegalStateException("The HashPartitioner failed"); }
// in solrj/src/java/org/apache/solr/client/solrj/request/FieldAnalysisRequest.java
Override public FieldAnalysisResponse process(SolrServer server) throws SolrServerException, IOException { if (fieldTypes == null && fieldNames == null) { throw new IllegalStateException("At least one field type or field name need to be specified"); } if (fieldValue == null) { throw new IllegalStateException("The field value must be set"); } long startTime = System.currentTimeMillis(); FieldAnalysisResponse res = new FieldAnalysisResponse(); res.setResponse(server.request(this)); res.setElapsedTime(System.currentTimeMillis() - startTime); return res; }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/SortedMapBackedCache.java
private void checkOpen(boolean shouldItBe) { if (!isOpen && shouldItBe) { throw new IllegalStateException( "Must call open() before using this cache."); } if (isOpen && !shouldItBe) { throw new IllegalStateException("The cache is already open."); } }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/SortedMapBackedCache.java
private void checkReadOnly() { if (isReadOnly) { throw new IllegalStateException("Cache is read-only."); } }
// in core/src/java/org/apache/solr/client/solrj/embedded/JettySolrRunner.java
public int getLocalPort() { if (lastPort == -1) { throw new IllegalStateException("You cannot get the port until this instance has started"); } return lastPort; }
// in core/src/java/org/apache/solr/search/grouping/distributed/command/TopGroupsFieldCommand.java
public TopGroupsFieldCommand build() { if (field == null || groupSort == null || sortWithinGroup == null || firstPhaseGroups == null || maxDocPerGroup == null) { throw new IllegalStateException("All required fields must be set"); } return new TopGroupsFieldCommand(field, groupSort, sortWithinGroup, firstPhaseGroups, maxDocPerGroup, needScores, needMaxScore); }
// in core/src/java/org/apache/solr/search/grouping/distributed/command/SearchGroupsFieldCommand.java
public SearchGroupsFieldCommand build() { if (field == null || groupSort == null || topNGroups == null) { throw new IllegalStateException("All fields must be set"); } return new SearchGroupsFieldCommand(field, groupSort, topNGroups, includeGroupCount); }
// in core/src/java/org/apache/solr/search/grouping/distributed/command/QueryCommand.java
public QueryCommand build() { if (sort == null || query == null || docSet == null || docsToCollect == null) { throw new IllegalStateException("All fields must be set"); } return new QueryCommand(sort, query, docsToCollect, needScores, docSet, queryString); }
// in core/src/java/org/apache/solr/search/grouping/CommandHandler.java
public CommandHandler build() { if (queryCommand == null || searcher == null) { throw new IllegalStateException("All fields must be set"); } return new CommandHandler(queryCommand, commands, searcher, needDocSet, truncateGroups, includeHitCount); }
// in core/src/java/org/apache/solr/logging/jul/JulWatcher.java
Override public void setThreshold(String level) { if(handler==null) { throw new IllegalStateException("Must have an handler"); } handler.setLevel( Level.parse(level) ); }
// in core/src/java/org/apache/solr/logging/jul/JulWatcher.java
Override public String getThreshold() { if(handler==null) { throw new IllegalStateException("Must have an handler"); } return handler.getLevel().toString(); }
// in core/src/java/org/apache/solr/logging/jul/JulWatcher.java
Override public void registerListener(ListenerConfig cfg, CoreContainer container) { if(history!=null) { throw new IllegalStateException("History already registered"); } history = new CircularList<LogRecord>(cfg.size); handler = new RecordHandler(this); if(cfg.threshold != null) { handler.setLevel(Level.parse(cfg.threshold)); } else { handler.setLevel(Level.WARNING); } Logger log = LogManager.getLogManager().getLogger(""); log.addHandler(handler); }
// in core/src/java/org/apache/solr/cloud/LeaderElector.java
private int getSeq(String nStringSequence) { int seq = 0; Matcher m = LEADER_SEQ.matcher(nStringSequence); if (m.matches()) { seq = Integer.parseInt(m.group(1)); } else { throw new IllegalStateException("Could not find regex match in:" + nStringSequence); } return seq; }
// in core/src/java/org/apache/solr/cloud/LeaderElector.java
private String getNodeId(String nStringSequence) { String id; Matcher m = SESSION_ID.matcher(nStringSequence); if (m.matches()) { id = m.group(1); } else { throw new IllegalStateException("Could not find regex match in:" + nStringSequence); } return id; }
// in core/src/java/org/apache/solr/update/DefaultSolrCoreState.java
Override public synchronized void incref() { if (refCnt == 0) { throw new IllegalStateException("IndexWriter has been closed"); } refCnt++; }
// in core/src/java/org/apache/solr/core/CoreContainer.java
public SolrCore register(String name, SolrCore core, boolean returnPrevNotClosed) { if( core == null ) { throw new RuntimeException( "Can not register a null core." ); } if( name == null || name.indexOf( '/' ) >= 0 || name.indexOf( '\\' ) >= 0 ){ throw new RuntimeException( "Invalid core name: "+name ); } if (zkController != null) { // this happens before we can receive requests zkController.preRegister(core.getCoreDescriptor()); } SolrCore old = null; synchronized (cores) { if (isShutDown) { core.close(); throw new IllegalStateException("This CoreContainer has been shutdown"); } old = cores.put(name, core); /* * set both the name of the descriptor and the name of the * core, since the descriptors name is used for persisting. */ core.setName(name); core.getCoreDescriptor().name = name; } if( old == null || old == core) { log.info( "registering core: "+name ); registerInZk(core); return null; } else { log.info( "replacing core: "+name ); if (!returnPrevNotClosed) { old.close(); } registerInZk(core); return old; } }
// in core/src/java/org/apache/solr/util/DateMathParser.java
public static void round(Calendar c, String unit) { Integer uu = CALENDAR_UNITS.get(unit); if (null == uu) { throw new IllegalArgumentException("Rounding Unit not recognized: " + unit); } int u = uu.intValue(); switch (u) { case Calendar.YEAR: c.clear(Calendar.MONTH); /* fall through */ case Calendar.MONTH: c.clear(Calendar.DAY_OF_MONTH); c.clear(Calendar.DAY_OF_WEEK); c.clear(Calendar.DAY_OF_WEEK_IN_MONTH); c.clear(Calendar.DAY_OF_YEAR); c.clear(Calendar.WEEK_OF_MONTH); c.clear(Calendar.WEEK_OF_YEAR); /* fall through */ case Calendar.DATE: c.clear(Calendar.HOUR_OF_DAY); c.clear(Calendar.HOUR); c.clear(Calendar.AM_PM); /* fall through */ case Calendar.HOUR_OF_DAY: c.clear(Calendar.MINUTE); /* fall through */ case Calendar.MINUTE: c.clear(Calendar.SECOND); /* fall through */ case Calendar.SECOND: c.clear(Calendar.MILLISECOND); break; default: throw new IllegalStateException ("No logic for rounding value ("+u+") " + unit); } }
0 2
              
// in solrj/src/java/org/apache/solr/client/solrj/impl/HttpClientUtil.java
public InputStream getContent() throws IOException, IllegalStateException { return new GZIPInputStream(wrappedEntity.getContent()); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/HttpClientUtil.java
public InputStream getContent() throws IOException, IllegalStateException { return new InflaterInputStream(wrappedEntity.getContent()); }
(Domain) SolrServerException 19
              
// in solrj/src/java/org/apache/solr/client/solrj/request/QueryRequest.java
Override public QueryResponse process( SolrServer server ) throws SolrServerException { try { long startTime = System.currentTimeMillis(); QueryResponse res = new QueryResponse( server.request( this ), server ); res.setElapsedTime( System.currentTimeMillis()-startTime ); return res; } catch (SolrServerException e){ throw e; } catch (SolrException s){ throw s; } catch (Exception e) { throw new SolrServerException("Error executing query", e); } }
// in solrj/src/java/org/apache/solr/client/solrj/impl/CloudSolrServer.java
Override public NamedList<Object> request(SolrRequest request) throws SolrServerException, IOException { connect(); // TODO: if you can hash here, you could favor the shard leader CloudState cloudState = zkStateReader.getCloudState(); SolrParams reqParams = request.getParams(); if (reqParams == null) { reqParams = new ModifiableSolrParams(); } String collection = reqParams.get("collection", defaultCollection); if (collection == null) { throw new SolrServerException("No collection param specified on request and no default collection has been set."); } // Extract each comma separated collection name and store in a List. List<String> collectionList = StrUtils.splitSmart(collection, ",", true); // Retrieve slices from the cloud state and, for each collection specified, // add it to the Map of slices. Map<String,Slice> slices = new HashMap<String,Slice>(); for (int i = 0; i < collectionList.size(); i++) { String coll= collectionList.get(i); ClientUtils.appendMap(coll, slices, cloudState.getSlices(coll)); } Set<String> liveNodes = cloudState.getLiveNodes(); // IDEA: have versions on various things... like a global cloudState version // or shardAddressVersion (which only changes when the shards change) // to allow caching. // build a map of unique nodes // TODO: allow filtering by group, role, etc Map<String,ZkNodeProps> nodes = new HashMap<String,ZkNodeProps>(); List<String> urlList = new ArrayList<String>(); for (Slice slice : slices.values()) { for (ZkNodeProps nodeProps : slice.getShards().values()) { ZkCoreNodeProps coreNodeProps = new ZkCoreNodeProps(nodeProps); String node = coreNodeProps.getNodeName(); if (!liveNodes.contains(coreNodeProps.getNodeName()) || !coreNodeProps.getState().equals( ZkStateReader.ACTIVE)) continue; if (nodes.put(node, nodeProps) == null) { String url = coreNodeProps.getCoreUrl(); urlList.add(url); } } } Collections.shuffle(urlList, rand); //System.out.println("########################## MAKING REQUEST TO " + urlList); LBHttpSolrServer.Req req = new LBHttpSolrServer.Req(request, urlList); LBHttpSolrServer.Rsp rsp = lbServer.request(req); return rsp.getResponse(); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/HttpSolrServer.java
public NamedList<Object> request(final SolrRequest request, final ResponseParser processor) throws SolrServerException, IOException { HttpRequestBase method = null; InputStream is = null; SolrParams params = request.getParams(); Collection<ContentStream> streams = requestWriter.getContentStreams(request); String path = requestWriter.getPath(request); if (path == null || !path.startsWith("/")) { path = DEFAULT_PATH; } ResponseParser parser = request.getResponseParser(); if (parser == null) { parser = this.parser; } // The parser 'wt=' and 'version=' params are used instead of the original // params ModifiableSolrParams wparams = new ModifiableSolrParams(params); wparams.set(CommonParams.WT, parser.getWriterType()); wparams.set(CommonParams.VERSION, parser.getVersion()); if (invariantParams != null) { wparams.add(invariantParams); } params = wparams; int tries = maxRetries + 1; try { while( tries-- > 0 ) { // Note: since we aren't do intermittent time keeping // ourselves, the potential non-timeout latency could be as // much as tries-times (plus scheduling effects) the given // timeAllowed. try { if( SolrRequest.METHOD.GET == request.getMethod() ) { if( streams != null ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "GET can't send streams!" ); } method = new HttpGet( baseUrl + path + ClientUtils.toQueryString( params, false ) ); } else if( SolrRequest.METHOD.POST == request.getMethod() ) { String url = baseUrl + path; boolean isMultipart = ( streams != null && streams.size() > 1 ); LinkedList<NameValuePair> postParams = new LinkedList<NameValuePair>(); if (streams == null || isMultipart) { HttpPost post = new HttpPost(url); post.setHeader("Content-Charset", "UTF-8"); if (!this.useMultiPartPost && !isMultipart) { post.addHeader("Content-Type", "application/x-www-form-urlencoded; charset=UTF-8"); } List<FormBodyPart> parts = new LinkedList<FormBodyPart>(); Iterator<String> iter = params.getParameterNamesIterator(); while (iter.hasNext()) { String p = iter.next(); String[] vals = params.getParams(p); if (vals != null) { for (String v : vals) { if (this.useMultiPartPost || isMultipart) { parts.add(new FormBodyPart(p, new StringBody(v, Charset.forName("UTF-8")))); } else { postParams.add(new BasicNameValuePair(p, v)); } } } } if (isMultipart) { for (ContentStream content : streams) { String contentType = content.getContentType(); if(contentType==null) { contentType = "application/octet-stream"; // default } parts.add(new FormBodyPart(content.getName(), new InputStreamBody( content.getStream(), contentType, content.getName()))); } } if (parts.size() > 0) { MultipartEntity entity = new MultipartEntity(HttpMultipartMode.STRICT); for(FormBodyPart p: parts) { entity.addPart(p); } post.setEntity(entity); } else { //not using multipart post.setEntity(new UrlEncodedFormEntity(postParams, "UTF-8")); } method = post; } // It is has one stream, it is the post body, put the params in the URL else { String pstr = ClientUtils.toQueryString(params, false); HttpPost post = new HttpPost(url + pstr); // Single stream as body // Using a loop just to get the first one final ContentStream[] contentStream = new ContentStream[1]; for (ContentStream content : streams) { contentStream[0] = content; break; } if (contentStream[0] instanceof RequestWriter.LazyContentStream) { post.setEntity(new InputStreamEntity(contentStream[0].getStream(), -1) { @Override public Header getContentType() { return new BasicHeader("Content-Type", contentStream[0].getContentType()); } @Override public boolean isRepeatable() { return false; } }); } else { post.setEntity(new InputStreamEntity(contentStream[0].getStream(), -1) { @Override public Header getContentType() { return new BasicHeader("Content-Type", contentStream[0].getContentType()); } @Override public boolean isRepeatable() { return false; } }); } method = post; } } else { throw new SolrServerException("Unsupported method: "+request.getMethod() ); } } catch( NoHttpResponseException r ) { method = null; if(is != null) { is.close(); } // If out of tries then just rethrow (as normal error). if (tries < 1) { throw r; } } }
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
public Rsp request(Req req) throws SolrServerException, IOException { Rsp rsp = new Rsp(); Exception ex = null; List<ServerWrapper> skipped = new ArrayList<ServerWrapper>(req.getNumDeadServersToTry()); for (String serverStr : req.getServers()) { serverStr = normalize(serverStr); // if the server is currently a zombie, just skip to the next one ServerWrapper wrapper = zombieServers.get(serverStr); if (wrapper != null) { // System.out.println("ZOMBIE SERVER QUERIED: " + serverStr); if (skipped.size() < req.getNumDeadServersToTry()) skipped.add(wrapper); continue; } rsp.server = serverStr; HttpSolrServer server = makeServer(serverStr); try { rsp.rsp = server.request(req.getRequest()); return rsp; // SUCCESS } catch (SolrException e) { // we retry on 404 or 403 or 503 - you can see this on solr shutdown if (e.code() == 404 || e.code() == 403 || e.code() == 503 || e.code() == 500) { ex = addZombie(server, e); } else { // Server is alive but the request was likely malformed or invalid throw e; } // TODO: consider using below above - currently does cause a problem with distrib updates: // seems to match up against a failed forward to leader exception as well... // || e.getMessage().contains("java.net.SocketException") // || e.getMessage().contains("java.net.ConnectException") } catch (SocketException e) { ex = addZombie(server, e); } catch (SocketTimeoutException e) { ex = addZombie(server, e); } catch (SolrServerException e) { Throwable rootCause = e.getRootCause(); if (rootCause instanceof IOException) { ex = addZombie(server, e); } else { throw e; } } catch (Exception e) { throw new SolrServerException(e); } } // try the servers we previously skipped for (ServerWrapper wrapper : skipped) { try { rsp.rsp = wrapper.solrServer.request(req.getRequest()); zombieServers.remove(wrapper.getKey()); return rsp; // SUCCESS } catch (SolrException e) { // we retry on 404 or 403 or 503 - you can see this on solr shutdown if (e.code() == 404 || e.code() == 403 || e.code() == 503 || e.code() == 500) { ex = e; // already a zombie, no need to re-add } else { // Server is alive but the request was malformed or invalid zombieServers.remove(wrapper.getKey()); throw e; } } catch (SocketException e) { ex = e; } catch (SocketTimeoutException e) { ex = e; } catch (SolrServerException e) { Throwable rootCause = e.getRootCause(); if (rootCause instanceof IOException) { ex = e; // already a zombie, no need to re-add } else { throw e; } } catch (Exception e) { throw new SolrServerException(e); } } if (ex == null) { throw new SolrServerException("No live SolrServers available to handle this request"); } else { throw new SolrServerException("No live SolrServers available to handle this request:" + zombieServers.keySet(), ex); } }
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
Override public NamedList<Object> request(final SolrRequest request) throws SolrServerException, IOException { Exception ex = null; ServerWrapper[] serverList = aliveServerList; int maxTries = serverList.length; Map<String,ServerWrapper> justFailed = null; for (int attempts=0; attempts<maxTries; attempts++) { int count = counter.incrementAndGet(); ServerWrapper wrapper = serverList[count % serverList.length]; wrapper.lastUsed = System.currentTimeMillis(); try { return wrapper.solrServer.request(request); } catch (SolrException e) { // Server is alive but the request was malformed or invalid throw e; } catch (SolrServerException e) { if (e.getRootCause() instanceof IOException) { ex = e; moveAliveToDead(wrapper); if (justFailed == null) justFailed = new HashMap<String,ServerWrapper>(); justFailed.put(wrapper.getKey(), wrapper); } else { throw e; } } catch (Exception e) { throw new SolrServerException(e); } } // try other standard servers that we didn't try just now for (ServerWrapper wrapper : zombieServers.values()) { if (wrapper.standard==false || justFailed!=null && justFailed.containsKey(wrapper.getKey())) continue; try { NamedList<Object> rsp = wrapper.solrServer.request(request); // remove from zombie list *before* adding to alive to avoid a race that could lose a server zombieServers.remove(wrapper.getKey()); addToAlive(wrapper); return rsp; } catch (SolrException e) { // Server is alive but the request was malformed or invalid throw e; } catch (SolrServerException e) { if (e.getRootCause() instanceof IOException) { ex = e; // still dead } else { throw e; } } catch (Exception e) { throw new SolrServerException(e); } } if (ex == null) { throw new SolrServerException("No live SolrServers available to handle this request"); } else { throw new SolrServerException("No live SolrServers available to handle this request", ex); } }
// in core/src/java/org/apache/solr/client/solrj/embedded/EmbeddedSolrServer.java
Override public NamedList<Object> request(SolrRequest request) throws SolrServerException, IOException { String path = request.getPath(); if( path == null || !path.startsWith( "/" ) ) { path = "/select"; } // Check for cores action SolrCore core = coreContainer.getCore( coreName ); if( core == null ) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "No such core: " + coreName ); } SolrParams params = request.getParams(); if( params == null ) { params = new ModifiableSolrParams(); } // Extract the handler from the path or params SolrRequestHandler handler = core.getRequestHandler( path ); if( handler == null ) { if( "/select".equals( path ) || "/select/".equalsIgnoreCase( path) ) { String qt = params.get( CommonParams.QT ); handler = core.getRequestHandler( qt ); if( handler == null ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "unknown handler: "+qt); } } // Perhaps the path is to manage the cores if( handler == null && coreContainer != null && path.equals( coreContainer.getAdminPath() ) ) { handler = coreContainer.getMultiCoreHandler(); } } if( handler == null ) { core.close(); throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "unknown handler: "+path ); } SolrQueryRequest req = null; try { req = _parser.buildRequestFrom( core, params, request.getContentStreams() ); req.getContext().put( "path", path ); SolrQueryResponse rsp = new SolrQueryResponse(); SolrRequestInfo.setRequestInfo(new SolrRequestInfo(req, rsp)); core.execute( handler, req, rsp ); if( rsp.getException() != null ) { if(rsp.getException() instanceof SolrException) { throw rsp.getException(); } throw new SolrServerException( rsp.getException() ); } // Check if this should stream results if( request.getStreamingResponseCallback() != null ) { try { final StreamingResponseCallback callback = request.getStreamingResponseCallback(); BinaryResponseWriter.Resolver resolver = new BinaryResponseWriter.Resolver( req, rsp.getReturnFields()) { @Override public void writeResults(ResultContext ctx, JavaBinCodec codec) throws IOException { // write an empty list... SolrDocumentList docs = new SolrDocumentList(); docs.setNumFound( ctx.docs.matches() ); docs.setStart( ctx.docs.offset() ); docs.setMaxScore( ctx.docs.maxScore() ); codec.writeSolrDocumentList( docs ); // This will transform writeResultsBody( ctx, codec ); } }; ByteArrayOutputStream out = new ByteArrayOutputStream(); new JavaBinCodec(resolver) { @Override public void writeSolrDocument(SolrDocument doc) throws IOException { callback.streamSolrDocument( doc ); //super.writeSolrDocument( doc, fields ); } @Override public void writeSolrDocumentList(SolrDocumentList docs) throws IOException { if( docs.size() > 0 ) { SolrDocumentList tmp = new SolrDocumentList(); tmp.setMaxScore( docs.getMaxScore() ); tmp.setNumFound( docs.getNumFound() ); tmp.setStart( docs.getStart() ); docs = tmp; } callback.streamDocListInfo( docs.getNumFound(), docs.getStart(), docs.getMaxScore() ); super.writeSolrDocumentList(docs); } }.marshal(rsp.getValues(), out); InputStream in = new ByteArrayInputStream(out.toByteArray()); return (NamedList<Object>) new JavaBinCodec(resolver).unmarshal(in); } catch (Exception ex) { throw new RuntimeException(ex); } } // Now write it out NamedList<Object> normalized = getParsedResponse(req, rsp); return normalized; } catch( IOException iox ) { throw iox; } catch( SolrException sx ) { throw sx; } catch( Exception ex ) { throw new SolrServerException( ex ); } finally { if (req != null) req.close(); core.close(); SolrRequestInfo.clearRequestInfo(); } }
10
              
// in solrj/src/java/org/apache/solr/client/solrj/request/QueryRequest.java
catch (Exception e) { throw new SolrServerException("Error executing query", e); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/HttpSolrServer.java
catch (IOException ex) { throw new SolrServerException("error reading streams", ex); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/HttpSolrServer.java
catch (ConnectException e) { throw new SolrServerException("Server refused connection at: " + getBaseURL(), e); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/HttpSolrServer.java
catch (SocketTimeoutException e) { throw new SolrServerException( "Timeout occured while waiting response from server at: " + getBaseURL(), e); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/HttpSolrServer.java
catch (IOException e) { throw new SolrServerException( "IOException occured when talking to server at: " + getBaseURL(), e); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
catch (Exception e) { throw new SolrServerException(e); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
catch (Exception e) { throw new SolrServerException(e); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
catch (Exception e) { throw new SolrServerException(e); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
catch (Exception e) { throw new SolrServerException(e); }
// in core/src/java/org/apache/solr/client/solrj/embedded/EmbeddedSolrServer.java
catch( Exception ex ) { throw new SolrServerException( ex ); }
57
              
// in solrj/src/java/org/apache/solr/client/solrj/request/QueryRequest.java
Override public QueryResponse process( SolrServer server ) throws SolrServerException { try { long startTime = System.currentTimeMillis(); QueryResponse res = new QueryResponse( server.request( this ), server ); res.setElapsedTime( System.currentTimeMillis()-startTime ); return res; } catch (SolrServerException e){ throw e; } catch (SolrException s){ throw s; } catch (Exception e) { throw new SolrServerException("Error executing query", e); } }
// in solrj/src/java/org/apache/solr/client/solrj/request/DirectXmlRequest.java
Override public UpdateResponse process( SolrServer server ) throws SolrServerException, IOException { long startTime = System.currentTimeMillis(); UpdateResponse res = new UpdateResponse(); res.setResponse( server.request( this ) ); res.setElapsedTime( System.currentTimeMillis()-startTime ); return res; }
// in solrj/src/java/org/apache/solr/client/solrj/request/SolrPing.java
Override public SolrPingResponse process( SolrServer server ) throws SolrServerException, IOException { long startTime = System.currentTimeMillis(); SolrPingResponse res = new SolrPingResponse(); res.setResponse( server.request( this ) ); res.setElapsedTime( System.currentTimeMillis()-startTime ); return res; }
// in solrj/src/java/org/apache/solr/client/solrj/request/AbstractUpdateRequest.java
Override public UpdateResponse process( SolrServer server ) throws SolrServerException, IOException { long startTime = System.currentTimeMillis(); UpdateResponse res = new UpdateResponse(); res.setResponse( server.request( this ) ); res.setElapsedTime( System.currentTimeMillis()-startTime ); return res; }
// in solrj/src/java/org/apache/solr/client/solrj/request/FieldAnalysisRequest.java
Override public FieldAnalysisResponse process(SolrServer server) throws SolrServerException, IOException { if (fieldTypes == null && fieldNames == null) { throw new IllegalStateException("At least one field type or field name need to be specified"); } if (fieldValue == null) { throw new IllegalStateException("The field value must be set"); } long startTime = System.currentTimeMillis(); FieldAnalysisResponse res = new FieldAnalysisResponse(); res.setResponse(server.request(this)); res.setElapsedTime(System.currentTimeMillis() - startTime); return res; }
// in solrj/src/java/org/apache/solr/client/solrj/request/LukeRequest.java
Override public LukeResponse process( SolrServer server ) throws SolrServerException, IOException { long startTime = System.currentTimeMillis(); LukeResponse res = new LukeResponse(); res.setResponse( server.request( this ) ); res.setElapsedTime( System.currentTimeMillis()-startTime ); return res; }
// in solrj/src/java/org/apache/solr/client/solrj/request/DocumentAnalysisRequest.java
Override public DocumentAnalysisResponse process(SolrServer server) throws SolrServerException, IOException { long startTime = System.currentTimeMillis(); DocumentAnalysisResponse res = new DocumentAnalysisResponse(); res.setResponse(server.request(this)); res.setElapsedTime(System.currentTimeMillis() - startTime); return res; }
// in solrj/src/java/org/apache/solr/client/solrj/request/CoreAdminRequest.java
Override public CoreAdminResponse process(SolrServer server) throws SolrServerException, IOException { long startTime = System.currentTimeMillis(); CoreAdminResponse res = new CoreAdminResponse(); res.setResponse( server.request( this ) ); res.setElapsedTime( System.currentTimeMillis()-startTime ); return res; }
// in solrj/src/java/org/apache/solr/client/solrj/request/CoreAdminRequest.java
public static CoreAdminResponse reloadCore( String name, SolrServer server ) throws SolrServerException, IOException { CoreAdminRequest req = new CoreAdminRequest(); req.setCoreName( name ); req.setAction( CoreAdminAction.RELOAD ); return req.process( server ); }
// in solrj/src/java/org/apache/solr/client/solrj/request/CoreAdminRequest.java
public static CoreAdminResponse unloadCore( String name, SolrServer server ) throws SolrServerException, IOException { return unloadCore(name, false, server); }
// in solrj/src/java/org/apache/solr/client/solrj/request/CoreAdminRequest.java
public static CoreAdminResponse unloadCore( String name, boolean deleteIndex, SolrServer server ) throws SolrServerException, IOException { Unload req = new Unload(deleteIndex); req.setCoreName( name ); return req.process( server ); }
// in solrj/src/java/org/apache/solr/client/solrj/request/CoreAdminRequest.java
public static CoreAdminResponse renameCore(String coreName, String newName, SolrServer server ) throws SolrServerException, IOException { CoreAdminRequest req = new CoreAdminRequest(); req.setCoreName(coreName); req.setOtherCoreName(newName); req.setAction( CoreAdminAction.RENAME ); return req.process( server ); }
// in solrj/src/java/org/apache/solr/client/solrj/request/CoreAdminRequest.java
public static CoreAdminResponse getStatus( String name, SolrServer server ) throws SolrServerException, IOException { CoreAdminRequest req = new CoreAdminRequest(); req.setCoreName( name ); req.setAction( CoreAdminAction.STATUS ); return req.process( server ); }
// in solrj/src/java/org/apache/solr/client/solrj/request/CoreAdminRequest.java
public static CoreAdminResponse createCore( String name, String instanceDir, SolrServer server ) throws SolrServerException, IOException { return CoreAdminRequest.createCore(name, instanceDir, server, null, null); }
// in solrj/src/java/org/apache/solr/client/solrj/request/CoreAdminRequest.java
public static CoreAdminResponse createCore( String name, String instanceDir, SolrServer server, String configFile, String schemaFile ) throws SolrServerException, IOException { CoreAdminRequest.Create req = new CoreAdminRequest.Create(); req.setCoreName( name ); req.setInstanceDir(instanceDir); if(configFile != null){ req.setConfigName(configFile); } if(schemaFile != null){ req.setSchemaName(schemaFile); } return req.process( server ); }
// in solrj/src/java/org/apache/solr/client/solrj/request/CoreAdminRequest.java
public static CoreAdminResponse persist(String fileName, SolrServer server) throws SolrServerException, IOException { CoreAdminRequest.Persist req = new CoreAdminRequest.Persist(); req.setFileName(fileName); return req.process(server); }
// in solrj/src/java/org/apache/solr/client/solrj/request/CoreAdminRequest.java
public static CoreAdminResponse mergeIndexes(String name, String[] indexDirs, String[] srcCores, SolrServer server) throws SolrServerException, IOException { CoreAdminRequest.MergeIndexes req = new CoreAdminRequest.MergeIndexes(); req.setCoreName(name); req.setIndexDirs(Arrays.asList(indexDirs)); req.setSrcCores(Arrays.asList(srcCores)); return req.process(server); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/ConcurrentUpdateSolrServer.java
public NamedList<Object> request(final SolrRequest request) throws SolrServerException, IOException { if (!(request instanceof UpdateRequest)) { return server.request(request); } UpdateRequest req = (UpdateRequest) request; // this happens for commit... if (req.getDocuments() == null || req.getDocuments().isEmpty()) { blockUntilFinished(); return server.request(request); } SolrParams params = req.getParams(); if (params != null) { // check if it is waiting for the searcher if (params.getBool(UpdateParams.WAIT_SEARCHER, false)) { log.info("blocking for commit/optimize"); blockUntilFinished(); // empty the queue return server.request(request); } } try { CountDownLatch tmpLock = lock; if (tmpLock != null) { tmpLock.await(); } boolean success = queue.offer(req); for (;;) { synchronized (runners) { if (runners.isEmpty() || (queue.remainingCapacity() < queue.size() // queue // is // half // full // and // we // can // add // more // runners && runners.size() < threadCount)) { // We need more runners, so start a new one. Runner r = new Runner(); runners.add(r); scheduler.execute(r); } else { // break out of the retry loop if we added the element to the queue // successfully, *and* // while we are still holding the runners lock to prevent race // conditions. // race conditions. if (success) break; } } // Retry to add to the queue w/o the runners lock held (else we risk // temporary deadlock) // This retry could also fail because // 1) existing runners were not able to take off any new elements in the // queue // 2) the queue was filled back up since our last try // If we succeed, the queue may have been completely emptied, and all // runners stopped. // In all cases, we should loop back to the top to see if we need to // start more runners. // if (!success) { success = queue.offer(req, 100, TimeUnit.MILLISECONDS); } } } catch (InterruptedException e) { log.error("interrupted", e); throw new IOException(e.getLocalizedMessage()); } // RETURN A DUMMY result NamedList<Object> dummy = new NamedList<Object>(); dummy.add("NOTE", "the request is processed in a background stream"); return dummy; }
// in solrj/src/java/org/apache/solr/client/solrj/impl/CloudSolrServer.java
Override public NamedList<Object> request(SolrRequest request) throws SolrServerException, IOException { connect(); // TODO: if you can hash here, you could favor the shard leader CloudState cloudState = zkStateReader.getCloudState(); SolrParams reqParams = request.getParams(); if (reqParams == null) { reqParams = new ModifiableSolrParams(); } String collection = reqParams.get("collection", defaultCollection); if (collection == null) { throw new SolrServerException("No collection param specified on request and no default collection has been set."); } // Extract each comma separated collection name and store in a List. List<String> collectionList = StrUtils.splitSmart(collection, ",", true); // Retrieve slices from the cloud state and, for each collection specified, // add it to the Map of slices. Map<String,Slice> slices = new HashMap<String,Slice>(); for (int i = 0; i < collectionList.size(); i++) { String coll= collectionList.get(i); ClientUtils.appendMap(coll, slices, cloudState.getSlices(coll)); } Set<String> liveNodes = cloudState.getLiveNodes(); // IDEA: have versions on various things... like a global cloudState version // or shardAddressVersion (which only changes when the shards change) // to allow caching. // build a map of unique nodes // TODO: allow filtering by group, role, etc Map<String,ZkNodeProps> nodes = new HashMap<String,ZkNodeProps>(); List<String> urlList = new ArrayList<String>(); for (Slice slice : slices.values()) { for (ZkNodeProps nodeProps : slice.getShards().values()) { ZkCoreNodeProps coreNodeProps = new ZkCoreNodeProps(nodeProps); String node = coreNodeProps.getNodeName(); if (!liveNodes.contains(coreNodeProps.getNodeName()) || !coreNodeProps.getState().equals( ZkStateReader.ACTIVE)) continue; if (nodes.put(node, nodeProps) == null) { String url = coreNodeProps.getCoreUrl(); urlList.add(url); } } } Collections.shuffle(urlList, rand); //System.out.println("########################## MAKING REQUEST TO " + urlList); LBHttpSolrServer.Req req = new LBHttpSolrServer.Req(request, urlList); LBHttpSolrServer.Rsp rsp = lbServer.request(req); return rsp.getResponse(); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/HttpSolrServer.java
Override public NamedList<Object> request(final SolrRequest request) throws SolrServerException, IOException { ResponseParser responseParser = request.getResponseParser(); if (responseParser == null) { responseParser = parser; } return request(request, responseParser); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/HttpSolrServer.java
public NamedList<Object> request(final SolrRequest request, final ResponseParser processor) throws SolrServerException, IOException { HttpRequestBase method = null; InputStream is = null; SolrParams params = request.getParams(); Collection<ContentStream> streams = requestWriter.getContentStreams(request); String path = requestWriter.getPath(request); if (path == null || !path.startsWith("/")) { path = DEFAULT_PATH; } ResponseParser parser = request.getResponseParser(); if (parser == null) { parser = this.parser; } // The parser 'wt=' and 'version=' params are used instead of the original // params ModifiableSolrParams wparams = new ModifiableSolrParams(params); wparams.set(CommonParams.WT, parser.getWriterType()); wparams.set(CommonParams.VERSION, parser.getVersion()); if (invariantParams != null) { wparams.add(invariantParams); } params = wparams; int tries = maxRetries + 1; try { while( tries-- > 0 ) { // Note: since we aren't do intermittent time keeping // ourselves, the potential non-timeout latency could be as // much as tries-times (plus scheduling effects) the given // timeAllowed. try { if( SolrRequest.METHOD.GET == request.getMethod() ) { if( streams != null ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "GET can't send streams!" ); } method = new HttpGet( baseUrl + path + ClientUtils.toQueryString( params, false ) ); } else if( SolrRequest.METHOD.POST == request.getMethod() ) { String url = baseUrl + path; boolean isMultipart = ( streams != null && streams.size() > 1 ); LinkedList<NameValuePair> postParams = new LinkedList<NameValuePair>(); if (streams == null || isMultipart) { HttpPost post = new HttpPost(url); post.setHeader("Content-Charset", "UTF-8"); if (!this.useMultiPartPost && !isMultipart) { post.addHeader("Content-Type", "application/x-www-form-urlencoded; charset=UTF-8"); } List<FormBodyPart> parts = new LinkedList<FormBodyPart>(); Iterator<String> iter = params.getParameterNamesIterator(); while (iter.hasNext()) { String p = iter.next(); String[] vals = params.getParams(p); if (vals != null) { for (String v : vals) { if (this.useMultiPartPost || isMultipart) { parts.add(new FormBodyPart(p, new StringBody(v, Charset.forName("UTF-8")))); } else { postParams.add(new BasicNameValuePair(p, v)); } } } } if (isMultipart) { for (ContentStream content : streams) { String contentType = content.getContentType(); if(contentType==null) { contentType = "application/octet-stream"; // default } parts.add(new FormBodyPart(content.getName(), new InputStreamBody( content.getStream(), contentType, content.getName()))); } } if (parts.size() > 0) { MultipartEntity entity = new MultipartEntity(HttpMultipartMode.STRICT); for(FormBodyPart p: parts) { entity.addPart(p); } post.setEntity(entity); } else { //not using multipart post.setEntity(new UrlEncodedFormEntity(postParams, "UTF-8")); } method = post; } // It is has one stream, it is the post body, put the params in the URL else { String pstr = ClientUtils.toQueryString(params, false); HttpPost post = new HttpPost(url + pstr); // Single stream as body // Using a loop just to get the first one final ContentStream[] contentStream = new ContentStream[1]; for (ContentStream content : streams) { contentStream[0] = content; break; } if (contentStream[0] instanceof RequestWriter.LazyContentStream) { post.setEntity(new InputStreamEntity(contentStream[0].getStream(), -1) { @Override public Header getContentType() { return new BasicHeader("Content-Type", contentStream[0].getContentType()); } @Override public boolean isRepeatable() { return false; } }); } else { post.setEntity(new InputStreamEntity(contentStream[0].getStream(), -1) { @Override public Header getContentType() { return new BasicHeader("Content-Type", contentStream[0].getContentType()); } @Override public boolean isRepeatable() { return false; } }); } method = post; } } else { throw new SolrServerException("Unsupported method: "+request.getMethod() ); } } catch( NoHttpResponseException r ) { method = null; if(is != null) { is.close(); } // If out of tries then just rethrow (as normal error). if (tries < 1) { throw r; } } }
// in solrj/src/java/org/apache/solr/client/solrj/impl/HttpSolrServer.java
public UpdateResponse add(Iterator<SolrInputDocument> docIterator) throws SolrServerException, IOException { UpdateRequest req = new UpdateRequest(); req.setDocIterator(docIterator); return req.process(this); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/HttpSolrServer.java
public UpdateResponse addBeans(final Iterator<?> beanIterator) throws SolrServerException, IOException { UpdateRequest req = new UpdateRequest(); req.setDocIterator(new Iterator<SolrInputDocument>() { public boolean hasNext() { return beanIterator.hasNext(); } public SolrInputDocument next() { Object o = beanIterator.next(); if (o == null) return null; return getBinder().toSolrInputDocument(o); } public void remove() { beanIterator.remove(); } }); return req.process(this); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
public Rsp request(Req req) throws SolrServerException, IOException { Rsp rsp = new Rsp(); Exception ex = null; List<ServerWrapper> skipped = new ArrayList<ServerWrapper>(req.getNumDeadServersToTry()); for (String serverStr : req.getServers()) { serverStr = normalize(serverStr); // if the server is currently a zombie, just skip to the next one ServerWrapper wrapper = zombieServers.get(serverStr); if (wrapper != null) { // System.out.println("ZOMBIE SERVER QUERIED: " + serverStr); if (skipped.size() < req.getNumDeadServersToTry()) skipped.add(wrapper); continue; } rsp.server = serverStr; HttpSolrServer server = makeServer(serverStr); try { rsp.rsp = server.request(req.getRequest()); return rsp; // SUCCESS } catch (SolrException e) { // we retry on 404 or 403 or 503 - you can see this on solr shutdown if (e.code() == 404 || e.code() == 403 || e.code() == 503 || e.code() == 500) { ex = addZombie(server, e); } else { // Server is alive but the request was likely malformed or invalid throw e; } // TODO: consider using below above - currently does cause a problem with distrib updates: // seems to match up against a failed forward to leader exception as well... // || e.getMessage().contains("java.net.SocketException") // || e.getMessage().contains("java.net.ConnectException") } catch (SocketException e) { ex = addZombie(server, e); } catch (SocketTimeoutException e) { ex = addZombie(server, e); } catch (SolrServerException e) { Throwable rootCause = e.getRootCause(); if (rootCause instanceof IOException) { ex = addZombie(server, e); } else { throw e; } } catch (Exception e) { throw new SolrServerException(e); } } // try the servers we previously skipped for (ServerWrapper wrapper : skipped) { try { rsp.rsp = wrapper.solrServer.request(req.getRequest()); zombieServers.remove(wrapper.getKey()); return rsp; // SUCCESS } catch (SolrException e) { // we retry on 404 or 403 or 503 - you can see this on solr shutdown if (e.code() == 404 || e.code() == 403 || e.code() == 503 || e.code() == 500) { ex = e; // already a zombie, no need to re-add } else { // Server is alive but the request was malformed or invalid zombieServers.remove(wrapper.getKey()); throw e; } } catch (SocketException e) { ex = e; } catch (SocketTimeoutException e) { ex = e; } catch (SolrServerException e) { Throwable rootCause = e.getRootCause(); if (rootCause instanceof IOException) { ex = e; // already a zombie, no need to re-add } else { throw e; } } catch (Exception e) { throw new SolrServerException(e); } } if (ex == null) { throw new SolrServerException("No live SolrServers available to handle this request"); } else { throw new SolrServerException("No live SolrServers available to handle this request:" + zombieServers.keySet(), ex); } }
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
Override public NamedList<Object> request(final SolrRequest request) throws SolrServerException, IOException { Exception ex = null; ServerWrapper[] serverList = aliveServerList; int maxTries = serverList.length; Map<String,ServerWrapper> justFailed = null; for (int attempts=0; attempts<maxTries; attempts++) { int count = counter.incrementAndGet(); ServerWrapper wrapper = serverList[count % serverList.length]; wrapper.lastUsed = System.currentTimeMillis(); try { return wrapper.solrServer.request(request); } catch (SolrException e) { // Server is alive but the request was malformed or invalid throw e; } catch (SolrServerException e) { if (e.getRootCause() instanceof IOException) { ex = e; moveAliveToDead(wrapper); if (justFailed == null) justFailed = new HashMap<String,ServerWrapper>(); justFailed.put(wrapper.getKey(), wrapper); } else { throw e; } } catch (Exception e) { throw new SolrServerException(e); } } // try other standard servers that we didn't try just now for (ServerWrapper wrapper : zombieServers.values()) { if (wrapper.standard==false || justFailed!=null && justFailed.containsKey(wrapper.getKey())) continue; try { NamedList<Object> rsp = wrapper.solrServer.request(request); // remove from zombie list *before* adding to alive to avoid a race that could lose a server zombieServers.remove(wrapper.getKey()); addToAlive(wrapper); return rsp; } catch (SolrException e) { // Server is alive but the request was malformed or invalid throw e; } catch (SolrServerException e) { if (e.getRootCause() instanceof IOException) { ex = e; // still dead } else { throw e; } } catch (Exception e) { throw new SolrServerException(e); } } if (ex == null) { throw new SolrServerException("No live SolrServers available to handle this request"); } else { throw new SolrServerException("No live SolrServers available to handle this request", ex); } }
// in solrj/src/java/org/apache/solr/client/solrj/SolrServer.java
public UpdateResponse add(Collection<SolrInputDocument> docs) throws SolrServerException, IOException { return add(docs, -1); }
// in solrj/src/java/org/apache/solr/client/solrj/SolrServer.java
public UpdateResponse add(Collection<SolrInputDocument> docs, int commitWithinMs) throws SolrServerException, IOException { UpdateRequest req = new UpdateRequest(); req.add(docs); req.setCommitWithin(commitWithinMs); return req.process(this); }
// in solrj/src/java/org/apache/solr/client/solrj/SolrServer.java
public UpdateResponse addBeans(Collection<?> beans ) throws SolrServerException, IOException { return addBeans(beans, -1); }
// in solrj/src/java/org/apache/solr/client/solrj/SolrServer.java
public UpdateResponse addBeans(Collection<?> beans, int commitWithinMs) throws SolrServerException, IOException { DocumentObjectBinder binder = this.getBinder(); ArrayList<SolrInputDocument> docs = new ArrayList<SolrInputDocument>(beans.size()); for (Object bean : beans) { docs.add(binder.toSolrInputDocument(bean)); } return add(docs, commitWithinMs); }
// in solrj/src/java/org/apache/solr/client/solrj/SolrServer.java
public UpdateResponse add(SolrInputDocument doc ) throws SolrServerException, IOException { return add(doc, -1); }
// in solrj/src/java/org/apache/solr/client/solrj/SolrServer.java
public UpdateResponse add(SolrInputDocument doc, int commitWithinMs) throws SolrServerException, IOException { UpdateRequest req = new UpdateRequest(); req.add(doc); req.setCommitWithin(commitWithinMs); return req.process(this); }
// in solrj/src/java/org/apache/solr/client/solrj/SolrServer.java
public UpdateResponse addBean(Object obj) throws IOException, SolrServerException { return addBean(obj, -1); }
// in solrj/src/java/org/apache/solr/client/solrj/SolrServer.java
public UpdateResponse addBean(Object obj, int commitWithinMs) throws IOException, SolrServerException { return add(getBinder().toSolrInputDocument(obj),commitWithinMs); }
// in solrj/src/java/org/apache/solr/client/solrj/SolrServer.java
public UpdateResponse commit( ) throws SolrServerException, IOException { return commit(true, true); }
// in solrj/src/java/org/apache/solr/client/solrj/SolrServer.java
public UpdateResponse optimize( ) throws SolrServerException, IOException { return optimize(true, true, 1); }
// in solrj/src/java/org/apache/solr/client/solrj/SolrServer.java
public UpdateResponse commit( boolean waitFlush, boolean waitSearcher ) throws SolrServerException, IOException { return new UpdateRequest().setAction( UpdateRequest.ACTION.COMMIT, waitFlush, waitSearcher ).process( this ); }
// in solrj/src/java/org/apache/solr/client/solrj/SolrServer.java
public UpdateResponse commit( boolean waitFlush, boolean waitSearcher, boolean softCommit ) throws SolrServerException, IOException { return new UpdateRequest().setAction( UpdateRequest.ACTION.COMMIT, waitFlush, waitSearcher, softCommit ).process( this ); }
// in solrj/src/java/org/apache/solr/client/solrj/SolrServer.java
public UpdateResponse optimize( boolean waitFlush, boolean waitSearcher ) throws SolrServerException, IOException { return optimize(waitFlush, waitSearcher, 1); }
// in solrj/src/java/org/apache/solr/client/solrj/SolrServer.java
public UpdateResponse optimize(boolean waitFlush, boolean waitSearcher, int maxSegments ) throws SolrServerException, IOException { return new UpdateRequest().setAction( UpdateRequest.ACTION.OPTIMIZE, waitFlush, waitSearcher, maxSegments ).process( this ); }
// in solrj/src/java/org/apache/solr/client/solrj/SolrServer.java
public UpdateResponse rollback() throws SolrServerException, IOException { return new UpdateRequest().rollback().process( this ); }
// in solrj/src/java/org/apache/solr/client/solrj/SolrServer.java
public UpdateResponse deleteById(String id) throws SolrServerException, IOException { return deleteById(id, -1); }
// in solrj/src/java/org/apache/solr/client/solrj/SolrServer.java
public UpdateResponse deleteById(String id, int commitWithinMs) throws SolrServerException, IOException { UpdateRequest req = new UpdateRequest(); req.deleteById(id); req.setCommitWithin(commitWithinMs); return req.process(this); }
// in solrj/src/java/org/apache/solr/client/solrj/SolrServer.java
public UpdateResponse deleteById(List<String> ids) throws SolrServerException, IOException { return deleteById(ids, -1); }
// in solrj/src/java/org/apache/solr/client/solrj/SolrServer.java
public UpdateResponse deleteById(List<String> ids, int commitWithinMs) throws SolrServerException, IOException { UpdateRequest req = new UpdateRequest(); req.deleteById(ids); req.setCommitWithin(commitWithinMs); return req.process(this); }
// in solrj/src/java/org/apache/solr/client/solrj/SolrServer.java
public UpdateResponse deleteByQuery(String query) throws SolrServerException, IOException { return deleteByQuery(query, -1); }
// in solrj/src/java/org/apache/solr/client/solrj/SolrServer.java
public UpdateResponse deleteByQuery(String query, int commitWithinMs) throws SolrServerException, IOException { UpdateRequest req = new UpdateRequest(); req.deleteByQuery(query); req.setCommitWithin(commitWithinMs); return req.process(this); }
// in solrj/src/java/org/apache/solr/client/solrj/SolrServer.java
public SolrPingResponse ping() throws SolrServerException, IOException { return new SolrPing().process( this ); }
// in solrj/src/java/org/apache/solr/client/solrj/SolrServer.java
public QueryResponse query(SolrParams params) throws SolrServerException { return new QueryRequest( params ).process( this ); }
// in solrj/src/java/org/apache/solr/client/solrj/SolrServer.java
public QueryResponse query(SolrParams params, METHOD method) throws SolrServerException { return new QueryRequest( params, method ).process( this ); }
// in solrj/src/java/org/apache/solr/client/solrj/SolrServer.java
public QueryResponse queryAndStreamResponse( SolrParams params, StreamingResponseCallback callback ) throws SolrServerException, IOException { ResponseParser parser = new StreamingBinaryResponseParser( callback ); QueryRequest req = new QueryRequest( params ); req.setStreamingResponseCallback( callback ); req.setResponseParser( parser ); return req.process(this); }
// in core/src/java/org/apache/solr/handler/admin/CoreAdminHandler.java
protected void handleDistribUrlAction(SolrQueryRequest req, SolrQueryResponse rsp) throws IOException, InterruptedException, SolrServerException { // TODO: finish this and tests SolrParams params = req.getParams(); final ModifiableSolrParams newParams = new ModifiableSolrParams(params); newParams.remove("action"); SolrParams required = params.required(); final String subAction = required.get("subAction"); String collection = required.get("collection"); newParams.set(CoreAdminParams.ACTION, subAction); SolrCore core = req.getCore(); ZkController zkController = core.getCoreDescriptor().getCoreContainer() .getZkController(); CloudState cloudState = zkController.getCloudState(); Map<String,Slice> slices = cloudState.getCollectionStates().get(collection); for (Map.Entry<String,Slice> entry : slices.entrySet()) { Slice slice = entry.getValue(); Map<String,ZkNodeProps> shards = slice.getShards(); Set<Map.Entry<String,ZkNodeProps>> shardEntries = shards.entrySet(); for (Map.Entry<String,ZkNodeProps> shardEntry : shardEntries) { final ZkNodeProps node = shardEntry.getValue(); if (cloudState.liveNodesContain(node.get(ZkStateReader.NODE_NAME_PROP))) { newParams.set(CoreAdminParams.CORE, node.get(ZkStateReader.CORE_NAME_PROP)); String replica = node.get(ZkStateReader.BASE_URL_PROP); ShardRequest sreq = new ShardRequest(); newParams.set("qt", "/admin/cores"); sreq.purpose = 1; // TODO: this sucks if (replica.startsWith("http://")) replica = replica.substring(7); sreq.shards = new String[]{replica}; sreq.actualShards = sreq.shards; sreq.params = newParams; shardHandler.submit(sreq, replica, sreq.params); } } } ShardResponse srsp; do { srsp = shardHandler.takeCompletedOrError(); if (srsp != null) { Throwable e = srsp.getException(); if (e != null) { log.error("Error talking to shard: " + srsp.getShard(), e); } } } while(srsp != null); }
// in core/src/java/org/apache/solr/client/solrj/embedded/EmbeddedSolrServer.java
Override public NamedList<Object> request(SolrRequest request) throws SolrServerException, IOException { String path = request.getPath(); if( path == null || !path.startsWith( "/" ) ) { path = "/select"; } // Check for cores action SolrCore core = coreContainer.getCore( coreName ); if( core == null ) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "No such core: " + coreName ); } SolrParams params = request.getParams(); if( params == null ) { params = new ModifiableSolrParams(); } // Extract the handler from the path or params SolrRequestHandler handler = core.getRequestHandler( path ); if( handler == null ) { if( "/select".equals( path ) || "/select/".equalsIgnoreCase( path) ) { String qt = params.get( CommonParams.QT ); handler = core.getRequestHandler( qt ); if( handler == null ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "unknown handler: "+qt); } } // Perhaps the path is to manage the cores if( handler == null && coreContainer != null && path.equals( coreContainer.getAdminPath() ) ) { handler = coreContainer.getMultiCoreHandler(); } } if( handler == null ) { core.close(); throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "unknown handler: "+path ); } SolrQueryRequest req = null; try { req = _parser.buildRequestFrom( core, params, request.getContentStreams() ); req.getContext().put( "path", path ); SolrQueryResponse rsp = new SolrQueryResponse(); SolrRequestInfo.setRequestInfo(new SolrRequestInfo(req, rsp)); core.execute( handler, req, rsp ); if( rsp.getException() != null ) { if(rsp.getException() instanceof SolrException) { throw rsp.getException(); } throw new SolrServerException( rsp.getException() ); } // Check if this should stream results if( request.getStreamingResponseCallback() != null ) { try { final StreamingResponseCallback callback = request.getStreamingResponseCallback(); BinaryResponseWriter.Resolver resolver = new BinaryResponseWriter.Resolver( req, rsp.getReturnFields()) { @Override public void writeResults(ResultContext ctx, JavaBinCodec codec) throws IOException { // write an empty list... SolrDocumentList docs = new SolrDocumentList(); docs.setNumFound( ctx.docs.matches() ); docs.setStart( ctx.docs.offset() ); docs.setMaxScore( ctx.docs.maxScore() ); codec.writeSolrDocumentList( docs ); // This will transform writeResultsBody( ctx, codec ); } }; ByteArrayOutputStream out = new ByteArrayOutputStream(); new JavaBinCodec(resolver) { @Override public void writeSolrDocument(SolrDocument doc) throws IOException { callback.streamSolrDocument( doc ); //super.writeSolrDocument( doc, fields ); } @Override public void writeSolrDocumentList(SolrDocumentList docs) throws IOException { if( docs.size() > 0 ) { SolrDocumentList tmp = new SolrDocumentList(); tmp.setMaxScore( docs.getMaxScore() ); tmp.setNumFound( docs.getNumFound() ); tmp.setStart( docs.getStart() ); docs = tmp; } callback.streamDocListInfo( docs.getNumFound(), docs.getStart(), docs.getMaxScore() ); super.writeSolrDocumentList(docs); } }.marshal(rsp.getValues(), out); InputStream in = new ByteArrayInputStream(out.toByteArray()); return (NamedList<Object>) new JavaBinCodec(resolver).unmarshal(in); } catch (Exception ex) { throw new RuntimeException(ex); } } // Now write it out NamedList<Object> normalized = getParsedResponse(req, rsp); return normalized; } catch( IOException iox ) { throw iox; } catch( SolrException sx ) { throw sx; } catch( Exception ex ) { throw new SolrServerException( ex ); } finally { if (req != null) req.close(); core.close(); SolrRequestInfo.clearRequestInfo(); } }
// in core/src/java/org/apache/solr/cloud/SyncStrategy.java
private boolean syncWithReplicas(ZkController zkController, SolrCore core, ZkNodeProps props, String collection, String shardId) throws MalformedURLException, SolrServerException, IOException { List<ZkCoreNodeProps> nodes = zkController.getZkStateReader() .getReplicaProps(collection, shardId, props.get(ZkStateReader.NODE_NAME_PROP), props.get(ZkStateReader.CORE_NAME_PROP), ZkStateReader.ACTIVE); // TODO: // should // there // be a // state // filter? if (nodes == null) { // I have no replicas return true; } List<String> syncWith = new ArrayList<String>(); for (ZkCoreNodeProps node : nodes) { // if we see a leader, must be stale state, and this is the guy that went down if (!node.getNodeProps().keySet().contains(ZkStateReader.LEADER_PROP)) { syncWith.add(node.getCoreUrl()); } } PeerSync peerSync = new PeerSync(core, syncWith, core.getUpdateHandler().getUpdateLog().numRecordsToKeep); return peerSync.sync(); }
// in core/src/java/org/apache/solr/cloud/SyncStrategy.java
private void syncToMe(ZkController zkController, String collection, String shardId, ZkNodeProps leaderProps) throws MalformedURLException, SolrServerException, IOException { // sync everyone else // TODO: we should do this in parallel at least List<ZkCoreNodeProps> nodes = zkController .getZkStateReader() .getReplicaProps(collection, shardId, leaderProps.get(ZkStateReader.NODE_NAME_PROP), leaderProps.get(ZkStateReader.CORE_NAME_PROP), ZkStateReader.ACTIVE); if (nodes == null) { // System.out.println("I have no replicas"); // I have no replicas return; } //System.out.println("tell my replicas to sync"); ZkCoreNodeProps zkLeader = new ZkCoreNodeProps(leaderProps); for (ZkCoreNodeProps node : nodes) { try { // System.out // .println("try and ask " + node.getCoreUrl() + " to sync"); log.info("try and ask " + node.getCoreUrl() + " to sync"); requestSync(zkLeader.getCoreUrl(), node.getCoreName()); } catch (Exception e) { SolrException.log(log, "Error syncing replica to leader", e); } } for(;;) { ShardResponse srsp = shardHandler.takeCompletedOrError(); if (srsp == null) break; boolean success = handleResponse(srsp); //System.out.println("got response:" + success); if (!success) { try { log.info("Sync failed - asking replica to recover."); //System.out.println("Sync failed - asking replica to recover."); RequestRecovery recoverRequestCmd = new RequestRecovery(); recoverRequestCmd.setAction(CoreAdminAction.REQUESTRECOVERY); recoverRequestCmd.setCoreName(((SyncShardRequest)srsp.getShardRequest()).coreName); HttpSolrServer server = new HttpSolrServer(zkLeader.getBaseUrl()); server.request(recoverRequestCmd); } catch (Exception e) { log.info("Could not tell a replica to recover", e); } shardHandler.cancelAll(); break; } } }
// in core/src/java/org/apache/solr/cloud/RecoveryStrategy.java
private void replicate(String nodeName, SolrCore core, ZkNodeProps leaderprops, String baseUrl) throws SolrServerException, IOException { String leaderBaseUrl = leaderprops.get(ZkStateReader.BASE_URL_PROP); ZkCoreNodeProps leaderCNodeProps = new ZkCoreNodeProps(leaderprops); String leaderUrl = leaderCNodeProps.getCoreUrl(); log.info("Attempting to replicate from " + leaderUrl); // if we are the leader, either we are trying to recover faster // then our ephemeral timed out or we are the only node if (!leaderBaseUrl.equals(baseUrl)) { // send commit commitOnLeader(leaderUrl); // use rep handler directly, so we can do this sync rather than async SolrRequestHandler handler = core.getRequestHandler(REPLICATION_HANDLER); if (handler instanceof LazyRequestHandlerWrapper) { handler = ((LazyRequestHandlerWrapper)handler).getWrappedHandler(); } ReplicationHandler replicationHandler = (ReplicationHandler) handler; if (replicationHandler == null) { throw new SolrException(ErrorCode.SERVICE_UNAVAILABLE, "Skipping recovery, no " + REPLICATION_HANDLER + " handler found"); } ModifiableSolrParams solrParams = new ModifiableSolrParams(); solrParams.set(ReplicationHandler.MASTER_URL, leaderUrl + "replication"); if (isClosed()) retries = INTERRUPTED; boolean success = replicationHandler.doFetch(solrParams, true); // TODO: look into making sure force=true does not download files we already have if (!success) { throw new SolrException(ErrorCode.SERVER_ERROR, "Replication for recovery failed."); } // solrcloud_debug // try { // RefCounted<SolrIndexSearcher> searchHolder = core.getNewestSearcher(false); // SolrIndexSearcher searcher = searchHolder.get(); // try { // System.out.println(core.getCoreDescriptor().getCoreContainer().getZkController().getNodeName() + " replicated " // + searcher.search(new MatchAllDocsQuery(), 1).totalHits + " from " + leaderUrl + " gen:" + core.getDeletionPolicy().getLatestCommit().getGeneration() + " data:" + core.getDataDir()); // } finally { // searchHolder.decref(); // } // } catch (Exception e) { // // } } }
// in core/src/java/org/apache/solr/cloud/RecoveryStrategy.java
private void commitOnLeader(String leaderUrl) throws MalformedURLException, SolrServerException, IOException { HttpSolrServer server = new HttpSolrServer(leaderUrl); server.setConnectionTimeout(30000); server.setSoTimeout(30000); UpdateRequest ureq = new UpdateRequest(); ureq.setParams(new ModifiableSolrParams()); ureq.getParams().set(DistributedUpdateProcessor.COMMIT_END_POINT, true); ureq.setAction(AbstractUpdateRequest.ACTION.COMMIT, false, true).process( server); server.shutdown(); }
// in core/src/java/org/apache/solr/cloud/RecoveryStrategy.java
private void sendPrepRecoveryCmd(String leaderBaseUrl, String leaderCoreName) throws MalformedURLException, SolrServerException, IOException { HttpSolrServer server = new HttpSolrServer(leaderBaseUrl); server.setConnectionTimeout(45000); server.setSoTimeout(45000); WaitForState prepCmd = new WaitForState(); prepCmd.setCoreName(leaderCoreName); prepCmd.setNodeName(zkController.getNodeName()); prepCmd.setCoreNodeName(coreZkNodeName); prepCmd.setState(ZkStateReader.RECOVERING); prepCmd.setCheckLive(true); prepCmd.setPauseFor(6000); server.request(prepCmd); server.shutdown(); }
(Domain) BindingException 8
              
// in solrj/src/java/org/apache/solr/client/solrj/beans/DocumentObjectBinder.java
private <T> T getBean(Class<T> clazz, List<DocField> fields, SolrDocument solrDoc) { if (fields == null) { fields = getDocFields(clazz); } try { T obj = clazz.newInstance(); for (DocField docField : fields) { docField.inject(obj, solrDoc); } return obj; } catch (Exception e) { throw new BindingException("Could not instantiate object of " + clazz, e); } }
// in solrj/src/java/org/apache/solr/client/solrj/beans/DocumentObjectBinder.java
public SolrInputDocument toSolrInputDocument(Object obj) { List<DocField> fields = getDocFields(obj.getClass()); if (fields.isEmpty()) { throw new BindingException("class: " + obj.getClass() + " does not define any fields."); } SolrInputDocument doc = new SolrInputDocument(); for (DocField field : fields) { if (field.dynamicFieldNamePatternMatcher != null && field.get(obj) != null && field.isContainedInMap) { Map<String, Object> mapValue = (Map<String, Object>) field.get(obj); for (Map.Entry<String, Object> e : mapValue.entrySet()) { doc.setField(e.getKey(), e.getValue(), 1.0f); } } else { doc.setField(field.name, field.get(obj), 1.0f); } } return doc; }
// in solrj/src/java/org/apache/solr/client/solrj/beans/DocumentObjectBinder.java
private void storeType() { if (field != null) { type = field.getType(); } else { Class[] params = setter.getParameterTypes(); if (params.length != 1) { throw new BindingException("Invalid setter method. Must have one and only one parameter"); } type = params[0]; } if(type == Collection.class || type == List.class || type == ArrayList.class) { type = Object.class; isList = true; } else if (type == byte[].class) { //no op } else if (type.isArray()) { isArray = true; type = type.getComponentType(); } else if (type == Map.class || type == HashMap.class) { //corresponding to the support for dynamicFields isContainedInMap = true; //assigned a default type type = Object.class; if (field != null) { if (field.getGenericType() instanceof ParameterizedType) { //check what are the generic values ParameterizedType parameterizedType = (ParameterizedType) field.getGenericType(); Type[] types = parameterizedType.getActualTypeArguments(); if (types != null && types.length == 2 && types[0] == String.class) { //the key should always be String //Raw and primitive types if (types[1] instanceof Class) { //the value could be multivalued then it is a List, Collection, ArrayList if (types[1]== Collection.class || types[1] == List.class || types[1] == ArrayList.class) { type = Object.class; isList = true; } else { //else assume it is a primitive and put in the source type itself type = (Class) types[1]; } } else if (types[1] instanceof ParameterizedType) { //Of all the Parameterized types, only List is supported Type rawType = ((ParameterizedType)types[1]).getRawType(); if(rawType== Collection.class || rawType == List.class || rawType == ArrayList.class){ type = Object.class; isList = true; } } else if (types[1] instanceof GenericArrayType) { //Array types type = (Class) ((GenericArrayType) types[1]).getGenericComponentType(); isArray = true; } else { //Throw an Exception if types are not known throw new BindingException("Allowed type for values of mapping a dynamicField are : " + "Object, Object[] and List"); } } } } } }
// in solrj/src/java/org/apache/solr/client/solrj/beans/DocumentObjectBinder.java
private void set(Object obj, Object v) { if (v != null && type == ByteBuffer.class && v.getClass() == byte[].class) { v = ByteBuffer.wrap((byte[]) v); } try { if (field != null) { field.set(obj, v); } else if (setter != null) { setter.invoke(obj, v); } } catch (Exception e) { throw new BindingException("Exception while setting value : " + v + " on " + (field != null ? field : setter), e); } }
// in solrj/src/java/org/apache/solr/client/solrj/beans/DocumentObjectBinder.java
public Object get(final Object obj) { if (field != null) { try { return field.get(obj); } catch (Exception e) { throw new BindingException("Exception while getting value: " + field, e); } } else if (getter == null) { throw new BindingException("Missing getter for field: " + name + " -- You can only call the 'get' for fields that have a field of 'get' method"); } try { return getter.invoke(obj, (Object[]) null); } catch (Exception e) { throw new BindingException("Exception while getting value: " + getter, e); } }
4
              
// in solrj/src/java/org/apache/solr/client/solrj/beans/DocumentObjectBinder.java
catch (Exception e) { throw new BindingException("Could not instantiate object of " + clazz, e); }
// in solrj/src/java/org/apache/solr/client/solrj/beans/DocumentObjectBinder.java
catch (Exception e) { throw new BindingException("Exception while setting value : " + v + " on " + (field != null ? field : setter), e); }
// in solrj/src/java/org/apache/solr/client/solrj/beans/DocumentObjectBinder.java
catch (Exception e) { throw new BindingException("Exception while getting value: " + field, e); }
// in solrj/src/java/org/apache/solr/client/solrj/beans/DocumentObjectBinder.java
catch (Exception e) { throw new BindingException("Exception while getting value: " + getter, e); }
0
(Lib) XMLStreamException 7
              
// in solrj/src/java/org/apache/solr/client/solrj/impl/XMLResponseParser.java
protected NamedList<Object> readNamedList( XMLStreamReader parser ) throws XMLStreamException { if( XMLStreamConstants.START_ELEMENT != parser.getEventType() ) { throw new RuntimeException( "must be start element, not: "+parser.getEventType() ); } StringBuilder builder = new StringBuilder(); NamedList<Object> nl = new SimpleOrderedMap<Object>(); KnownType type = null; String name = null; // just eat up the events... int depth = 0; while( true ) { switch (parser.next()) { case XMLStreamConstants.START_ELEMENT: depth++; builder.setLength( 0 ); // reset the text type = KnownType.get( parser.getLocalName() ); if( type == null ) { throw new RuntimeException( "this must be known type! not: "+parser.getLocalName() ); } name = null; int cnt = parser.getAttributeCount(); for( int i=0; i<cnt; i++ ) { if( "name".equals( parser.getAttributeLocalName( i ) ) ) { name = parser.getAttributeValue( i ); break; } } /** The name in a NamedList can actually be null if( name == null ) { throw new XMLStreamException( "requires 'name' attribute: "+parser.getLocalName(), parser.getLocation() ); } **/ if( !type.isLeaf ) { switch( type ) { case LST: nl.add( name, readNamedList( parser ) ); depth--; continue; case ARR: nl.add( name, readArray( parser ) ); depth--; continue; case RESULT: nl.add( name, readDocuments( parser ) ); depth--; continue; case DOC: nl.add( name, readDocument( parser ) ); depth--; continue; } throw new XMLStreamException( "branch element not handled!", parser.getLocation() ); } break; case XMLStreamConstants.END_ELEMENT: if( --depth < 0 ) { return nl; } //System.out.println( "NL:ELEM:"+type+"::"+name+"::"+builder ); nl.add( name, type.read( builder.toString().trim() ) ); break; case XMLStreamConstants.SPACE: // TODO? should this be trimmed? make sure it only gets one/two space? case XMLStreamConstants.CDATA: case XMLStreamConstants.CHARACTERS: builder.append( parser.getText() ); break; } } }
// in solrj/src/java/org/apache/solr/client/solrj/impl/XMLResponseParser.java
protected List<Object> readArray( XMLStreamReader parser ) throws XMLStreamException { if( XMLStreamConstants.START_ELEMENT != parser.getEventType() ) { throw new RuntimeException( "must be start element, not: "+parser.getEventType() ); } if( !"arr".equals( parser.getLocalName().toLowerCase(Locale.ENGLISH) ) ) { throw new RuntimeException( "must be 'arr', not: "+parser.getLocalName() ); } StringBuilder builder = new StringBuilder(); KnownType type = null; List<Object> vals = new ArrayList<Object>(); int depth = 0; while( true ) { switch (parser.next()) { case XMLStreamConstants.START_ELEMENT: depth++; KnownType t = KnownType.get( parser.getLocalName() ); if( t == null ) { throw new RuntimeException( "this must be known type! not: "+parser.getLocalName() ); } if( type == null ) { type = t; } /*** actually, there is no rule that arrays need the same type else if( type != t && !(t == KnownType.NULL || type == KnownType.NULL)) { throw new RuntimeException( "arrays must have the same type! ("+type+"!="+t+") "+parser.getLocalName() ); } ***/ type = t; builder.setLength( 0 ); // reset the text if( !type.isLeaf ) { switch( type ) { case LST: vals.add( readNamedList( parser ) ); depth--; continue; case ARR: vals.add( readArray( parser ) ); depth--; continue; case RESULT: vals.add( readDocuments( parser ) ); depth--; continue; case DOC: vals.add( readDocument( parser ) ); depth--; continue; } throw new XMLStreamException( "branch element not handled!", parser.getLocation() ); } break; case XMLStreamConstants.END_ELEMENT: if( --depth < 0 ) { return vals; // the last element is itself } //System.out.println( "ARR:"+type+"::"+builder ); Object val = type.read( builder.toString().trim() ); if( val == null && type != KnownType.NULL) { throw new XMLStreamException( "error reading value:"+type, parser.getLocation() ); } vals.add( val ); break; case XMLStreamConstants.SPACE: // TODO? should this be trimmed? make sure it only gets one/two space? case XMLStreamConstants.CDATA: case XMLStreamConstants.CHARACTERS: builder.append( parser.getText() ); break; } } }
// in solrj/src/java/org/apache/solr/client/solrj/impl/XMLResponseParser.java
protected SolrDocument readDocument( XMLStreamReader parser ) throws XMLStreamException { if( XMLStreamConstants.START_ELEMENT != parser.getEventType() ) { throw new RuntimeException( "must be start element, not: "+parser.getEventType() ); } if( !"doc".equals( parser.getLocalName().toLowerCase(Locale.ENGLISH) ) ) { throw new RuntimeException( "must be 'lst', not: "+parser.getLocalName() ); } SolrDocument doc = new SolrDocument(); StringBuilder builder = new StringBuilder(); KnownType type = null; String name = null; // just eat up the events... int depth = 0; while( true ) { switch (parser.next()) { case XMLStreamConstants.START_ELEMENT: depth++; builder.setLength( 0 ); // reset the text type = KnownType.get( parser.getLocalName() ); if( type == null ) { throw new RuntimeException( "this must be known type! not: "+parser.getLocalName() ); } name = null; int cnt = parser.getAttributeCount(); for( int i=0; i<cnt; i++ ) { if( "name".equals( parser.getAttributeLocalName( i ) ) ) { name = parser.getAttributeValue( i ); break; } } if( name == null ) { throw new XMLStreamException( "requires 'name' attribute: "+parser.getLocalName(), parser.getLocation() ); } // Handle multi-valued fields if( type == KnownType.ARR ) { for( Object val : readArray( parser ) ) { doc.addField( name, val ); } depth--; // the array reading clears out the 'endElement' } else if( type == KnownType.LST ) { doc.addField( name, readNamedList( parser ) ); depth--; } else if( !type.isLeaf ) { System.out.println("nbot leaf!:" + type); throw new XMLStreamException( "must be value or array", parser.getLocation() ); } break; case XMLStreamConstants.END_ELEMENT: if( --depth < 0 ) { return doc; } //System.out.println( "FIELD:"+type+"::"+name+"::"+builder ); Object val = type.read( builder.toString().trim() ); if( val == null ) { throw new XMLStreamException( "error reading value:"+type, parser.getLocation() ); } doc.addField( name, val ); break; case XMLStreamConstants.SPACE: // TODO? should this be trimmed? make sure it only gets one/two space? case XMLStreamConstants.CDATA: case XMLStreamConstants.CHARACTERS: builder.append( parser.getText() ); break; } } }
// in core/src/java/org/apache/solr/util/SystemIdResolver.java
public Object resolveEntity(String publicId, String systemId, String baseURI, String namespace) throws XMLStreamException { try { final InputSource src = SystemIdResolver.this.resolveEntity(null, publicId, baseURI, systemId); return (src == null) ? null : src.getByteStream(); } catch (IOException ioe) { throw new XMLStreamException("Cannot resolve entity", ioe); } }
1
              
// in core/src/java/org/apache/solr/util/SystemIdResolver.java
catch (IOException ioe) { throw new XMLStreamException("Cannot resolve entity", ioe); }
12
              
// in solrj/src/java/org/apache/solr/client/solrj/impl/XMLResponseParser.java
protected NamedList<Object> readNamedList( XMLStreamReader parser ) throws XMLStreamException { if( XMLStreamConstants.START_ELEMENT != parser.getEventType() ) { throw new RuntimeException( "must be start element, not: "+parser.getEventType() ); } StringBuilder builder = new StringBuilder(); NamedList<Object> nl = new SimpleOrderedMap<Object>(); KnownType type = null; String name = null; // just eat up the events... int depth = 0; while( true ) { switch (parser.next()) { case XMLStreamConstants.START_ELEMENT: depth++; builder.setLength( 0 ); // reset the text type = KnownType.get( parser.getLocalName() ); if( type == null ) { throw new RuntimeException( "this must be known type! not: "+parser.getLocalName() ); } name = null; int cnt = parser.getAttributeCount(); for( int i=0; i<cnt; i++ ) { if( "name".equals( parser.getAttributeLocalName( i ) ) ) { name = parser.getAttributeValue( i ); break; } } /** The name in a NamedList can actually be null if( name == null ) { throw new XMLStreamException( "requires 'name' attribute: "+parser.getLocalName(), parser.getLocation() ); } **/ if( !type.isLeaf ) { switch( type ) { case LST: nl.add( name, readNamedList( parser ) ); depth--; continue; case ARR: nl.add( name, readArray( parser ) ); depth--; continue; case RESULT: nl.add( name, readDocuments( parser ) ); depth--; continue; case DOC: nl.add( name, readDocument( parser ) ); depth--; continue; } throw new XMLStreamException( "branch element not handled!", parser.getLocation() ); } break; case XMLStreamConstants.END_ELEMENT: if( --depth < 0 ) { return nl; } //System.out.println( "NL:ELEM:"+type+"::"+name+"::"+builder ); nl.add( name, type.read( builder.toString().trim() ) ); break; case XMLStreamConstants.SPACE: // TODO? should this be trimmed? make sure it only gets one/two space? case XMLStreamConstants.CDATA: case XMLStreamConstants.CHARACTERS: builder.append( parser.getText() ); break; } } }
// in solrj/src/java/org/apache/solr/client/solrj/impl/XMLResponseParser.java
protected List<Object> readArray( XMLStreamReader parser ) throws XMLStreamException { if( XMLStreamConstants.START_ELEMENT != parser.getEventType() ) { throw new RuntimeException( "must be start element, not: "+parser.getEventType() ); } if( !"arr".equals( parser.getLocalName().toLowerCase(Locale.ENGLISH) ) ) { throw new RuntimeException( "must be 'arr', not: "+parser.getLocalName() ); } StringBuilder builder = new StringBuilder(); KnownType type = null; List<Object> vals = new ArrayList<Object>(); int depth = 0; while( true ) { switch (parser.next()) { case XMLStreamConstants.START_ELEMENT: depth++; KnownType t = KnownType.get( parser.getLocalName() ); if( t == null ) { throw new RuntimeException( "this must be known type! not: "+parser.getLocalName() ); } if( type == null ) { type = t; } /*** actually, there is no rule that arrays need the same type else if( type != t && !(t == KnownType.NULL || type == KnownType.NULL)) { throw new RuntimeException( "arrays must have the same type! ("+type+"!="+t+") "+parser.getLocalName() ); } ***/ type = t; builder.setLength( 0 ); // reset the text if( !type.isLeaf ) { switch( type ) { case LST: vals.add( readNamedList( parser ) ); depth--; continue; case ARR: vals.add( readArray( parser ) ); depth--; continue; case RESULT: vals.add( readDocuments( parser ) ); depth--; continue; case DOC: vals.add( readDocument( parser ) ); depth--; continue; } throw new XMLStreamException( "branch element not handled!", parser.getLocation() ); } break; case XMLStreamConstants.END_ELEMENT: if( --depth < 0 ) { return vals; // the last element is itself } //System.out.println( "ARR:"+type+"::"+builder ); Object val = type.read( builder.toString().trim() ); if( val == null && type != KnownType.NULL) { throw new XMLStreamException( "error reading value:"+type, parser.getLocation() ); } vals.add( val ); break; case XMLStreamConstants.SPACE: // TODO? should this be trimmed? make sure it only gets one/two space? case XMLStreamConstants.CDATA: case XMLStreamConstants.CHARACTERS: builder.append( parser.getText() ); break; } } }
// in solrj/src/java/org/apache/solr/client/solrj/impl/XMLResponseParser.java
protected SolrDocumentList readDocuments( XMLStreamReader parser ) throws XMLStreamException { SolrDocumentList docs = new SolrDocumentList(); // Parse the attributes for( int i=0; i<parser.getAttributeCount(); i++ ) { String n = parser.getAttributeLocalName( i ); String v = parser.getAttributeValue( i ); if( "numFound".equals( n ) ) { docs.setNumFound( Long.parseLong( v ) ); } else if( "start".equals( n ) ) { docs.setStart( Long.parseLong( v ) ); } else if( "maxScore".equals( n ) ) { docs.setMaxScore( Float.parseFloat( v ) ); } } // Read through each document int event; while( true ) { event = parser.next(); if( XMLStreamConstants.START_ELEMENT == event ) { if( !"doc".equals( parser.getLocalName() ) ) { throw new RuntimeException( "should be doc! "+parser.getLocalName() + " :: " + parser.getLocation() ); } docs.add( readDocument( parser ) ); } else if ( XMLStreamConstants.END_ELEMENT == event ) { return docs; // only happens once } } }
// in solrj/src/java/org/apache/solr/client/solrj/impl/XMLResponseParser.java
protected SolrDocument readDocument( XMLStreamReader parser ) throws XMLStreamException { if( XMLStreamConstants.START_ELEMENT != parser.getEventType() ) { throw new RuntimeException( "must be start element, not: "+parser.getEventType() ); } if( !"doc".equals( parser.getLocalName().toLowerCase(Locale.ENGLISH) ) ) { throw new RuntimeException( "must be 'lst', not: "+parser.getLocalName() ); } SolrDocument doc = new SolrDocument(); StringBuilder builder = new StringBuilder(); KnownType type = null; String name = null; // just eat up the events... int depth = 0; while( true ) { switch (parser.next()) { case XMLStreamConstants.START_ELEMENT: depth++; builder.setLength( 0 ); // reset the text type = KnownType.get( parser.getLocalName() ); if( type == null ) { throw new RuntimeException( "this must be known type! not: "+parser.getLocalName() ); } name = null; int cnt = parser.getAttributeCount(); for( int i=0; i<cnt; i++ ) { if( "name".equals( parser.getAttributeLocalName( i ) ) ) { name = parser.getAttributeValue( i ); break; } } if( name == null ) { throw new XMLStreamException( "requires 'name' attribute: "+parser.getLocalName(), parser.getLocation() ); } // Handle multi-valued fields if( type == KnownType.ARR ) { for( Object val : readArray( parser ) ) { doc.addField( name, val ); } depth--; // the array reading clears out the 'endElement' } else if( type == KnownType.LST ) { doc.addField( name, readNamedList( parser ) ); depth--; } else if( !type.isLeaf ) { System.out.println("nbot leaf!:" + type); throw new XMLStreamException( "must be value or array", parser.getLocation() ); } break; case XMLStreamConstants.END_ELEMENT: if( --depth < 0 ) { return doc; } //System.out.println( "FIELD:"+type+"::"+name+"::"+builder ); Object val = type.read( builder.toString().trim() ); if( val == null ) { throw new XMLStreamException( "error reading value:"+type, parser.getLocation() ); } doc.addField( name, val ); break; case XMLStreamConstants.SPACE: // TODO? should this be trimmed? make sure it only gets one/two space? case XMLStreamConstants.CDATA: case XMLStreamConstants.CHARACTERS: builder.append( parser.getText() ); break; } } }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/XPathRecordReader.java
private void parse(XMLStreamReader parser, Handler handler, Map<String, Object> values, Stack<Set<String>> stack, // lists of values to purge boolean recordStarted ) throws IOException, XMLStreamException { Set<String> valuesAddedinThisFrame = null; if (isRecord) { // This Node is a match for an XPATH from a forEach attribute, // prepare for the clean up that will occurr when the record // is emitted after its END_ELEMENT is matched recordStarted = true; valuesAddedinThisFrame = new HashSet<String>(); stack.push(valuesAddedinThisFrame); } else if (recordStarted) { // This node is a child of some parent which matched against forEach // attribute. Continue to add values to an existing record. valuesAddedinThisFrame = stack.peek(); } try { /* The input stream has deposited us at this Node in our tree of * intresting nodes. Depending on how this node is of interest, * process further tokens from the input stream and decide what * we do next */ if (attributes != null) { // we interested in storing attributes from the input stream for (Node node : attributes) { String value = parser.getAttributeValue(null, node.name); if (value != null || (recordStarted && !isRecord)) { putText(values, value, node.fieldName, node.multiValued); valuesAddedinThisFrame.add(node.fieldName); } } } Set<Node> childrenFound = new HashSet<Node>(); int event = -1; int flattenedStarts=0; // our tag depth when flattening elements StringBuilder text = new StringBuilder(); while (true) { event = parser.next(); if (event == END_ELEMENT) { if (flattenedStarts > 0) flattenedStarts--; else { if (hasText && valuesAddedinThisFrame != null) { valuesAddedinThisFrame.add(fieldName); putText(values, text.toString(), fieldName, multiValued); } if (isRecord) handler.handle(getDeepCopy(values), forEachPath); if (childNodes != null && recordStarted && !isRecord && !childrenFound.containsAll(childNodes)) { // nonReccord nodes where we have not collected text for ALL // the child nodes. for (Node n : childNodes) { // For the multivalue child nodes where we could have, but // didnt, collect text. Push a null string into values. if (!childrenFound.contains(n)) n.putNulls(values); } } return; } } else if (hasText && (event==CDATA || event==CHARACTERS || event==SPACE)) { text.append(parser.getText()); } else if (event == START_ELEMENT) { if ( flatten ) flattenedStarts++; else handleStartElement(parser, childrenFound, handler, values, stack, recordStarted); } // END_DOCUMENT is least likely to appear and should be // last in if-then-else skip chain else if (event == END_DOCUMENT) return; } }finally { if ((isRecord || !recordStarted) && !stack.empty()) { Set<String> cleanThis = stack.pop(); if (cleanThis != null) { for (String fld : cleanThis) values.remove(fld); } } } }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/XPathRecordReader.java
private void handleStartElement(XMLStreamReader parser, Set<Node> childrenFound, Handler handler, Map<String, Object> values, Stack<Set<String>> stack, boolean recordStarted) throws IOException, XMLStreamException { Node n = getMatchingNode(parser,childNodes); Map<String, Object> decends=new HashMap<String, Object>(); if (n != null) { childrenFound.add(n); n.parse(parser, handler, values, stack, recordStarted); return; } // The stream has diverged from the tree of interesting elements, but // are there any wildCardNodes ... anywhere in our path from the root? Node dn = this; // checking our Node first! do { if (dn.wildCardNodes != null) { // Check to see if the streams tag matches one of the "//" all // decendents type expressions for this node. n = getMatchingNode(parser, dn.wildCardNodes); if (n != null) { childrenFound.add(n); n.parse(parser, handler, values, stack, recordStarted); break; } // add the list of this nodes wild decendents to the cache for (Node nn : dn.wildCardNodes) decends.put(nn.name, nn); } dn = dn.wildAncestor; // leap back along the tree toward root } while (dn != null) ; if (n == null) { // we have a START_ELEMENT which is not within the tree of // interesting nodes. Skip over the contents of this element // but recursivly repeat the above for any START_ELEMENTs // found within this element. int count = 1; // we have had our first START_ELEMENT while (count != 0) { int token = parser.next(); if (token == START_ELEMENT) { Node nn = (Node) decends.get(parser.getLocalName()); if (nn != null) { // We have a //Node which matches the stream's parser.localName childrenFound.add(nn); // Parse the contents of this stream element nn.parse(parser, handler, values, stack, recordStarted); } else count++; } else if (token == END_ELEMENT) count--; } } }
// in core/src/java/org/apache/solr/handler/DocumentAnalysisRequestHandler.java
DocumentAnalysisRequest resolveAnalysisRequest(SolrQueryRequest req) throws IOException, XMLStreamException { DocumentAnalysisRequest request = new DocumentAnalysisRequest(); SolrParams params = req.getParams(); String query = params.get(AnalysisParams.QUERY, params.get(CommonParams.Q, null)); request.setQuery(query); boolean showMatch = params.getBool(AnalysisParams.SHOW_MATCH, false); request.setShowMatch(showMatch); ContentStream stream = extractSingleContentStream(req); InputStream is = null; XMLStreamReader parser = null; try { is = stream.getStream(); final String charset = ContentStreamBase.getCharsetFromContentType(stream.getContentType()); parser = (charset == null) ? inputFactory.createXMLStreamReader(is) : inputFactory.createXMLStreamReader(is, charset); while (true) { int event = parser.next(); switch (event) { case XMLStreamConstants.END_DOCUMENT: { parser.close(); return request; } case XMLStreamConstants.START_ELEMENT: { String currTag = parser.getLocalName(); if ("doc".equals(currTag)) { log.trace("Reading doc..."); SolrInputDocument document = readDocument(parser, req.getSchema()); request.addDocument(document); } break; } } } } finally { if (parser != null) parser.close(); IOUtils.closeQuietly(is); } }
// in core/src/java/org/apache/solr/handler/DocumentAnalysisRequestHandler.java
SolrInputDocument readDocument(XMLStreamReader reader, IndexSchema schema) throws XMLStreamException { SolrInputDocument doc = new SolrInputDocument(); String uniqueKeyField = schema.getUniqueKeyField().getName(); StringBuilder text = new StringBuilder(); String fieldName = null; boolean hasId = false; while (true) { int event = reader.next(); switch (event) { // Add everything to the text case XMLStreamConstants.SPACE: case XMLStreamConstants.CDATA: case XMLStreamConstants.CHARACTERS: text.append(reader.getText()); break; case XMLStreamConstants.END_ELEMENT: if ("doc".equals(reader.getLocalName())) { if (!hasId) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "All documents must contain a unique key value: '" + doc.toString() + "'"); } return doc; } else if ("field".equals(reader.getLocalName())) { doc.addField(fieldName, text.toString(), DEFAULT_BOOST); if (uniqueKeyField.equals(fieldName)) { hasId = true; } } break; case XMLStreamConstants.START_ELEMENT: text.setLength(0); String localName = reader.getLocalName(); if (!"field".equals(localName)) { log.warn("unexpected XML tag doc/" + localName); throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "unexpected XML tag doc/" + localName); } for (int i = 0; i < reader.getAttributeCount(); i++) { String attrName = reader.getAttributeLocalName(i); if ("name".equals(attrName)) { fieldName = reader.getAttributeValue(i); } } break; } } }
// in core/src/java/org/apache/solr/handler/loader/XMLLoader.java
void processUpdate(SolrQueryRequest req, UpdateRequestProcessor processor, XMLStreamReader parser) throws XMLStreamException, IOException, FactoryConfigurationError, InstantiationException, IllegalAccessException, TransformerConfigurationException { AddUpdateCommand addCmd = null; SolrParams params = req.getParams(); while (true) { int event = parser.next(); switch (event) { case XMLStreamConstants.END_DOCUMENT: parser.close(); return; case XMLStreamConstants.START_ELEMENT: String currTag = parser.getLocalName(); if (currTag.equals(UpdateRequestHandler.ADD)) { log.trace("SolrCore.update(add)"); addCmd = new AddUpdateCommand(req); // First look for commitWithin parameter on the request, will be overwritten for individual <add>'s addCmd.commitWithin = params.getInt(UpdateParams.COMMIT_WITHIN, -1); addCmd.overwrite = params.getBool(UpdateParams.OVERWRITE, true); for (int i = 0; i < parser.getAttributeCount(); i++) { String attrName = parser.getAttributeLocalName(i); String attrVal = parser.getAttributeValue(i); if (UpdateRequestHandler.OVERWRITE.equals(attrName)) { addCmd.overwrite = StrUtils.parseBoolean(attrVal); } else if (UpdateRequestHandler.COMMIT_WITHIN.equals(attrName)) { addCmd.commitWithin = Integer.parseInt(attrVal); } else { log.warn("Unknown attribute id in add:" + attrName); } } } else if ("doc".equals(currTag)) { if(addCmd != null) { log.trace("adding doc..."); addCmd.clear(); addCmd.solrDoc = readDoc(parser); processor.processAdd(addCmd); } else { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Unexpected <doc> tag without an <add> tag surrounding it."); } } else if (UpdateRequestHandler.COMMIT.equals(currTag) || UpdateRequestHandler.OPTIMIZE.equals(currTag)) { log.trace("parsing " + currTag); CommitUpdateCommand cmd = new CommitUpdateCommand(req, UpdateRequestHandler.OPTIMIZE.equals(currTag)); ModifiableSolrParams mp = new ModifiableSolrParams(); for (int i = 0; i < parser.getAttributeCount(); i++) { String attrName = parser.getAttributeLocalName(i); String attrVal = parser.getAttributeValue(i); mp.set(attrName, attrVal); } RequestHandlerUtils.validateCommitParams(mp); SolrParams p = SolrParams.wrapDefaults(mp, req.getParams()); // default to the normal request params for commit options RequestHandlerUtils.updateCommit(cmd, p); processor.processCommit(cmd); } // end commit else if (UpdateRequestHandler.ROLLBACK.equals(currTag)) { log.trace("parsing " + currTag); RollbackUpdateCommand cmd = new RollbackUpdateCommand(req); processor.processRollback(cmd); } // end rollback else if (UpdateRequestHandler.DELETE.equals(currTag)) { log.trace("parsing delete"); processDelete(req, processor, parser); } // end delete break; } } }
// in core/src/java/org/apache/solr/handler/loader/XMLLoader.java
void processDelete(SolrQueryRequest req, UpdateRequestProcessor processor, XMLStreamReader parser) throws XMLStreamException, IOException { // Parse the command DeleteUpdateCommand deleteCmd = new DeleteUpdateCommand(req); // First look for commitWithin parameter on the request, will be overwritten for individual <delete>'s SolrParams params = req.getParams(); deleteCmd.commitWithin = params.getInt(UpdateParams.COMMIT_WITHIN, -1); for (int i = 0; i < parser.getAttributeCount(); i++) { String attrName = parser.getAttributeLocalName(i); String attrVal = parser.getAttributeValue(i); if ("fromPending".equals(attrName)) { // deprecated } else if ("fromCommitted".equals(attrName)) { // deprecated } else if (UpdateRequestHandler.COMMIT_WITHIN.equals(attrName)) { deleteCmd.commitWithin = Integer.parseInt(attrVal); } else { log.warn("unexpected attribute delete/@" + attrName); } } StringBuilder text = new StringBuilder(); while (true) { int event = parser.next(); switch (event) { case XMLStreamConstants.START_ELEMENT: String mode = parser.getLocalName(); if (!("id".equals(mode) || "query".equals(mode))) { log.warn("unexpected XML tag /delete/" + mode); throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "unexpected XML tag /delete/" + mode); } text.setLength(0); if ("id".equals(mode)) { for (int i = 0; i < parser.getAttributeCount(); i++) { String attrName = parser.getAttributeLocalName(i); String attrVal = parser.getAttributeValue(i); if (UpdateRequestHandler.VERSION.equals(attrName)) { deleteCmd.setVersion(Long.parseLong(attrVal)); } } } break; case XMLStreamConstants.END_ELEMENT: String currTag = parser.getLocalName(); if ("id".equals(currTag)) { deleteCmd.setId(text.toString()); } else if ("query".equals(currTag)) { deleteCmd.setQuery(text.toString()); } else if ("delete".equals(currTag)) { return; } else { log.warn("unexpected XML tag /delete/" + currTag); throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "unexpected XML tag /delete/" + currTag); } processor.processDelete(deleteCmd); deleteCmd.clear(); break; // Add everything to the text case XMLStreamConstants.SPACE: case XMLStreamConstants.CDATA: case XMLStreamConstants.CHARACTERS: text.append(parser.getText()); break; } } }
// in core/src/java/org/apache/solr/handler/loader/XMLLoader.java
public SolrInputDocument readDoc(XMLStreamReader parser) throws XMLStreamException { SolrInputDocument doc = new SolrInputDocument(); String attrName = ""; for (int i = 0; i < parser.getAttributeCount(); i++) { attrName = parser.getAttributeLocalName(i); if ("boost".equals(attrName)) { doc.setDocumentBoost(Float.parseFloat(parser.getAttributeValue(i))); } else { log.warn("Unknown attribute doc/@" + attrName); } } StringBuilder text = new StringBuilder(); String name = null; float boost = 1.0f; boolean isNull = false; String update = null; while (true) { int event = parser.next(); switch (event) { // Add everything to the text case XMLStreamConstants.SPACE: case XMLStreamConstants.CDATA: case XMLStreamConstants.CHARACTERS: text.append(parser.getText()); break; case XMLStreamConstants.END_ELEMENT: if ("doc".equals(parser.getLocalName())) { return doc; } else if ("field".equals(parser.getLocalName())) { Object v = isNull ? null : text.toString(); if (update != null) { Map<String,Object> extendedValue = new HashMap<String,Object>(1); extendedValue.put(update, v); v = extendedValue; } doc.addField(name, v, boost); boost = 1.0f; } break; case XMLStreamConstants.START_ELEMENT: text.setLength(0); String localName = parser.getLocalName(); if (!"field".equals(localName)) { log.warn("unexpected XML tag doc/" + localName); throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "unexpected XML tag doc/" + localName); } boost = 1.0f; update = null; String attrVal = ""; for (int i = 0; i < parser.getAttributeCount(); i++) { attrName = parser.getAttributeLocalName(i); attrVal = parser.getAttributeValue(i); if ("name".equals(attrName)) { name = attrVal; } else if ("boost".equals(attrName)) { boost = Float.parseFloat(attrVal); } else if ("null".equals(attrName)) { isNull = StrUtils.parseBoolean(attrVal); } else if ("update".equals(attrName)) { update = attrVal; } else { log.warn("Unknown attribute doc/field/@" + attrName); } } break; } } }
// in core/src/java/org/apache/solr/util/SystemIdResolver.java
public XMLResolver asXMLResolver() { return new XMLResolver() { public Object resolveEntity(String publicId, String systemId, String baseURI, String namespace) throws XMLStreamException { try { final InputSource src = SystemIdResolver.this.resolveEntity(null, publicId, baseURI, systemId); return (src == null) ? null : src.getByteStream(); } catch (IOException ioe) { throw new XMLStreamException("Cannot resolve entity", ioe); } } }; }
// in core/src/java/org/apache/solr/util/SystemIdResolver.java
public Object resolveEntity(String publicId, String systemId, String baseURI, String namespace) throws XMLStreamException { try { final InputSource src = SystemIdResolver.this.resolveEntity(null, publicId, baseURI, systemId); return (src == null) ? null : src.getByteStream(); } catch (IOException ioe) { throw new XMLStreamException("Cannot resolve entity", ioe); } }
(Lib) NullPointerException 6
              
// in core/src/java/org/apache/solr/client/solrj/embedded/EmbeddedSolrServer.java
Deprecated public EmbeddedSolrServer( SolrCore core ) { if ( core == null ) { throw new NullPointerException("SolrCore instance required"); } CoreDescriptor dcore = core.getCoreDescriptor(); if (dcore == null) throw new NullPointerException("CoreDescriptor required"); CoreContainer cores = dcore.getCoreContainer(); if (cores == null) throw new NullPointerException("CoreContainer required"); coreName = dcore.getName(); coreContainer = cores; _parser = new SolrRequestParsers( null ); }
// in core/src/java/org/apache/solr/core/CachingDirectoryFactory.java
Override public void release(Directory directory) throws IOException { if (directory == null) { throw new NullPointerException(); } close(directory); }
0 0
(Lib) Exception 5
              
// in solrj/src/java/org/apache/solr/client/solrj/impl/XMLResponseParser.java
private NamedList<Object> processResponse(XMLStreamReader parser) { try { NamedList<Object> response = null; for (int event = parser.next(); event != XMLStreamConstants.END_DOCUMENT; event = parser.next()) { switch (event) { case XMLStreamConstants.START_ELEMENT: if( response != null ) { throw new Exception( "already read the response!" ); } // only top-level element is "response String name = parser.getLocalName(); if( name.equals( "response" ) || name.equals( "result" ) ) { response = readNamedList( parser ); } else if( name.equals( "solr" ) ) { return new SimpleOrderedMap<Object>(); } else { throw new Exception( "really needs to be response or result. " + "not:"+parser.getLocalName() ); } break; } } return response; } catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "parsing error", ex ); } finally { try { parser.close(); } catch( Exception ex ){} } }
// in core/src/java/org/apache/solr/schema/PreAnalyzedField.java
Override protected void init(IndexSchema schema, Map<String, String> args) { super.init(schema, args); String implName = args.get(PARSER_IMPL); if (implName == null) { parser = new JsonPreAnalyzedParser(); } else { try { Class<?> implClazz = Class.forName(implName); if (!PreAnalyzedParser.class.isAssignableFrom(implClazz)) { throw new Exception("must implement " + PreAnalyzedParser.class.getName()); } Constructor<?> c = implClazz.getConstructor(new Class<?>[0]); parser = (PreAnalyzedParser) c.newInstance(new Object[0]); } catch (Exception e) { LOG.warn("Can't use the configured PreAnalyzedParser class '" + implName + "' (" + e.getMessage() + "), using default " + DEFAULT_IMPL); parser = new JsonPreAnalyzedParser(); } } }
// in core/src/java/org/apache/solr/core/RequestHandlers.java
void initHandlersFromConfig(SolrConfig config ){ // use link map so we iterate in the same order Map<PluginInfo,SolrRequestHandler> handlers = new LinkedHashMap<PluginInfo,SolrRequestHandler>(); for (PluginInfo info : config.getPluginInfos(SolrRequestHandler.class.getName())) { try { SolrRequestHandler requestHandler; String startup = info.attributes.get("startup") ; if( startup != null ) { if( "lazy".equals(startup) ) { log.info("adding lazy requestHandler: " + info.className); requestHandler = new LazyRequestHandlerWrapper( core, info.className, info.initArgs ); } else { throw new Exception( "Unknown startup value: '"+startup+"' for: "+info.className ); } } else { requestHandler = core.createRequestHandler(info.className); } handlers.put(info,requestHandler); SolrRequestHandler old = register(info.name, requestHandler); if(old != null) { log.warn("Multiple requestHandler registered to the same name: " + info.name + " ignoring: " + old.getClass().getName()); } if(info.isDefault()){ old = register("",requestHandler); if(old != null) log.warn("Multiple default requestHandler registered" + " ignoring: " + old.getClass().getName()); } log.info("created "+info.name+": " + info.className); } catch (Exception ex) { throw new SolrException (ErrorCode.SERVER_ERROR, "RequestHandler init failure", ex); } } // we've now registered all handlers, time to init them in the same order for (Map.Entry<PluginInfo,SolrRequestHandler> entry : handlers.entrySet()) { PluginInfo info = entry.getKey(); SolrRequestHandler requestHandler = entry.getValue(); if (requestHandler instanceof PluginInfoInitialized) { ((PluginInfoInitialized) requestHandler).init(info); } else{ requestHandler.init(info.initArgs); } } if(get("") == null) register("", get("/select"));//defacto default handler if(get("") == null) register("", get("standard"));//old default handler name; TODO remove? if(get("") == null) log.warn("no default request handler is registered (either '/select' or 'standard')"); }
// in core/src/java/org/apache/solr/core/SolrCore.java
private void initWriters() { // use link map so we iterate in the same order Map<PluginInfo,QueryResponseWriter> writers = new LinkedHashMap<PluginInfo,QueryResponseWriter>(); for (PluginInfo info : solrConfig.getPluginInfos(QueryResponseWriter.class.getName())) { try { QueryResponseWriter writer; String startup = info.attributes.get("startup") ; if( startup != null ) { if( "lazy".equals(startup) ) { log.info("adding lazy queryResponseWriter: " + info.className); writer = new LazyQueryResponseWriterWrapper(this, info.className, info.initArgs ); } else { throw new Exception( "Unknown startup value: '"+startup+"' for: "+info.className ); } } else { writer = createQueryResponseWriter(info.className); } writers.put(info,writer); QueryResponseWriter old = registerResponseWriter(info.name, writer); if(old != null) { log.warn("Multiple queryResponseWriter registered to the same name: " + info.name + " ignoring: " + old.getClass().getName()); } if(info.isDefault()){ if(defaultResponseWriter != null) log.warn("Multiple default queryResponseWriter registered, using: " + info.name); defaultResponseWriter = writer; } log.info("created "+info.name+": " + info.className); } catch (Exception ex) { SolrException e = new SolrException (SolrException.ErrorCode.SERVER_ERROR, "QueryResponseWriter init failure", ex); SolrException.log(log,null,e); throw e; } } // we've now registered all handlers, time to init them in the same order for (Map.Entry<PluginInfo,QueryResponseWriter> entry : writers.entrySet()) { PluginInfo info = entry.getKey(); QueryResponseWriter writer = entry.getValue(); responseWriters.put(info.name, writer); if (writer instanceof PluginInfoInitialized) { ((PluginInfoInitialized) writer).init(info); } else{ writer.init(info.initArgs); } } NamedList emptyList = new NamedList(); for (Map.Entry<String, QueryResponseWriter> entry : DEFAULT_RESPONSE_WRITERS.entrySet()) { if(responseWriters.get(entry.getKey()) == null) { responseWriters.put(entry.getKey(), entry.getValue()); // call init so any logic in the default writers gets invoked entry.getValue().init(emptyList); } } // configure the default response writer; this one should never be null if (defaultResponseWriter == null) { defaultResponseWriter = responseWriters.get("standard"); } }
0 77
              
// in contrib/extraction/src/java/org/apache/solr/handler/extraction/ExtractingDocumentLoader.java
Override public void load(SolrQueryRequest req, SolrQueryResponse rsp, ContentStream stream, UpdateRequestProcessor processor) throws Exception { Parser parser = null; String streamType = req.getParams().get(ExtractingParams.STREAM_TYPE, null); if (streamType != null) { //Cache? Parsers are lightweight to construct and thread-safe, so I'm told MediaType mt = MediaType.parse(streamType.trim().toLowerCase(Locale.ENGLISH)); parser = new DefaultParser(config.getMediaTypeRegistry()).getParsers().get(mt); } else { parser = autoDetectParser; } if (parser != null) { Metadata metadata = new Metadata(); // If you specify the resource name (the filename, roughly) with this parameter, // then Tika can make use of it in guessing the appropriate MIME type: String resourceName = req.getParams().get(ExtractingParams.RESOURCE_NAME, null); if (resourceName != null) { metadata.add(TikaMetadataKeys.RESOURCE_NAME_KEY, resourceName); } // Provide stream's content type as hint for auto detection if(stream.getContentType() != null) { metadata.add(HttpHeaders.CONTENT_TYPE, stream.getContentType()); } InputStream inputStream = null; try { inputStream = stream.getStream(); metadata.add(ExtractingMetadataConstants.STREAM_NAME, stream.getName()); metadata.add(ExtractingMetadataConstants.STREAM_SOURCE_INFO, stream.getSourceInfo()); metadata.add(ExtractingMetadataConstants.STREAM_SIZE, String.valueOf(stream.getSize())); metadata.add(ExtractingMetadataConstants.STREAM_CONTENT_TYPE, stream.getContentType()); // HtmlParser and TXTParser regard Metadata.CONTENT_ENCODING in metadata String charset = ContentStreamBase.getCharsetFromContentType(stream.getContentType()); if(charset != null){ metadata.add(HttpHeaders.CONTENT_ENCODING, charset); } String xpathExpr = params.get(ExtractingParams.XPATH_EXPRESSION); boolean extractOnly = params.getBool(ExtractingParams.EXTRACT_ONLY, false); SolrContentHandler handler = factory.createSolrContentHandler(metadata, params, schema); ContentHandler parsingHandler = handler; StringWriter writer = null; BaseMarkupSerializer serializer = null; if (extractOnly == true) { String extractFormat = params.get(ExtractingParams.EXTRACT_FORMAT, "xml"); writer = new StringWriter(); if (extractFormat.equals(TEXT_FORMAT)) { serializer = new TextSerializer(); serializer.setOutputCharStream(writer); serializer.setOutputFormat(new OutputFormat("Text", "UTF-8", true)); } else { serializer = new XMLSerializer(writer, new OutputFormat("XML", "UTF-8", true)); } if (xpathExpr != null) { Matcher matcher = PARSER.parse(xpathExpr); serializer.startDocument();//The MatchingContentHandler does not invoke startDocument. See http://tika.markmail.org/message/kknu3hw7argwiqin parsingHandler = new MatchingContentHandler(serializer, matcher); } else { parsingHandler = serializer; } } else if (xpathExpr != null) { Matcher matcher = PARSER.parse(xpathExpr); parsingHandler = new MatchingContentHandler(handler, matcher); } //else leave it as is try{ //potentially use a wrapper handler for parsing, but we still need the SolrContentHandler for getting the document. ParseContext context = new ParseContext();//TODO: should we design a way to pass in parse context? parser.parse(inputStream, parsingHandler, metadata, context); } catch (TikaException e) { if(ignoreTikaException) log.warn(new StringBuilder("skip extracting text due to ").append(e.getLocalizedMessage()) .append(". metadata=").append(metadata.toString()).toString()); else throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); } if (extractOnly == false) { addDoc(handler); } else { //serializer is not null, so we need to call endDoc on it if using xpath if (xpathExpr != null){ serializer.endDocument(); } rsp.add(stream.getName(), writer.toString()); writer.close(); String[] names = metadata.names(); NamedList metadataNL = new NamedList(); for (int i = 0; i < names.length; i++) { String[] vals = metadata.getValues(names[i]); metadataNL.add(names[i], vals); } rsp.add(stream.getName() + "_metadata", metadataNL); } } catch (SAXException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); } finally { IOUtils.closeQuietly(inputStream); } } else { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Stream type of " + streamType + " didn't match any known parsers. Please supply the " + ExtractingParams.STREAM_TYPE + " parameter."); } }
// in contrib/dataimporthandler-extras/src/java/org/apache/solr/handler/dataimport/MailEntityProcessor.java
public void addPartToDocument(Part part, Map<String, Object> row, boolean outerMost) throws Exception { if (part instanceof Message) { addEnvelopToDocument(part, row); } String ct = part.getContentType(); ContentType ctype = new ContentType(ct); if (part.isMimeType("multipart/*")) { Multipart mp = (Multipart) part.getContent(); int count = mp.getCount(); if (part.isMimeType("multipart/alternative")) count = 1; for (int i = 0; i < count; i++) addPartToDocument(mp.getBodyPart(i), row, false); } else if (part.isMimeType("message/rfc822")) { addPartToDocument((Part) part.getContent(), row, false); } else { String disp = part.getDisposition(); if (!processAttachment || (disp != null && disp.equalsIgnoreCase(Part.ATTACHMENT))) return; InputStream is = part.getInputStream(); String fileName = part.getFileName(); Metadata md = new Metadata(); md.set(HttpHeaders.CONTENT_TYPE, ctype.getBaseType().toLowerCase(Locale.ENGLISH)); md.set(TikaMetadataKeys.RESOURCE_NAME_KEY, fileName); String content = tika.parseToString(is, md); if (disp != null && disp.equalsIgnoreCase(Part.ATTACHMENT)) { if (row.get(ATTACHMENT) == null) row.put(ATTACHMENT, new ArrayList<String>()); List<String> contents = (List<String>) row.get(ATTACHMENT); contents.add(content); row.put(ATTACHMENT, contents); if (row.get(ATTACHMENT_NAMES) == null) row.put(ATTACHMENT_NAMES, new ArrayList<String>()); List<String> names = (List<String>) row.get(ATTACHMENT_NAMES); names.add(fileName); row.put(ATTACHMENT_NAMES, names); } else { if (row.get(CONTENT) == null) row.put(CONTENT, new ArrayList<String>()); List<String> contents = (List<String>) row.get(CONTENT); contents.add(content); row.put(CONTENT, contents); } } }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/JdbcDataSource.java
protected Callable<Connection> createConnectionFactory(final Context context, final Properties initProps) { // final VariableResolver resolver = context.getVariableResolver(); resolveVariables(context, initProps); final String jndiName = initProps.getProperty(JNDI_NAME); final String url = initProps.getProperty(URL); final String driver = initProps.getProperty(DRIVER); if (url == null && jndiName == null) throw new DataImportHandlerException(SEVERE, "JDBC URL or JNDI name has to be specified"); if (driver != null) { try { DocBuilder.loadClass(driver, context.getSolrCore()); } catch (ClassNotFoundException e) { wrapAndThrow(SEVERE, e, "Could not load driver: " + driver); } } else { if(jndiName == null){ throw new DataImportHandlerException(SEVERE, "One of driver or jndiName must be specified in the data source"); } } String s = initProps.getProperty("maxRows"); if (s != null) { maxRows = Integer.parseInt(s); } return factory = new Callable<Connection>() { public Connection call() throws Exception { LOG.info("Creating a connection for entity " + context.getEntityAttribute(DataImporter.NAME) + " with URL: " + url); long start = System.currentTimeMillis(); Connection c = null; try { if(url != null){ c = DriverManager.getConnection(url, initProps); } else if(jndiName != null){ InitialContext ctx = new InitialContext(); Object jndival = ctx.lookup(jndiName); if (jndival instanceof javax.sql.DataSource) { javax.sql.DataSource dataSource = (javax.sql.DataSource) jndival; String user = (String) initProps.get("user"); String pass = (String) initProps.get("password"); if(user == null || user.trim().equals("")){ c = dataSource.getConnection(); } else { c = dataSource.getConnection(user, pass); } } else { throw new DataImportHandlerException(SEVERE, "the jndi name : '"+jndiName +"' is not a valid javax.sql.DataSource"); } } } catch (SQLException e) { // DriverManager does not allow you to use a driver which is not loaded through // the class loader of the class which is trying to make the connection. // This is a workaround for cases where the user puts the driver jar in the // solr.home/lib or solr.home/core/lib directories. Driver d = (Driver) DocBuilder.loadClass(driver, context.getSolrCore()).newInstance(); c = d.connect(url, initProps); } if (c != null) { if (Boolean.parseBoolean(initProps.getProperty("readOnly"))) { c.setReadOnly(true); // Add other sane defaults c.setAutoCommit(true); c.setTransactionIsolation(Connection.TRANSACTION_READ_UNCOMMITTED); c.setHoldability(ResultSet.CLOSE_CURSORS_AT_COMMIT); } if (!Boolean.parseBoolean(initProps.getProperty("autoCommit"))) { c.setAutoCommit(false); } String transactionIsolation = initProps.getProperty("transactionIsolation"); if ("TRANSACTION_READ_UNCOMMITTED".equals(transactionIsolation)) { c.setTransactionIsolation(Connection.TRANSACTION_READ_UNCOMMITTED); } else if ("TRANSACTION_READ_COMMITTED".equals(transactionIsolation)) { c.setTransactionIsolation(Connection.TRANSACTION_READ_COMMITTED); } else if ("TRANSACTION_REPEATABLE_READ".equals(transactionIsolation)) { c.setTransactionIsolation(Connection.TRANSACTION_REPEATABLE_READ); } else if ("TRANSACTION_SERIALIZABLE".equals(transactionIsolation)) { c.setTransactionIsolation(Connection.TRANSACTION_SERIALIZABLE); } else if ("TRANSACTION_NONE".equals(transactionIsolation)) { c.setTransactionIsolation(Connection.TRANSACTION_NONE); } String holdability = initProps.getProperty("holdability"); if ("CLOSE_CURSORS_AT_COMMIT".equals(holdability)) { c.setHoldability(ResultSet.CLOSE_CURSORS_AT_COMMIT); } else if ("HOLD_CURSORS_OVER_COMMIT".equals(holdability)) { c.setHoldability(ResultSet.HOLD_CURSORS_OVER_COMMIT); } } LOG.info("Time taken for getConnection(): " + (System.currentTimeMillis() - start)); return c; } }; }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/JdbcDataSource.java
public Connection call() throws Exception { LOG.info("Creating a connection for entity " + context.getEntityAttribute(DataImporter.NAME) + " with URL: " + url); long start = System.currentTimeMillis(); Connection c = null; try { if(url != null){ c = DriverManager.getConnection(url, initProps); } else if(jndiName != null){ InitialContext ctx = new InitialContext(); Object jndival = ctx.lookup(jndiName); if (jndival instanceof javax.sql.DataSource) { javax.sql.DataSource dataSource = (javax.sql.DataSource) jndival; String user = (String) initProps.get("user"); String pass = (String) initProps.get("password"); if(user == null || user.trim().equals("")){ c = dataSource.getConnection(); } else { c = dataSource.getConnection(user, pass); } } else { throw new DataImportHandlerException(SEVERE, "the jndi name : '"+jndiName +"' is not a valid javax.sql.DataSource"); } } } catch (SQLException e) { // DriverManager does not allow you to use a driver which is not loaded through // the class loader of the class which is trying to make the connection. // This is a workaround for cases where the user puts the driver jar in the // solr.home/lib or solr.home/core/lib directories. Driver d = (Driver) DocBuilder.loadClass(driver, context.getSolrCore()).newInstance(); c = d.connect(url, initProps); } if (c != null) { if (Boolean.parseBoolean(initProps.getProperty("readOnly"))) { c.setReadOnly(true); // Add other sane defaults c.setAutoCommit(true); c.setTransactionIsolation(Connection.TRANSACTION_READ_UNCOMMITTED); c.setHoldability(ResultSet.CLOSE_CURSORS_AT_COMMIT); } if (!Boolean.parseBoolean(initProps.getProperty("autoCommit"))) { c.setAutoCommit(false); } String transactionIsolation = initProps.getProperty("transactionIsolation"); if ("TRANSACTION_READ_UNCOMMITTED".equals(transactionIsolation)) { c.setTransactionIsolation(Connection.TRANSACTION_READ_UNCOMMITTED); } else if ("TRANSACTION_READ_COMMITTED".equals(transactionIsolation)) { c.setTransactionIsolation(Connection.TRANSACTION_READ_COMMITTED); } else if ("TRANSACTION_REPEATABLE_READ".equals(transactionIsolation)) { c.setTransactionIsolation(Connection.TRANSACTION_REPEATABLE_READ); } else if ("TRANSACTION_SERIALIZABLE".equals(transactionIsolation)) { c.setTransactionIsolation(Connection.TRANSACTION_SERIALIZABLE); } else if ("TRANSACTION_NONE".equals(transactionIsolation)) { c.setTransactionIsolation(Connection.TRANSACTION_NONE); } String holdability = initProps.getProperty("holdability"); if ("CLOSE_CURSORS_AT_COMMIT".equals(holdability)) { c.setHoldability(ResultSet.CLOSE_CURSORS_AT_COMMIT); } else if ("HOLD_CURSORS_OVER_COMMIT".equals(holdability)) { c.setHoldability(ResultSet.HOLD_CURSORS_OVER_COMMIT); } } LOG.info("Time taken for getConnection(): " + (System.currentTimeMillis() - start)); return c; }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/JdbcDataSource.java
private Connection getConnection() throws Exception { long currTime = System.currentTimeMillis(); if (currTime - connLastUsed > CONN_TIME_OUT) { synchronized (this) { Connection tmpConn = factory.call(); closeConnection(); connLastUsed = System.currentTimeMillis(); return conn = tmpConn; } } else { connLastUsed = currTime; return conn; } }
// in core/src/java/org/apache/solr/handler/DocumentAnalysisRequestHandler.java
Override protected NamedList doAnalysis(SolrQueryRequest req) throws Exception { DocumentAnalysisRequest analysisRequest = resolveAnalysisRequest(req); return handleAnalysisRequest(analysisRequest, req.getSchema()); }
// in core/src/java/org/apache/solr/handler/UpdateRequestHandler.java
Override public void load(SolrQueryRequest req, SolrQueryResponse rsp, ContentStream stream, UpdateRequestProcessor processor) throws Exception { String type = req.getParams().get(UpdateParams.ASSUME_CONTENT_TYPE); if(type == null) { type = stream.getContentType(); } if( type == null ) { // Normal requests will not get here. throw new SolrException(ErrorCode.BAD_REQUEST, "Missing ContentType"); } int idx = type.indexOf(';'); if(idx>0) { type = type.substring(0,idx); } ContentStreamLoader loader = loaders.get(type); if(loader==null) { throw new SolrException(ErrorCode.BAD_REQUEST, "Unsupported ContentType: " +type+ " Not in: "+loaders.keySet()); } if(loader.getDefaultWT()!=null) { setDefaultWT(req,loader); } loader.load(req, rsp, stream, processor); }
// in core/src/java/org/apache/solr/handler/AnalysisRequestHandlerBase.java
Override public void handleRequestBody(SolrQueryRequest req, SolrQueryResponse rsp) throws Exception { rsp.add("analysis", doAnalysis(req)); }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
private void terminateAndWaitFsyncService() throws Exception { if (fsyncService.isTerminated()) return; fsyncService.shutdown(); // give a long wait say 1 hr fsyncService.awaitTermination(3600, TimeUnit.SECONDS); // if any fsync failed, throw that exception back Exception fsyncExceptionCopy = fsyncException; if (fsyncExceptionCopy != null) throw fsyncExceptionCopy; }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
private void downloadConfFiles(List<Map<String, Object>> confFilesToDownload, long latestGeneration) throws Exception { LOG.info("Starting download of configuration files from master: " + confFilesToDownload); confFilesDownloaded = Collections.synchronizedList(new ArrayList<Map<String, Object>>()); File tmpconfDir = new File(solrCore.getResourceLoader().getConfigDir(), "conf." + getDateAsStr(new Date())); try { boolean status = tmpconfDir.mkdirs(); if (!status) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Failed to create temporary config folder: " + tmpconfDir.getName()); } for (Map<String, Object> file : confFilesToDownload) { String saveAs = (String) (file.get(ALIAS) == null ? file.get(NAME) : file.get(ALIAS)); fileFetcher = new FileFetcher(tmpconfDir, file, saveAs, true, latestGeneration); currentFile = file; fileFetcher.fetchFile(); confFilesDownloaded.add(new HashMap<String, Object>(file)); } // this is called before copying the files to the original conf dir // so that if there is an exception avoid corrupting the original files. terminateAndWaitFsyncService(); copyTmpConfFiles2Conf(tmpconfDir); } finally { delTree(tmpconfDir); } }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
private void downloadIndexFiles(boolean downloadCompleteIndex, File tmpIdxDir, long latestGeneration) throws Exception { for (Map<String, Object> file : filesToDownload) { File localIndexFile = new File(solrCore.getIndexDir(), (String) file.get(NAME)); if (!localIndexFile.exists() || downloadCompleteIndex) { fileFetcher = new FileFetcher(tmpIdxDir, file, (String) file.get(NAME), false, latestGeneration); currentFile = file; fileFetcher.fetchFile(); filesDownloaded.add(new HashMap<String, Object>(file)); } else { LOG.info("Skipping download for " + localIndexFile); } } }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
void fetchFile() throws Exception { try { while (true) { final FastInputStream is = getStream(); int result; try { //fetch packets one by one in a single request result = fetchPackets(is); if (result == 0 || result == NO_CONTENT) { // if the file is downloaded properly set the // timestamp same as that in the server if (file.exists() && lastmodified > 0) file.setLastModified(lastmodified); return; } //if there is an error continue. But continue from the point where it got broken } finally { IOUtils.closeQuietly(is); } } } finally { cleanup(); //if cleanup suceeds . The file is downloaded fully. do an fsync fsyncService.submit(new Runnable(){ public void run() { try { FileUtils.sync(file); } catch (IOException e) { fsyncException = e; } } }); } }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
private int fetchPackets(FastInputStream fis) throws Exception { byte[] intbytes = new byte[4]; byte[] longbytes = new byte[8]; try { while (true) { if (stop) { stop = false; aborted = true; throw new ReplicationHandlerException("User aborted replication"); } long checkSumServer = -1; fis.readFully(intbytes); //read the size of the packet int packetSize = readInt(intbytes); if (packetSize <= 0) { LOG.warn("No content recieved for file: " + currentFile); return NO_CONTENT; } if (buf.length < packetSize) buf = new byte[packetSize]; if (checksum != null) { //read the checksum fis.readFully(longbytes); checkSumServer = readLong(longbytes); } //then read the packet of bytes fis.readFully(buf, 0, packetSize); //compare the checksum as sent from the master if (includeChecksum) { checksum.reset(); checksum.update(buf, 0, packetSize); long checkSumClient = checksum.getValue(); if (checkSumClient != checkSumServer) { LOG.error("Checksum not matched between client and server for: " + currentFile); //if checksum is wrong it is a problem return for retry return 1; } } //if everything is fine, write down the packet to the file fileChannel.write(ByteBuffer.wrap(buf, 0, packetSize)); bytesDownloaded += packetSize; if (bytesDownloaded >= size) return 0; //errorcount is always set to zero after a successful packet errorCount = 0; } } catch (ReplicationHandlerException e) { throw e; } catch (Exception e) { LOG.warn("Error in fetching packets ", e); //for any failure , increment the error count errorCount++; //if it fails for the same pacaket for MAX_RETRIES fail and come out if (errorCount > MAX_RETRIES) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Fetch failed for file:" + fileName, e); } return ERR; } }
// in core/src/java/org/apache/solr/handler/loader/XMLLoader.java
Override public void load(SolrQueryRequest req, SolrQueryResponse rsp, ContentStream stream, UpdateRequestProcessor processor) throws Exception { final String charset = ContentStreamBase.getCharsetFromContentType(stream.getContentType()); InputStream is = null; XMLStreamReader parser = null; String tr = req.getParams().get(CommonParams.TR,null); if(tr!=null) { Transformer t = getTransformer(tr,req); final DOMResult result = new DOMResult(); // first step: read XML and build DOM using Transformer (this is no overhead, as XSL always produces // an internal result DOM tree, we just access it directly as input for StAX): try { is = stream.getStream(); final InputSource isrc = new InputSource(is); isrc.setEncoding(charset); final SAXSource source = new SAXSource(isrc); t.transform(source, result); } catch(TransformerException te) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, te.getMessage(), te); } finally { IOUtils.closeQuietly(is); } // second step feed the intermediate DOM tree into StAX parser: try { parser = inputFactory.createXMLStreamReader(new DOMSource(result.getNode())); this.processUpdate(req, processor, parser); } catch (XMLStreamException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e.getMessage(), e); } finally { if (parser != null) parser.close(); } } // Normal XML Loader else { try { is = stream.getStream(); if (UpdateRequestHandler.log.isTraceEnabled()) { final byte[] body = IOUtils.toByteArray(is); // TODO: The charset may be wrong, as the real charset is later // determined by the XML parser, the content-type is only used as a hint! UpdateRequestHandler.log.trace("body", new String(body, (charset == null) ? ContentStreamBase.DEFAULT_CHARSET : charset)); IOUtils.closeQuietly(is); is = new ByteArrayInputStream(body); } parser = (charset == null) ? inputFactory.createXMLStreamReader(is) : inputFactory.createXMLStreamReader(is, charset); this.processUpdate(req, processor, parser); } catch (XMLStreamException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e.getMessage(), e); } finally { if (parser != null) parser.close(); IOUtils.closeQuietly(is); } } }
// in core/src/java/org/apache/solr/handler/loader/JavabinLoader.java
Override public void load(SolrQueryRequest req, SolrQueryResponse rsp, ContentStream stream, UpdateRequestProcessor processor) throws Exception { InputStream is = null; try { is = stream.getStream(); parseAndLoadDocs(req, rsp, is, processor); } finally { if(is != null) { is.close(); } } }
// in core/src/java/org/apache/solr/handler/loader/JsonLoader.java
Override public void load(SolrQueryRequest req, SolrQueryResponse rsp, ContentStream stream, UpdateRequestProcessor processor) throws Exception { new SingleThreadedJsonLoader(req,processor).load(req, rsp, stream, processor); }
// in core/src/java/org/apache/solr/handler/loader/JsonLoader.java
Override public void load(SolrQueryRequest req, SolrQueryResponse rsp, ContentStream stream, UpdateRequestProcessor processor) throws Exception { Reader reader = null; try { reader = stream.getReader(); if (log.isTraceEnabled()) { String body = IOUtils.toString(reader); log.trace("body", body); reader = new StringReader(body); } parser = new JSONParser(reader); this.processUpdate(); } finally { IOUtils.closeQuietly(reader); } }
// in core/src/java/org/apache/solr/handler/loader/CSVLoader.java
Override public void load(SolrQueryRequest req, SolrQueryResponse rsp, ContentStream stream, UpdateRequestProcessor processor) throws Exception { new SingleThreadedCSVLoader(req,processor).load(req, rsp, stream, processor); }
// in core/src/java/org/apache/solr/handler/FieldAnalysisRequestHandler.java
Override protected NamedList doAnalysis(SolrQueryRequest req) throws Exception { FieldAnalysisRequest analysisRequest = resolveAnalysisRequest(req); IndexSchema indexSchema = req.getCore().getSchema(); return handleAnalysisRequest(analysisRequest, indexSchema); }
// in core/src/java/org/apache/solr/handler/ReplicationHandler.java
Override public void handleRequestBody(SolrQueryRequest req, SolrQueryResponse rsp) throws Exception { rsp.setHttpCaching(false); final SolrParams solrParams = req.getParams(); String command = solrParams.get(COMMAND); if (command == null) { rsp.add(STATUS, OK_STATUS); rsp.add("message", "No command"); return; } // This command does not give the current index version of the master // It gives the current 'replicateable' index version if (command.equals(CMD_INDEX_VERSION)) { IndexCommit commitPoint = indexCommitPoint; // make a copy so it won't change if (commitPoint == null) { // if this handler is 'lazy', we may not have tracked the last commit // because our commit listener is registered on inform commitPoint = core.getDeletionPolicy().getLatestCommit(); } if (commitPoint != null && replicationEnabled.get()) { // // There is a race condition here. The commit point may be changed / deleted by the time // we get around to reserving it. This is a very small window though, and should not result // in a catastrophic failure, but will result in the client getting an empty file list for // the CMD_GET_FILE_LIST command. // core.getDeletionPolicy().setReserveDuration(commitPoint.getGeneration(), reserveCommitDuration); rsp.add(CMD_INDEX_VERSION, IndexDeletionPolicyWrapper.getCommitTimestamp(commitPoint)); rsp.add(GENERATION, commitPoint.getGeneration()); } else { // This happens when replication is not configured to happen after startup and no commit/optimize // has happened yet. rsp.add(CMD_INDEX_VERSION, 0L); rsp.add(GENERATION, 0L); } } else if (command.equals(CMD_GET_FILE)) { getFileStream(solrParams, rsp); } else if (command.equals(CMD_GET_FILE_LIST)) { getFileList(solrParams, rsp); } else if (command.equalsIgnoreCase(CMD_BACKUP)) { doSnapShoot(new ModifiableSolrParams(solrParams), rsp,req); rsp.add(STATUS, OK_STATUS); } else if (command.equalsIgnoreCase(CMD_FETCH_INDEX)) { String masterUrl = solrParams.get(MASTER_URL); if (!isSlave && masterUrl == null) { rsp.add(STATUS,ERR_STATUS); rsp.add("message","No slave configured or no 'masterUrl' Specified"); return; } final SolrParams paramsCopy = new ModifiableSolrParams(solrParams); new Thread() { @Override public void run() { doFetch(paramsCopy, false); } }.start(); rsp.add(STATUS, OK_STATUS); } else if (command.equalsIgnoreCase(CMD_DISABLE_POLL)) { if (snapPuller != null){ snapPuller.disablePoll(); rsp.add(STATUS, OK_STATUS); } else { rsp.add(STATUS, ERR_STATUS); rsp.add("message","No slave configured"); } } else if (command.equalsIgnoreCase(CMD_ENABLE_POLL)) { if (snapPuller != null){ snapPuller.enablePoll(); rsp.add(STATUS, OK_STATUS); }else { rsp.add(STATUS,ERR_STATUS); rsp.add("message","No slave configured"); } } else if (command.equalsIgnoreCase(CMD_ABORT_FETCH)) { SnapPuller temp = tempSnapPuller; if (temp != null){ temp.abortPull(); rsp.add(STATUS, OK_STATUS); } else { rsp.add(STATUS,ERR_STATUS); rsp.add("message","No slave configured"); } } else if (command.equals(CMD_FILE_CHECKSUM)) { // this command is not used by anyone getFileChecksum(solrParams, rsp); } else if (command.equals(CMD_SHOW_COMMITS)) { rsp.add(CMD_SHOW_COMMITS, getCommits()); } else if (command.equals(CMD_DETAILS)) { rsp.add(CMD_DETAILS, getReplicationDetails(solrParams.getBool("slave",true))); RequestHandlerUtils.addExperimentalFormatWarning(rsp); } else if (CMD_ENABLE_REPL.equalsIgnoreCase(command)) { replicationEnabled.set(true); rsp.add(STATUS, OK_STATUS); } else if (CMD_DISABLE_REPL.equalsIgnoreCase(command)) { replicationEnabled.set(false); rsp.add(STATUS, OK_STATUS); } }
// in core/src/java/org/apache/solr/handler/MoreLikeThisHandler.java
Override public void handleRequestBody(SolrQueryRequest req, SolrQueryResponse rsp) throws Exception { SolrParams params = req.getParams(); // Set field flags ReturnFields returnFields = new ReturnFields( req ); rsp.setReturnFields( returnFields ); int flags = 0; if (returnFields.wantsScore()) { flags |= SolrIndexSearcher.GET_SCORES; } String defType = params.get(QueryParsing.DEFTYPE, QParserPlugin.DEFAULT_QTYPE); String q = params.get( CommonParams.Q ); Query query = null; SortSpec sortSpec = null; List<Query> filters = null; try { if (q != null) { QParser parser = QParser.getParser(q, defType, req); query = parser.getQuery(); sortSpec = parser.getSort(true); } String[] fqs = req.getParams().getParams(CommonParams.FQ); if (fqs!=null && fqs.length!=0) { filters = new ArrayList<Query>(); for (String fq : fqs) { if (fq != null && fq.trim().length()!=0) { QParser fqp = QParser.getParser(fq, null, req); filters.add(fqp.getQuery()); } } } } catch (ParseException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); } SolrIndexSearcher searcher = req.getSearcher(); MoreLikeThisHelper mlt = new MoreLikeThisHelper( params, searcher ); // Hold on to the interesting terms if relevant TermStyle termStyle = TermStyle.get( params.get( MoreLikeThisParams.INTERESTING_TERMS ) ); List<InterestingTerm> interesting = (termStyle == TermStyle.NONE ) ? null : new ArrayList<InterestingTerm>( mlt.mlt.getMaxQueryTerms() ); DocListAndSet mltDocs = null; // Parse Required Params // This will either have a single Reader or valid query Reader reader = null; try { if (q == null || q.trim().length() < 1) { Iterable<ContentStream> streams = req.getContentStreams(); if (streams != null) { Iterator<ContentStream> iter = streams.iterator(); if (iter.hasNext()) { reader = iter.next().getReader(); } if (iter.hasNext()) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "MoreLikeThis does not support multiple ContentStreams"); } } } int start = params.getInt(CommonParams.START, 0); int rows = params.getInt(CommonParams.ROWS, 10); // Find documents MoreLikeThis - either with a reader or a query // -------------------------------------------------------------------------------- if (reader != null) { mltDocs = mlt.getMoreLikeThis(reader, start, rows, filters, interesting, flags); } else if (q != null) { // Matching options boolean includeMatch = params.getBool(MoreLikeThisParams.MATCH_INCLUDE, true); int matchOffset = params.getInt(MoreLikeThisParams.MATCH_OFFSET, 0); // Find the base match DocList match = searcher.getDocList(query, null, null, matchOffset, 1, flags); // only get the first one... if (includeMatch) { rsp.add("match", match); } // This is an iterator, but we only handle the first match DocIterator iterator = match.iterator(); if (iterator.hasNext()) { // do a MoreLikeThis query for each document in results int id = iterator.nextDoc(); mltDocs = mlt.getMoreLikeThis(id, start, rows, filters, interesting, flags); } } else { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "MoreLikeThis requires either a query (?q=) or text to find similar documents."); } } finally { if (reader != null) { reader.close(); } } if( mltDocs == null ) { mltDocs = new DocListAndSet(); // avoid NPE } rsp.add( "response", mltDocs.docList ); if( interesting != null ) { if( termStyle == TermStyle.DETAILS ) { NamedList<Float> it = new NamedList<Float>(); for( InterestingTerm t : interesting ) { it.add( t.term.toString(), t.boost ); } rsp.add( "interestingTerms", it ); } else { List<String> it = new ArrayList<String>( interesting.size() ); for( InterestingTerm t : interesting ) { it.add( t.term.text()); } rsp.add( "interestingTerms", it ); } } // maybe facet the results if (params.getBool(FacetParams.FACET,false)) { if( mltDocs.docSet == null ) { rsp.add( "facet_counts", null ); } else { SimpleFacets f = new SimpleFacets(req, mltDocs.docSet, params ); rsp.add( "facet_counts", f.getFacetCounts() ); } } boolean dbg = req.getParams().getBool(CommonParams.DEBUG_QUERY, false); boolean dbgQuery = false, dbgResults = false; if (dbg == false){//if it's true, we are doing everything anyway. String[] dbgParams = req.getParams().getParams(CommonParams.DEBUG); if (dbgParams != null) { for (int i = 0; i < dbgParams.length; i++) { if (dbgParams[i].equals(CommonParams.QUERY)){ dbgQuery = true; } else if (dbgParams[i].equals(CommonParams.RESULTS)){ dbgResults = true; } } } } else { dbgQuery = true; dbgResults = true; } // Copied from StandardRequestHandler... perhaps it should be added to doStandardDebug? if (dbg == true) { try { NamedList<Object> dbgInfo = SolrPluginUtils.doStandardDebug(req, q, mlt.getRawMLTQuery(), mltDocs.docList, dbgQuery, dbgResults); if (null != dbgInfo) { if (null != filters) { dbgInfo.add("filter_queries",req.getParams().getParams(CommonParams.FQ)); List<String> fqs = new ArrayList<String>(filters.size()); for (Query fq : filters) { fqs.add(QueryParsing.toString(fq, req.getSchema())); } dbgInfo.add("parsed_filter_queries",fqs); } rsp.add("debug", dbgInfo); } } catch (Exception e) { SolrException.log(SolrCore.log, "Exception during debug", e); rsp.add("exception_during_debug", SolrException.toStr(e)); } } }
// in core/src/java/org/apache/solr/handler/component/HttpShardHandler.java
public void submit(final ShardRequest sreq, final String shard, final ModifiableSolrParams params) { // do this outside of the callable for thread safety reasons final List<String> urls = getURLs(shard); Callable<ShardResponse> task = new Callable<ShardResponse>() { public ShardResponse call() throws Exception { ShardResponse srsp = new ShardResponse(); srsp.setShardRequest(sreq); srsp.setShard(shard); SimpleSolrResponse ssr = new SimpleSolrResponse(); srsp.setSolrResponse(ssr); long startTime = System.currentTimeMillis(); try { params.remove(CommonParams.WT); // use default (currently javabin) params.remove(CommonParams.VERSION); // SolrRequest req = new QueryRequest(SolrRequest.METHOD.POST, "/select"); // use generic request to avoid extra processing of queries QueryRequest req = new QueryRequest(params); req.setMethod(SolrRequest.METHOD.POST); // no need to set the response parser as binary is the default // req.setResponseParser(new BinaryResponseParser()); // if there are no shards available for a slice, urls.size()==0 if (urls.size()==0) { // TODO: what's the right error code here? We should use the same thing when // all of the servers for a shard are down. throw new SolrException(SolrException.ErrorCode.SERVICE_UNAVAILABLE, "no servers hosting shard: " + shard); } if (urls.size() <= 1) { String url = urls.get(0); srsp.setShardAddress(url); SolrServer server = new HttpSolrServer(url, httpClient); ssr.nl = server.request(req); } else { LBHttpSolrServer.Rsp rsp = httpShardHandlerFactory.loadbalancer.request(new LBHttpSolrServer.Req(req, urls)); ssr.nl = rsp.getResponse(); srsp.setShardAddress(rsp.getServer()); } } catch( ConnectException cex ) { srsp.setException(cex); //???? } catch (Throwable th) { srsp.setException(th); if (th instanceof SolrException) { srsp.setResponseCode(((SolrException)th).code()); } else { srsp.setResponseCode(-1); } } ssr.elapsedTime = System.currentTimeMillis() - startTime; return srsp; } }; pending.add( completionService.submit(task) ); }
// in core/src/java/org/apache/solr/handler/component/HttpShardHandler.java
public ShardResponse call() throws Exception { ShardResponse srsp = new ShardResponse(); srsp.setShardRequest(sreq); srsp.setShard(shard); SimpleSolrResponse ssr = new SimpleSolrResponse(); srsp.setSolrResponse(ssr); long startTime = System.currentTimeMillis(); try { params.remove(CommonParams.WT); // use default (currently javabin) params.remove(CommonParams.VERSION); // SolrRequest req = new QueryRequest(SolrRequest.METHOD.POST, "/select"); // use generic request to avoid extra processing of queries QueryRequest req = new QueryRequest(params); req.setMethod(SolrRequest.METHOD.POST); // no need to set the response parser as binary is the default // req.setResponseParser(new BinaryResponseParser()); // if there are no shards available for a slice, urls.size()==0 if (urls.size()==0) { // TODO: what's the right error code here? We should use the same thing when // all of the servers for a shard are down. throw new SolrException(SolrException.ErrorCode.SERVICE_UNAVAILABLE, "no servers hosting shard: " + shard); } if (urls.size() <= 1) { String url = urls.get(0); srsp.setShardAddress(url); SolrServer server = new HttpSolrServer(url, httpClient); ssr.nl = server.request(req); } else { LBHttpSolrServer.Rsp rsp = httpShardHandlerFactory.loadbalancer.request(new LBHttpSolrServer.Req(req, urls)); ssr.nl = rsp.getResponse(); srsp.setShardAddress(rsp.getServer()); } } catch( ConnectException cex ) { srsp.setException(cex); //???? } catch (Throwable th) { srsp.setException(th); if (th instanceof SolrException) { srsp.setResponseCode(((SolrException)th).code()); } else { srsp.setResponseCode(-1); } } ssr.elapsedTime = System.currentTimeMillis() - startTime; return srsp; }
// in core/src/java/org/apache/solr/handler/component/QueryElevationComponent.java
Map<String, ElevationObj> getElevationMap(IndexReader reader, SolrCore core) throws Exception { synchronized (elevationCache) { Map<String, ElevationObj> map = elevationCache.get(null); if (map != null) return map; map = elevationCache.get(reader); if (map == null) { String f = initArgs.get(CONFIG_FILE); if (f == null) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "QueryElevationComponent must specify argument: " + CONFIG_FILE); } log.info("Loading QueryElevation from data dir: " + f); Config cfg; ZkController zkController = core.getCoreDescriptor().getCoreContainer().getZkController(); if (zkController != null) { cfg = new Config(core.getResourceLoader(), f, null, null); } else { InputStream is = VersionedFile.getLatestFile(core.getDataDir(), f); cfg = new Config(core.getResourceLoader(), f, new InputSource(is), null); } map = loadElevationMap(cfg); elevationCache.put(reader, map); } return map; } }
// in core/src/java/org/apache/solr/handler/component/SearchHandler.java
Override public void handleRequestBody(SolrQueryRequest req, SolrQueryResponse rsp) throws Exception, ParseException, InstantiationException, IllegalAccessException { // int sleep = req.getParams().getInt("sleep",0); // if (sleep > 0) {log.error("SLEEPING for " + sleep); Thread.sleep(sleep);} ResponseBuilder rb = new ResponseBuilder(req, rsp, components); if (rb.requestInfo != null) { rb.requestInfo.setResponseBuilder(rb); } boolean dbg = req.getParams().getBool(CommonParams.DEBUG_QUERY, false); rb.setDebug(dbg); if (dbg == false){//if it's true, we are doing everything anyway. SolrPluginUtils.getDebugInterests(req.getParams().getParams(CommonParams.DEBUG), rb); } final RTimer timer = rb.isDebug() ? new RTimer() : null; ShardHandler shardHandler1 = shardHandlerFactory.getShardHandler(); shardHandler1.checkDistributed(rb); if (timer == null) { // non-debugging prepare phase for( SearchComponent c : components ) { c.prepare(rb); } } else { // debugging prepare phase RTimer subt = timer.sub( "prepare" ); for( SearchComponent c : components ) { rb.setTimer( subt.sub( c.getName() ) ); c.prepare(rb); rb.getTimer().stop(); } subt.stop(); } if (!rb.isDistrib) { // a normal non-distributed request // The semantics of debugging vs not debugging are different enough that // it makes sense to have two control loops if(!rb.isDebug()) { // Process for( SearchComponent c : components ) { c.process(rb); } } else { // Process RTimer subt = timer.sub( "process" ); for( SearchComponent c : components ) { rb.setTimer( subt.sub( c.getName() ) ); c.process(rb); rb.getTimer().stop(); } subt.stop(); timer.stop(); // add the timing info if (rb.isDebugTimings()) { rb.addDebugInfo("timing", timer.asNamedList() ); } } } else { // a distributed request if (rb.outgoing == null) { rb.outgoing = new LinkedList<ShardRequest>(); } rb.finished = new ArrayList<ShardRequest>(); int nextStage = 0; do { rb.stage = nextStage; nextStage = ResponseBuilder.STAGE_DONE; // call all components for( SearchComponent c : components ) { // the next stage is the minimum of what all components report nextStage = Math.min(nextStage, c.distributedProcess(rb)); } // check the outgoing queue and send requests while (rb.outgoing.size() > 0) { // submit all current request tasks at once while (rb.outgoing.size() > 0) { ShardRequest sreq = rb.outgoing.remove(0); sreq.actualShards = sreq.shards; if (sreq.actualShards==ShardRequest.ALL_SHARDS) { sreq.actualShards = rb.shards; } sreq.responses = new ArrayList<ShardResponse>(); // TODO: map from shard to address[] for (String shard : sreq.actualShards) { ModifiableSolrParams params = new ModifiableSolrParams(sreq.params); params.remove(ShardParams.SHARDS); // not a top-level request params.set("distrib", "false"); // not a top-level request params.remove("indent"); params.remove(CommonParams.HEADER_ECHO_PARAMS); params.set(ShardParams.IS_SHARD, true); // a sub (shard) request params.set(ShardParams.SHARD_URL, shard); // so the shard knows what was asked if (rb.requestInfo != null) { // we could try and detect when this is needed, but it could be tricky params.set("NOW", Long.toString(rb.requestInfo.getNOW().getTime())); } String shardQt = params.get(ShardParams.SHARDS_QT); if (shardQt == null) { params.remove(CommonParams.QT); } else { params.set(CommonParams.QT, shardQt); } shardHandler1.submit(sreq, shard, params); } } // now wait for replies, but if anyone puts more requests on // the outgoing queue, send them out immediately (by exiting // this loop) boolean tolerant = rb.req.getParams().getBool(ShardParams.SHARDS_TOLERANT, false); while (rb.outgoing.size() == 0) { ShardResponse srsp = tolerant ? shardHandler1.takeCompletedIncludingErrors(): shardHandler1.takeCompletedOrError(); if (srsp == null) break; // no more requests to wait for // Was there an exception? if (srsp.getException() != null) { // If things are not tolerant, abort everything and rethrow if(!tolerant) { shardHandler1.cancelAll(); if (srsp.getException() instanceof SolrException) { throw (SolrException)srsp.getException(); } else { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, srsp.getException()); } } } rb.finished.add(srsp.getShardRequest()); // let the components see the responses to the request for(SearchComponent c : components) { c.handleResponses(rb, srsp.getShardRequest()); } } } for(SearchComponent c : components) { c.finishStage(rb); } // we are done when the next stage is MAX_VALUE } while (nextStage != Integer.MAX_VALUE); } }
// in core/src/java/org/apache/solr/handler/ContentStreamHandlerBase.java
Override public void handleRequestBody(SolrQueryRequest req, SolrQueryResponse rsp) throws Exception { SolrParams params = req.getParams(); UpdateRequestProcessorChain processorChain = req.getCore().getUpdateProcessingChain(params.get(UpdateParams.UPDATE_CHAIN)); UpdateRequestProcessor processor = processorChain.createProcessor(req, rsp); try { ContentStreamLoader documentLoader = newLoader(req, processor); Iterable<ContentStream> streams = req.getContentStreams(); if (streams == null) { if (!RequestHandlerUtils.handleCommit(req, processor, params, false) && !RequestHandlerUtils.handleRollback(req, processor, params, false)) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "missing content stream"); } } else { for (ContentStream stream : streams) { documentLoader.load(req, rsp, stream, processor); } // Perhaps commit from the parameters RequestHandlerUtils.handleCommit(req, processor, params, false); RequestHandlerUtils.handleRollback(req, processor, params, false); } } finally { // finish the request processor.finish(); } }
// in core/src/java/org/apache/solr/handler/admin/PluginInfoHandler.java
Override public void handleRequestBody(SolrQueryRequest req, SolrQueryResponse rsp) throws Exception { SolrParams params = req.getParams(); boolean stats = params.getBool( "stats", false ); rsp.add( "plugins", getSolrInfoBeans( req.getCore(), stats ) ); rsp.setHttpCaching(false); }
// in core/src/java/org/apache/solr/handler/admin/LukeRequestHandler.java
Override public void handleRequestBody(SolrQueryRequest req, SolrQueryResponse rsp) throws Exception { IndexSchema schema = req.getSchema(); SolrIndexSearcher searcher = req.getSearcher(); DirectoryReader reader = searcher.getIndexReader(); SolrParams params = req.getParams(); ShowStyle style = ShowStyle.get(params.get("show")); // If no doc is given, show all fields and top terms rsp.add("index", getIndexInfo(reader)); if(ShowStyle.INDEX==style) { return; // that's all we need } Integer docId = params.getInt( DOC_ID ); if( docId == null && params.get( ID ) != null ) { // Look for something with a given solr ID SchemaField uniqueKey = schema.getUniqueKeyField(); String v = uniqueKey.getType().toInternal( params.get(ID) ); Term t = new Term( uniqueKey.getName(), v ); docId = searcher.getFirstMatch( t ); if( docId < 0 ) { throw new SolrException( SolrException.ErrorCode.NOT_FOUND, "Can't find document: "+params.get( ID ) ); } } // Read the document from the index if( docId != null ) { if( style != null && style != ShowStyle.DOC ) { throw new SolrException(ErrorCode.BAD_REQUEST, "missing doc param for doc style"); } Document doc = null; try { doc = reader.document( docId ); } catch( Exception ex ) {} if( doc == null ) { throw new SolrException( SolrException.ErrorCode.NOT_FOUND, "Can't find document: "+docId ); } SimpleOrderedMap<Object> info = getDocumentFieldsInfo( doc, docId, reader, schema ); SimpleOrderedMap<Object> docinfo = new SimpleOrderedMap<Object>(); docinfo.add( "docId", docId ); docinfo.add( "lucene", info ); docinfo.add( "solr", doc ); rsp.add( "doc", docinfo ); } else if ( ShowStyle.SCHEMA == style ) { rsp.add( "schema", getSchemaInfo( req.getSchema() ) ); } else { rsp.add( "fields", getIndexedFieldsInfo(req) ) ; } // Add some generally helpful information NamedList<Object> info = new SimpleOrderedMap<Object>(); info.add( "key", getFieldFlagsKey() ); info.add( "NOTE", "Document Frequency (df) is not updated when a document is marked for deletion. df values include deleted documents." ); rsp.add( "info", info ); rsp.setHttpCaching(false); }
// in core/src/java/org/apache/solr/handler/admin/LukeRequestHandler.java
private static SimpleOrderedMap<Object> getIndexedFieldsInfo(SolrQueryRequest req) throws Exception { SolrIndexSearcher searcher = req.getSearcher(); SolrParams params = req.getParams(); Set<String> fields = null; String fl = params.get(CommonParams.FL); if (fl != null) { fields = new TreeSet<String>(Arrays.asList(fl.split( "[,\\s]+" ))); } AtomicReader reader = searcher.getAtomicReader(); IndexSchema schema = searcher.getSchema(); // Don't be tempted to put this in the loop below, the whole point here is to alphabetize the fields! Set<String> fieldNames = new TreeSet<String>(); for(FieldInfo fieldInfo : reader.getFieldInfos()) { fieldNames.add(fieldInfo.name); } // Walk the term enum and keep a priority queue for each map in our set SimpleOrderedMap<Object> finfo = new SimpleOrderedMap<Object>(); for (String fieldName : fieldNames) { if (fields != null && ! fields.contains(fieldName) && ! fields.contains("*")) { continue; //we're not interested in this field Still an issue here } SimpleOrderedMap<Object> fieldMap = new SimpleOrderedMap<Object>(); SchemaField sfield = schema.getFieldOrNull( fieldName ); FieldType ftype = (sfield==null)?null:sfield.getType(); fieldMap.add( "type", (ftype==null)?null:ftype.getTypeName() ); fieldMap.add("schema", getFieldFlags(sfield)); if (sfield != null && schema.isDynamicField(sfield.getName()) && schema.getDynamicPattern(sfield.getName()) != null) { fieldMap.add("dynamicBase", schema.getDynamicPattern(sfield.getName())); } Terms terms = reader.fields().terms(fieldName); if (terms == null) { // Not indexed, so we need to report what we can (it made it through the fl param if specified) finfo.add( fieldName, fieldMap ); continue; } if(sfield != null && sfield.indexed() ) { // In the pre-4.0 days, this did a veeeery expensive range query. But we can be much faster now, // so just do this all the time. Document doc = getFirstLiveDoc(reader, fieldName, terms); if( doc != null ) { // Found a document with this field try { IndexableField fld = doc.getField( fieldName ); if( fld != null ) { fieldMap.add("index", getFieldFlags(fld)); } else { // it is a non-stored field... fieldMap.add("index", "(unstored field)"); } } catch( Exception ex ) { log.warn( "error reading field: "+fieldName ); } } fieldMap.add("docs", terms.getDocCount()); } if (fields != null && (fields.contains(fieldName) || fields.contains("*"))) { getDetailedFieldInfo(req, fieldName, fieldMap); } // Add the field finfo.add( fieldName, fieldMap ); } return finfo; }
// in core/src/java/org/apache/solr/handler/admin/SolrInfoMBeanHandler.java
Override public void handleRequestBody(SolrQueryRequest req, SolrQueryResponse rsp) throws Exception { NamedList<NamedList<NamedList<Object>>> cats = getMBeanInfo(req); if(req.getParams().getBool("diff", false)) { ContentStream body = null; try { body = req.getContentStreams().iterator().next(); } catch(Exception ex) { throw new SolrException(ErrorCode.BAD_REQUEST, "missing content-stream for diff"); } String content = IOUtils.toString(body.getReader()); NamedList<NamedList<NamedList<Object>>> ref = fromXML(content); // Normalize the output SolrQueryResponse wrap = new SolrQueryResponse(); wrap.add("solr-mbeans", cats); cats = (NamedList<NamedList<NamedList<Object>>>) BinaryResponseWriter.getParsedResponse(req, wrap).get("solr-mbeans"); // Get rid of irrelevant things ref = normalize(ref); cats = normalize(cats); // Only the changes boolean showAll = req.getParams().getBool("all", false); rsp.add("solr-mbeans", getDiff(ref,cats, showAll)); } else { rsp.add("solr-mbeans", cats); } rsp.setHttpCaching(false); // never cache, no matter what init config looks like }
// in core/src/java/org/apache/solr/handler/admin/CoreAdminHandler.java
Override public void handleRequestBody(SolrQueryRequest req, SolrQueryResponse rsp) throws Exception { // Make sure the cores is enabled CoreContainer cores = getCoreContainer(); if (cores == null) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Core container instance missing"); } boolean doPersist = false; // Pick the action SolrParams params = req.getParams(); CoreAdminAction action = CoreAdminAction.STATUS; String a = params.get(CoreAdminParams.ACTION); if (a != null) { action = CoreAdminAction.get(a); if (action == null) { doPersist = this.handleCustomAction(req, rsp); } } if (action != null) { switch (action) { case CREATE: { doPersist = this.handleCreateAction(req, rsp); break; } case RENAME: { doPersist = this.handleRenameAction(req, rsp); break; } case UNLOAD: { doPersist = this.handleUnloadAction(req, rsp); break; } case STATUS: { doPersist = this.handleStatusAction(req, rsp); break; } case PERSIST: { doPersist = this.handlePersistAction(req, rsp); break; } case RELOAD: { doPersist = this.handleReloadAction(req, rsp); break; } case SWAP: { doPersist = this.handleSwapAction(req, rsp); break; } case MERGEINDEXES: { doPersist = this.handleMergeAction(req, rsp); break; } case PREPRECOVERY: { this.handleWaitForStateAction(req, rsp); break; } case REQUESTRECOVERY: { this.handleRequestRecoveryAction(req, rsp); break; } case DISTRIBURL: { this.handleDistribUrlAction(req, rsp); break; } default: { doPersist = this.handleCustomAction(req, rsp); break; } case LOAD: break; } } // Should we persist the changes? if (doPersist) { cores.persist(); rsp.add("saved", cores.getConfigFile().getAbsolutePath()); } rsp.setHttpCaching(false); }
// in core/src/java/org/apache/solr/handler/admin/SystemInfoHandler.java
Override public void handleRequestBody(SolrQueryRequest req, SolrQueryResponse rsp) throws Exception { rsp.add( "core", getCoreInfo( req.getCore() ) ); rsp.add( "lucene", getLuceneInfo() ); rsp.add( "jvm", getJvmInfo() ); rsp.add( "system", getSystemInfo() ); rsp.setHttpCaching(false); }
// in core/src/java/org/apache/solr/handler/admin/SystemInfoHandler.java
private SimpleOrderedMap<Object> getCoreInfo( SolrCore core ) throws Exception { SimpleOrderedMap<Object> info = new SimpleOrderedMap<Object>(); IndexSchema schema = core.getSchema(); info.add( "schema", schema != null ? schema.getSchemaName():"no schema!" ); // Host info.add( "host", hostname ); // Now info.add( "now", new Date() ); // Start Time info.add( "start", new Date(core.getStartTime()) ); // Solr Home SimpleOrderedMap<Object> dirs = new SimpleOrderedMap<Object>(); dirs.add( "cwd" , new File( System.getProperty("user.dir")).getAbsolutePath() ); dirs.add( "instance", new File( core.getResourceLoader().getInstanceDir() ).getAbsolutePath() ); dirs.add( "data", new File( core.getDataDir() ).getAbsolutePath() ); dirs.add( "index", new File( core.getIndexDir() ).getAbsolutePath() ); info.add( "directory", dirs ); return info; }
// in core/src/java/org/apache/solr/handler/admin/SystemInfoHandler.java
public static SimpleOrderedMap<Object> getSystemInfo() throws Exception { SimpleOrderedMap<Object> info = new SimpleOrderedMap<Object>(); OperatingSystemMXBean os = ManagementFactory.getOperatingSystemMXBean(); info.add( "name", os.getName() ); info.add( "version", os.getVersion() ); info.add( "arch", os.getArch() ); info.add( "systemLoadAverage", os.getSystemLoadAverage()); // com.sun.management.OperatingSystemMXBean addGetterIfAvaliable( os, "committedVirtualMemorySize", info); addGetterIfAvaliable( os, "freePhysicalMemorySize", info); addGetterIfAvaliable( os, "freeSwapSpaceSize", info); addGetterIfAvaliable( os, "processCpuTime", info); addGetterIfAvaliable( os, "totalPhysicalMemorySize", info); addGetterIfAvaliable( os, "totalSwapSpaceSize", info); // com.sun.management.UnixOperatingSystemMXBean addGetterIfAvaliable( os, "openFileDescriptorCount", info ); addGetterIfAvaliable( os, "maxFileDescriptorCount", info ); try { if( !os.getName().toLowerCase(Locale.ENGLISH).startsWith( "windows" ) ) { // Try some command line things info.add( "uname", execute( "uname -a" ) ); info.add( "uptime", execute( "uptime" ) ); } } catch( Throwable ex ) { ex.printStackTrace(); } return info; }
// in core/src/java/org/apache/solr/handler/admin/SystemInfoHandler.java
private static SimpleOrderedMap<Object> getLuceneInfo() throws Exception { SimpleOrderedMap<Object> info = new SimpleOrderedMap<Object>(); Package p = SolrCore.class.getPackage(); info.add( "solr-spec-version", p.getSpecificationVersion() ); info.add( "solr-impl-version", p.getImplementationVersion() ); p = LucenePackage.class.getPackage(); info.add( "lucene-spec-version", p.getSpecificationVersion() ); info.add( "lucene-impl-version", p.getImplementationVersion() ); return info; }
// in core/src/java/org/apache/solr/handler/admin/LoggingHandler.java
Override public void handleRequestBody(SolrQueryRequest req, SolrQueryResponse rsp) throws Exception { // Don't do anything if the framework is unknown if(watcher==null) { rsp.add("error", "Logging Not Initalized"); return; } rsp.add("watcher", watcher.getName()); SolrParams params = req.getParams(); if(params.get("threshold")!=null) { watcher.setThreshold(params.get("threshold")); } // Write something at each level if(params.get("test")!=null) { log.trace("trace message"); log.debug( "debug message"); log.info("info (with exception)", new RuntimeException("test") ); log.warn("warn (with exception)", new RuntimeException("test") ); log.error("error (with exception)", new RuntimeException("test") ); } String[] set = params.getParams("set"); if (set != null) { for (String pair : set) { String[] split = pair.split(":"); if (split.length != 2) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "Invalid format, expected level:value, got " + pair); } String category = split[0]; String level = split[1]; watcher.setLogLevel(category, level); } } String since = req.getParams().get("since"); if(since != null) { long time = -1; try { time = Long.parseLong(since); } catch(Exception ex) { throw new SolrException(ErrorCode.BAD_REQUEST, "invalid timestamp: "+since); } AtomicBoolean found = new AtomicBoolean(false); SolrDocumentList docs = watcher.getHistory(time, found); if(docs==null) { rsp.add("error", "History not enabled"); return; } else { SimpleOrderedMap<Object> info = new SimpleOrderedMap<Object>(); if(time>0) { info.add("since", time); info.add("found", found); } else { info.add("levels", watcher.getAllLevels()); // show for the first request } info.add("last", watcher.getLastEvent()); info.add("buffer", watcher.getHistorySize()); info.add("threshold", watcher.getThreshold()); rsp.add("info", info); rsp.add("history", docs); } } else { rsp.add("levels", watcher.getAllLevels()); List<LoggerInfo> loggers = new ArrayList<LoggerInfo>(watcher.getAllLoggers()); Collections.sort(loggers); List<SimpleOrderedMap<?>> info = new ArrayList<SimpleOrderedMap<?>>(); for(LoggerInfo wrap:loggers) { info.add(wrap.getInfo()); } rsp.add("loggers", info); } rsp.setHttpCaching(false); }
// in core/src/java/org/apache/solr/handler/PingRequestHandler.java
Override public void handleRequestBody(SolrQueryRequest req, SolrQueryResponse rsp) throws Exception { SolrParams params = req.getParams(); // in this case, we want to default distrib to false so // we only ping the single node Boolean distrib = params.getBool("distrib"); if (distrib == null) { ModifiableSolrParams mparams = new ModifiableSolrParams(params); mparams.set("distrib", false); req.setParams(mparams); } String actionParam = params.get("action"); ACTIONS action = null; if (actionParam == null){ action = ACTIONS.PING; } else { try { action = ACTIONS.valueOf(actionParam.toUpperCase(Locale.ENGLISH)); } catch (IllegalArgumentException iae){ throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Unknown action: " + actionParam); } } switch(action){ case PING: if( isPingDisabled() ) { throw new SolrException(SolrException.ErrorCode.SERVICE_UNAVAILABLE, "Service disabled"); } handlePing(req, rsp); break; case ENABLE: handleEnable(true); break; case DISABLE: handleEnable(false); break; case STATUS: if( healthcheck == null ){ SolrException e = new SolrException (SolrException.ErrorCode.SERVICE_UNAVAILABLE, "healthcheck not configured"); rsp.setException(e); } else { rsp.add( "status", isPingDisabled() ? "disabled" : "enabled" ); } } }
// in core/src/java/org/apache/solr/handler/PingRequestHandler.java
protected void handlePing(SolrQueryRequest req, SolrQueryResponse rsp) throws Exception { SolrParams params = req.getParams(); SolrCore core = req.getCore(); // Get the RequestHandler String qt = params.get( CommonParams.QT );//optional; you get the default otherwise SolrRequestHandler handler = core.getRequestHandler( qt ); if( handler == null ) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Unknown RequestHandler (qt): "+qt ); } if( handler instanceof PingRequestHandler ) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Cannot execute the PingRequestHandler recursively" ); } // Execute the ping query and catch any possible exception Throwable ex = null; try { SolrQueryResponse pingrsp = new SolrQueryResponse(); core.execute(handler, req, pingrsp ); ex = pingrsp.getException(); } catch( Throwable th ) { ex = th; } // Send an error or an 'OK' message (response code will be 200) if( ex != null ) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Ping query caused exception: "+ex.getMessage(), ex ); } rsp.add( "status", "OK" ); }
// in core/src/java/org/apache/solr/client/solrj/embedded/JettySolrRunner.java
public void start() throws Exception { start(true); }
// in core/src/java/org/apache/solr/client/solrj/embedded/JettySolrRunner.java
public void start(boolean waitForSolr) throws Exception { // if started before, make a new server if (startedBefore) { waitOnSolr = false; init(solrHome, context, lastPort, stopAtShutdown); } else { startedBefore = true; } if( dataDir != null) { System.setProperty("solr.data.dir", dataDir); } if(shards != null) { System.setProperty("shard", shards); } if (!server.isRunning()) { server.start(); } synchronized (JettySolrRunner.this) { int cnt = 0; while (!waitOnSolr) { this.wait(100); if (cnt++ == 5) { throw new RuntimeException("Jetty/Solr unresponsive"); } } } System.clearProperty("shard"); System.clearProperty("solr.data.dir"); }
// in core/src/java/org/apache/solr/client/solrj/embedded/JettySolrRunner.java
public void stop() throws Exception { if (!server.isStopped() && !server.isStopping()) { server.stop(); } server.join(); }
// in core/src/java/org/apache/solr/SolrLogFormatter.java
public static void main(String[] args) throws Exception { Handler[] handlers = Logger.getLogger("").getHandlers(); boolean foundConsoleHandler = false; for (int index = 0; index < handlers.length; index++) { // set console handler to SEVERE if (handlers[index] instanceof ConsoleHandler) { handlers[index].setLevel(Level.ALL); handlers[index].setFormatter(new SolrLogFormatter()); foundConsoleHandler = true; } } if (!foundConsoleHandler) { // no console handler found System.err.println("No consoleHandler found, adding one."); ConsoleHandler consoleHandler = new ConsoleHandler(); consoleHandler.setLevel(Level.ALL); consoleHandler.setFormatter(new SolrLogFormatter()); Logger.getLogger("").addHandler(consoleHandler); } final org.slf4j.Logger log = LoggerFactory.getLogger(SolrLogFormatter.class); log.error("HELLO"); ThreadGroup tg = new MyThreadGroup("YCS"); Thread th = new Thread(tg, "NEW_THREAD") { @Override public void run() { try { go(); } catch (Throwable e) { e.printStackTrace(); } } }; th.start(); th.join(); }
// in core/src/java/org/apache/solr/SolrLogFormatter.java
public static void go() throws Exception { final org.slf4j.Logger log = LoggerFactory.getLogger(SolrLogFormatter.class); Thread thread1 = new Thread() { @Override public void run() { threadLocal.set("from thread1"); log.error("[] webapp=/solr path=/select params={hello} wow"); } }; Thread thread2 = new Thread() { @Override public void run() { threadLocal.set("from thread2"); log.error("InThread2"); } }; thread1.start(); thread2.start(); thread1.join(); thread2.join(); }
// in core/src/java/org/apache/solr/response/BinaryResponseWriter.java
public Object getValue(SchemaField sf, IndexableField f) throws Exception { FieldType ft = null; if(sf != null) ft =sf.getType(); if (ft == null) { // handle fields not in the schema BytesRef bytesRef = f.binaryValue(); if (bytesRef != null) { if (bytesRef.offset == 0 && bytesRef.length == bytesRef.bytes.length) { return bytesRef.bytes; } else { final byte[] bytes = new byte[bytesRef.length]; System.arraycopy(bytesRef.bytes, bytesRef.offset, bytes, 0, bytesRef.length); return bytes; } } else return f.stringValue(); } else { if (useFieldObjects && KNOWN_TYPES.contains(ft.getClass())) { return ft.toObject(f); } else { return ft.toExternal(f); } } }
// in core/src/java/org/apache/solr/request/PerSegmentSingleValuedFaceting.java
NamedList<Integer> getFacetCounts(Executor executor) throws IOException { CompletionService<SegFacet> completionService = new ExecutorCompletionService<SegFacet>(executor); // reuse the translation logic to go from top level set to per-segment set baseSet = docs.getTopFilter(); final AtomicReaderContext[] leaves = searcher.getTopReaderContext().leaves(); // The list of pending tasks that aren't immediately submitted // TODO: Is there a completion service, or a delegating executor that can // limit the number of concurrent tasks submitted to a bigger executor? LinkedList<Callable<SegFacet>> pending = new LinkedList<Callable<SegFacet>>(); int threads = nThreads <= 0 ? Integer.MAX_VALUE : nThreads; for (int i=0; i<leaves.length; i++) { final SegFacet segFacet = new SegFacet(leaves[i]); Callable<SegFacet> task = new Callable<SegFacet>() { public SegFacet call() throws Exception { segFacet.countTerms(); return segFacet; } }; // TODO: if limiting threads, submit by largest segment first? if (--threads >= 0) { completionService.submit(task); } else { pending.add(task); } } // now merge the per-segment results PriorityQueue<SegFacet> queue = new PriorityQueue<SegFacet>(leaves.length) { @Override protected boolean lessThan(SegFacet a, SegFacet b) { return a.tempBR.compareTo(b.tempBR) < 0; } }; boolean hasMissingCount=false; int missingCount=0; for (int i=0; i<leaves.length; i++) { SegFacet seg = null; try { Future<SegFacet> future = completionService.take(); seg = future.get(); if (!pending.isEmpty()) { completionService.submit(pending.removeFirst()); } } catch (InterruptedException e) { Thread.currentThread().interrupt(); throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); } catch (ExecutionException e) { Throwable cause = e.getCause(); if (cause instanceof RuntimeException) { throw (RuntimeException)cause; } else { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Error in per-segment faceting on field: " + fieldName, cause); } } if (seg.startTermIndex < seg.endTermIndex) { if (seg.startTermIndex==0) { hasMissingCount=true; missingCount += seg.counts[0]; seg.pos = 1; } else { seg.pos = seg.startTermIndex; } if (seg.pos < seg.endTermIndex) { seg.tenum = seg.si.getTermsEnum(); seg.tenum.seekExact(seg.pos); seg.tempBR = seg.tenum.term(); queue.add(seg); } } } FacetCollector collector; if (sort.equals(FacetParams.FACET_SORT_COUNT) || sort.equals(FacetParams.FACET_SORT_COUNT_LEGACY)) { collector = new CountSortedFacetCollector(offset, limit, mincount); } else { collector = new IndexSortedFacetCollector(offset, limit, mincount); } BytesRef val = new BytesRef(); while (queue.size() > 0) { SegFacet seg = queue.top(); // make a shallow copy val.bytes = seg.tempBR.bytes; val.offset = seg.tempBR.offset; val.length = seg.tempBR.length; int count = 0; do { count += seg.counts[seg.pos - seg.startTermIndex]; // TODO: OPTIMIZATION... // if mincount>0 then seg.pos++ can skip ahead to the next non-zero entry. seg.pos++; if (seg.pos >= seg.endTermIndex) { queue.pop(); seg = queue.top(); } else { seg.tempBR = seg.tenum.next(); seg = queue.updateTop(); } } while (seg != null && val.compareTo(seg.tempBR) == 0); boolean stop = collector.collect(val, count); if (stop) break; } NamedList<Integer> res = collector.getFacetCounts(); // convert labels to readable form FieldType ft = searcher.getSchema().getFieldType(fieldName); int sz = res.size(); for (int i=0; i<sz; i++) { res.setName(i, ft.indexedToReadable(res.getName(i))); } if (missing) { if (!hasMissingCount) { missingCount = SimpleFacets.getFieldMissingCount(searcher,docs,fieldName); } res.add(null, missingCount); } return res; }
// in core/src/java/org/apache/solr/request/PerSegmentSingleValuedFaceting.java
public SegFacet call() throws Exception { segFacet.countTerms(); return segFacet; }
// in core/src/java/org/apache/solr/servlet/SolrRequestParsers.java
public SolrQueryRequest parse( SolrCore core, String path, HttpServletRequest req ) throws Exception { SolrRequestParser parser = standard; // TODO -- in the future, we could pick a different parser based on the request // Pick the parser from the request... ArrayList<ContentStream> streams = new ArrayList<ContentStream>(1); SolrParams params = parser.parseParamsAndFillStreams( req, streams ); SolrQueryRequest sreq = buildRequestFrom( core, params, streams ); // Handlers and login will want to know the path. If it contains a ':' // the handler could use it for RESTful URLs sreq.getContext().put( "path", path ); return sreq; }
// in core/src/java/org/apache/solr/servlet/SolrRequestParsers.java
public SolrQueryRequest buildRequestFrom( SolrCore core, SolrParams params, Collection<ContentStream> streams ) throws Exception { // The content type will be applied to all streaming content String contentType = params.get( CommonParams.STREAM_CONTENTTYPE ); // Handle anything with a remoteURL String[] strs = params.getParams( CommonParams.STREAM_URL ); if( strs != null ) { if( !enableRemoteStreams ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "Remote Streaming is disabled." ); } for( final String url : strs ) { ContentStreamBase stream = new ContentStreamBase.URLStream( new URL(url) ); if( contentType != null ) { stream.setContentType( contentType ); } streams.add( stream ); } } // Handle streaming files strs = params.getParams( CommonParams.STREAM_FILE ); if( strs != null ) { if( !enableRemoteStreams ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "Remote Streaming is disabled." ); } for( final String file : strs ) { ContentStreamBase stream = new ContentStreamBase.FileStream( new File(file) ); if( contentType != null ) { stream.setContentType( contentType ); } streams.add( stream ); } } // Check for streams in the request parameters strs = params.getParams( CommonParams.STREAM_BODY ); if( strs != null ) { for( final String body : strs ) { ContentStreamBase stream = new ContentStreamBase.StringStream( body ); if( contentType != null ) { stream.setContentType( contentType ); } streams.add( stream ); } } SolrQueryRequestBase q = new SolrQueryRequestBase( core, params ) { }; if( streams != null && streams.size() > 0 ) { q.setContentStreams( streams ); } return q; }
// in core/src/java/org/apache/solr/servlet/SolrRequestParsers.java
public SolrParams parseParamsAndFillStreams( final HttpServletRequest req, ArrayList<ContentStream> streams ) throws Exception { return new ServletSolrParams(req); }
// in core/src/java/org/apache/solr/servlet/SolrRequestParsers.java
public SolrParams parseParamsAndFillStreams( final HttpServletRequest req, ArrayList<ContentStream> streams ) throws Exception { // The javadocs for HttpServletRequest are clear that req.getReader() should take // care of any character encoding issues. BUT, there are problems while running on // some servlet containers: including Tomcat 5 and resin. // // Rather than return req.getReader(), this uses the default ContentStreamBase method // that checks for charset definitions in the ContentType. streams.add( new HttpRequestContentStream( req ) ); return SolrRequestParsers.parseQueryString( req.getQueryString() ); }
// in core/src/java/org/apache/solr/servlet/SolrRequestParsers.java
public SolrParams parseParamsAndFillStreams( final HttpServletRequest req, ArrayList<ContentStream> streams ) throws Exception { if( !ServletFileUpload.isMultipartContent(req) ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "Not multipart content! "+req.getContentType() ); } MultiMapSolrParams params = SolrRequestParsers.parseQueryString( req.getQueryString() ); // Create a factory for disk-based file items DiskFileItemFactory factory = new DiskFileItemFactory(); // Set factory constraints // TODO - configure factory.setSizeThreshold(yourMaxMemorySize); // TODO - configure factory.setRepository(yourTempDirectory); // Create a new file upload handler ServletFileUpload upload = new ServletFileUpload(factory); upload.setSizeMax( uploadLimitKB*1024 ); // Parse the request List items = upload.parseRequest(req); Iterator iter = items.iterator(); while (iter.hasNext()) { FileItem item = (FileItem) iter.next(); // If its a form field, put it in our parameter map if (item.isFormField()) { MultiMapSolrParams.addParam( item.getFieldName(), item.getString(), params.getMap() ); } // Add the stream else { streams.add( new FileItemContentStream( item ) ); } } return params; }
// in core/src/java/org/apache/solr/servlet/SolrRequestParsers.java
public SolrParams parseParamsAndFillStreams( final HttpServletRequest req, ArrayList<ContentStream> streams ) throws Exception { String method = req.getMethod().toUpperCase(Locale.ENGLISH); if( "GET".equals( method ) || "HEAD".equals( method )) { return new ServletSolrParams(req); } if( "POST".equals( method ) ) { String contentType = req.getContentType(); if( contentType != null ) { int idx = contentType.indexOf( ';' ); if( idx > 0 ) { // remove the charset definition "; charset=utf-8" contentType = contentType.substring( 0, idx ); } if( "application/x-www-form-urlencoded".equals( contentType.toLowerCase(Locale.ENGLISH) ) ) { return new ServletSolrParams(req); // just get the params from parameterMap } if( ServletFileUpload.isMultipartContent(req) ) { return multipart.parseParamsAndFillStreams(req, streams); } } return raw.parseParamsAndFillStreams(req, streams); } throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "Unsupported method: "+method ); }
// in core/src/java/org/apache/solr/servlet/DirectSolrConnection.java
public String request( String pathAndParams, String body ) throws Exception { String path = null; SolrParams params = null; int idx = pathAndParams.indexOf( '?' ); if( idx > 0 ) { path = pathAndParams.substring( 0, idx ); params = SolrRequestParsers.parseQueryString( pathAndParams.substring(idx+1) ); } else { path= pathAndParams; params = new MapSolrParams( new HashMap<String, String>() ); } return request(path, params, body); }
// in core/src/java/org/apache/solr/servlet/DirectSolrConnection.java
public String request(String path, SolrParams params, String body) throws Exception { // Extract the handler from the path or params SolrRequestHandler handler = core.getRequestHandler( path ); if( handler == null ) { if( "/select".equals( path ) || "/select/".equalsIgnoreCase( path) ) { if (params == null) params = new MapSolrParams( new HashMap<String, String>() ); String qt = params.get( CommonParams.QT ); handler = core.getRequestHandler( qt ); if( handler == null ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "unknown handler: "+qt); } } } if( handler == null ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "unknown handler: "+path ); } return request(handler, params, body); }
// in core/src/java/org/apache/solr/servlet/DirectSolrConnection.java
public String request(SolrRequestHandler handler, SolrParams params, String body) throws Exception { if (params == null) params = new MapSolrParams( new HashMap<String, String>() ); // Make a stream for the 'body' content List<ContentStream> streams = new ArrayList<ContentStream>( 1 ); if( body != null && body.length() > 0 ) { streams.add( new ContentStreamBase.StringStream( body ) ); } SolrQueryRequest req = null; try { req = parser.buildRequestFrom( core, params, streams ); SolrQueryResponse rsp = new SolrQueryResponse(); SolrRequestInfo.setRequestInfo(new SolrRequestInfo(req, rsp)); core.execute( handler, req, rsp ); if( rsp.getException() != null ) { throw rsp.getException(); } // Now write it out QueryResponseWriter responseWriter = core.getQueryResponseWriter(req); StringWriter out = new StringWriter(); responseWriter.write(out, req, rsp); return out.toString(); } finally { if (req != null) { req.close(); } SolrRequestInfo.clearRequestInfo(); } }
// in core/src/java/org/apache/solr/schema/FieldTypePluginLoader.java
Override protected FieldType create( ResourceLoader loader, String name, String className, Node node ) throws Exception { FieldType ft = loader.newInstance(className, FieldType.class); ft.setTypeName(name); String expression = "./analyzer[@type='query']"; Node anode = (Node)xpath.evaluate(expression, node, XPathConstants.NODE); Analyzer queryAnalyzer = readAnalyzer(anode); expression = "./analyzer[@type='multiterm']"; anode = (Node)xpath.evaluate(expression, node, XPathConstants.NODE); Analyzer multiAnalyzer = readAnalyzer(anode); // An analyzer without a type specified, or with type="index" expression = "./analyzer[not(@type)] | ./analyzer[@type='index']"; anode = (Node)xpath.evaluate(expression, node, XPathConstants.NODE); Analyzer analyzer = readAnalyzer(anode); // a custom similarity[Factory] expression = "./similarity"; anode = (Node)xpath.evaluate(expression, node, XPathConstants.NODE); SimilarityFactory simFactory = IndexSchema.readSimilarity(loader, anode); if (queryAnalyzer==null) queryAnalyzer=analyzer; if (analyzer==null) analyzer=queryAnalyzer; if (multiAnalyzer == null) { multiAnalyzer = constructMultiTermAnalyzer(queryAnalyzer); } if (analyzer!=null) { ft.setAnalyzer(analyzer); ft.setQueryAnalyzer(queryAnalyzer); if (ft instanceof TextField) ((TextField)ft).setMultiTermAnalyzer(multiAnalyzer); } if (simFactory!=null) { ft.setSimilarity(simFactory.getSimilarity()); } if (ft instanceof SchemaAware){ schemaAware.add((SchemaAware) ft); } return ft; }
// in core/src/java/org/apache/solr/schema/FieldTypePluginLoader.java
Override protected void init(FieldType plugin, Node node) throws Exception { Map<String,String> params = DOMUtil.toMapExcept( node.getAttributes(), "name","class" ); plugin.setArgs(schema, params ); }
// in core/src/java/org/apache/solr/schema/FieldTypePluginLoader.java
Override protected FieldType register(String name, FieldType plugin) throws Exception { log.trace("fieldtype defined: " + plugin ); return fieldTypes.put( name, plugin ); }
// in core/src/java/org/apache/solr/schema/FieldTypePluginLoader.java
private Analyzer readAnalyzer(Node node) throws XPathExpressionException { final SolrResourceLoader loader = schema.getResourceLoader(); // parent node used to be passed in as "fieldtype" // if (!fieldtype.hasChildNodes()) return null; // Node node = DOMUtil.getChild(fieldtype,"analyzer"); if (node == null) return null; NamedNodeMap attrs = node.getAttributes(); String analyzerName = DOMUtil.getAttr(attrs,"class"); if (analyzerName != null) { try { // No need to be core-aware as Analyzers are not in the core-aware list final Class<? extends Analyzer> clazz = loader.findClass(analyzerName, Analyzer.class); try { // first try to use a ctor with version parameter // (needed for many new Analyzers that have no default one anymore) Constructor<? extends Analyzer> cnstr = clazz.getConstructor(Version.class); final String matchVersionStr = DOMUtil.getAttr(attrs, LUCENE_MATCH_VERSION_PARAM); final Version luceneMatchVersion = (matchVersionStr == null) ? schema.getDefaultLuceneMatchVersion() : Config.parseLuceneVersionString(matchVersionStr); if (luceneMatchVersion == null) { throw new SolrException ( SolrException.ErrorCode.SERVER_ERROR, "Configuration Error: Analyzer '" + clazz.getName() + "' needs a 'luceneMatchVersion' parameter"); } return cnstr.newInstance(luceneMatchVersion); } catch (NoSuchMethodException nsme) { // otherwise use default ctor return clazz.newInstance(); } } catch (Exception e) { log.error("Cannot load analyzer: "+analyzerName, e); throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "Cannot load analyzer: "+analyzerName, e ); } } // Load the CharFilters final ArrayList<CharFilterFactory> charFilters = new ArrayList<CharFilterFactory>(); AbstractPluginLoader<CharFilterFactory> charFilterLoader = new AbstractPluginLoader<CharFilterFactory> ("[schema.xml] analyzer/charFilter", CharFilterFactory.class, false, false) { @Override protected void init(CharFilterFactory plugin, Node node) throws Exception { if( plugin != null ) { final Map<String,String> params = DOMUtil.toMapExcept(node.getAttributes(),"class"); String configuredVersion = params.remove(LUCENE_MATCH_VERSION_PARAM); plugin.setLuceneMatchVersion(parseConfiguredVersion(configuredVersion, plugin.getClass().getSimpleName())); plugin.init( params ); charFilters.add( plugin ); } } @Override protected CharFilterFactory register(String name, CharFilterFactory plugin) { return null; // used for map registration } }; charFilterLoader.load( loader, (NodeList)xpath.evaluate("./charFilter", node, XPathConstants.NODESET) ); // Load the Tokenizer // Although an analyzer only allows a single Tokenizer, we load a list to make sure // the configuration is ok final ArrayList<TokenizerFactory> tokenizers = new ArrayList<TokenizerFactory>(1); AbstractPluginLoader<TokenizerFactory> tokenizerLoader = new AbstractPluginLoader<TokenizerFactory> ("[schema.xml] analyzer/tokenizer", TokenizerFactory.class, false, false) { @Override protected void init(TokenizerFactory plugin, Node node) throws Exception { if( !tokenizers.isEmpty() ) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "The schema defines multiple tokenizers for: "+node ); } final Map<String,String> params = DOMUtil.toMapExcept(node.getAttributes(),"class"); String configuredVersion = params.remove(LUCENE_MATCH_VERSION_PARAM); plugin.setLuceneMatchVersion(parseConfiguredVersion(configuredVersion, plugin.getClass().getSimpleName())); plugin.init( params ); tokenizers.add( plugin ); } @Override protected TokenizerFactory register(String name, TokenizerFactory plugin) { return null; // used for map registration } }; tokenizerLoader.load( loader, (NodeList)xpath.evaluate("./tokenizer", node, XPathConstants.NODESET) ); // Make sure something was loaded if( tokenizers.isEmpty() ) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR,"analyzer without class or tokenizer & filter list"); } // Load the Filters final ArrayList<TokenFilterFactory> filters = new ArrayList<TokenFilterFactory>(); AbstractPluginLoader<TokenFilterFactory> filterLoader = new AbstractPluginLoader<TokenFilterFactory>("[schema.xml] analyzer/filter", TokenFilterFactory.class, false, false) { @Override protected void init(TokenFilterFactory plugin, Node node) throws Exception { if( plugin != null ) { final Map<String,String> params = DOMUtil.toMapExcept(node.getAttributes(),"class"); String configuredVersion = params.remove(LUCENE_MATCH_VERSION_PARAM); plugin.setLuceneMatchVersion(parseConfiguredVersion(configuredVersion, plugin.getClass().getSimpleName())); plugin.init( params ); filters.add( plugin ); } } @Override protected TokenFilterFactory register(String name, TokenFilterFactory plugin) throws Exception { return null; // used for map registration } }; filterLoader.load( loader, (NodeList)xpath.evaluate("./filter", node, XPathConstants.NODESET) ); return new TokenizerChain(charFilters.toArray(new CharFilterFactory[charFilters.size()]), tokenizers.get(0), filters.toArray(new TokenFilterFactory[filters.size()])); }
// in core/src/java/org/apache/solr/schema/FieldTypePluginLoader.java
Override protected void init(CharFilterFactory plugin, Node node) throws Exception { if( plugin != null ) { final Map<String,String> params = DOMUtil.toMapExcept(node.getAttributes(),"class"); String configuredVersion = params.remove(LUCENE_MATCH_VERSION_PARAM); plugin.setLuceneMatchVersion(parseConfiguredVersion(configuredVersion, plugin.getClass().getSimpleName())); plugin.init( params ); charFilters.add( plugin ); } }
// in core/src/java/org/apache/solr/schema/FieldTypePluginLoader.java
Override protected void init(TokenizerFactory plugin, Node node) throws Exception { if( !tokenizers.isEmpty() ) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "The schema defines multiple tokenizers for: "+node ); } final Map<String,String> params = DOMUtil.toMapExcept(node.getAttributes(),"class"); String configuredVersion = params.remove(LUCENE_MATCH_VERSION_PARAM); plugin.setLuceneMatchVersion(parseConfiguredVersion(configuredVersion, plugin.getClass().getSimpleName())); plugin.init( params ); tokenizers.add( plugin ); }
// in core/src/java/org/apache/solr/schema/FieldTypePluginLoader.java
Override protected void init(TokenFilterFactory plugin, Node node) throws Exception { if( plugin != null ) { final Map<String,String> params = DOMUtil.toMapExcept(node.getAttributes(),"class"); String configuredVersion = params.remove(LUCENE_MATCH_VERSION_PARAM); plugin.setLuceneMatchVersion(parseConfiguredVersion(configuredVersion, plugin.getClass().getSimpleName())); plugin.init( params ); filters.add( plugin ); } }
// in core/src/java/org/apache/solr/schema/FieldTypePluginLoader.java
Override protected TokenFilterFactory register(String name, TokenFilterFactory plugin) throws Exception { return null; // used for map registration }
// in core/src/java/org/apache/solr/schema/PreAnalyzedField.java
public IndexableField fromString(SchemaField field, String val, float boost) throws Exception { if (val == null || val.trim().length() == 0) { return null; } PreAnalyzedTokenizer parse = new PreAnalyzedTokenizer(new StringReader(val), parser); Field f = (Field)super.createField(field, val, boost); if (parse.getStringValue() != null) { f.setStringValue(parse.getStringValue()); } else if (parse.getBinaryValue() != null) { f.setBytesValue(parse.getBinaryValue()); } else { f.fieldType().setStored(false); } if (parse.hasTokenStream()) { f.fieldType().setIndexed(true); f.fieldType().setTokenized(true); f.setTokenStream(parse); } return f; }
// in core/src/java/org/apache/solr/internal/csv/writer/CSVWriter.java
protected String writeValue(CSVField field, String value) throws Exception { if (config.isFixedWidth()) { if (value.length() < field.getSize()) { int fillPattern = config.getFill(); if (field.overrideFill()) { fillPattern = field.getFill(); } StringBuffer sb = new StringBuffer(); int fillSize = (field.getSize() - value.length()); char[] fill = new char[fillSize]; Arrays.fill(fill, config.getFillChar()); if (fillPattern == CSVConfig.FILLLEFT) { sb.append(fill); sb.append(value); value = sb.toString(); } else { // defaults to fillpattern FILLRIGHT when fixedwidth is used sb.append(value); sb.append(fill); value = sb.toString(); } } else if (value.length() > field.getSize()) { // value to big.. value = value.substring(0, field.getSize()); } if (!config.isValueDelimiterIgnored()) { // add the value delimiter.. value = config.getValueDelimiter()+value+config.getValueDelimiter(); } } return value; }
// in core/src/java/org/apache/solr/search/function/FileFloatSource.java
Override public void handleRequestBody(SolrQueryRequest req, SolrQueryResponse rsp) throws Exception { FileFloatSource.resetCache(); log.debug("readerCache has been reset."); UpdateRequestProcessor processor = req.getCore().getUpdateProcessingChain(null).createProcessor(req, rsp); try{ RequestHandlerUtils.handleCommit(req, processor, req.getParams(), true); } finally{ processor.finish(); } }
// in core/src/java/org/apache/solr/cloud/ZkController.java
public static void main(String[] args) throws Exception { // start up a tmp zk server first String zkServerAddress = args[0]; String solrHome = args[1]; String solrPort = null; if (args.length > 2) { solrPort = args[2]; } SolrZkServer zkServer = null; if (solrPort != null) { zkServer = new SolrZkServer("true", null, solrHome + "/zoo_data", solrHome, solrPort); zkServer.parseConfig(); zkServer.start(); } SolrZkClient zkClient = new SolrZkClient(zkServerAddress, 15000, 5000, new OnReconnect() { @Override public void command() { }}); SolrResourceLoader loader = new SolrResourceLoader(solrHome); solrHome = loader.getInstanceDir(); InputSource cfgis = new InputSource(new File(solrHome, "solr.xml").toURI().toASCIIString()); Config cfg = new Config(loader, null, cfgis , null, false); bootstrapConf(zkClient, cfg, solrHome); if (solrPort != null) { zkServer.stop(); } }
// in core/src/java/org/apache/solr/cloud/ZkController.java
public String register(String coreName, final CoreDescriptor desc) throws Exception { return register(coreName, desc, false); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
public String register(String coreName, final CoreDescriptor desc, boolean recoverReloadedCores) throws Exception { final String baseUrl = getBaseUrl(); final CloudDescriptor cloudDesc = desc.getCloudDescriptor(); final String collection = cloudDesc.getCollectionName(); final String coreZkNodeName = getNodeName() + "_" + coreName; String shardId = cloudDesc.getShardId(); Map<String,String> props = new HashMap<String,String>(); // we only put a subset of props into the leader node props.put(ZkStateReader.BASE_URL_PROP, baseUrl); props.put(ZkStateReader.CORE_NAME_PROP, coreName); props.put(ZkStateReader.NODE_NAME_PROP, getNodeName()); if (log.isInfoEnabled()) { log.info("Register shard - core:" + coreName + " address:" + baseUrl + " shardId:" + shardId); } ZkNodeProps leaderProps = new ZkNodeProps(props); try { joinElection(desc); } catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } catch (KeeperException e) { throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } catch (IOException e) { throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } // rather than look in the cluster state file, we go straight to the zknodes // here, because on cluster restart there could be stale leader info in the // cluster state node that won't be updated for a moment String leaderUrl = getLeaderProps(collection, cloudDesc.getShardId()).getCoreUrl(); // now wait until our currently cloud state contains the latest leader String cloudStateLeader = zkStateReader.getLeaderUrl(collection, cloudDesc.getShardId(), 30000); int tries = 0; while (!leaderUrl.equals(cloudStateLeader)) { if (tries == 60) { throw new SolrException(ErrorCode.SERVER_ERROR, "There is conflicting information about the leader of shard: " + cloudDesc.getShardId()); } Thread.sleep(1000); tries++; cloudStateLeader = zkStateReader.getLeaderUrl(collection, cloudDesc.getShardId(), 30000); } String ourUrl = ZkCoreNodeProps.getCoreUrl(baseUrl, coreName); log.info("We are " + ourUrl + " and leader is " + leaderUrl); boolean isLeader = leaderUrl.equals(ourUrl); SolrCore core = null; if (cc != null) { // CoreContainer only null in tests try { core = cc.getCore(desc.getName()); // recover from local transaction log and wait for it to complete before // going active // TODO: should this be moved to another thread? To recoveryStrat? // TODO: should this actually be done earlier, before (or as part of) // leader election perhaps? // TODO: if I'm the leader, ensure that a replica that is trying to recover waits until I'm // active (or don't make me the // leader until my local replay is done. UpdateLog ulog = core.getUpdateHandler().getUpdateLog(); if (!core.isReloaded() && ulog != null) { Future<UpdateLog.RecoveryInfo> recoveryFuture = core.getUpdateHandler() .getUpdateLog().recoverFromLog(); if (recoveryFuture != null) { recoveryFuture.get(); // NOTE: this could potentially block for // minutes or more! // TODO: public as recovering in the mean time? // TODO: in the future we could do peerync in parallel with recoverFromLog } else { log.info("No LogReplay needed for core="+core.getName() + " baseURL=" + baseUrl); } } boolean didRecovery = checkRecovery(coreName, desc, recoverReloadedCores, isLeader, cloudDesc, collection, coreZkNodeName, shardId, leaderProps, core, cc); if (!didRecovery) { publishAsActive(baseUrl, desc, coreZkNodeName, coreName); } } finally { if (core != null) { core.close(); } } } else { publishAsActive(baseUrl, desc, coreZkNodeName, coreName); } // make sure we have an update cluster state right away zkStateReader.updateCloudState(true); return shardId; }
// in core/src/java/org/apache/solr/update/SolrCmdDistributor.java
public void submit(final Request sreq) { if (completionService == null) { completionService = new ExecutorCompletionService<Request>(commExecutor); pending = new HashSet<Future<Request>>(); } final String url = sreq.node.getUrl(); Callable<Request> task = new Callable<Request>() { @Override public Request call() throws Exception { Request clonedRequest = new Request(); clonedRequest.node = sreq.node; clonedRequest.ureq = sreq.ureq; clonedRequest.retries = sreq.retries; try { String fullUrl; if (!url.startsWith("http://") && !url.startsWith("https://")) { fullUrl = "http://" + url; } else { fullUrl = url; } HttpSolrServer server = new HttpSolrServer(fullUrl, client); clonedRequest.ursp = server.request(clonedRequest.ureq); // currently no way to get the request body. } catch (Exception e) { clonedRequest.exception = e; if (e instanceof SolrException) { clonedRequest.rspCode = ((SolrException) e).code(); } else { clonedRequest.rspCode = -1; } } return clonedRequest; } }; pending.add(completionService.submit(task)); }
// in core/src/java/org/apache/solr/update/SolrCmdDistributor.java
Override public Request call() throws Exception { Request clonedRequest = new Request(); clonedRequest.node = sreq.node; clonedRequest.ureq = sreq.ureq; clonedRequest.retries = sreq.retries; try { String fullUrl; if (!url.startsWith("http://") && !url.startsWith("https://")) { fullUrl = "http://" + url; } else { fullUrl = url; } HttpSolrServer server = new HttpSolrServer(fullUrl, client); clonedRequest.ursp = server.request(clonedRequest.ureq); // currently no way to get the request body. } catch (Exception e) { clonedRequest.exception = e; if (e instanceof SolrException) { clonedRequest.rspCode = ((SolrException) e).code(); } else { clonedRequest.rspCode = -1; } } return clonedRequest; }
// in core/src/java/org/apache/solr/core/SolrCore.java
public Object call() throws Exception { latch.await(); return null; }
// in core/src/java/org/apache/solr/core/SolrCore.java
public RefCounted<SolrIndexSearcher> getSearcher(boolean forceNew, boolean returnSearcher, final Future[] waitSearcher, boolean updateHandlerReopens) throws IOException { // it may take some time to open an index.... we may need to make // sure that two threads aren't trying to open one at the same time // if it isn't necessary. synchronized (searcherLock) { // see if we can return the current searcher if (_searcher!=null && !forceNew) { if (returnSearcher) { _searcher.incref(); return _searcher; } else { return null; } } // check to see if we can wait for someone else's searcher to be set if (onDeckSearchers>0 && !forceNew && _searcher==null) { try { searcherLock.wait(); } catch (InterruptedException e) { log.info(SolrException.toStr(e)); } } // check again: see if we can return right now if (_searcher!=null && !forceNew) { if (returnSearcher) { _searcher.incref(); return _searcher; } else { return null; } } // At this point, we know we need to open a new searcher... // first: increment count to signal other threads that we are // opening a new searcher. onDeckSearchers++; if (onDeckSearchers < 1) { // should never happen... just a sanity check log.error(logid+"ERROR!!! onDeckSearchers is " + onDeckSearchers); onDeckSearchers=1; // reset } else if (onDeckSearchers > maxWarmingSearchers) { onDeckSearchers--; String msg="Error opening new searcher. exceeded limit of maxWarmingSearchers="+maxWarmingSearchers + ", try again later."; log.warn(logid+""+ msg); // HTTP 503==service unavailable, or 409==Conflict throw new SolrException(SolrException.ErrorCode.SERVICE_UNAVAILABLE,msg); } else if (onDeckSearchers > 1) { log.warn(logid+"PERFORMANCE WARNING: Overlapping onDeckSearchers=" + onDeckSearchers); } } // a signal to decrement onDeckSearchers if something goes wrong. final boolean[] decrementOnDeckCount=new boolean[]{true}; RefCounted<SolrIndexSearcher> currSearcherHolder = null; // searcher we are autowarming from RefCounted<SolrIndexSearcher> searchHolder = null; boolean success = false; openSearcherLock.lock(); try { searchHolder = openNewSearcher(updateHandlerReopens, false); // the searchHolder will be incremented once already (and it will eventually be assigned to _searcher when registered) // increment it again if we are going to return it to the caller. if (returnSearcher) { searchHolder.incref(); } final RefCounted<SolrIndexSearcher> newSearchHolder = searchHolder; final SolrIndexSearcher newSearcher = newSearchHolder.get(); boolean alreadyRegistered = false; synchronized (searcherLock) { if (_searcher == null) { // if there isn't a current searcher then we may // want to register this one before warming is complete instead of waiting. if (solrConfig.useColdSearcher) { registerSearcher(newSearchHolder); decrementOnDeckCount[0]=false; alreadyRegistered=true; } } else { // get a reference to the current searcher for purposes of autowarming. currSearcherHolder=_searcher; currSearcherHolder.incref(); } } final SolrIndexSearcher currSearcher = currSearcherHolder==null ? null : currSearcherHolder.get(); Future future=null; // warm the new searcher based on the current searcher. // should this go before the other event handlers or after? if (currSearcher != null) { future = searcherExecutor.submit( new Callable() { public Object call() throws Exception { try { newSearcher.warm(currSearcher); } catch (Throwable e) { SolrException.log(log,e); } return null; } } ); } if (currSearcher==null && firstSearcherListeners.size() > 0) { future = searcherExecutor.submit( new Callable() { public Object call() throws Exception { try { for (SolrEventListener listener : firstSearcherListeners) { listener.newSearcher(newSearcher,null); } } catch (Throwable e) { SolrException.log(log,null,e); } return null; } } ); }
// in core/src/java/org/apache/solr/core/SolrCore.java
public Object call() throws Exception { try { newSearcher.warm(currSearcher); } catch (Throwable e) { SolrException.log(log,e); } return null; }
// in core/src/java/org/apache/solr/core/SolrCore.java
public Object call() throws Exception { try { for (SolrEventListener listener : firstSearcherListeners) { listener.newSearcher(newSearcher,null); } } catch (Throwable e) { SolrException.log(log,null,e); } return null; }
// in core/src/java/org/apache/solr/core/SolrCore.java
public Object call() throws Exception { try { for (SolrEventListener listener : newSearcherListeners) { listener.newSearcher(newSearcher, currSearcher); } } catch (Throwable e) { SolrException.log(log,null,e); } return null; }
// in core/src/java/org/apache/solr/core/SolrCore.java
public Object call() throws Exception { try { // registerSearcher will decrement onDeckSearchers and // do a notify, even if it fails. registerSearcher(newSearchHolder); } catch (Throwable e) { SolrException.log(log, e); } finally { // we are all done with the old searcher we used // for warming... if (currSearcherHolderF!=null) currSearcherHolderF.decref(); } return null; }
// in core/src/java/org/apache/solr/util/plugin/MapPluginLoader.java
Override protected void init(T plugin, Node node) throws Exception { Map<String,String> params = DOMUtil.toMapExcept( node.getAttributes(), "name","class" ); plugin.init( params ); }
// in core/src/java/org/apache/solr/util/plugin/MapPluginLoader.java
Override protected T register(String name, T plugin) throws Exception { if( registry != null ) { return registry.put( name, plugin ); } return null; }
// in core/src/java/org/apache/solr/util/plugin/NamedListPluginLoader.java
Override protected void init(T plugin,Node node) throws Exception { plugin.init( DOMUtil.childNodesToNamedList(node) ); }
// in core/src/java/org/apache/solr/util/plugin/NamedListPluginLoader.java
Override protected T register(String name, T plugin) throws Exception { if( registry != null ) { return registry.put( name, plugin ); } return null; }
(Lib) ConfigException 4
              
// in core/src/java/org/apache/solr/cloud/SolrZkServer.java
public static Properties getProperties(String path) throws ConfigException { File configFile = new File(path); LOG.info("Reading configuration from: " + configFile); try { if (!configFile.exists()) { throw new IllegalArgumentException(configFile.toString() + " file is missing"); } Properties cfg = new Properties(); FileInputStream in = new FileInputStream(configFile); try { cfg.load(in); } finally { in.close(); } return cfg; } catch (IOException e) { throw new ConfigException("Error processing " + path, e); } catch (IllegalArgumentException e) { throw new ConfigException("Error processing " + path, e); } }
// in core/src/java/org/apache/solr/cloud/SolrZkServer.java
Override public void parseProperties(Properties zkProp) throws IOException, ConfigException { for (Entry<Object, Object> entry : zkProp.entrySet()) { String key = entry.getKey().toString().trim(); String value = entry.getValue().toString().trim(); if (key.equals("dataDir")) { dataDir = value; } else if (key.equals("dataLogDir")) { dataLogDir = value; } else if (key.equals("clientPort")) { setClientPort(Integer.parseInt(value)); } else if (key.equals("tickTime")) { tickTime = Integer.parseInt(value); } else if (key.equals("initLimit")) { initLimit = Integer.parseInt(value); } else if (key.equals("syncLimit")) { syncLimit = Integer.parseInt(value); } else if (key.equals("electionAlg")) { electionAlg = Integer.parseInt(value); } else if (key.equals("maxClientCnxns")) { maxClientCnxns = Integer.parseInt(value); } else if (key.startsWith("server.")) { int dot = key.indexOf('.'); long sid = Long.parseLong(key.substring(dot + 1)); String parts[] = value.split(":"); if ((parts.length != 2) && (parts.length != 3)) { LOG.error(value + " does not have the form host:port or host:port:port"); } InetSocketAddress addr = new InetSocketAddress(parts[0], Integer.parseInt(parts[1])); if (parts.length == 2) { servers.put(Long.valueOf(sid), new QuorumPeer.QuorumServer(sid, addr)); } else if (parts.length == 3) { InetSocketAddress electionAddr = new InetSocketAddress( parts[0], Integer.parseInt(parts[2])); servers.put(Long.valueOf(sid), new QuorumPeer.QuorumServer(sid, addr, electionAddr)); } } else if (key.startsWith("group")) { int dot = key.indexOf('.'); long gid = Long.parseLong(key.substring(dot + 1)); numGroups++; String parts[] = value.split(":"); for(String s : parts){ long sid = Long.parseLong(s); if(serverGroup.containsKey(sid)) throw new ConfigException("Server " + sid + "is in multiple groups"); else serverGroup.put(sid, gid); } } else if(key.startsWith("weight")) { int dot = key.indexOf('.'); long sid = Long.parseLong(key.substring(dot + 1)); serverWeight.put(sid, Long.parseLong(value)); } else { System.setProperty("zookeeper." + key, value); } } if (dataDir == null) { throw new IllegalArgumentException("dataDir is not set"); } if (dataLogDir == null) { dataLogDir = dataDir; } else { if (!new File(dataLogDir).isDirectory()) { throw new IllegalArgumentException("dataLogDir " + dataLogDir + " is missing."); } } if (tickTime == 0) { throw new IllegalArgumentException("tickTime is not set"); } if (servers.size() > 1) { if (initLimit == 0) { throw new IllegalArgumentException("initLimit is not set"); } if (syncLimit == 0) { throw new IllegalArgumentException("syncLimit is not set"); } /* * If using FLE, then every server requires a separate election * port. */ if (electionAlg != 0) { for (QuorumPeer.QuorumServer s : servers.values()) { if (s.electionAddr == null) throw new IllegalArgumentException( "Missing election port for server: " + s.id); } } /* * Default of quorum config is majority */ if(serverGroup.size() > 0){ if(servers.size() != serverGroup.size()) throw new ConfigException("Every server must be in exactly one group"); /* * The deafult weight of a server is 1 */ for(QuorumPeer.QuorumServer s : servers.values()){ if(!serverWeight.containsKey(s.id)) serverWeight.put(s.id, (long) 1); } /* * Set the quorumVerifier to be QuorumHierarchical */ quorumVerifier = new QuorumHierarchical(numGroups, serverWeight, serverGroup); } else { /* * The default QuorumVerifier is QuorumMaj */ LOG.info("Defaulting to majority quorums"); quorumVerifier = new QuorumMaj(servers.size()); } File myIdFile = new File(dataDir, "myid"); if (!myIdFile.exists()) { ///////////////// ADDED FOR SOLR ////// Long myid = getMySeverId(); if (myid != null) { serverId = myid; return; } if (zkRun == null) return; //////////////// END ADDED FOR SOLR ////// throw new IllegalArgumentException(myIdFile.toString() + " file is missing"); } BufferedReader br = new BufferedReader(new FileReader(myIdFile)); String myIdString; try { myIdString = br.readLine(); } finally { br.close(); } try { serverId = Long.parseLong(myIdString); } catch (NumberFormatException e) { throw new IllegalArgumentException("serverid " + myIdString + " is not a number"); } } }
2
              
// in core/src/java/org/apache/solr/cloud/SolrZkServer.java
catch (IOException e) { throw new ConfigException("Error processing " + path, e); }
// in core/src/java/org/apache/solr/cloud/SolrZkServer.java
catch (IllegalArgumentException e) { throw new ConfigException("Error processing " + path, e); }
2
              
// in core/src/java/org/apache/solr/cloud/SolrZkServer.java
public static Properties getProperties(String path) throws ConfigException { File configFile = new File(path); LOG.info("Reading configuration from: " + configFile); try { if (!configFile.exists()) { throw new IllegalArgumentException(configFile.toString() + " file is missing"); } Properties cfg = new Properties(); FileInputStream in = new FileInputStream(configFile); try { cfg.load(in); } finally { in.close(); } return cfg; } catch (IOException e) { throw new ConfigException("Error processing " + path, e); } catch (IllegalArgumentException e) { throw new ConfigException("Error processing " + path, e); } }
// in core/src/java/org/apache/solr/cloud/SolrZkServer.java
Override public void parseProperties(Properties zkProp) throws IOException, ConfigException { for (Entry<Object, Object> entry : zkProp.entrySet()) { String key = entry.getKey().toString().trim(); String value = entry.getValue().toString().trim(); if (key.equals("dataDir")) { dataDir = value; } else if (key.equals("dataLogDir")) { dataLogDir = value; } else if (key.equals("clientPort")) { setClientPort(Integer.parseInt(value)); } else if (key.equals("tickTime")) { tickTime = Integer.parseInt(value); } else if (key.equals("initLimit")) { initLimit = Integer.parseInt(value); } else if (key.equals("syncLimit")) { syncLimit = Integer.parseInt(value); } else if (key.equals("electionAlg")) { electionAlg = Integer.parseInt(value); } else if (key.equals("maxClientCnxns")) { maxClientCnxns = Integer.parseInt(value); } else if (key.startsWith("server.")) { int dot = key.indexOf('.'); long sid = Long.parseLong(key.substring(dot + 1)); String parts[] = value.split(":"); if ((parts.length != 2) && (parts.length != 3)) { LOG.error(value + " does not have the form host:port or host:port:port"); } InetSocketAddress addr = new InetSocketAddress(parts[0], Integer.parseInt(parts[1])); if (parts.length == 2) { servers.put(Long.valueOf(sid), new QuorumPeer.QuorumServer(sid, addr)); } else if (parts.length == 3) { InetSocketAddress electionAddr = new InetSocketAddress( parts[0], Integer.parseInt(parts[2])); servers.put(Long.valueOf(sid), new QuorumPeer.QuorumServer(sid, addr, electionAddr)); } } else if (key.startsWith("group")) { int dot = key.indexOf('.'); long gid = Long.parseLong(key.substring(dot + 1)); numGroups++; String parts[] = value.split(":"); for(String s : parts){ long sid = Long.parseLong(s); if(serverGroup.containsKey(sid)) throw new ConfigException("Server " + sid + "is in multiple groups"); else serverGroup.put(sid, gid); } } else if(key.startsWith("weight")) { int dot = key.indexOf('.'); long sid = Long.parseLong(key.substring(dot + 1)); serverWeight.put(sid, Long.parseLong(value)); } else { System.setProperty("zookeeper." + key, value); } } if (dataDir == null) { throw new IllegalArgumentException("dataDir is not set"); } if (dataLogDir == null) { dataLogDir = dataDir; } else { if (!new File(dataLogDir).isDirectory()) { throw new IllegalArgumentException("dataLogDir " + dataLogDir + " is missing."); } } if (tickTime == 0) { throw new IllegalArgumentException("tickTime is not set"); } if (servers.size() > 1) { if (initLimit == 0) { throw new IllegalArgumentException("initLimit is not set"); } if (syncLimit == 0) { throw new IllegalArgumentException("syncLimit is not set"); } /* * If using FLE, then every server requires a separate election * port. */ if (electionAlg != 0) { for (QuorumPeer.QuorumServer s : servers.values()) { if (s.electionAddr == null) throw new IllegalArgumentException( "Missing election port for server: " + s.id); } } /* * Default of quorum config is majority */ if(serverGroup.size() > 0){ if(servers.size() != serverGroup.size()) throw new ConfigException("Every server must be in exactly one group"); /* * The deafult weight of a server is 1 */ for(QuorumPeer.QuorumServer s : servers.values()){ if(!serverWeight.containsKey(s.id)) serverWeight.put(s.id, (long) 1); } /* * Set the quorumVerifier to be QuorumHierarchical */ quorumVerifier = new QuorumHierarchical(numGroups, serverWeight, serverGroup); } else { /* * The default QuorumVerifier is QuorumMaj */ LOG.info("Defaulting to majority quorums"); quorumVerifier = new QuorumMaj(servers.size()); } File myIdFile = new File(dataDir, "myid"); if (!myIdFile.exists()) { ///////////////// ADDED FOR SOLR ////// Long myid = getMySeverId(); if (myid != null) { serverId = myid; return; } if (zkRun == null) return; //////////////// END ADDED FOR SOLR ////// throw new IllegalArgumentException(myIdFile.toString() + " file is missing"); } BufferedReader br = new BufferedReader(new FileReader(myIdFile)); String myIdString; try { myIdString = br.readLine(); } finally { br.close(); } try { serverId = Long.parseLong(myIdString); } catch (NumberFormatException e) { throw new IllegalArgumentException("serverid " + myIdString + " is not a number"); } } }
(Lib) EOFException 3
              
// in solrj/src/java/org/apache/solr/common/util/FastInputStream.java
public int readUnsignedByte() throws IOException { if (pos >= end) { refill(); if (pos >= end) { throw new EOFException(); } } return buf[pos++] & 0xff; }
// in solrj/src/java/org/apache/solr/common/util/FastInputStream.java
public void readFully(byte b[], int off, int len) throws IOException { while (len>0) { int ret = read(b, off, len); if (ret==-1) { throw new EOFException(); } off += ret; len -= ret; } }
// in solrj/src/java/org/apache/solr/common/util/FastInputStream.java
public byte readByte() throws IOException { if (pos >= end) { refill(); if (pos >= end) throw new EOFException(); } return buf[pos++]; }
0 0
(Lib) FileNotFoundException 3
              
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/FileDataSource.java
static File getFile(String basePath, String query) { try { File file0 = new File(query); File file = file0; if (!file.isAbsolute()) file = new File(basePath + query); if (file.isFile() && file.canRead()) { LOG.debug("Accessing File: " + file.toString()); return file; } else if (file != file0) if (file0.isFile() && file0.canRead()) { LOG.debug("Accessing File0: " + file0.toString()); return file0; } throw new FileNotFoundException("Could not find file: " + query); } catch (FileNotFoundException e) { throw new RuntimeException(e); } }
// in core/src/java/org/apache/solr/handler/SnapShooter.java
public void copyFile(File source, File destination, boolean preserveFileDate) throws IOException { // check source exists if (!source.exists()) { String message = "File " + source + " does not exist"; throw new FileNotFoundException(message); } // does destinations directory exist ? if (destination.getParentFile() != null && !destination.getParentFile().exists()) { destination.getParentFile().mkdirs(); } // make sure we can write to destination if (destination.exists() && !destination.canWrite()) { String message = "Unable to open file " + destination + " for writing."; throw new IOException(message); } FileInputStream input = null; FileOutputStream output = null; try { input = new FileInputStream(source); output = new FileOutputStream(destination); int count = 0; int n = 0; int rcnt = 0; while (-1 != (n = input.read(buffer))) { output.write(buffer, 0, n); count += n; rcnt++; /*** // reserve every 4.6875 MB if (rcnt == 150) { rcnt = 0; delPolicy.setReserveDuration(indexCommit.getVersion(), reserveTime); } ***/ } } finally { try { IOUtils.closeQuietly(input); } finally { IOUtils.closeQuietly(output); } } if (source.length() != destination.length()) { String message = "Failed to copy full contents from " + source + " to " + destination; throw new IOException(message); } if (preserveFileDate) { // file copy should preserve file date destination.setLastModified(source.lastModified()); } }
// in core/src/java/org/apache/solr/util/FileUtils.java
public static void sync(File fullFile) throws IOException { if (fullFile == null || !fullFile.exists()) throw new FileNotFoundException("File does not exist " + fullFile); boolean success = false; int retryCount = 0; IOException exc = null; while(!success && retryCount < 5) { retryCount++; RandomAccessFile file = null; try { try { file = new RandomAccessFile(fullFile, "rw"); file.getFD().sync(); success = true; } finally { if (file != null) file.close(); } } catch (IOException ioe) { if (exc == null) exc = ioe; try { // Pause 5 msec Thread.sleep(5); } catch (InterruptedException ie) { Thread.currentThread().interrupt(); } } } if (!success) // Throw original exception throw exc; }
0 2
              
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/FileDataSource.java
protected Reader openStream(File file) throws FileNotFoundException, UnsupportedEncodingException { if (encoding == null) { return new InputStreamReader(new FileInputStream(file)); } else { return new InputStreamReader(new FileInputStream(file), encoding); } }
// in core/src/java/org/apache/solr/util/VersionedFile.java
public static InputStream getLatestFile(String dirName, String fileName) throws FileNotFoundException { Collection<File> oldFiles=null; final String prefix = fileName+'.'; File f = new File(dirName, fileName); InputStream is = null; // there can be a race between checking for a file and opening it... // the user may have just put a new version in and deleted an old version. // try multiple times in a row. for (int retry=0; retry<10 && is==null; retry++) { try { if (!f.exists()) { File dir = new File(dirName); String[] names = dir.list(new FilenameFilter() { public boolean accept(File dir, String name) { return name.startsWith(prefix); } }); Arrays.sort(names); f = new File(dir, names[names.length-1]); oldFiles = new ArrayList<File>(); for (int i=0; i<names.length-1; i++) { oldFiles.add(new File(dir, names[i])); } } is = new FileInputStream(f); } catch (Exception e) { // swallow exception for now } }
(Lib) AssertionError 2 0 0
(Lib) ClassNotFoundException 2
              
// in core/src/java/org/apache/solr/util/SuggestMissingFactories.java
public Collection<Class> findExtends(Class<?> clazz) throws ClassNotFoundException { HashSet<Class> results = new HashSet<Class>(); for (JarFile jarFile : jarFiles) { for (Enumeration<JarEntry> e = jarFile.entries(); e.hasMoreElements() ;) { String n = e.nextElement().getName(); if (n.endsWith(".class")) { String cn = n.replace("/",".").substring(0,n.length()-6); Class<?> target; try { target = cl.loadClass(cn); } catch (NoClassDefFoundError e1) { throw new ClassNotFoundException ("Can't load: " + cn, e1); } if (clazz.isAssignableFrom(target) && !target.isAnonymousClass()) { int mods = target.getModifiers(); if (!(Modifier.isAbstract(mods) || Modifier.isInterface(mods))) { results.add(target); } } } } } return results; }
2
              
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DocBuilder.java
catch (Exception e) { try { String n = DocBuilder.class.getPackage().getName() + "." + name; return core != null ? core.getResourceLoader().findClass(n, Object.class) : Class.forName(n); } catch (Exception e1) { throw new ClassNotFoundException("Unable to load " + name + " or " + DocBuilder.class.getPackage().getName() + "." + name, e); } }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DocBuilder.java
catch (Exception e1) { throw new ClassNotFoundException("Unable to load " + name + " or " + DocBuilder.class.getPackage().getName() + "." + name, e); }
// in core/src/java/org/apache/solr/util/SuggestMissingFactories.java
catch (NoClassDefFoundError e1) { throw new ClassNotFoundException ("Can't load: " + cn, e1); }
3
              
// in core/src/java/org/apache/solr/util/SuggestMissingFactories.java
public static void main(String[] args) throws ClassNotFoundException, IOException, NoSuchMethodException { final File[] files = new File[args.length]; for (int i = 0; i < args.length; i++) { files[i] = new File(args[i]); } final FindClasses finder = new FindClasses(files); final ClassLoader cl = finder.getClassLoader(); final Class TOKENSTREAM = cl.loadClass("org.apache.lucene.analysis.TokenStream"); final Class TOKENIZER = cl.loadClass("org.apache.lucene.analysis.Tokenizer"); final Class TOKENFILTER = cl.loadClass("org.apache.lucene.analysis.TokenFilter"); final Class TOKENIZERFACTORY = cl.loadClass("org.apache.solr.analysis.TokenizerFactory"); final Class TOKENFILTERFACTORY = cl.loadClass("org.apache.solr.analysis.TokenFilterFactory"); final HashSet<Class> result = new HashSet<Class>(finder.findExtends(TOKENIZER)); result.addAll(finder.findExtends(TOKENFILTER)); result.removeAll(finder.findMethodReturns (finder.findExtends(TOKENIZERFACTORY), "create", Reader.class).values()); result.removeAll(finder.findMethodReturns (finder.findExtends(TOKENFILTERFACTORY), "create", TOKENSTREAM).values()); for (final Class c : result) { System.out.println(c.getName()); } }
// in core/src/java/org/apache/solr/util/SuggestMissingFactories.java
public static void main(String[] args) throws ClassNotFoundException, IOException, NoSuchMethodException { FindClasses finder = new FindClasses(new File(args[1])); ClassLoader cl = finder.getClassLoader(); Class clazz = cl.loadClass(args[0]); if (args.length == 2) { System.out.println("Finding all extenders of " + clazz.getName()); for (Class c : finder.findExtends(clazz)) { System.out.println(c.getName()); } } else { String methName = args[2]; System.out.println("Finding all extenders of " + clazz.getName() + " with method: " + methName); Class[] methArgs = new Class[args.length-3]; for (int i = 3; i < args.length; i++) { methArgs[i-3] = cl.loadClass(args[i]); } Map<Class,Class> map = finder.findMethodReturns (finder.findExtends(clazz),methName, methArgs); for (Class key : map.keySet()) { System.out.println(key.getName() + " => " + map.get(key).getName()); } } }
// in core/src/java/org/apache/solr/util/SuggestMissingFactories.java
public Collection<Class> findExtends(Class<?> clazz) throws ClassNotFoundException { HashSet<Class> results = new HashSet<Class>(); for (JarFile jarFile : jarFiles) { for (Enumeration<JarEntry> e = jarFile.entries(); e.hasMoreElements() ;) { String n = e.nextElement().getName(); if (n.endsWith(".class")) { String cn = n.replace("/",".").substring(0,n.length()-6); Class<?> target; try { target = cl.loadClass(cn); } catch (NoClassDefFoundError e1) { throw new ClassNotFoundException ("Can't load: " + cn, e1); } if (clazz.isAssignableFrom(target) && !target.isAnonymousClass()) { int mods = target.getModifiers(); if (!(Modifier.isAbstract(mods) || Modifier.isInterface(mods))) { results.add(target); } } } } } return results; }
(Lib) InterruptedException 2
              
// in core/src/java/org/apache/solr/handler/SnapPuller.java
boolean fetchLatestIndex(SolrCore core, boolean force) throws IOException, InterruptedException { successfulInstall = false; replicationStartTime = System.currentTimeMillis(); try { //get the current 'replicateable' index version in the master NamedList response = null; try { response = getLatestVersion(); } catch (Exception e) { LOG.error("Master at: " + masterUrl + " is not available. Index fetch failed. Exception: " + e.getMessage()); return false; } long latestVersion = (Long) response.get(CMD_INDEX_VERSION); long latestGeneration = (Long) response.get(GENERATION); IndexCommit commit; RefCounted<SolrIndexSearcher> searcherRefCounted = null; try { searcherRefCounted = core.getNewestSearcher(false); if (searcherRefCounted == null) { SolrException.log(LOG, "No open searcher found - fetch aborted"); return false; } commit = searcherRefCounted.get().getIndexReader().getIndexCommit(); } finally { if (searcherRefCounted != null) searcherRefCounted.decref(); } if (latestVersion == 0L) { if (force && commit.getGeneration() != 0) { // since we won't get the files for an empty index, // we just clear ours and commit core.getUpdateHandler().getSolrCoreState().getIndexWriter(core).deleteAll(); SolrQueryRequest req = new LocalSolrQueryRequest(core, new ModifiableSolrParams()); core.getUpdateHandler().commit(new CommitUpdateCommand(req, false)); } //there is nothing to be replicated successfulInstall = true; return true; } if (!force && IndexDeletionPolicyWrapper.getCommitTimestamp(commit) == latestVersion) { //master and slave are already in sync just return LOG.info("Slave in sync with master."); successfulInstall = true; return true; } LOG.info("Master's generation: " + latestGeneration); LOG.info("Slave's generation: " + commit.getGeneration()); LOG.info("Starting replication process"); // get the list of files first fetchFileList(latestGeneration); // this can happen if the commit point is deleted before we fetch the file list. if(filesToDownload.isEmpty()) return false; LOG.info("Number of files in latest index in master: " + filesToDownload.size()); // Create the sync service fsyncService = Executors.newSingleThreadExecutor(); // use a synchronized list because the list is read by other threads (to show details) filesDownloaded = Collections.synchronizedList(new ArrayList<Map<String, Object>>()); // if the generateion of master is older than that of the slave , it means they are not compatible to be copied // then a new index direcory to be created and all the files need to be copied boolean isFullCopyNeeded = IndexDeletionPolicyWrapper.getCommitTimestamp(commit) >= latestVersion || force; File tmpIndexDir = createTempindexDir(core); if (isIndexStale()) isFullCopyNeeded = true; successfulInstall = false; boolean deleteTmpIdxDir = true; File indexDir = null ; try { indexDir = new File(core.getIndexDir()); downloadIndexFiles(isFullCopyNeeded, tmpIndexDir, latestGeneration); LOG.info("Total time taken for download : " + ((System.currentTimeMillis() - replicationStartTime) / 1000) + " secs"); Collection<Map<String, Object>> modifiedConfFiles = getModifiedConfFiles(confFilesToDownload); if (!modifiedConfFiles.isEmpty()) { downloadConfFiles(confFilesToDownload, latestGeneration); if (isFullCopyNeeded) { successfulInstall = modifyIndexProps(tmpIndexDir.getName()); deleteTmpIdxDir = false; } else { successfulInstall = copyIndexFiles(tmpIndexDir, indexDir); } if (successfulInstall) { LOG.info("Configuration files are modified, core will be reloaded"); logReplicationTimeAndConfFiles(modifiedConfFiles, successfulInstall);//write to a file time of replication and conf files. reloadCore(); } } else { terminateAndWaitFsyncService(); if (isFullCopyNeeded) { successfulInstall = modifyIndexProps(tmpIndexDir.getName()); deleteTmpIdxDir = false; } else { successfulInstall = copyIndexFiles(tmpIndexDir, indexDir); } if (successfulInstall) { logReplicationTimeAndConfFiles(modifiedConfFiles, successfulInstall); doCommit(); } } replicationStartTime = 0; return successfulInstall; } catch (ReplicationHandlerException e) { LOG.error("User aborted Replication"); return false; } catch (SolrException e) { throw e; } catch (InterruptedException e) { throw new InterruptedException("Index fetch interrupted"); } catch (Exception e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Index fetch failed : ", e); } finally { if (deleteTmpIdxDir) delTree(tmpIndexDir); else delTree(indexDir); } } finally { if (!successfulInstall) { logReplicationTimeAndConfFiles(null, successfulInstall); } filesToDownload = filesDownloaded = confFilesDownloaded = confFilesToDownload = null; replicationStartTime = 0; fileFetcher = null; if (fsyncService != null && !fsyncService.isShutdown()) fsyncService.shutdownNow(); fsyncService = null; stop = false; fsyncException = null; } }
2
              
// in solrj/src/java/org/apache/solr/common/cloud/ZkCmdExecutor.java
catch (KeeperException.ConnectionLossException e) { if (exception == null) { exception = e; } if (Thread.currentThread().isInterrupted()) { Thread.currentThread().interrupt(); throw new InterruptedException(); } if (Thread.currentThread() instanceof SafeStopThread) { if (((SafeStopThread) Thread.currentThread()).isClosed()) { throw new RuntimeException("Interrupted"); } } retryDelay(i); }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
catch (InterruptedException e) { throw new InterruptedException("Index fetch interrupted"); }
111
              
// in solrj/src/java/org/apache/solr/common/cloud/DefaultConnectionStrategy.java
Override public void connect(String serverAddress, int timeout, Watcher watcher, ZkUpdate updater) throws IOException, InterruptedException, TimeoutException { updater.update(new SolrZooKeeper(serverAddress, timeout, watcher)); }
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
public void delete(final String path, final int version, boolean retryOnConnLoss) throws InterruptedException, KeeperException { if (retryOnConnLoss) { zkCmdExecutor.retryOperation(new ZkOperation() { @Override public Stat execute() throws KeeperException, InterruptedException { keeper.delete(path, version); return null; } }); }
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
Override public Stat execute() throws KeeperException, InterruptedException { keeper.delete(path, version); return null; }
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
public Stat exists(final String path, final Watcher watcher, boolean retryOnConnLoss) throws KeeperException, InterruptedException { if (retryOnConnLoss) { return zkCmdExecutor.retryOperation(new ZkOperation() { @Override public Stat execute() throws KeeperException, InterruptedException { return keeper.exists(path, watcher); } }); }
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
Override public Stat execute() throws KeeperException, InterruptedException { return keeper.exists(path, watcher); }
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
public Boolean exists(final String path, boolean retryOnConnLoss) throws KeeperException, InterruptedException { if (retryOnConnLoss) { return zkCmdExecutor.retryOperation(new ZkOperation() { @Override public Boolean execute() throws KeeperException, InterruptedException { return keeper.exists(path, null) != null; } }); }
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
Override public Boolean execute() throws KeeperException, InterruptedException { return keeper.exists(path, null) != null; }
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
public String create(final String path, final byte data[], final List<ACL> acl, final CreateMode createMode, boolean retryOnConnLoss) throws KeeperException, InterruptedException { if (retryOnConnLoss) { return zkCmdExecutor.retryOperation(new ZkOperation() { @Override public String execute() throws KeeperException, InterruptedException { return keeper.create(path, data, acl, createMode); } }); }
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
Override public String execute() throws KeeperException, InterruptedException { return keeper.create(path, data, acl, createMode); }
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
public List<String> getChildren(final String path, final Watcher watcher, boolean retryOnConnLoss) throws KeeperException, InterruptedException { if (retryOnConnLoss) { return zkCmdExecutor.retryOperation(new ZkOperation() { @Override public List<String> execute() throws KeeperException, InterruptedException { return keeper.getChildren(path, watcher); } }); }
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
Override public List<String> execute() throws KeeperException, InterruptedException { return keeper.getChildren(path, watcher); }
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
public byte[] getData(final String path, final Watcher watcher, final Stat stat, boolean retryOnConnLoss) throws KeeperException, InterruptedException { if (retryOnConnLoss) { return zkCmdExecutor.retryOperation(new ZkOperation() { @Override public byte[] execute() throws KeeperException, InterruptedException { return keeper.getData(path, watcher, stat); } }); }
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
Override public byte[] execute() throws KeeperException, InterruptedException { return keeper.getData(path, watcher, stat); }
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
public Stat setData(final String path, final byte data[], final int version, boolean retryOnConnLoss) throws KeeperException, InterruptedException { if (retryOnConnLoss) { return zkCmdExecutor.retryOperation(new ZkOperation() { @Override public Stat execute() throws KeeperException, InterruptedException { return keeper.setData(path, data, version); } }); }
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
Override public Stat execute() throws KeeperException, InterruptedException { return keeper.setData(path, data, version); }
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
public String create(final String path, final byte[] data, final CreateMode createMode, boolean retryOnConnLoss) throws KeeperException, InterruptedException { if (retryOnConnLoss) { return zkCmdExecutor.retryOperation(new ZkOperation() { @Override public String execute() throws KeeperException, InterruptedException { return keeper.create(path, data, ZooDefs.Ids.OPEN_ACL_UNSAFE, createMode); } }); }
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
Override public String execute() throws KeeperException, InterruptedException { return keeper.create(path, data, ZooDefs.Ids.OPEN_ACL_UNSAFE, createMode); }
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
public void makePath(String path, boolean retryOnConnLoss) throws KeeperException, InterruptedException { makePath(path, null, CreateMode.PERSISTENT, retryOnConnLoss); }
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
public void makePath(String path, boolean failOnExists, boolean retryOnConnLoss) throws KeeperException, InterruptedException { makePath(path, null, CreateMode.PERSISTENT, null, failOnExists, retryOnConnLoss); }
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
public void makePath(String path, File file, boolean failOnExists, boolean retryOnConnLoss) throws IOException, KeeperException, InterruptedException { makePath(path, FileUtils.readFileToString(file).getBytes("UTF-8"), CreateMode.PERSISTENT, null, failOnExists, retryOnConnLoss); }
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
public void makePath(String path, File file, boolean retryOnConnLoss) throws IOException, KeeperException, InterruptedException { makePath(path, FileUtils.readFileToString(file).getBytes("UTF-8"), retryOnConnLoss); }
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
public void makePath(String path, CreateMode createMode, boolean retryOnConnLoss) throws KeeperException, InterruptedException { makePath(path, null, createMode, retryOnConnLoss); }
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
public void makePath(String path, byte[] data, boolean retryOnConnLoss) throws KeeperException, InterruptedException { makePath(path, data, CreateMode.PERSISTENT, retryOnConnLoss); }
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
public void makePath(String path, byte[] data, CreateMode createMode, boolean retryOnConnLoss) throws KeeperException, InterruptedException { makePath(path, data, createMode, null, retryOnConnLoss); }
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
public void makePath(String path, byte[] data, CreateMode createMode, Watcher watcher, boolean retryOnConnLoss) throws KeeperException, InterruptedException { makePath(path, data, createMode, watcher, true, retryOnConnLoss); }
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
public void makePath(String path, byte[] data, CreateMode createMode, Watcher watcher, boolean failOnExists, boolean retryOnConnLoss) throws KeeperException, InterruptedException { if (log.isInfoEnabled()) { log.info("makePath: " + path); } boolean retry = true; if (path.startsWith("/")) { path = path.substring(1, path.length()); } String[] paths = path.split("/"); StringBuilder sbPath = new StringBuilder(); for (int i = 0; i < paths.length; i++) { byte[] bytes = null; String pathPiece = paths[i]; sbPath.append("/" + pathPiece); final String currentPath = sbPath.toString(); Object exists = exists(currentPath, watcher, retryOnConnLoss); if (exists == null || ((i == paths.length -1) && failOnExists)) { CreateMode mode = CreateMode.PERSISTENT; if (i == paths.length - 1) { mode = createMode; bytes = data; if (!retryOnConnLoss) retry = false; } try { if (retry) { final CreateMode finalMode = mode; final byte[] finalBytes = bytes; zkCmdExecutor.retryOperation(new ZkOperation() { @Override public Object execute() throws KeeperException, InterruptedException { keeper.create(currentPath, finalBytes, ZooDefs.Ids.OPEN_ACL_UNSAFE, finalMode); return null; } }); } else { keeper.create(currentPath, bytes, ZooDefs.Ids.OPEN_ACL_UNSAFE, mode); } } catch (NodeExistsException e) { if (!failOnExists) { // TODO: version ? for now, don't worry about race setData(currentPath, data, -1, retryOnConnLoss); // set new watch exists(currentPath, watcher, retryOnConnLoss); return; } // ignore unless it's the last node in the path if (i == paths.length - 1) { throw e; } } if(i == paths.length -1) { // set new watch exists(currentPath, watcher, retryOnConnLoss); } } else if (i == paths.length - 1) { // TODO: version ? for now, don't worry about race setData(currentPath, data, -1, retryOnConnLoss); // set new watch exists(currentPath, watcher, retryOnConnLoss); } }
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
Override public Object execute() throws KeeperException, InterruptedException { keeper.create(currentPath, finalBytes, ZooDefs.Ids.OPEN_ACL_UNSAFE, finalMode); return null; }
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
public void makePath(String zkPath, CreateMode createMode, Watcher watcher, boolean retryOnConnLoss) throws KeeperException, InterruptedException { makePath(zkPath, null, createMode, watcher, retryOnConnLoss); }
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
public void setData(String path, byte[] data, boolean retryOnConnLoss) throws KeeperException, InterruptedException { setData(path, data, -1, retryOnConnLoss); }
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
public void setData(String path, File file, boolean retryOnConnLoss) throws IOException, KeeperException, InterruptedException { if (log.isInfoEnabled()) { log.info("Write to ZooKeepeer " + file.getAbsolutePath() + " to " + path); } String data = FileUtils.readFileToString(file); setData(path, data.getBytes("UTF-8"), retryOnConnLoss); }
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
public void printLayout(String path, int indent, StringBuilder string) throws KeeperException, InterruptedException { byte[] data = getData(path, null, null, true); List<String> children = getChildren(path, null, true); StringBuilder dent = new StringBuilder(); for (int i = 0; i < indent; i++) { dent.append(" "); } string.append(dent + path + " (" + children.size() + ")" + NEWL); if (data != null) { try { String dataString = new String(data, "UTF-8"); if ((!path.endsWith(".txt") && !path.endsWith(".xml")) || path.endsWith(ZkStateReader.CLUSTER_STATE)) { if (path.endsWith(".xml")) { // this is the cluster state in xml format - lets pretty print dataString = prettyPrint(dataString); } string.append(dent + "DATA:\n" + dent + " " + dataString.replaceAll("\n", "\n" + dent + " ") + NEWL); } else { string.append(dent + "DATA: ...supressed..." + NEWL); } } catch (UnsupportedEncodingException e) { // can't happen - UTF-8 throw new RuntimeException(e); } } for (String child : children) { if (!child.equals("quota")) { try { printLayout(path + (path.equals("/") ? "" : "/") + child, indent + 1, string); } catch (NoNodeException e) { // must have gone away } } } }
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
public void printLayoutToStdOut() throws KeeperException, InterruptedException { StringBuilder sb = new StringBuilder(); printLayout("/", 0, sb); System.out.println(sb.toString()); }
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
public void close() throws InterruptedException { if (isClosed) return; // it's okay if we over close - same as solrcore isClosed = true; keeper.close(); numCloses.incrementAndGet(); }
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
void updateKeeper(SolrZooKeeper keeper) throws InterruptedException { SolrZooKeeper oldKeeper = this.keeper; this.keeper = keeper; if (oldKeeper != null) { oldKeeper.close(); } }
// in solrj/src/java/org/apache/solr/common/cloud/ConnectionManager.java
public synchronized void process(WatchedEvent event) { if (log.isInfoEnabled()) { log.info("Watcher " + this + " name:" + name + " got event " + event + " path:" + event.getPath() + " type:" + event.getType()); } state = event.getState(); if (state == KeeperState.SyncConnected) { connected = true; clientConnected.countDown(); } else if (state == KeeperState.Expired) { connected = false; log.info("Attempting to reconnect to recover relationship with ZooKeeper..."); try { connectionStrategy.reconnect(zkServerAddress, zkClientTimeout, this, new ZkClientConnectionStrategy.ZkUpdate() { @Override public void update(SolrZooKeeper keeper) throws InterruptedException, TimeoutException, IOException { synchronized (connectionStrategy) { waitForConnected(SolrZkClient.DEFAULT_CLIENT_CONNECT_TIMEOUT); client.updateKeeper(keeper); if (onReconnect != null) { onReconnect.command(); } synchronized (ConnectionManager.this) { ConnectionManager.this.connected = true; } } } }); } catch (Exception e) { SolrException.log(log, "", e); } log.info("Connected:" + connected); }
// in solrj/src/java/org/apache/solr/common/cloud/ConnectionManager.java
Override public void update(SolrZooKeeper keeper) throws InterruptedException, TimeoutException, IOException { synchronized (connectionStrategy) { waitForConnected(SolrZkClient.DEFAULT_CLIENT_CONNECT_TIMEOUT); client.updateKeeper(keeper); if (onReconnect != null) { onReconnect.command(); } synchronized (ConnectionManager.this) { ConnectionManager.this.connected = true; } } }
// in solrj/src/java/org/apache/solr/common/cloud/ConnectionManager.java
public synchronized void waitForConnected(long waitForConnection) throws InterruptedException, TimeoutException, IOException { long expire = System.currentTimeMillis() + waitForConnection; long left = waitForConnection; while (!connected && left > 0) { wait(left); left = expire - System.currentTimeMillis(); } if (!connected) { throw new TimeoutException("Could not connect to ZooKeeper " + zkServerAddress + " within " + waitForConnection + " ms"); } }
// in solrj/src/java/org/apache/solr/common/cloud/ConnectionManager.java
public synchronized void waitForDisconnected(long timeout) throws InterruptedException, TimeoutException { long expire = System.currentTimeMillis() + timeout; long left = timeout; while (connected && left > 0) { wait(left); left = expire - System.currentTimeMillis(); } if (connected) { throw new TimeoutException("Did not disconnect"); } }
// in solrj/src/java/org/apache/solr/common/cloud/ZkStateReader.java
public void updateCloudState(boolean immediate) throws KeeperException, InterruptedException { updateCloudState(immediate, false); }
// in solrj/src/java/org/apache/solr/common/cloud/ZkStateReader.java
public void updateLiveNodes() throws KeeperException, InterruptedException { updateCloudState(true, true); }
// in solrj/src/java/org/apache/solr/common/cloud/ZkStateReader.java
public synchronized void createClusterStateWatchersAndUpdate() throws KeeperException, InterruptedException { // We need to fetch the current cluster state and the set of live nodes synchronized (getUpdateLock()) { cmdExecutor.ensureExists(CLUSTER_STATE, zkClient); log.info("Updating cluster state from ZooKeeper... "); zkClient.exists(CLUSTER_STATE, new Watcher() { @Override public void process(WatchedEvent event) { log.info("A cluster state change has occurred"); try { // delayed approach // ZkStateReader.this.updateCloudState(false, false); synchronized (ZkStateReader.this.getUpdateLock()) { // remake watch final Watcher thisWatch = this; byte[] data = zkClient.getData(CLUSTER_STATE, thisWatch, null, true); CloudState clusterState = CloudState.load(data, ZkStateReader.this.cloudState.getLiveNodes()); // update volatile cloudState = clusterState; } } catch (KeeperException e) { if (e.code() == KeeperException.Code.SESSIONEXPIRED || e.code() == KeeperException.Code.CONNECTIONLOSS) { log.warn("ZooKeeper watch triggered, but Solr cannot talk to ZK"); return; } log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.warn("", e); return; } }
// in solrj/src/java/org/apache/solr/common/cloud/ZkStateReader.java
private synchronized void updateCloudState(boolean immediate, final boolean onlyLiveNodes) throws KeeperException, InterruptedException { log.info("Manual update of cluster state initiated"); // build immutable CloudInfo if (immediate) { CloudState clusterState; synchronized (getUpdateLock()) { List<String> liveNodes = zkClient.getChildren(LIVE_NODES_ZKNODE, null, true); Set<String> liveNodesSet = new HashSet<String>(); liveNodesSet.addAll(liveNodes); if (!onlyLiveNodes) { log.info("Updating cloud state from ZooKeeper... "); clusterState = CloudState.load(zkClient, liveNodesSet); } else { log.info("Updating live nodes from ZooKeeper... "); clusterState = new CloudState(liveNodesSet, ZkStateReader.this.cloudState.getCollectionStates()); } } this.cloudState = clusterState; }
// in solrj/src/java/org/apache/solr/common/cloud/ZkStateReader.java
public String getLeaderUrl(String collection, String shard) throws InterruptedException, KeeperException { return getLeaderUrl(collection, shard, 1000); }
// in solrj/src/java/org/apache/solr/common/cloud/ZkStateReader.java
public String getLeaderUrl(String collection, String shard, int timeout) throws InterruptedException, KeeperException { ZkCoreNodeProps props = new ZkCoreNodeProps(getLeaderProps(collection, shard, timeout)); return props.getCoreUrl(); }
// in solrj/src/java/org/apache/solr/common/cloud/ZkStateReader.java
public ZkNodeProps getLeaderProps(String collection, String shard) throws InterruptedException { return getLeaderProps(collection, shard, 1000); }
// in solrj/src/java/org/apache/solr/common/cloud/ZkStateReader.java
public ZkNodeProps getLeaderProps(String collection, String shard, int timeout) throws InterruptedException { long timeoutAt = System.currentTimeMillis() + timeout; while (System.currentTimeMillis() < timeoutAt) { if (cloudState != null) { final CloudState currentState = cloudState; final ZkNodeProps nodeProps = currentState.getLeader(collection, shard); if (nodeProps != null) { return nodeProps; } } Thread.sleep(50); } throw new RuntimeException("No registered leader was found, collection:" + collection + " slice:" + shard); }
// in solrj/src/java/org/apache/solr/common/cloud/CloudState.java
public static CloudState load(SolrZkClient zkClient, Set<String> liveNodes) throws KeeperException, InterruptedException { byte[] state = zkClient.getData(ZkStateReader.CLUSTER_STATE, null, null, true); return load(state, liveNodes); }
// in solrj/src/java/org/apache/solr/common/cloud/CloudState.java
public static CloudState load(byte[] bytes, Set<String> liveNodes) throws KeeperException, InterruptedException { if (bytes == null || bytes.length == 0) { return new CloudState(liveNodes, Collections.<String, Map<String,Slice>>emptyMap()); } LinkedHashMap<String, Object> stateMap = (LinkedHashMap<String, Object>) ZkStateReader.fromJSON(bytes); HashMap<String,Map<String, Slice>> state = new HashMap<String,Map<String,Slice>>(); for(String collectionName: stateMap.keySet()){ Map<String, Object> collection = (Map<String, Object>)stateMap.get(collectionName); Map<String, Slice> slices = new LinkedHashMap<String,Slice>(); for(String sliceName: collection.keySet()) { Map<String, Map<String, String>> sliceMap = (Map<String, Map<String, String>>)collection.get(sliceName); Map<String, ZkNodeProps> shards = new LinkedHashMap<String,ZkNodeProps>(); for(String shardName: sliceMap.keySet()) { shards.put(shardName, new ZkNodeProps(sliceMap.get(shardName))); } Slice slice = new Slice(sliceName, shards); slices.put(sliceName, slice); } state.put(collectionName, slices); } return new CloudState(liveNodes, state); }
// in solrj/src/java/org/apache/solr/common/cloud/ZkCmdExecutor.java
public void ensureExists(String path, final SolrZkClient zkClient) throws KeeperException, InterruptedException { ensureExists(path, null, CreateMode.PERSISTENT, zkClient); }
// in solrj/src/java/org/apache/solr/common/cloud/ZkCmdExecutor.java
public void ensureExists(final String path, final byte[] data, CreateMode createMode, final SolrZkClient zkClient) throws KeeperException, InterruptedException { if (zkClient.exists(path, true)) { return; } try { zkClient.makePath(path, data, true); } catch (NodeExistsException e) { // its okay if another beats us creating the node } }
// in solrj/src/java/org/apache/solr/common/cloud/ZkCmdExecutor.java
protected void retryDelay(int attemptCount) throws InterruptedException { if (attemptCount > 0) { Thread.sleep(attemptCount * retryDelay); } }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
boolean fetchLatestIndex(SolrCore core, boolean force) throws IOException, InterruptedException { successfulInstall = false; replicationStartTime = System.currentTimeMillis(); try { //get the current 'replicateable' index version in the master NamedList response = null; try { response = getLatestVersion(); } catch (Exception e) { LOG.error("Master at: " + masterUrl + " is not available. Index fetch failed. Exception: " + e.getMessage()); return false; } long latestVersion = (Long) response.get(CMD_INDEX_VERSION); long latestGeneration = (Long) response.get(GENERATION); IndexCommit commit; RefCounted<SolrIndexSearcher> searcherRefCounted = null; try { searcherRefCounted = core.getNewestSearcher(false); if (searcherRefCounted == null) { SolrException.log(LOG, "No open searcher found - fetch aborted"); return false; } commit = searcherRefCounted.get().getIndexReader().getIndexCommit(); } finally { if (searcherRefCounted != null) searcherRefCounted.decref(); } if (latestVersion == 0L) { if (force && commit.getGeneration() != 0) { // since we won't get the files for an empty index, // we just clear ours and commit core.getUpdateHandler().getSolrCoreState().getIndexWriter(core).deleteAll(); SolrQueryRequest req = new LocalSolrQueryRequest(core, new ModifiableSolrParams()); core.getUpdateHandler().commit(new CommitUpdateCommand(req, false)); } //there is nothing to be replicated successfulInstall = true; return true; } if (!force && IndexDeletionPolicyWrapper.getCommitTimestamp(commit) == latestVersion) { //master and slave are already in sync just return LOG.info("Slave in sync with master."); successfulInstall = true; return true; } LOG.info("Master's generation: " + latestGeneration); LOG.info("Slave's generation: " + commit.getGeneration()); LOG.info("Starting replication process"); // get the list of files first fetchFileList(latestGeneration); // this can happen if the commit point is deleted before we fetch the file list. if(filesToDownload.isEmpty()) return false; LOG.info("Number of files in latest index in master: " + filesToDownload.size()); // Create the sync service fsyncService = Executors.newSingleThreadExecutor(); // use a synchronized list because the list is read by other threads (to show details) filesDownloaded = Collections.synchronizedList(new ArrayList<Map<String, Object>>()); // if the generateion of master is older than that of the slave , it means they are not compatible to be copied // then a new index direcory to be created and all the files need to be copied boolean isFullCopyNeeded = IndexDeletionPolicyWrapper.getCommitTimestamp(commit) >= latestVersion || force; File tmpIndexDir = createTempindexDir(core); if (isIndexStale()) isFullCopyNeeded = true; successfulInstall = false; boolean deleteTmpIdxDir = true; File indexDir = null ; try { indexDir = new File(core.getIndexDir()); downloadIndexFiles(isFullCopyNeeded, tmpIndexDir, latestGeneration); LOG.info("Total time taken for download : " + ((System.currentTimeMillis() - replicationStartTime) / 1000) + " secs"); Collection<Map<String, Object>> modifiedConfFiles = getModifiedConfFiles(confFilesToDownload); if (!modifiedConfFiles.isEmpty()) { downloadConfFiles(confFilesToDownload, latestGeneration); if (isFullCopyNeeded) { successfulInstall = modifyIndexProps(tmpIndexDir.getName()); deleteTmpIdxDir = false; } else { successfulInstall = copyIndexFiles(tmpIndexDir, indexDir); } if (successfulInstall) { LOG.info("Configuration files are modified, core will be reloaded"); logReplicationTimeAndConfFiles(modifiedConfFiles, successfulInstall);//write to a file time of replication and conf files. reloadCore(); } } else { terminateAndWaitFsyncService(); if (isFullCopyNeeded) { successfulInstall = modifyIndexProps(tmpIndexDir.getName()); deleteTmpIdxDir = false; } else { successfulInstall = copyIndexFiles(tmpIndexDir, indexDir); } if (successfulInstall) { logReplicationTimeAndConfFiles(modifiedConfFiles, successfulInstall); doCommit(); } } replicationStartTime = 0; return successfulInstall; } catch (ReplicationHandlerException e) { LOG.error("User aborted Replication"); return false; } catch (SolrException e) { throw e; } catch (InterruptedException e) { throw new InterruptedException("Index fetch interrupted"); } catch (Exception e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Index fetch failed : ", e); } finally { if (deleteTmpIdxDir) delTree(tmpIndexDir); else delTree(indexDir); } } finally { if (!successfulInstall) { logReplicationTimeAndConfFiles(null, successfulInstall); } filesToDownload = filesDownloaded = confFilesDownloaded = confFilesToDownload = null; replicationStartTime = 0; fileFetcher = null; if (fsyncService != null && !fsyncService.isShutdown()) fsyncService.shutdownNow(); fsyncService = null; stop = false; fsyncException = null; } }
// in core/src/java/org/apache/solr/handler/admin/ShowFileRequestHandler.java
Override public void handleRequestBody(SolrQueryRequest req, SolrQueryResponse rsp) throws IOException, KeeperException, InterruptedException { CoreContainer coreContainer = req.getCore().getCoreDescriptor().getCoreContainer(); if (coreContainer.isZooKeeperAware()) { showFromZooKeeper(req, rsp, coreContainer); } else { showFromFileSystem(req, rsp); } }
// in core/src/java/org/apache/solr/handler/admin/ShowFileRequestHandler.java
private void showFromZooKeeper(SolrQueryRequest req, SolrQueryResponse rsp, CoreContainer coreContainer) throws KeeperException, InterruptedException, UnsupportedEncodingException { String adminFile = null; SolrCore core = req.getCore(); SolrZkClient zkClient = coreContainer.getZkController().getZkClient(); final ZkSolrResourceLoader loader = (ZkSolrResourceLoader) core .getResourceLoader(); String confPath = loader.getCollectionZkPath(); String fname = req.getParams().get("file", null); if (fname == null) { adminFile = confPath; } else { fname = fname.replace('\\', '/'); // normalize slashes if (hiddenFiles.contains(fname.toUpperCase(Locale.ENGLISH))) { throw new SolrException(ErrorCode.FORBIDDEN, "Can not access: " + fname); } if (fname.indexOf("..") >= 0) { throw new SolrException(ErrorCode.FORBIDDEN, "Invalid path: " + fname); } adminFile = confPath + "/" + fname; } // Make sure the file exists, is readable and is not a hidden file if (!zkClient.exists(adminFile, true)) { throw new SolrException(ErrorCode.BAD_REQUEST, "Can not find: " + adminFile); } // Show a directory listing List<String> children = zkClient.getChildren(adminFile, null, true); if (children.size() > 0) { NamedList<SimpleOrderedMap<Object>> files = new SimpleOrderedMap<SimpleOrderedMap<Object>>(); for (String f : children) { if (hiddenFiles.contains(f.toUpperCase(Locale.ENGLISH))) { continue; // don't show 'hidden' files } if (f.startsWith(".")) { continue; // skip hidden system files... } SimpleOrderedMap<Object> fileInfo = new SimpleOrderedMap<Object>(); files.add(f, fileInfo); List<String> fchildren = zkClient.getChildren(adminFile, null, true); if (fchildren.size() > 0) { fileInfo.add("directory", true); } else { // TODO? content type fileInfo.add("size", f.length()); } // TODO: ? // fileInfo.add( "modified", new Date( f.lastModified() ) ); } rsp.add("files", files); } else { // Include the file contents // The file logic depends on RawResponseWriter, so force its use. ModifiableSolrParams params = new ModifiableSolrParams(req.getParams()); params.set(CommonParams.WT, "raw"); req.setParams(params); ContentStreamBase content = new ContentStreamBase.StringStream( new String(zkClient.getData(adminFile, null, null, true), "UTF-8")); content.setContentType(req.getParams().get(USE_CONTENT_TYPE)); rsp.add(RawResponseWriter.CONTENT, content); } rsp.setHttpCaching(false); }
// in core/src/java/org/apache/solr/handler/admin/CoreAdminHandler.java
protected void handleWaitForStateAction(SolrQueryRequest req, SolrQueryResponse rsp) throws IOException, InterruptedException { final SolrParams params = req.getParams(); String cname = params.get(CoreAdminParams.CORE); if (cname == null) { cname = ""; } String nodeName = params.get("nodeName"); String coreNodeName = params.get("coreNodeName"); String waitForState = params.get("state"); Boolean checkLive = params.getBool("checkLive"); int pauseFor = params.getInt("pauseFor", 0); String state = null; boolean live = false; int retry = 0; while (true) { SolrCore core = null; try { core = coreContainer.getCore(cname); if (core == null && retry == 30) { throw new SolrException(ErrorCode.BAD_REQUEST, "core not found:" + cname); } if (core != null) { // wait until we are sure the recovering node is ready // to accept updates CloudDescriptor cloudDescriptor = core.getCoreDescriptor() .getCloudDescriptor(); CloudState cloudState = coreContainer.getZkController() .getCloudState(); String collection = cloudDescriptor.getCollectionName(); Slice slice = cloudState.getSlice(collection, cloudDescriptor.getShardId()); if (slice != null) { ZkNodeProps nodeProps = slice.getShards().get(coreNodeName); if (nodeProps != null) { state = nodeProps.get(ZkStateReader.STATE_PROP); live = cloudState.liveNodesContain(nodeName); if (nodeProps != null && state.equals(waitForState)) { if (checkLive == null) { break; } else if (checkLive && live) { break; } else if (!checkLive && !live) { break; } } } } } if (retry++ == 30) { throw new SolrException(ErrorCode.BAD_REQUEST, "I was asked to wait on state " + waitForState + " for " + nodeName + " but I still do not see the request state. I see state: " + state + " live:" + live); } } finally { if (core != null) { core.close(); } } Thread.sleep(1000); } // small safety net for any updates that started with state that // kept it from sending the update to be buffered - // pause for a while to let any outstanding updates finish // System.out.println("I saw state:" + state + " sleep for " + pauseFor + // " live:" + live); Thread.sleep(pauseFor); // solrcloud_debug // try {; // LocalSolrQueryRequest r = new LocalSolrQueryRequest(core, new // ModifiableSolrParams()); // CommitUpdateCommand commitCmd = new CommitUpdateCommand(r, false); // commitCmd.softCommit = true; // core.getUpdateHandler().commit(commitCmd); // RefCounted<SolrIndexSearcher> searchHolder = // core.getNewestSearcher(false); // SolrIndexSearcher searcher = searchHolder.get(); // try { // System.out.println(core.getCoreDescriptor().getCoreContainer().getZkController().getNodeName() // + " to replicate " // + searcher.search(new MatchAllDocsQuery(), 1).totalHits + " gen:" + // core.getDeletionPolicy().getLatestCommit().getGeneration() + " data:" + // core.getDataDir()); // } finally { // searchHolder.decref(); // } // } catch (Exception e) { // // } }
// in core/src/java/org/apache/solr/handler/admin/CoreAdminHandler.java
protected void handleDistribUrlAction(SolrQueryRequest req, SolrQueryResponse rsp) throws IOException, InterruptedException, SolrServerException { // TODO: finish this and tests SolrParams params = req.getParams(); final ModifiableSolrParams newParams = new ModifiableSolrParams(params); newParams.remove("action"); SolrParams required = params.required(); final String subAction = required.get("subAction"); String collection = required.get("collection"); newParams.set(CoreAdminParams.ACTION, subAction); SolrCore core = req.getCore(); ZkController zkController = core.getCoreDescriptor().getCoreContainer() .getZkController(); CloudState cloudState = zkController.getCloudState(); Map<String,Slice> slices = cloudState.getCollectionStates().get(collection); for (Map.Entry<String,Slice> entry : slices.entrySet()) { Slice slice = entry.getValue(); Map<String,ZkNodeProps> shards = slice.getShards(); Set<Map.Entry<String,ZkNodeProps>> shardEntries = shards.entrySet(); for (Map.Entry<String,ZkNodeProps> shardEntry : shardEntries) { final ZkNodeProps node = shardEntry.getValue(); if (cloudState.liveNodesContain(node.get(ZkStateReader.NODE_NAME_PROP))) { newParams.set(CoreAdminParams.CORE, node.get(ZkStateReader.CORE_NAME_PROP)); String replica = node.get(ZkStateReader.BASE_URL_PROP); ShardRequest sreq = new ShardRequest(); newParams.set("qt", "/admin/cores"); sreq.purpose = 1; // TODO: this sucks if (replica.startsWith("http://")) replica = replica.substring(7); sreq.shards = new String[]{replica}; sreq.actualShards = sreq.shards; sreq.params = newParams; shardHandler.submit(sreq, replica, sreq.params); } } } ShardResponse srsp; do { srsp = shardHandler.takeCompletedOrError(); if (srsp != null) { Throwable e = srsp.getException(); if (e != null) { log.error("Error talking to shard: " + srsp.getShard(), e); } } } while(srsp != null); }
// in core/src/java/org/apache/solr/cloud/ShardLeaderWatcher.java
private void processLeaderChange() throws KeeperException, InterruptedException { if(closed) return; try { byte[] data = zkClient.getData(path, this, null, true); if (data != null) { final ZkCoreNodeProps leaderProps = new ZkCoreNodeProps(ZkNodeProps.load(data)); listener.announceLeader(collection, shard, leaderProps); } } catch (KeeperException ke) { //check if we lost connection or the node was gone if (ke.code() != Code.CONNECTIONLOSS && ke.code() != Code.SESSIONEXPIRED && ke.code() != Code.NONODE) { throw ke; } } }
// in core/src/java/org/apache/solr/cloud/Overseer.java
private CloudState updateState(CloudState state, String nodeName, CoreState coreState) throws KeeperException, InterruptedException { String collection = coreState.getCollectionName(); String zkCoreNodeName = coreState.getCoreNodeName(); //collection does not yet exist, create placeholders if num shards is specified if (!state.getCollections().contains(coreState.getCollectionName()) && coreState.getNumShards() != null) { state = createCollection(state, collection, coreState.getNumShards()); } // use the provided non null shardId String shardId = coreState.getProperties().get(ZkStateReader.SHARD_ID_PROP); if(shardId==null) { //use shardId from CloudState shardId = getAssignedId(state, nodeName, coreState); } if(shardId==null) { //request new shardId shardId = AssignShard.assignShard(collection, state, coreState.getNumShards()); } Map<String,String> props = new HashMap<String,String>(); Map<String,String> coreProps = new HashMap<String,String>(coreState.getProperties().size()); coreProps.putAll(coreState.getProperties()); // we don't put num_shards in the clusterstate coreProps.remove("num_shards"); for (Entry<String,String> entry : coreProps.entrySet()) { props.put(entry.getKey(), entry.getValue()); } ZkNodeProps zkProps = new ZkNodeProps(props); Slice slice = state.getSlice(collection, shardId); Map<String,ZkNodeProps> shardProps; if (slice == null) { shardProps = new HashMap<String,ZkNodeProps>(); } else { shardProps = state.getSlice(collection, shardId).getShardsCopy(); } shardProps.put(zkCoreNodeName, zkProps); slice = new Slice(shardId, shardProps); CloudState newCloudState = updateSlice(state, collection, slice); return newCloudState; }
// in core/src/java/org/apache/solr/cloud/Overseer.java
public synchronized void createWatches() throws KeeperException, InterruptedException { addCollectionsWatch(); addLiveNodesWatch(); }
// in core/src/java/org/apache/solr/cloud/Overseer.java
private void addCollectionsWatch() throws KeeperException, InterruptedException { zkCmdExecutor.ensureExists(ZkStateReader.COLLECTIONS_ZKNODE, zkClient); List<String> collections = zkClient.getChildren(ZkStateReader.COLLECTIONS_ZKNODE, new Watcher(){ @Override public void process(WatchedEvent event) { try { List<String> collections = zkClient.getChildren(ZkStateReader.COLLECTIONS_ZKNODE, this, true); collectionsChanged(collections); } catch (KeeperException e) { if (e.code() == Code.CONNECTIONLOSS || e.code() == Code.SESSIONEXPIRED) { log.warn("ZooKeeper watch triggered, but Solr cannot talk to ZK"); return; } } catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.warn("", e); } } }, true); collectionsChanged(collections); }
// in core/src/java/org/apache/solr/cloud/Overseer.java
private void collectionsChanged(Collection<String> collections) throws KeeperException, InterruptedException { synchronized (shardLeaderWatches) { for(String collection: collections) { if(!shardLeaderWatches.containsKey(collection)) { shardLeaderWatches.put(collection, new HashMap<String,ShardLeaderWatcher>()); addShardLeadersWatch(collection); } } //XXX not handling delete collections.. } }
// in core/src/java/org/apache/solr/cloud/Overseer.java
private void addShardLeadersWatch(final String collection) throws KeeperException, InterruptedException { zkCmdExecutor.ensureExists(ZkStateReader.getShardLeadersPath(collection, null), zkClient); final List<String> leaderNodes = zkClient.getChildren( ZkStateReader.getShardLeadersPath(collection, null), new Watcher() { @Override public void process(WatchedEvent event) { try { List<String> leaderNodes = zkClient.getChildren( ZkStateReader.getShardLeadersPath(collection, null), this, true); processLeaderNodesChanged(collection, leaderNodes); } catch (KeeperException e) { if (e.code() == KeeperException.Code.SESSIONEXPIRED || e.code() == KeeperException.Code.CONNECTIONLOSS) { log.warn("ZooKeeper watch triggered, but Solr cannot talk to ZK"); return; } SolrException.log(log, "", e); throw new ZooKeeperException( SolrException.ErrorCode.SERVER_ERROR, "", e); } catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); } } }, true); processLeaderNodesChanged(collection, leaderNodes); }
// in core/src/java/org/apache/solr/cloud/Overseer.java
private void addLiveNodesWatch() throws KeeperException, InterruptedException { List<String> liveNodes = zkCmdExecutor.retryOperation(new ZkOperation() { @Override public Object execute() throws KeeperException, InterruptedException { return zkClient.getChildren( ZkStateReader.LIVE_NODES_ZKNODE, new Watcher() { @Override public void process(WatchedEvent event) { try { List<String> liveNodes = zkClient.getChildren( ZkStateReader.LIVE_NODES_ZKNODE, this, true); synchronized (nodeStateWatches) { processLiveNodesChanged(nodeStateWatches.keySet(), liveNodes); } } catch (KeeperException e) { if (e.code() == KeeperException.Code.SESSIONEXPIRED || e.code() == KeeperException.Code.CONNECTIONLOSS) { log.warn("ZooKeeper watch triggered, but Solr cannot talk to ZK"); return; } SolrException.log(log, "", e); throw new ZooKeeperException( SolrException.ErrorCode.SERVER_ERROR, "", e); } catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); } } }, true); } }); processLiveNodesChanged(Collections.<String>emptySet(), liveNodes); }
// in core/src/java/org/apache/solr/cloud/Overseer.java
Override public Object execute() throws KeeperException, InterruptedException { return zkClient.getChildren( ZkStateReader.LIVE_NODES_ZKNODE, new Watcher() { @Override public void process(WatchedEvent event) { try { List<String> liveNodes = zkClient.getChildren( ZkStateReader.LIVE_NODES_ZKNODE, this, true); synchronized (nodeStateWatches) { processLiveNodesChanged(nodeStateWatches.keySet(), liveNodes); } } catch (KeeperException e) { if (e.code() == KeeperException.Code.SESSIONEXPIRED || e.code() == KeeperException.Code.CONNECTIONLOSS) { log.warn("ZooKeeper watch triggered, but Solr cannot talk to ZK"); return; } SolrException.log(log, "", e); throw new ZooKeeperException( SolrException.ErrorCode.SERVER_ERROR, "", e); } catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); } } }, true); }
// in core/src/java/org/apache/solr/cloud/Overseer.java
private void processLiveNodesChanged(Collection<String> oldLiveNodes, Collection<String> liveNodes) throws InterruptedException, KeeperException { Set<String> upNodes = complement(liveNodes, oldLiveNodes); if (upNodes.size() > 0) { addNodeStateWatches(upNodes); } Set<String> downNodes = complement(oldLiveNodes, liveNodes); for(String node: downNodes) { synchronized (nodeStateWatches) { NodeStateWatcher watcher = nodeStateWatches.remove(node); } log.debug("Removed NodeStateWatcher for node:" + node); } }
// in core/src/java/org/apache/solr/cloud/Overseer.java
private void addNodeStateWatches(Set<String> nodeNames) throws InterruptedException, KeeperException { for (String nodeName : nodeNames) { final String path = STATES_NODE + "/" + nodeName; synchronized (nodeStateWatches) { if (!nodeStateWatches.containsKey(nodeName)) { zkCmdExecutor.ensureExists(path, zkClient); nodeStateWatches.put(nodeName, new NodeStateWatcher(zkClient, nodeName, path, this)); log.debug("Added NodeStateWatcher for node " + nodeName); } else { log.debug("watch already added"); } } } }
// in core/src/java/org/apache/solr/cloud/Overseer.java
Override public void coreChanged(final String nodeName, final Set<CoreState> states) throws KeeperException, InterruptedException { log.info("Core change pooled: " + nodeName + " states:" + states); for (CoreState state : states) { fifo.add(new CloudStateUpdateRequest(Op.StateChange, nodeName, state)); } }
// in core/src/java/org/apache/solr/cloud/Overseer.java
Override public void coreDeleted(String nodeName, Collection<CoreState> states) throws KeeperException, InterruptedException { for (CoreState state : states) { fifo.add(new CloudStateUpdateRequest(Op.CoreDeleted, state.getCollectionName(), state.getCoreNodeName())); } }
// in core/src/java/org/apache/solr/cloud/Overseer.java
public static void createClientNodes(SolrZkClient zkClient, String nodeName) throws KeeperException, InterruptedException { final String node = STATES_NODE + "/" + nodeName; if (log.isInfoEnabled()) { log.info("creating node:" + node); } ZkCmdExecutor zkCmdExecutor = new ZkCmdExecutor(); zkCmdExecutor.ensureExists(node, zkClient); }
// in core/src/java/org/apache/solr/cloud/ElectionContext.java
public void cancelElection() throws InterruptedException, KeeperException { zkClient.delete(leaderSeqPath, -1, true); }
// in core/src/java/org/apache/solr/cloud/ElectionContext.java
Override void runLeaderProcess(boolean weAreReplacement) throws KeeperException, InterruptedException, IOException { try { zkClient.makePath(leaderPath, leaderProps == null ? null : ZkStateReader.toJSON(leaderProps), CreateMode.EPHEMERAL, true); } catch (NodeExistsException e) { // if a previous leader ephemeral still exists for some reason, try and // remove it zkClient.delete(leaderPath, -1, true); zkClient.makePath(leaderPath, leaderProps == null ? null : ZkStateReader.toJSON(leaderProps), CreateMode.EPHEMERAL, true); } }
// in core/src/java/org/apache/solr/cloud/ElectionContext.java
Override void runLeaderProcess(boolean weAreReplacement) throws KeeperException, InterruptedException, IOException { if (cc != null) { String coreName = leaderProps.get(ZkStateReader.CORE_NAME_PROP); SolrCore core = null; try { // the first time we are run, we will get a startupCore - after // we will get null and must use cc.getCore core = cc.getCore(coreName); if (core == null) { cancelElection(); throw new SolrException(ErrorCode.SERVER_ERROR, "Fatal Error, SolrCore not found:" + coreName + " in " + cc.getCoreNames()); } // should I be leader? if (weAreReplacement && !shouldIBeLeader(leaderProps)) { // System.out.println("there is a better leader candidate it appears"); rejoinLeaderElection(leaderSeqPath, core); return; } if (weAreReplacement) { if (zkClient.exists(leaderPath, true)) { zkClient.delete(leaderPath, -1, true); } // System.out.println("I may be the new Leader:" + leaderPath // + " - I need to try and sync"); boolean success = syncStrategy.sync(zkController, core, leaderProps); if (!success && anyoneElseActive()) { rejoinLeaderElection(leaderSeqPath, core); return; } } // If I am going to be the leader I have to be active // System.out.println("I am leader go active"); core.getUpdateHandler().getSolrCoreState().cancelRecovery(); zkController.publish(core.getCoreDescriptor(), ZkStateReader.ACTIVE); } finally { if (core != null ) { core.close(); } } } super.runLeaderProcess(weAreReplacement); }
// in core/src/java/org/apache/solr/cloud/ElectionContext.java
private void rejoinLeaderElection(String leaderSeqPath, SolrCore core) throws InterruptedException, KeeperException, IOException { // remove our ephemeral and re join the election // System.out.println("sync failed, delete our election node:" // + leaderSeqPath); zkController.publish(core.getCoreDescriptor(), ZkStateReader.DOWN); cancelElection(); core.getUpdateHandler().getSolrCoreState().doRecovery(cc, core.getName()); leaderElector.joinElection(this); }
// in core/src/java/org/apache/solr/cloud/ElectionContext.java
Override void runLeaderProcess(boolean weAreReplacement) throws KeeperException, InterruptedException { final String id = leaderSeqPath.substring(leaderSeqPath.lastIndexOf("/")+1); ZkNodeProps myProps = new ZkNodeProps("id", id); try { zkClient.makePath(leaderPath, ZkStateReader.toJSON(myProps), CreateMode.EPHEMERAL, true); } catch (NodeExistsException e) { // if a previous leader ephemeral still exists for some reason, try and // remove it zkClient.delete(leaderPath, -1, true); zkClient.makePath(leaderPath, ZkStateReader.toJSON(myProps), CreateMode.EPHEMERAL, true); } new Overseer(zkClient, stateReader, id); }
// in core/src/java/org/apache/solr/cloud/NodeStateWatcher.java
private void processStateChange() throws KeeperException, InterruptedException { byte[] data = zkClient.getData(path, this, null, true); if (data != null) { CoreState[] states = CoreState.load(data); List<CoreState> stateList = Arrays.asList(states); HashSet<CoreState> modifiedCores = new HashSet<CoreState>(); modifiedCores.addAll(stateList); modifiedCores.removeAll(currentState); HashSet<CoreState> newState = new HashSet<CoreState>(); newState.addAll(stateList); HashMap<String, CoreState> lookup = new HashMap<String, CoreState>(); for(CoreState state: states) { lookup.put(state.getCoreName(), state); } //check for status change for(CoreState state: currentState) { if(lookup.containsKey(state.getCoreName())) { if(!state.getProperties().equals(lookup.get(state.getCoreName()).getProperties())) { modifiedCores.add(lookup.get(state.getCoreName())); } } } HashMap<String, CoreState> deletedCores = new HashMap<String, CoreState>(); for(CoreState state: currentState) { deletedCores.put(state.getCoreNodeName(), state); } for(CoreState state: stateList) { deletedCores.remove(state.getCoreNodeName()); } if (deletedCores.size() > 0) { listener.coreDeleted(nodeName, deletedCores.values()); } currentState = Collections.unmodifiableSet(newState); if (modifiedCores.size() > 0) { try { listener.coreChanged(nodeName, Collections.unmodifiableSet(modifiedCores)); } catch (KeeperException e) { log.warn("Could not talk to ZK", e); } catch (InterruptedException e) { Thread.currentThread().interrupt(); log.warn("Could not talk to ZK", e); } } } else { // ignore null state } }
// in core/src/java/org/apache/solr/cloud/RecoveryStrategy.java
private Future<RecoveryInfo> replay(UpdateLog ulog) throws InterruptedException, ExecutionException, TimeoutException { Future<RecoveryInfo> future = ulog.applyBufferedUpdates(); if (future == null) { // no replay needed\ log.info("No replay needed"); } else { log.info("Replaying buffered documents"); // wait for replay future.get(); } // solrcloud_debug // try { // RefCounted<SolrIndexSearcher> searchHolder = core.getNewestSearcher(false); // SolrIndexSearcher searcher = searchHolder.get(); // try { // System.out.println(core.getCoreDescriptor().getCoreContainer().getZkController().getNodeName() + " replayed " // + searcher.search(new MatchAllDocsQuery(), 1).totalHits); // } finally { // searchHolder.decref(); // } // } catch (Exception e) { // // } return future; }
// in core/src/java/org/apache/solr/cloud/ZkController.java
public boolean configFileExists(String collection, String fileName) throws KeeperException, InterruptedException { Stat stat = zkClient.exists(CONFIGS_ZKNODE + "/" + collection + "/" + fileName, null, true); return stat != null; }
// in core/src/java/org/apache/solr/cloud/ZkController.java
public byte[] getConfigFileData(String zkConfigName, String fileName) throws KeeperException, InterruptedException { String zkPath = CONFIGS_ZKNODE + "/" + zkConfigName + "/" + fileName; byte[] bytes = zkClient.getData(zkPath, null, null, true); if (bytes == null) { log.error("Config file contains no data:" + zkPath); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "Config file contains no data:" + zkPath); } return bytes; }
// in core/src/java/org/apache/solr/cloud/ZkController.java
private void syncNodeState() throws KeeperException, InterruptedException { log.debug("Syncing internal state with zk. Current: " + coreStates); final String path = Overseer.STATES_NODE + "/" + getNodeName(); final byte[] data = zkClient.getData(path, null, null, true); if (data != null) { CoreState[] states = CoreState.load(data); synchronized (coreStates) { coreStates.clear(); // TODO: should we do this? for(CoreState coreState: states) { coreStates.put(coreState.getCoreName(), coreState); } } } log.debug("after sync: " + coreStates); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
private void createEphemeralLiveNode() throws KeeperException, InterruptedException { String nodeName = getNodeName(); String nodePath = ZkStateReader.LIVE_NODES_ZKNODE + "/" + nodeName; log.info("Register node as live in ZooKeeper:" + nodePath); try { boolean nodeDeleted = true; try { // we attempt a delete in the case of a quick server bounce - // if there was not a graceful shutdown, the node may exist // until expiration timeout - so a node won't be created here because // it exists, but eventually the node will be removed. So delete // in case it exists and create a new node. zkClient.delete(nodePath, -1, true); } catch (KeeperException.NoNodeException e) { // fine if there is nothing to delete // TODO: annoying that ZK logs a warning on us nodeDeleted = false; } if (nodeDeleted) { log .info("Found a previous node that still exists while trying to register a new live node " + nodePath + " - removing existing node to create another."); } zkClient.makePath(nodePath, CreateMode.EPHEMERAL, true); } catch (KeeperException e) { // its okay if the node already exists if (e.code() != KeeperException.Code.NODEEXISTS) { throw e; } } }
// in core/src/java/org/apache/solr/cloud/ZkController.java
public boolean pathExists(String path) throws KeeperException, InterruptedException { return zkClient.exists(path, true); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
public String readConfigName(String collection) throws KeeperException, InterruptedException, IOException { String configName = null; String path = ZkStateReader.COLLECTIONS_ZKNODE + "/" + collection; if (log.isInfoEnabled()) { log.info("Load collection config from:" + path); } byte[] data = zkClient.getData(path, null, null, true); if(data != null) { ZkNodeProps props = ZkNodeProps.load(data); configName = props.get(CONFIGNAME_PROP); } if (configName != null && !zkClient.exists(CONFIGS_ZKNODE + "/" + configName, true)) { log.error("Specified config does not exist in ZooKeeper:" + configName); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "Specified config does not exist in ZooKeeper:" + configName); } return configName; }
// in core/src/java/org/apache/solr/cloud/ZkController.java
private ZkCoreNodeProps getLeaderProps(final String collection, final String slice) throws KeeperException, InterruptedException { int iterCount = 60; while (iterCount-- > 0) try { byte[] data = zkClient.getData( ZkStateReader.getShardLeadersPath(collection, slice), null, null, true); ZkCoreNodeProps leaderProps = new ZkCoreNodeProps( ZkNodeProps.load(data)); return leaderProps; } catch (NoNodeException e) { Thread.sleep(500); } throw new RuntimeException("Could not get leader props"); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
private void joinElection(CoreDescriptor cd) throws InterruptedException, KeeperException, IOException { String shardId = cd.getCloudDescriptor().getShardId(); Map<String,String> props = new HashMap<String,String>(); // we only put a subset of props into the leader node props.put(ZkStateReader.BASE_URL_PROP, getBaseUrl()); props.put(ZkStateReader.CORE_NAME_PROP, cd.getName()); props.put(ZkStateReader.NODE_NAME_PROP, getNodeName()); final String coreZkNodeName = getNodeName() + "_" + cd.getName(); ZkNodeProps ourProps = new ZkNodeProps(props); String collection = cd.getCloudDescriptor() .getCollectionName(); ElectionContext context = new ShardLeaderElectionContext(leaderElector, shardId, collection, coreZkNodeName, ourProps, this, cc); leaderElector.setup(context); electionContexts.put(coreZkNodeName, context); leaderElector.joinElection(context); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
private boolean checkRecovery(String coreName, final CoreDescriptor desc, boolean recoverReloadedCores, final boolean isLeader, final CloudDescriptor cloudDesc, final String collection, final String shardZkNodeName, String shardId, ZkNodeProps leaderProps, SolrCore core, CoreContainer cc) throws InterruptedException, KeeperException, IOException, ExecutionException { if (SKIP_AUTO_RECOVERY) { log.warn("Skipping recovery according to sys prop solrcloud.skip.autorecovery"); return false; } boolean doRecovery = true; if (!isLeader) { if (core.isReloaded() && !recoverReloadedCores) { doRecovery = false; } if (doRecovery) { log.info("Core needs to recover:" + core.getName()); core.getUpdateHandler().getSolrCoreState().doRecovery(cc, coreName); return true; } } else { log.info("I am the leader, no recovery necessary"); } return false; }
// in core/src/java/org/apache/solr/cloud/ZkController.java
public void unregister(String coreName, CloudDescriptor cloudDesc) throws InterruptedException, KeeperException { synchronized (coreStates) { coreStates.remove(coreName); } publishState(); final String zkNodeName = getNodeName() + "_" + coreName; ElectionContext context = electionContexts.remove(zkNodeName); if (context != null) { context.cancelElection(); } }
// in core/src/java/org/apache/solr/cloud/ZkController.java
public void uploadToZK(File dir, String zkPath) throws IOException, KeeperException, InterruptedException { uploadToZK(zkClient, dir, zkPath); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
public void uploadConfigDir(File dir, String configName) throws IOException, KeeperException, InterruptedException { uploadToZK(zkClient, dir, ZkController.CONFIGS_ZKNODE + "/" + configName); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
void printLayoutToStdOut() throws KeeperException, InterruptedException { zkClient.printLayoutToStdOut(); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
public void createCollectionZkNode(CloudDescriptor cd) throws KeeperException, InterruptedException, IOException { String collection = cd.getCollectionName(); log.info("Check for collection zkNode:" + collection); String collectionPath = ZkStateReader.COLLECTIONS_ZKNODE + "/" + collection; try { if(!zkClient.exists(collectionPath, true)) { log.info("Creating collection in ZooKeeper:" + collection); SolrParams params = cd.getParams(); try { Map<String,String> collectionProps = new HashMap<String,String>(); // TODO: if collection.configName isn't set, and there isn't already a conf in zk, just use that? String defaultConfigName = System.getProperty(COLLECTION_PARAM_PREFIX+CONFIGNAME_PROP, collection); // params passed in - currently only done via core admin (create core commmand). if (params != null) { Iterator<String> iter = params.getParameterNamesIterator(); while (iter.hasNext()) { String paramName = iter.next(); if (paramName.startsWith(COLLECTION_PARAM_PREFIX)) { collectionProps.put(paramName.substring(COLLECTION_PARAM_PREFIX.length()), params.get(paramName)); } } // if the config name wasn't passed in, use the default if (!collectionProps.containsKey(CONFIGNAME_PROP)) getConfName(collection, collectionPath, collectionProps); } else if(System.getProperty("bootstrap_confdir") != null) { // if we are bootstrapping a collection, default the config for // a new collection to the collection we are bootstrapping log.info("Setting config for collection:" + collection + " to " + defaultConfigName); Properties sysProps = System.getProperties(); for (String sprop : System.getProperties().stringPropertyNames()) { if (sprop.startsWith(COLLECTION_PARAM_PREFIX)) { collectionProps.put(sprop.substring(COLLECTION_PARAM_PREFIX.length()), sysProps.getProperty(sprop)); } } // if the config name wasn't passed in, use the default if (!collectionProps.containsKey(CONFIGNAME_PROP)) collectionProps.put(CONFIGNAME_PROP, defaultConfigName); } else if (Boolean.getBoolean("bootstrap_conf")) { // the conf name should should be the collection name of this core collectionProps.put(CONFIGNAME_PROP, cd.getCollectionName()); } else { getConfName(collection, collectionPath, collectionProps); } ZkNodeProps zkProps = new ZkNodeProps(collectionProps); zkClient.makePath(collectionPath, ZkStateReader.toJSON(zkProps), CreateMode.PERSISTENT, null, true); // ping that there is a new collection zkClient.setData(ZkStateReader.COLLECTIONS_ZKNODE, (byte[])null, true); } catch (KeeperException e) { // its okay if the node already exists if (e.code() != KeeperException.Code.NODEEXISTS) { throw e; } } } else { log.info("Collection zkNode exists"); } } catch (KeeperException e) { // its okay if another beats us creating the node if (e.code() != KeeperException.Code.NODEEXISTS) { throw e; } } }
// in core/src/java/org/apache/solr/cloud/ZkController.java
private void getConfName(String collection, String collectionPath, Map<String,String> collectionProps) throws KeeperException, InterruptedException { // check for configName log.info("Looking for collection configName"); List<String> configNames = null; int retry = 1; int retryLimt = 6; for (; retry < retryLimt; retry++) { if (zkClient.exists(collectionPath, true)) { ZkNodeProps cProps = ZkNodeProps.load(zkClient.getData(collectionPath, null, null, true)); if (cProps.containsKey(CONFIGNAME_PROP)) { break; } } // if there is only one conf, use that try { configNames = zkClient.getChildren(CONFIGS_ZKNODE, null, true); } catch (NoNodeException e) { // just keep trying } if (configNames != null && configNames.size() == 1) { // no config set named, but there is only 1 - use it log.info("Only one config set found in zk - using it:" + configNames.get(0)); collectionProps.put(CONFIGNAME_PROP, configNames.get(0)); break; } if (configNames != null && configNames.contains(collection)) { log.info("Could not find explicit collection configName, but found config name matching collection name - using that set."); collectionProps.put(CONFIGNAME_PROP, collection); break; } log.info("Could not find collection configName - pausing for 3 seconds and trying again - try: " + retry); Thread.sleep(3000); } if (retry == retryLimt) { log.error("Could not find configName for collection " + collection); throw new ZooKeeperException( SolrException.ErrorCode.SERVER_ERROR, "Could not find configName for collection " + collection + " found:" + configNames); } }
// in core/src/java/org/apache/solr/cloud/ZkController.java
private String doGetShardIdProcess(String coreName, CloudDescriptor descriptor) throws InterruptedException { final String shardZkNodeName = getNodeName() + "_" + coreName; int retryCount = 120; while (retryCount-- > 0) { final String shardId = zkStateReader.getCloudState().getShardId( shardZkNodeName); if (shardId != null) { return shardId; } try { Thread.sleep(500); } catch (InterruptedException e) { Thread.currentThread().interrupt(); } } throw new SolrException(ErrorCode.SERVER_ERROR, "Could not get shard_id for core: " + coreName); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
public static void uploadToZK(SolrZkClient zkClient, File dir, String zkPath) throws IOException, KeeperException, InterruptedException { File[] files = dir.listFiles(); if (files == null) { throw new IllegalArgumentException("Illegal directory: " + dir); } for(File file : files) { if (!file.getName().startsWith(".")) { if (!file.isDirectory()) { zkClient.makePath(zkPath + "/" + file.getName(), file, false, true); } else { uploadToZK(zkClient, file, zkPath + "/" + file.getName()); } } } }
// in core/src/java/org/apache/solr/cloud/ZkController.java
public static void uploadConfigDir(SolrZkClient zkClient, File dir, String configName) throws IOException, KeeperException, InterruptedException { uploadToZK(zkClient, dir, ZkController.CONFIGS_ZKNODE + "/" + configName); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
public static void bootstrapConf(SolrZkClient zkClient, Config cfg, String solrHome) throws IOException, KeeperException, InterruptedException { NodeList nodes = (NodeList)cfg.evaluate("solr/cores/core", XPathConstants.NODESET); for (int i=0; i<nodes.getLength(); i++) { Node node = nodes.item(i); String rawName = DOMUtil.getAttr(node, "name", null); String instanceDir = DOMUtil.getAttr(node, "instanceDir", null); File idir = new File(instanceDir); if (!idir.isAbsolute()) { idir = new File(solrHome, instanceDir); } String confName = DOMUtil.getAttr(node, "collection", null); if (confName == null) { confName = rawName; } ZkController.uploadConfigDir(zkClient, new File(idir, "conf"), confName); } }
// in core/src/java/org/apache/solr/cloud/LeaderElector.java
private void checkIfIamLeader(final int seq, final ElectionContext context, boolean replacement) throws KeeperException, InterruptedException, IOException { // get all other numbers... final String holdElectionPath = context.electionPath + ELECTION_NODE; List<String> seqs = zkClient.getChildren(holdElectionPath, null, true); sortSeqs(seqs); List<Integer> intSeqs = getSeqs(seqs); if (seq <= intSeqs.get(0)) { runIamLeaderProcess(context, replacement); } else { // I am not the leader - watch the node below me int i = 1; for (; i < intSeqs.size(); i++) { int s = intSeqs.get(i); if (seq < s) { // we found who we come before - watch the guy in front break; } } int index = i - 2; if (index < 0) { log.warn("Our node is no longer in line to be leader"); return; } try { zkClient.getData(holdElectionPath + "/" + seqs.get(index), new Watcher() { @Override public void process(WatchedEvent event) { // am I the next leader? try { checkIfIamLeader(seq, context, true); } catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.warn("", e); } catch (IOException e) { log.warn("", e); } catch (Exception e) { log.warn("", e); } } }, null, true); } catch (KeeperException.SessionExpiredException e) { throw e; } catch (KeeperException e) { // we couldn't set our watch - the node before us may already be down? // we need to check if we are the leader again checkIfIamLeader(seq, context, true); } }
// in core/src/java/org/apache/solr/cloud/LeaderElector.java
protected void runIamLeaderProcess(final ElectionContext context, boolean weAreReplacement) throws KeeperException, InterruptedException, IOException { context.runLeaderProcess(weAreReplacement); }
// in core/src/java/org/apache/solr/cloud/LeaderElector.java
public int joinElection(ElectionContext context) throws KeeperException, InterruptedException, IOException { final String shardsElectZkPath = context.electionPath + LeaderElector.ELECTION_NODE; long sessionId = zkClient.getSolrZooKeeper().getSessionId(); String id = sessionId + "-" + context.id; String leaderSeqPath = null; boolean cont = true; int tries = 0; while (cont) { try { leaderSeqPath = zkClient.create(shardsElectZkPath + "/" + id + "-n_", null, CreateMode.EPHEMERAL_SEQUENTIAL, false); context.leaderSeqPath = leaderSeqPath; cont = false; } catch (ConnectionLossException e) { // we don't know if we made our node or not... List<String> entries = zkClient.getChildren(shardsElectZkPath, null, true); boolean foundId = false; for (String entry : entries) { String nodeId = getNodeId(entry); if (id.equals(nodeId)) { // we did create our node... foundId = true; break; } } if (!foundId) { throw e; } } catch (KeeperException.NoNodeException e) { // we must have failed in creating the election node - someone else must // be working on it, lets try again if (tries++ > 9) { throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } cont = true; Thread.sleep(50); } } int seq = getSeq(leaderSeqPath); checkIfIamLeader(seq, context, false); return seq; }
// in core/src/java/org/apache/solr/cloud/LeaderElector.java
public void setup(final ElectionContext context) throws InterruptedException, KeeperException { String electZKPath = context.electionPath + LeaderElector.ELECTION_NODE; zkCmdExecutor.ensureExists(electZKPath, zkClient); }
// in core/src/java/org/apache/solr/update/TransactionLog.java
public Object next() throws IOException, InterruptedException { long pos = fis.position(); synchronized (TransactionLog.this) { if (trace) { log.trace("Reading log record. pos="+pos+" currentSize="+fos.size()); } if (pos >= fos.size()) { return null; } fos.flushBuffer(); } if (pos == 0) { readHeader(fis); // shouldn't currently happen - header and first record are currently written at the same time synchronized (TransactionLog.this) { if (fis.position() >= fos.size()) { return null; } pos = fis.position(); } } Object o = codec.readVal(fis); // skip over record size int size = fis.readInt(); assert size == fis.position() - pos - 4; return o; }
// in core/src/java/org/apache/solr/core/CoreContainer.java
private SolrConfig getSolrConfigFromZk(String zkConfigName, String solrConfigFileName, SolrResourceLoader resourceLoader) throws IOException, ParserConfigurationException, SAXException, KeeperException, InterruptedException { byte[] config = zkController.getConfigFileData(zkConfigName, solrConfigFileName); InputSource is = new InputSource(new ByteArrayInputStream(config)); is.setSystemId(SystemIdResolver.createSystemIdFromResourceName(solrConfigFileName)); SolrConfig cfg = solrConfigFileName == null ? new SolrConfig( resourceLoader, SolrConfig.DEFAULT_CONF_FILE, is) : new SolrConfig( resourceLoader, solrConfigFileName, is); return cfg; }
// in core/src/java/org/apache/solr/core/CoreContainer.java
private IndexSchema getSchemaFromZk(String zkConfigName, String schemaName, SolrConfig config, SolrResourceLoader resourceLoader) throws KeeperException, InterruptedException { byte[] configBytes = zkController.getConfigFileData(zkConfigName, schemaName); InputSource is = new InputSource(new ByteArrayInputStream(configBytes)); is.setSystemId(SystemIdResolver.createSystemIdFromResourceName(schemaName)); IndexSchema schema = new IndexSchema(config, schemaName, is); return schema; }
// in core/src/java/org/apache/solr/util/RTimer.java
public static void main(String []argv) throws InterruptedException { RTimer rt = new RTimer(), subt, st; Thread.sleep(100); subt = rt.sub("sub1"); Thread.sleep(50); st = subt.sub("sub1.1"); st.resume(); Thread.sleep(10); st.pause(); Thread.sleep(50); st.resume(); Thread.sleep(10); st.pause(); subt.stop(); rt.stop(); System.out.println( rt.toString()); }
(Lib) TimeoutException 2
              
// in solrj/src/java/org/apache/solr/common/cloud/ConnectionManager.java
public synchronized void waitForConnected(long waitForConnection) throws InterruptedException, TimeoutException, IOException { long expire = System.currentTimeMillis() + waitForConnection; long left = waitForConnection; while (!connected && left > 0) { wait(left); left = expire - System.currentTimeMillis(); } if (!connected) { throw new TimeoutException("Could not connect to ZooKeeper " + zkServerAddress + " within " + waitForConnection + " ms"); } }
// in solrj/src/java/org/apache/solr/common/cloud/ConnectionManager.java
public synchronized void waitForDisconnected(long timeout) throws InterruptedException, TimeoutException { long expire = System.currentTimeMillis() + timeout; long left = timeout; while (connected && left > 0) { wait(left); left = expire - System.currentTimeMillis(); } if (connected) { throw new TimeoutException("Did not disconnect"); } }
0 11
              
// in solrj/src/java/org/apache/solr/common/cloud/DefaultConnectionStrategy.java
Override public void connect(String serverAddress, int timeout, Watcher watcher, ZkUpdate updater) throws IOException, InterruptedException, TimeoutException { updater.update(new SolrZooKeeper(serverAddress, timeout, watcher)); }
// in solrj/src/java/org/apache/solr/common/cloud/ConnectionManager.java
public synchronized void process(WatchedEvent event) { if (log.isInfoEnabled()) { log.info("Watcher " + this + " name:" + name + " got event " + event + " path:" + event.getPath() + " type:" + event.getType()); } state = event.getState(); if (state == KeeperState.SyncConnected) { connected = true; clientConnected.countDown(); } else if (state == KeeperState.Expired) { connected = false; log.info("Attempting to reconnect to recover relationship with ZooKeeper..."); try { connectionStrategy.reconnect(zkServerAddress, zkClientTimeout, this, new ZkClientConnectionStrategy.ZkUpdate() { @Override public void update(SolrZooKeeper keeper) throws InterruptedException, TimeoutException, IOException { synchronized (connectionStrategy) { waitForConnected(SolrZkClient.DEFAULT_CLIENT_CONNECT_TIMEOUT); client.updateKeeper(keeper); if (onReconnect != null) { onReconnect.command(); } synchronized (ConnectionManager.this) { ConnectionManager.this.connected = true; } } } }); } catch (Exception e) { SolrException.log(log, "", e); } log.info("Connected:" + connected); }
// in solrj/src/java/org/apache/solr/common/cloud/ConnectionManager.java
Override public void update(SolrZooKeeper keeper) throws InterruptedException, TimeoutException, IOException { synchronized (connectionStrategy) { waitForConnected(SolrZkClient.DEFAULT_CLIENT_CONNECT_TIMEOUT); client.updateKeeper(keeper); if (onReconnect != null) { onReconnect.command(); } synchronized (ConnectionManager.this) { ConnectionManager.this.connected = true; } } }
// in solrj/src/java/org/apache/solr/common/cloud/ConnectionManager.java
public synchronized void waitForConnected(long waitForConnection) throws InterruptedException, TimeoutException, IOException { long expire = System.currentTimeMillis() + waitForConnection; long left = waitForConnection; while (!connected && left > 0) { wait(left); left = expire - System.currentTimeMillis(); } if (!connected) { throw new TimeoutException("Could not connect to ZooKeeper " + zkServerAddress + " within " + waitForConnection + " ms"); } }
// in solrj/src/java/org/apache/solr/common/cloud/ConnectionManager.java
public synchronized void waitForDisconnected(long timeout) throws InterruptedException, TimeoutException { long expire = System.currentTimeMillis() + timeout; long left = timeout; while (connected && left > 0) { wait(left); left = expire - System.currentTimeMillis(); } if (connected) { throw new TimeoutException("Did not disconnect"); } }
// in core/src/java/org/apache/solr/cloud/RecoveryStrategy.java
private Future<RecoveryInfo> replay(UpdateLog ulog) throws InterruptedException, ExecutionException, TimeoutException { Future<RecoveryInfo> future = ulog.applyBufferedUpdates(); if (future == null) { // no replay needed\ log.info("No replay needed"); } else { log.info("Replaying buffered documents"); // wait for replay future.get(); } // solrcloud_debug // try { // RefCounted<SolrIndexSearcher> searchHolder = core.getNewestSearcher(false); // SolrIndexSearcher searcher = searchHolder.get(); // try { // System.out.println(core.getCoreDescriptor().getCoreContainer().getZkController().getNodeName() + " replayed " // + searcher.search(new MatchAllDocsQuery(), 1).totalHits); // } finally { // searchHolder.decref(); // } // } catch (Exception e) { // // } return future; }
(Lib) AttributeNotFoundException 1
              
// in core/src/java/org/apache/solr/core/JmxMonitoredMap.java
public Object getAttribute(String attribute) throws AttributeNotFoundException, MBeanException, ReflectionException { Object val; if ("coreHashCode".equals(attribute)) { val = coreHashCode; } else if (staticStats.contains(attribute) && attribute != null && attribute.length() > 0) { try { String getter = "get" + attribute.substring(0, 1).toUpperCase(Locale.ENGLISH) + attribute.substring(1); Method meth = infoBean.getClass().getMethod(getter); val = meth.invoke(infoBean); } catch (Exception e) { throw new AttributeNotFoundException(attribute); } } else { NamedList list = infoBean.getStatistics(); val = list.get(attribute); } if (val != null) { // Its String or one of the simple types, just return it as JMX suggests direct support for such types for (String simpleTypeName : SimpleType.ALLOWED_CLASSNAMES_LIST) { if (val.getClass().getName().equals(simpleTypeName)) { return val; } } // Its an arbitrary object which could be something complex and odd, return its toString, assuming that is // a workable representation of the object return val.toString(); } return null; }
1
              
// in core/src/java/org/apache/solr/core/JmxMonitoredMap.java
catch (Exception e) { throw new AttributeNotFoundException(attribute); }
2
              
// in core/src/java/org/apache/solr/core/JmxMonitoredMap.java
public Object getAttribute(String attribute) throws AttributeNotFoundException, MBeanException, ReflectionException { Object val; if ("coreHashCode".equals(attribute)) { val = coreHashCode; } else if (staticStats.contains(attribute) && attribute != null && attribute.length() > 0) { try { String getter = "get" + attribute.substring(0, 1).toUpperCase(Locale.ENGLISH) + attribute.substring(1); Method meth = infoBean.getClass().getMethod(getter); val = meth.invoke(infoBean); } catch (Exception e) { throw new AttributeNotFoundException(attribute); } } else { NamedList list = infoBean.getStatistics(); val = list.get(attribute); } if (val != null) { // Its String or one of the simple types, just return it as JMX suggests direct support for such types for (String simpleTypeName : SimpleType.ALLOWED_CLASSNAMES_LIST) { if (val.getClass().getName().equals(simpleTypeName)) { return val; } } // Its an arbitrary object which could be something complex and odd, return its toString, assuming that is // a workable representation of the object return val.toString(); } return null; }
// in core/src/java/org/apache/solr/core/JmxMonitoredMap.java
public void setAttribute(Attribute attribute) throws AttributeNotFoundException, InvalidAttributeValueException, MBeanException, ReflectionException { throw new UnsupportedOperationException("Operation not Supported"); }
(Domain) FieldMappingException 1
              
// in contrib/uima/src/java/org/apache/solr/uima/processor/UIMAToSolrMapper.java
public void map(String typeName, Map<String, MapField> featureFieldsmapping) throws FieldMappingException { try { Type type = cas.getTypeSystem().getType(typeName); for (FSIterator<FeatureStructure> iterator = cas.getFSIndexRepository().getAllIndexedFS(type); iterator .hasNext(); ) { FeatureStructure fs = iterator.next(); for (String featureName : featureFieldsmapping.keySet()) { MapField mapField = featureFieldsmapping.get(featureName); String fieldNameFeature = mapField.getFieldNameFeature(); String fieldNameFeatureValue = fieldNameFeature == null ? null : fs.getFeatureValueAsString(type.getFeatureByBaseName(fieldNameFeature)); String fieldName = mapField.getFieldName(fieldNameFeatureValue); log.info(new StringBuffer("mapping ").append(typeName).append("@").append(featureName) .append(" to ").append(fieldName).toString()); String featureValue = null; if (fs instanceof Annotation && "coveredText".equals(featureName)) { featureValue = ((Annotation) fs).getCoveredText(); } else { featureValue = fs.getFeatureValueAsString(type.getFeatureByBaseName(featureName)); } log.info(new StringBuffer("writing ").append(featureValue).append(" in ").append( fieldName).toString()); document.addField(fieldName, featureValue, 1.0f); } } } catch (Exception e) { throw new FieldMappingException(e); } }
1
              
// in contrib/uima/src/java/org/apache/solr/uima/processor/UIMAToSolrMapper.java
catch (Exception e) { throw new FieldMappingException(e); }
1
              
// in contrib/uima/src/java/org/apache/solr/uima/processor/UIMAToSolrMapper.java
public void map(String typeName, Map<String, MapField> featureFieldsmapping) throws FieldMappingException { try { Type type = cas.getTypeSystem().getType(typeName); for (FSIterator<FeatureStructure> iterator = cas.getFSIndexRepository().getAllIndexedFS(type); iterator .hasNext(); ) { FeatureStructure fs = iterator.next(); for (String featureName : featureFieldsmapping.keySet()) { MapField mapField = featureFieldsmapping.get(featureName); String fieldNameFeature = mapField.getFieldNameFeature(); String fieldNameFeatureValue = fieldNameFeature == null ? null : fs.getFeatureValueAsString(type.getFeatureByBaseName(fieldNameFeature)); String fieldName = mapField.getFieldName(fieldNameFeatureValue); log.info(new StringBuffer("mapping ").append(typeName).append("@").append(featureName) .append(" to ").append(fieldName).toString()); String featureValue = null; if (fs instanceof Annotation && "coveredText".equals(featureName)) { featureValue = ((Annotation) fs).getCoveredText(); } else { featureValue = fs.getFeatureValueAsString(type.getFeatureByBaseName(featureName)); } log.info(new StringBuffer("writing ").append(featureValue).append(" in ").append( fieldName).toString()); document.addField(fieldName, featureValue, 1.0f); } } } catch (Exception e) { throw new FieldMappingException(e); } }
(Lib) IndexOutOfBoundsException 1
              
// in core/src/java/org/apache/solr/logging/CircularList.java
private void checkIndex(int index) { if (index >= size || index < 0) throw new IndexOutOfBoundsException("Index: "+index+", Size: "+size); }
0 0
(Lib) LockObtainFailedException 1
              
// in core/src/java/org/apache/solr/core/SolrCore.java
void initIndex() { try { String indexDir = getNewIndexDir(); boolean indexExists = getDirectoryFactory().exists(indexDir); boolean firstTime; synchronized (SolrCore.class) { firstTime = dirs.add(new File(indexDir).getCanonicalPath()); } boolean removeLocks = solrConfig.unlockOnStartup; initIndexReaderFactory(); if (indexExists && firstTime) { // to remove locks, the directory must already exist... so we create it // if it didn't exist already... Directory dir = directoryFactory.get(indexDir, getSolrConfig().indexConfig.lockType); if (dir != null) { if (IndexWriter.isLocked(dir)) { if (removeLocks) { log.warn(logid + "WARNING: Solr index directory '{}' is locked. Unlocking...", indexDir); IndexWriter.unlock(dir); } else { log.error(logid + "Solr index directory '{}' is locked. Throwing exception", indexDir); throw new LockObtainFailedException("Index locked for write for core " + name); } } directoryFactory.release(dir); } } // Create the index if it doesn't exist. if(!indexExists) { log.warn(logid+"Solr index directory '" + new File(indexDir) + "' doesn't exist." + " Creating new index..."); SolrIndexWriter writer = new SolrIndexWriter("SolrCore.initIndex", indexDir, getDirectoryFactory(), true, schema, solrConfig.indexConfig, solrDelPolicy, codec, false); writer.close(); } } catch (IOException e) { throw new RuntimeException(e); } }
0 0
(Lib) NoSuchElementException 1
              
// in core/src/java/org/apache/solr/spelling/PossibilityIterator.java
private RankedSpellPossibility internalNext() { if (done) { throw new NoSuchElementException(); } List<SpellCheckCorrection> possibleCorrection = new ArrayList<SpellCheckCorrection>(); int rank = 0; for (int i = 0; i < correctionIndex.length; i++) { List<SpellCheckCorrection> singleWordPossibilities = possibilityList.get(i); SpellCheckCorrection singleWordPossibility = singleWordPossibilities.get(correctionIndex[i]); rank += correctionIndex[i]; if (i == correctionIndex.length - 1) { correctionIndex[i]++; if (correctionIndex[i] == singleWordPossibilities.size()) { correctionIndex[i] = 0; if (correctionIndex.length == 1) { done = true; } for (int ii = i - 1; ii >= 0; ii--) { correctionIndex[ii]++; if (correctionIndex[ii] >= possibilityList.get(ii).size() && ii > 0) { correctionIndex[ii] = 0; } else { break; } } } } possibleCorrection.add(singleWordPossibility); } if(correctionIndex[0] == possibilityList.get(0).size()) { done = true; } RankedSpellPossibility rsl = new RankedSpellPossibility(); rsl.setCorrections(possibleCorrection); rsl.setRank(rank); return rsl; }
0 0
(Lib) NumberFormatException 1
              
// in core/src/java/org/apache/solr/internal/csv/CSVParser.java
protected int unicodeEscapeLexer(int c) throws IOException { int ret = 0; // ignore 'u' (assume c==\ now) and read 4 hex digits c = in.read(); code.clear(); try { for (int i = 0; i < 4; i++) { c = in.read(); if (isEndOfFile(c) || isEndOfLine(c)) { throw new NumberFormatException("number too short"); } code.append((char) c); } ret = Integer.parseInt(code.toString(), 16); } catch (NumberFormatException e) { throw new IOException( "(line " + getLineNumber() + ") Wrong unicode escape sequence found '" + code.toString() + "'" + e.toString()); } return ret; }
0 0
(Domain) ReplicationHandlerException 1
              
// in core/src/java/org/apache/solr/handler/SnapPuller.java
private int fetchPackets(FastInputStream fis) throws Exception { byte[] intbytes = new byte[4]; byte[] longbytes = new byte[8]; try { while (true) { if (stop) { stop = false; aborted = true; throw new ReplicationHandlerException("User aborted replication"); } long checkSumServer = -1; fis.readFully(intbytes); //read the size of the packet int packetSize = readInt(intbytes); if (packetSize <= 0) { LOG.warn("No content recieved for file: " + currentFile); return NO_CONTENT; } if (buf.length < packetSize) buf = new byte[packetSize]; if (checksum != null) { //read the checksum fis.readFully(longbytes); checkSumServer = readLong(longbytes); } //then read the packet of bytes fis.readFully(buf, 0, packetSize); //compare the checksum as sent from the master if (includeChecksum) { checksum.reset(); checksum.update(buf, 0, packetSize); long checkSumClient = checksum.getValue(); if (checkSumClient != checkSumServer) { LOG.error("Checksum not matched between client and server for: " + currentFile); //if checksum is wrong it is a problem return for retry return 1; } } //if everything is fine, write down the packet to the file fileChannel.write(ByteBuffer.wrap(buf, 0, packetSize)); bytesDownloaded += packetSize; if (bytesDownloaded >= size) return 0; //errorcount is always set to zero after a successful packet errorCount = 0; } } catch (ReplicationHandlerException e) { throw e; } catch (Exception e) { LOG.warn("Error in fetching packets ", e); //for any failure , increment the error count errorCount++; //if it fails for the same pacaket for MAX_RETRIES fail and come out if (errorCount > MAX_RETRIES) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Fetch failed for file:" + fileName, e); } return ERR; } }
0 0
(Lib) ServletException 1
              
// in core/src/java/org/apache/solr/servlet/RedirectServlet.java
public void init(ServletConfig config) throws ServletException { super.init(config); destination = config.getInitParameter("destination"); if(destination==null) { throw new ServletException("RedirectServlet missing destination configuration"); } if( "false".equals(config.getInitParameter("permanent") )) { code = HttpServletResponse.SC_MOVED_TEMPORARILY; } // Replace the context key if(destination.startsWith(CONTEXT_KEY)) { destination = config.getServletContext().getContextPath() +destination.substring(CONTEXT_KEY.length()); } }
0 10
              
// in core/src/java/org/apache/solr/servlet/SolrDispatchFilter.java
public void init(FilterConfig config) throws ServletException { log.info("SolrDispatchFilter.init()"); CoreContainer.Initializer init = createInitializer(); try { // web.xml configuration this.pathPrefix = config.getInitParameter( "path-prefix" ); this.cores = init.initialize(); log.info("user.dir=" + System.getProperty("user.dir")); } catch( Throwable t ) { // catch this so our filter still works log.error( "Could not start Solr. Check solr/home property and the logs"); SolrCore.log( t ); } log.info("SolrDispatchFilter.init() done"); }
// in core/src/java/org/apache/solr/servlet/SolrDispatchFilter.java
public void doFilter(ServletRequest request, ServletResponse response, FilterChain chain) throws IOException, ServletException { if( abortErrorMessage != null ) { ((HttpServletResponse)response).sendError( 500, abortErrorMessage ); return; } if (this.cores == null) { ((HttpServletResponse)response).sendError( 403, "Server is shutting down" ); return; } CoreContainer cores = this.cores; SolrCore core = null; SolrQueryRequest solrReq = null; if( request instanceof HttpServletRequest) { HttpServletRequest req = (HttpServletRequest)request; HttpServletResponse resp = (HttpServletResponse)response; SolrRequestHandler handler = null; String corename = ""; try { // put the core container in request attribute req.setAttribute("org.apache.solr.CoreContainer", cores); String path = req.getServletPath(); if( req.getPathInfo() != null ) { // this lets you handle /update/commit when /update is a servlet path += req.getPathInfo(); } if( pathPrefix != null && path.startsWith( pathPrefix ) ) { path = path.substring( pathPrefix.length() ); } // check for management path String alternate = cores.getManagementPath(); if (alternate != null && path.startsWith(alternate)) { path = path.substring(0, alternate.length()); } // unused feature ? int idx = path.indexOf( ':' ); if( idx > 0 ) { // save the portion after the ':' for a 'handler' path parameter path = path.substring( 0, idx ); } // Check for the core admin page if( path.equals( cores.getAdminPath() ) ) { handler = cores.getMultiCoreHandler(); solrReq = adminRequestParser.parse(null,path, req); handleAdminRequest(req, response, handler, solrReq); return; } else { //otherwise, we should find a core from the path idx = path.indexOf( "/", 1 ); if( idx > 1 ) { // try to get the corename as a request parameter first corename = path.substring( 1, idx ); core = cores.getCore(corename); if (core != null) { path = path.substring( idx ); } } if (core == null) { if (!cores.isZooKeeperAware() ) { core = cores.getCore(""); } } } if (core == null && cores.isZooKeeperAware()) { // we couldn't find the core - lets make sure a collection was not specified instead core = getCoreByCollection(cores, corename, path); if (core != null) { // we found a core, update the path path = path.substring( idx ); } else { // try the default core core = cores.getCore(""); } // TODO: if we couldn't find it locally, look on other nodes } // With a valid core... if( core != null ) { final SolrConfig config = core.getSolrConfig(); // get or create/cache the parser for the core SolrRequestParsers parser = null; parser = parsers.get(config); if( parser == null ) { parser = new SolrRequestParsers(config); parsers.put(config, parser ); } // Determine the handler from the url path if not set // (we might already have selected the cores handler) if( handler == null && path.length() > 1 ) { // don't match "" or "/" as valid path handler = core.getRequestHandler( path ); // no handler yet but allowed to handle select; let's check if( handler == null && parser.isHandleSelect() ) { if( "/select".equals( path ) || "/select/".equals( path ) ) { solrReq = parser.parse( core, path, req ); String qt = solrReq.getParams().get( CommonParams.QT ); handler = core.getRequestHandler( qt ); if( handler == null ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "unknown handler: "+qt); } if( qt != null && qt.startsWith("/") && (handler instanceof ContentStreamHandlerBase)) { //For security reasons it's a bad idea to allow a leading '/', ex: /select?qt=/update see SOLR-3161 //There was no restriction from Solr 1.4 thru 3.5 and it's not supported for update handlers. throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "Invalid query type. Do not use /select to access: "+qt); } } } } // With a valid handler and a valid core... if( handler != null ) { // if not a /select, create the request if( solrReq == null ) { solrReq = parser.parse( core, path, req ); } final Method reqMethod = Method.getMethod(req.getMethod()); HttpCacheHeaderUtil.setCacheControlHeader(config, resp, reqMethod); // unless we have been explicitly told not to, do cache validation // if we fail cache validation, execute the query if (config.getHttpCachingConfig().isNever304() || !HttpCacheHeaderUtil.doCacheHeaderValidation(solrReq, req, reqMethod, resp)) { SolrQueryResponse solrRsp = new SolrQueryResponse(); /* even for HEAD requests, we need to execute the handler to * ensure we don't get an error (and to make sure the correct * QueryResponseWriter is selected and we get the correct * Content-Type) */ SolrRequestInfo.setRequestInfo(new SolrRequestInfo(solrReq, solrRsp)); this.execute( req, handler, solrReq, solrRsp ); HttpCacheHeaderUtil.checkHttpCachingVeto(solrRsp, resp, reqMethod); // add info to http headers //TODO: See SOLR-232 and SOLR-267. /*try { NamedList solrRspHeader = solrRsp.getResponseHeader(); for (int i=0; i<solrRspHeader.size(); i++) { ((javax.servlet.http.HttpServletResponse) response).addHeader(("Solr-" + solrRspHeader.getName(i)), String.valueOf(solrRspHeader.getVal(i))); } } catch (ClassCastException cce) { log.log(Level.WARNING, "exception adding response header log information", cce); }*/ QueryResponseWriter responseWriter = core.getQueryResponseWriter(solrReq); writeResponse(solrRsp, response, responseWriter, solrReq, reqMethod); } return; // we are done with a valid handler } } log.debug("no handler or core retrieved for " + path + ", follow through..."); } catch (Throwable ex) { sendError( core, solrReq, request, (HttpServletResponse)response, ex ); return; } finally { if( solrReq != null ) { solrReq.close(); } if (core != null) { core.close(); } SolrRequestInfo.clearRequestInfo(); } } // Otherwise let the webapp handle the request chain.doFilter(request, response); }
// in core/src/java/org/apache/solr/servlet/ZookeeperInfoServlet.java
Override public void init() throws ServletException { }
// in core/src/java/org/apache/solr/servlet/ZookeeperInfoServlet.java
Override public void doGet(HttpServletRequest request, HttpServletResponse response) throws IOException, ServletException { response.setCharacterEncoding("UTF-8"); response.setContentType("application/json"); // This attribute is set by the SolrDispatchFilter CoreContainer cores = (CoreContainer) request.getAttribute("org.apache.solr.CoreContainer"); String path = request.getParameter("path"); String addr = request.getParameter("addr"); if (addr != null && addr.length() == 0) { addr = null; } String detailS = request.getParameter("detail"); boolean detail = detailS != null && detailS.equals("true"); String dumpS = request.getParameter("dump"); boolean dump = dumpS != null && dumpS.equals("true"); PrintWriter out = response.getWriter(); ZKPrinter printer = new ZKPrinter(response, out, cores.getZkController(), addr); printer.detail = detail; printer.dump = dump; try { printer.print(path); } finally { printer.close(); } }
// in core/src/java/org/apache/solr/servlet/ZookeeperInfoServlet.java
Override public void doPost(HttpServletRequest request, HttpServletResponse response) throws IOException, ServletException { doGet(request, response); }
// in core/src/java/org/apache/solr/servlet/LoadAdminUiServlet.java
Override public void doGet(HttpServletRequest request, HttpServletResponse response) throws IOException, ServletException { response.setCharacterEncoding("UTF-8"); response.setContentType("text/html"); PrintWriter out = response.getWriter(); InputStream in = getServletContext().getResourceAsStream("/admin.html"); if(in != null) { try { // This attribute is set by the SolrDispatchFilter CoreContainer cores = (CoreContainer) request.getAttribute("org.apache.solr.CoreContainer"); String html = IOUtils.toString(in, "UTF-8"); String[] search = new String[] { "${contextPath}", "${adminPath}" }; String[] replace = new String[] { StringEscapeUtils.escapeJavaScript(request.getContextPath()), StringEscapeUtils.escapeJavaScript(cores.getAdminPath()) }; out.println( StringUtils.replaceEach(html, search, replace) ); } finally { IOUtils.closeQuietly(in); } } else { out.println("solr"); } }
// in core/src/java/org/apache/solr/servlet/LoadAdminUiServlet.java
Override public void doPost(HttpServletRequest request, HttpServletResponse response) throws IOException, ServletException { doGet(request, response); }
// in core/src/java/org/apache/solr/servlet/RedirectServlet.java
public void init(ServletConfig config) throws ServletException { super.init(config); destination = config.getInitParameter("destination"); if(destination==null) { throw new ServletException("RedirectServlet missing destination configuration"); } if( "false".equals(config.getInitParameter("permanent") )) { code = HttpServletResponse.SC_MOVED_TEMPORARILY; } // Replace the context key if(destination.startsWith(CONTEXT_KEY)) { destination = config.getServletContext().getContextPath() +destination.substring(CONTEXT_KEY.length()); } }
// in core/src/java/org/apache/solr/servlet/RedirectServlet.java
public void doGet(HttpServletRequest req, HttpServletResponse res) throws ServletException,IOException { res.setStatus(code); res.setHeader("Location", destination); }
// in core/src/java/org/apache/solr/servlet/RedirectServlet.java
public void doPost(HttpServletRequest req, HttpServletResponse res) throws ServletException,IOException { doGet(req,res); }
(Lib) TransformerException 1
              
// in core/src/java/org/apache/solr/util/SystemIdResolver.java
public Source resolve(String href, String base) throws TransformerException { try { final InputSource src = SystemIdResolver.this.resolveEntity(null, null, base, href); return (src == null) ? null : new SAXSource(src); } catch (IOException ioe) { throw new TransformerException("Cannot resolve entity", ioe); } }
1
              
// in core/src/java/org/apache/solr/util/SystemIdResolver.java
catch (IOException ioe) { throw new TransformerException("Cannot resolve entity", ioe); }
4
              
// in solrj/src/java/org/apache/solr/common/util/XMLErrorLogger.java
public void error(TransformerException e) throws TransformerException { throw e; }
// in solrj/src/java/org/apache/solr/common/util/XMLErrorLogger.java
public void fatalError(TransformerException e) throws TransformerException { throw e; }
// in core/src/java/org/apache/solr/core/CoreContainer.java
private Document copyDoc(Document document) throws TransformerException { TransformerFactory tfactory = TransformerFactory.newInstance(); Transformer tx = tfactory.newTransformer(); DOMSource source = new DOMSource(document); DOMResult result = new DOMResult(); tx.transform(source,result); return (Document)result.getNode(); }
// in core/src/java/org/apache/solr/util/SystemIdResolver.java
public URIResolver asURIResolver() { return new URIResolver() { public Source resolve(String href, String base) throws TransformerException { try { final InputSource src = SystemIdResolver.this.resolveEntity(null, null, base, href); return (src == null) ? null : new SAXSource(src); } catch (IOException ioe) { throw new TransformerException("Cannot resolve entity", ioe); } } }; }
// in core/src/java/org/apache/solr/util/SystemIdResolver.java
public Source resolve(String href, String base) throws TransformerException { try { final InputSource src = SystemIdResolver.this.resolveEntity(null, null, base, href); return (src == null) ? null : new SAXSource(src); } catch (IOException ioe) { throw new TransformerException("Cannot resolve entity", ioe); } }
Explicit thrown (throw new...): 1070/1155
Explicit thrown ratio: 92.6%
Builder thrown ratio: 2.2%
Variable thrown ratio: 5.3%
Checked Runtime Total
Domain 20 669 689
Lib 30 266 296
Total 50 935

Caught Exceptions Summary

A (Domain) exception is defined in the application. A (Lib) exception is defined in the JDK or in a library. An exception can be caught, and it happens that the catch block contains a throw (e.g. for wrapping a low-level exception). Hovering over a number triggers showing code snippets from the application code.

Type Exception Caught
(directly)
Caught
with Thrown
(Lib) Exception 254
            
// in solrj/src/java/org/apache/zookeeper/SolrZooKeeper.java
catch (Exception e) { }
// in solrj/src/java/org/apache/solr/common/cloud/DefaultConnectionStrategy.java
catch (Exception e) { SolrException.log(log, "Reconnect to ZooKeeper failed", e); log.info("Reconnect to ZooKeeper failed"); }
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
catch (Exception e) { throw new RuntimeException("Problem pretty printing XML", e); }
// in solrj/src/java/org/apache/solr/common/cloud/ConnectionManager.java
catch (Exception e) { SolrException.log(log, "", e); }
// in solrj/src/java/org/apache/solr/common/util/ContentStreamBase.java
catch(Exception ex) {}
// in solrj/src/java/org/apache/solr/common/params/CoreAdminParams.java
catch( Exception ex ) {}
// in solrj/src/java/org/apache/solr/common/params/SolrParams.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, ex.getMessage(), ex ); }
// in solrj/src/java/org/apache/solr/common/params/SolrParams.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, ex.getMessage(), ex ); }
// in solrj/src/java/org/apache/solr/common/params/SolrParams.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, ex.getMessage(), ex ); }
// in solrj/src/java/org/apache/solr/common/params/SolrParams.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, ex.getMessage(), ex ); }
// in solrj/src/java/org/apache/solr/common/params/SolrParams.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, ex.getMessage(), ex ); }
// in solrj/src/java/org/apache/solr/common/params/SolrParams.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, ex.getMessage(), ex ); }
// in solrj/src/java/org/apache/solr/common/params/SolrParams.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, ex.getMessage(), ex ); }
// in solrj/src/java/org/apache/solr/common/params/SolrParams.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, ex.getMessage(), ex ); }
// in solrj/src/java/org/apache/solr/common/params/SolrParams.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, ex.getMessage(), ex ); }
// in solrj/src/java/org/apache/solr/common/params/SolrParams.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, ex.getMessage(), ex ); }
// in solrj/src/java/org/apache/solr/common/params/SolrParams.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, ex.getMessage(), ex ); }
// in solrj/src/java/org/apache/solr/common/params/SolrParams.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, ex.getMessage(), ex ); }
// in solrj/src/java/org/apache/solr/client/solrj/request/QueryRequest.java
catch (Exception e) { throw new SolrServerException("Error executing query", e); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/ConcurrentUpdateSolrServer.java
catch (Exception ex) { }
// in solrj/src/java/org/apache/solr/client/solrj/impl/HttpSolrServer.java
catch (Exception ex) {}
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
catch (Exception e) { throw new SolrServerException(e); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
catch (Exception e) { throw new SolrServerException(e); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
catch (Exception e) { throw new SolrServerException(e); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
catch (Exception e) { throw new SolrServerException(e); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
catch (Exception e) { //Expected. The server is still down. zombieServer.failedPings++; // If the server doesn't belong in the standard set belonging to this load balancer // then simply drop it after a certain number of failed pings. if (!zombieServer.standard && zombieServer.failedPings >= NONSTANDARD_PING_LIMIT) { zombieServers.remove(zombieServer.getKey()); } }
// in solrj/src/java/org/apache/solr/client/solrj/impl/XMLResponseParser.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "parsing error", ex ); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/XMLResponseParser.java
catch( Exception ex ){}
// in solrj/src/java/org/apache/solr/client/solrj/impl/XMLResponseParser.java
catch( Exception ex ) { ex.printStackTrace(); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/XMLResponseParser.java
catch( Exception ex ) {}
// in solrj/src/java/org/apache/solr/client/solrj/beans/DocumentObjectBinder.java
catch (Exception e) { throw new BindingException("Could not instantiate object of " + clazz, e); }
// in solrj/src/java/org/apache/solr/client/solrj/beans/DocumentObjectBinder.java
catch (Exception ex) { // no getter -- don't worry about it... if (type == Boolean.class) { gname = "is" + setter.getName().substring(3); try { getter = setter.getDeclaringClass().getMethod(gname, (Class[]) null); } catch(Exception ex2) { // no getter -- don't worry about it... } } }
// in solrj/src/java/org/apache/solr/client/solrj/beans/DocumentObjectBinder.java
catch(Exception ex2) { // no getter -- don't worry about it... }
// in solrj/src/java/org/apache/solr/client/solrj/beans/DocumentObjectBinder.java
catch (Exception e) { throw new BindingException("Exception while setting value : " + v + " on " + (field != null ? field : setter), e); }
// in solrj/src/java/org/apache/solr/client/solrj/beans/DocumentObjectBinder.java
catch (Exception e) { throw new BindingException("Exception while getting value: " + field, e); }
// in solrj/src/java/org/apache/solr/client/solrj/beans/DocumentObjectBinder.java
catch (Exception e) { throw new BindingException("Exception while getting value: " + getter, e); }
// in solrj/src/java/org/apache/solr/client/solrj/util/ClientUtils.java
catch( Exception ex ){}
// in contrib/extraction/src/java/org/apache/solr/handler/extraction/SolrContentHandler.java
catch (Exception e) { // Let the specific fieldType handle errors // throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Invalid value: " + val + " for field: " + schFld, e); }
// in contrib/extraction/src/java/org/apache/solr/handler/extraction/ExtractingRequestHandler.java
catch (Exception e) { throw new SolrException(ErrorCode.SERVER_ERROR, e); }
// in contrib/uima/src/java/org/apache/solr/uima/processor/UIMAUpdateRequestProcessor.java
catch (Exception e) { String logField = solrUIMAConfiguration.getLogField(); if(logField == null){ SchemaField uniqueKeyField = solrCore.getSchema().getUniqueKeyField(); if(uniqueKeyField != null){ logField = uniqueKeyField.getName(); } } String optionalFieldInfo = logField == null ? "." : new StringBuilder(". ").append(logField).append("=") .append((String)cmd.getSolrInputDocument().getField(logField).getValue()) .append(", ").toString(); int len = Math.min(text.length(), 100); if (solrUIMAConfiguration.isIgnoreErrors()) { log.warn(new StringBuilder("skip the text processing due to ") .append(e.getLocalizedMessage()).append(optionalFieldInfo) .append(" text=\"").append(text.substring(0, len)).append("...\"").toString()); } else { throw new SolrException(ErrorCode.SERVER_ERROR, new StringBuilder("processing error: ") .append(e.getLocalizedMessage()).append(optionalFieldInfo) .append(" text=\"").append(text.substring(0, len)).append("...\"").toString(), e); } }
// in contrib/uima/src/java/org/apache/solr/uima/processor/UIMAToSolrMapper.java
catch (Exception e) { throw new FieldMappingException(e); }
// in contrib/clustering/src/java/org/apache/solr/handler/clustering/carrot2/LuceneCarrot2TokenizerFactory.java
catch (Exception e) { throw ExceptionUtils.wrapAsRuntimeException(e); }
// in contrib/clustering/src/java/org/apache/solr/handler/clustering/carrot2/LuceneCarrot2StemmerFactory.java
catch (Exception e) { logger.warn("Could not instantiate snowball stemmer" + " for language: " + language.name() + ". Quality of clustering may be degraded.", e); return IdentityStemmer.INSTANCE; }
// in contrib/clustering/src/java/org/apache/solr/handler/clustering/carrot2/CarrotClusteringEngine.java
catch (Exception ignored) { // If we get the exception, the resource loader implementation // probably does not support getConfigDir(). Not a big problem. }
// in contrib/clustering/src/java/org/apache/solr/handler/clustering/carrot2/CarrotClusteringEngine.java
catch (Exception e) { log.error("Carrot2 clustering failed", e); throw new SolrException(ErrorCode.SERVER_ERROR, "Carrot2 clustering failed", e); }
// in contrib/langid/src/java/org/apache/solr/update/processor/LangDetectLanguageIdentifierUpdateProcessorFactory.java
catch (Exception e) { throw new RuntimeException("Couldn't load profile data, will return empty languages always!", e); }
// in contrib/dataimporthandler-extras/src/java/org/apache/solr/handler/dataimport/TikaEntityProcessor.java
catch (Exception e) { wrapAndThrow (SEVERE, e,"Unable to load Tika Config"); }
// in contrib/dataimporthandler-extras/src/java/org/apache/solr/handler/dataimport/TikaEntityProcessor.java
catch (Exception e) { wrapAndThrow(SEVERE, e, "Unable to read content"); }
// in contrib/dataimporthandler-extras/src/java/org/apache/solr/handler/dataimport/MailEntityProcessor.java
catch (Exception e) { return null; }
// in contrib/dataimporthandler-extras/src/java/org/apache/solr/handler/dataimport/MailEntityProcessor.java
catch (Exception e) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Custom filter could not be created", e); }
// in contrib/analysis-extras/src/java/org/apache/solr/schema/ICUCollationField.java
catch (Exception e) { // io error or invalid rules throw new RuntimeException(e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DataImporter.java
catch (Exception e) { throw new DataImportHandlerException(SEVERE, "Exception occurred while initializing context", e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DataImporter.java
catch (Exception e) { wrapAndThrow(SEVERE, e, "Invalid type for data source: " + type); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DataImporter.java
catch (Exception e) { wrapAndThrow(SEVERE, e, "Failed to initialize DataSource: " + key.getDataSourceName()); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/XPathEntityProcessor.java
catch (Exception e) { throw new DataImportHandlerException(SEVERE, "Error initializing XSL ", e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/XPathEntityProcessor.java
catch (Exception e) { if (ABORT.equals(onError)) { wrapAndThrow(SEVERE, e); } else if (SKIP.equals(onError)) { if (LOG.isDebugEnabled()) LOG.debug("Skipping url : " + s, e); wrapAndThrow(DataImportHandlerException.SKIP, e); } else { LOG.warn("Failed for url : " + s, e); rowIterator = Collections.EMPTY_LIST.iterator(); return; } }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/XPathEntityProcessor.java
catch (Exception e) { String msg = "Parsing failed for xml, url:" + s + " rows processed:" + rows.size(); if (rows.size() > 0) msg += " last row: " + rows.get(rows.size() - 1); if (ABORT.equals(onError)) { wrapAndThrow(SEVERE, e, msg); } else if (SKIP.equals(onError)) { LOG.warn(msg, e); Map<String, Object> map = new HashMap<String, Object>(); map.put(SKIP_DOC, Boolean.TRUE); rows.add(map); } else if (CONTINUE.equals(onError)) { LOG.warn(msg, e); } }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/XPathEntityProcessor.java
catch (Exception e) { /* Ignore */ }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/XPathEntityProcessor.java
catch (Exception e) { isEnd.set(true); return; }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/XPathEntityProcessor.java
catch (Exception e) { if(throwExp.get()) exp.set(e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/RegexTransformer.java
catch (Exception e) { LOG.warn("Parsing failed for field : " + columnName, e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DebugLogger.java
catch (Exception e) { log(DIHLogLevels.ENTITY_EXCEPTION, null, e); DataImportHandlerException de = new DataImportHandlerException( DataImportHandlerException.SEVERE, "", e); de.debugged = true; throw de; }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DebugLogger.java
catch (Exception e) { log(DIHLogLevels.TRANSFORMER_EXCEPTION, tName, e); DataImportHandlerException de = new DataImportHandlerException(DataImportHandlerException.SEVERE, "", e); de.debugged = true; throw de; }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/ScriptTransformer.java
catch (Exception e) { wrapAndThrow(SEVERE,e, "Error invoking script for entity " + context.getEntityAttribute("name")); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/FileDataSource.java
catch (Exception e) { wrapAndThrow(SEVERE,e,"Unable to open File : "+f.getAbsolutePath()); return null; }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/FieldStreamDataSource.java
catch (Exception e) { LOG.info("Unable to get data from BLOB"); return null; }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/SqlEntityProcessor.java
catch (Exception e) { LOG.error( "The query failed '" + q + "'", e); throw new DataImportHandlerException(DataImportHandlerException.SEVERE, e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/SimplePropertiesWriter.java
catch (Exception e) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Unable to persist Index Start Time", e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/SimplePropertiesWriter.java
catch (Exception e) { log.warn("Unable to read: " + persistFilename); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/EvaluatorBag.java
catch (Exception e) { wrapAndThrow(SEVERE, e, "Unable to encode expression: " + expression + " with value: " + s); return null; }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/EvaluatorBag.java
catch (Exception e) { wrapAndThrow(SEVERE, e, "Unable to instantiate evaluator: " + map.get(CLASS)); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/JdbcDataSource.java
catch (Exception e) { wrapAndThrow(SEVERE, e, "Unable to execute query: " + query); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/JdbcDataSource.java
catch (Exception e) { logError("Exception while closing result set", e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/JdbcDataSource.java
catch (Exception e) { LOG.error("Ignoring Error when closing connection", e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DIHCacheSupport.java
catch (Exception e) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Unable to load Cache implementation:" + cacheImplName, e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DIHCacheSupport.java
catch (Exception e) { SolrException.log(log, "getNextFromCache() failed for query '" + query + "'", e); wrapAndThrow(DataImportHandlerException.WARN, e); return null; }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/SolrWriter.java
catch (Exception e) { log.warn("Error creating document : " + d, e); return false; }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/SolrWriter.java
catch (Exception e) { }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/EntityProcessorWrapper.java
catch (Exception e) { log.error("Unable to load Transformer: " + aTransArr, e); wrapAndThrow(SEVERE, e,"Unable to load Transformer: " + trans); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/EntityProcessorWrapper.java
catch (Exception e) { log.warn("method invocation failed on transformer : " + trans, e); throw new DataImportHandlerException(WARN, e);
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/EntityProcessorWrapper.java
catch (Exception e) { log.warn("transformer threw error", e); if (ABORT.equals(onError)) { wrapAndThrow(SEVERE, e); } else if (SKIP.equals(onError)) { wrapAndThrow(DataImportHandlerException.SKIP, e); } // onError = continue }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/EntityProcessorWrapper.java
catch (Exception e) { if(ABORT.equals(onError)){ wrapAndThrow(SEVERE, e); } else { //SKIP is not really possible. If this calls the nextRow() again the Entityprocessor would be in an inconisttent state SolrException.log(log, "Exception in entity : "+ entityName, e); return null; } }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/ZKPropertiesWriter.java
catch (Exception e) { log.warn( "Could not persist properties to " + path + " :" + e.getClass(), e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/XPathRecordReader.java
catch (Exception e) { throw new RuntimeException(e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/BinURLDataSource.java
catch (Exception e) { LOG.error("Exception thrown while getting data", e); wrapAndThrow (SEVERE, e, "Exception in invoking url " + url); return null;//unreachable }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/URLDataSource.java
catch (Exception e) { LOG.error("Exception thrown while getting data", e); throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Exception in invoking url " + url, e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/FieldReaderDataSource.java
catch (Exception e) { LOG.info("Unable to get data from CLOB"); return null; }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/FieldReaderDataSource.java
catch (Exception e) { LOG.info("Unable to get data from BLOB"); return null; }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/FieldReaderDataSource.java
catch (Exception e) { wrapAndThrow(SEVERE, e,"Unable to get reader from clob"); return null;//unreachable }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/EntityProcessorBase.java
catch (Exception e) { SolrException.log(log, "getNext() failed for query '" + query + "'", e); query = null; rowIterator = null; wrapAndThrow(DataImportHandlerException.WARN, e); return null; }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DocBuilder.java
catch (Exception e) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Unable to load Writer implementation:" + writerClassStr, e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DocBuilder.java
catch (Exception e) { wrapAndThrow(SEVERE, e); // unreachable statement return null; }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DocBuilder.java
catch (Exception e) { wrapAndThrow(SEVERE, e, "Unable to load class : " + className); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DocBuilder.java
catch(Exception e) { throw new RuntimeException(e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DocBuilder.java
catch (Exception e) { LOG.error("Could not write property file", e); statusMessages.put("error", "Could not write property file. Delta imports will not work. " + "Make sure your conf directory is writable"); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DocBuilder.java
catch (Exception e) { throw new RuntimeException(e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DocBuilder.java
catch (Exception e) { wrapAndThrow (SEVERE,e, "Unable to load EntityProcessor implementation for entity:" + entity.getName()); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DocBuilder.java
catch (Exception e) { try { String n = DocBuilder.class.getPackage().getName() + "." + name; return core != null ? core.getResourceLoader().findClass(n, Object.class) : Class.forName(n); } catch (Exception e1) { throw new ClassNotFoundException("Unable to load " + name + " or " + DocBuilder.class.getPackage().getName() + "." + name, e); } }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DocBuilder.java
catch (Exception e1) { throw new ClassNotFoundException("Unable to load " + name + " or " + DocBuilder.class.getPackage().getName() + "." + name, e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/PlainTextEntityProcessor.java
catch (Exception e) { wrapAndThrow(SEVERE, e, "Exception reading url : " + url); }
// in contrib/velocity/src/java/org/apache/solr/response/VelocityResponseWriter.java
catch (Exception e) { throw new IOException(e.getMessage()); }
// in contrib/velocity/src/java/org/apache/solr/response/VelocityResponseWriter.java
catch (Exception e) { throw new RuntimeException(e); }
// in contrib/velocity/src/java/org/apache/solr/response/VelocityResponseWriter.java
catch (Exception e) { throw new IOException(e.getMessage()); }
// in core/src/java/org/apache/solr/handler/DocumentAnalysisRequestHandler.java
catch (Exception e) { // ignore analysis exceptions since we are applying arbitrary text to all fields termsToMatch = EMPTY_BYTES_SET; }
// in core/src/java/org/apache/solr/handler/DocumentAnalysisRequestHandler.java
catch (Exception e) { // ignore analysis exceptions since we are applying arbitrary text to all fields }
// in core/src/java/org/apache/solr/handler/RequestHandlerBase.java
catch (Exception e) { if (e instanceof SolrException) { SolrException se = (SolrException)e; if (se.code() == SolrException.ErrorCode.CONFLICT.code) { // TODO: should we allow this to be counted as an error (numErrors++)? } else { SolrException.log(SolrCore.log,e); } } else { SolrException.log(SolrCore.log,e); if (e instanceof ParseException) { e = new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); } } rsp.setException(e); numErrors++; }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
catch (Exception e) { LOG.error("Exception in fetching index", e); }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
catch (Exception e) { }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
catch (Exception e) { LOG.error("Master at: " + masterUrl + " is not available. Index fetch failed. Exception: " + e.getMessage()); return false; }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
catch (Exception e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Index fetch failed : ", e); }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
catch (Exception e) { LOG.warn("Exception while updating statistics", e); }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
catch (Exception e) { LOG.error("Could not restart core ", e); }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
catch (Exception e) { LOG.error("Unable to load index.properties"); }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
catch (Exception e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Unable to write index.properties", e); }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
catch (Exception e) { LOG.warn("Error in fetching packets ", e); //for any failure , increment the error count errorCount++; //if it fails for the same pacaket for MAX_RETRIES fail and come out if (errorCount > MAX_RETRIES) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Fetch failed for file:" + fileName, e); } return ERR; }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
catch (Exception e) {/* noop */ LOG.error("Error closing the file stream: "+ this.saveAs ,e); }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
catch (Exception e) { LOG.error("Error deleting file in cleanup" + e.getMessage()); }
// in core/src/java/org/apache/solr/handler/loader/JavabinLoader.java
catch (Exception e) { log.error("Exception while processing update request", e); break; }
// in core/src/java/org/apache/solr/handler/ReplicationHandler.java
catch (Exception e) { LOG.warn("Exception in finding checksum of " + f, e); }
// in core/src/java/org/apache/solr/handler/ReplicationHandler.java
catch (Exception e) { SolrException.log(LOG, "SnapPull failed ", e); }
// in core/src/java/org/apache/solr/handler/ReplicationHandler.java
catch (Exception e) { LOG.warn("Exception during creating a snapshot", e); rsp.add("exception", e); }
// in core/src/java/org/apache/solr/handler/ReplicationHandler.java
catch (Exception e) { LOG.warn("Exception while invoking 'details' method for replication on master ", e); slave.add(ERR_STATUS, "invalid_master"); }
// in core/src/java/org/apache/solr/handler/ReplicationHandler.java
catch (Exception e) { LOG.error("Exception while writing replication details: ", e); }
// in core/src/java/org/apache/solr/handler/ReplicationHandler.java
catch (Exception e) { LOG.warn("Exception while reading " + SnapPuller.REPLICATION_PROPERTIES); }
// in core/src/java/org/apache/solr/handler/ReplicationHandler.java
catch (Exception e) { LOG.error("Exception while snapshooting", e); }
// in core/src/java/org/apache/solr/handler/MoreLikeThisHandler.java
catch (Exception e) { SolrException.log(SolrCore.log, "Exception during debug", e); rsp.add("exception_during_debug", SolrException.toStr(e)); }
// in core/src/java/org/apache/solr/handler/component/QueryElevationComponent.java
catch (Exception ex) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Error initializing QueryElevationComponent.", ex); }
// in core/src/java/org/apache/solr/handler/component/QueryElevationComponent.java
catch (Exception ex) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Error loading elevation", ex); }
// in core/src/java/org/apache/solr/handler/component/SpellCheckComponent.java
catch (Exception e) { log.error( "Exception in building spell check index for spellchecker: " + checker.getDictionaryName(), e); }
// in core/src/java/org/apache/solr/handler/component/HighlightComponent.java
catch (Exception e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/handler/admin/LukeRequestHandler.java
catch( Exception ex ) {}
// in core/src/java/org/apache/solr/handler/admin/LukeRequestHandler.java
catch( Exception ex ) { log.warn( "error writing term vector", ex ); }
// in core/src/java/org/apache/solr/handler/admin/LukeRequestHandler.java
catch( Exception ex ) { log.warn( "error reading field: "+fieldName ); }
// in core/src/java/org/apache/solr/handler/admin/SolrInfoMBeanHandler.java
catch(Exception ex) { throw new SolrException(ErrorCode.BAD_REQUEST, "missing content-stream for diff"); }
// in core/src/java/org/apache/solr/handler/admin/SolrInfoMBeanHandler.java
catch(Exception ex) { throw new SolrException(ErrorCode.BAD_REQUEST, "Unable to read original XML", ex); }
// in core/src/java/org/apache/solr/handler/admin/CoreAdminHandler.java
catch (Exception ex) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Error executing default implementation of CREATE", ex); }
// in core/src/java/org/apache/solr/handler/admin/CoreAdminHandler.java
catch (Exception ex) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Error handling 'status' action ", ex); }
// in core/src/java/org/apache/solr/handler/admin/CoreAdminHandler.java
catch (Exception ex) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Error handling 'reload' action", ex); }
// in core/src/java/org/apache/solr/handler/admin/SystemInfoHandler.java
catch( Exception ex ) {}
// in core/src/java/org/apache/solr/handler/admin/SystemInfoHandler.java
catch( Exception ex ) { // ignore - log.warn("Error executing command", ex); return "(error executing: " + cmd + ")"; }
// in core/src/java/org/apache/solr/handler/admin/SystemInfoHandler.java
catch (Exception e) { log.warn("Error getting JMX properties", e); }
// in core/src/java/org/apache/solr/handler/admin/LoggingHandler.java
catch(Exception ex) { throw new SolrException(ErrorCode.BAD_REQUEST, "invalid timestamp: "+since); }
// in core/src/java/org/apache/solr/handler/SnapShooter.java
catch (Exception e) { SnapPuller.delTree(snapShotDir); LOG.error("Exception while creating snapshot", e); details.add("snapShootException", e.getMessage()); }
// in core/src/java/org/apache/solr/handler/SnapShooter.java
catch(Exception e) { this.dir = null; this.timestamp = null; }
// in core/src/java/org/apache/solr/analysis/PhoneticFilterFactory.java
catch (Exception e) { throw new InitializationException("Encoder " + name + " / " + clazz + " does not support " + MAX_CODE_LENGTH, e); }
// in core/src/java/org/apache/solr/analysis/PhoneticFilterFactory.java
catch (Exception e) { final Throwable t = (e instanceof InvocationTargetException) ? e.getCause() : e; throw new InitializationException("Error initializing encoder: " + name + " / " + clazz, t); }
// in core/src/java/org/apache/solr/analysis/HunspellStemFilterFactory.java
catch (Exception e) { throw new InitializationException("Unable to load hunspell data! [dictionary=" + args.get("dictionary") + ",affix=" + affixFile + "]", e); }
// in core/src/java/org/apache/solr/analysis/SynonymFilterFactory.java
catch (Exception e) { throw new InitializationException("Exception thrown while loading synonyms", e); }
// in core/src/java/org/apache/solr/analysis/TrimFilterFactory.java
catch( Exception ex ) { throw new InitializationException("Error reading updateOffsets value. Must be true or false.", ex); }
// in core/src/java/org/apache/solr/analysis/HyphenationCompoundWordTokenFilterFactory.java
catch (Exception e) { // TODO: getHyphenationTree really shouldn't throw "Exception" throw new InitializationException("Exception thrown while loading dictionary and hyphenation file", e); }
// in core/src/java/org/apache/solr/analysis/SnowballPorterFilterFactory.java
catch (Exception e) { throw new InitializationException("Error instantiating stemmer for language " + language + "from class " + stemClass, e); }
// in core/src/java/org/apache/solr/analysis/JapaneseTokenizerFactory.java
catch (Exception e) { throw new InitializationException("Exception thrown while loading dictionary", e); }
// in core/src/java/org/apache/solr/analysis/PatternTokenizerFactory.java
catch( Exception ex ) { throw new InitializationException("invalid group argument: " + g); }
// in core/src/java/org/apache/solr/client/solrj/embedded/EmbeddedSolrServer.java
catch (Exception ex) { throw new RuntimeException(ex); }
// in core/src/java/org/apache/solr/client/solrj/embedded/EmbeddedSolrServer.java
catch( Exception ex ) { throw new SolrServerException( ex ); }
// in core/src/java/org/apache/solr/client/solrj/embedded/JettySolrRunner.java
catch (Exception ex) { ex.printStackTrace(); }
// in core/src/java/org/apache/solr/response/XSLTResponseWriter.java
catch(Exception e) { // TODO should our parent interface throw (IO)Exception? throw new RuntimeException("getTransformer fails in getContentType",e); }
// in core/src/java/org/apache/solr/response/transform/ValueAugmenterFactory.java
catch( Exception ex ) { throw new SolrException( ErrorCode.BAD_REQUEST, "Unable to parse "+type+"="+val, ex ); }
// in core/src/java/org/apache/solr/response/transform/ExplainAugmenterFactory.java
catch( Exception ex ) { throw new SolrException( ErrorCode.BAD_REQUEST, "Unknown Explain Style: "+str ); }
// in core/src/java/org/apache/solr/response/BinaryResponseWriter.java
catch (Exception e) { LOG.warn("Error reading a field : " + o, e); }
// in core/src/java/org/apache/solr/response/BinaryResponseWriter.java
catch (Exception e) { // There is a chance of the underlying field not really matching the // actual field type . So ,it can throw exception LOG.warn("Error reading a field from document : " + solrDoc, e); //if it happens log it and continue continue; }
// in core/src/java/org/apache/solr/response/BinaryResponseWriter.java
catch (Exception ex) { throw new RuntimeException(ex); }
// in core/src/java/org/apache/solr/request/SimpleFacets.java
catch (Exception e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Can't parse value "+rawval+" for field: " + field.getName(), e); }
// in core/src/java/org/apache/solr/request/SimpleFacets.java
catch (Exception e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Can't parse gap "+gap+" for field: " + field.getName(), e); }
// in core/src/java/org/apache/solr/request/SimpleFacets.java
catch (Exception e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Can't add gap "+gap+" to value " + value + " for field: " + field.getName(), e); }
// in core/src/java/org/apache/solr/servlet/SolrDispatchFilter.java
catch (Exception e) { //unlikely throw new SolrException(SolrException.ErrorCode.SERVER_ERROR,e); }
// in core/src/java/org/apache/solr/servlet/cache/Method.java
catch (Exception e) { return OTHER; }
// in core/src/java/org/apache/solr/schema/OpenExchangeRatesOrgProvider.java
catch (Exception e) { ratesJsonStream = resourceLoader.openResource(ratesFileLocation); }
// in core/src/java/org/apache/solr/schema/OpenExchangeRatesOrgProvider.java
catch (Exception e) { throw new SolrException(ErrorCode.SERVER_ERROR, "Error reloading exchange rates", e); }
// in core/src/java/org/apache/solr/schema/OpenExchangeRatesOrgProvider.java
catch (Exception e) { throw new SolrException(ErrorCode.BAD_REQUEST, "Error initializing", e); }
// in core/src/java/org/apache/solr/schema/IndexSchema.java
catch(Exception e) { // unexpected exception... throw new SolrException( SolrException.ErrorCode.SERVER_ERROR,"Schema Parsing Failed: " + e.getMessage(), e); }
// in core/src/java/org/apache/solr/schema/DateField.java
catch (Exception e) { return DateUtil.parseDate(s); }
// in core/src/java/org/apache/solr/schema/CurrencyField.java
catch (Exception e) { throw new SolrException(ErrorCode.BAD_REQUEST, "Error instansiating exhange rate provider "+exchangeRateProviderClass+". Please check your FieldType configuration", e); }
// in core/src/java/org/apache/solr/schema/FieldTypePluginLoader.java
catch (Exception e) { log.error("Cannot load analyzer: "+analyzerName, e); throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "Cannot load analyzer: "+analyzerName, e ); }
// in core/src/java/org/apache/solr/schema/PreAnalyzedField.java
catch (Exception e) { LOG.warn("Can't use the configured PreAnalyzedParser class '" + implName + "' (" + e.getMessage() + "), using default " + DEFAULT_IMPL); parser = new JsonPreAnalyzedParser(); }
// in core/src/java/org/apache/solr/schema/PreAnalyzedField.java
catch (Exception e) { e.printStackTrace(); return null; }
// in core/src/java/org/apache/solr/internal/csv/writer/CSVConfigGuesser.java
catch(Exception e) { e.printStackTrace(); }
// in core/src/java/org/apache/solr/internal/csv/writer/CSVConfigGuesser.java
catch(Exception e) { // ignore exception. }
// in core/src/java/org/apache/solr/internal/csv/writer/CSVWriter.java
catch(Exception e) { e.printStackTrace(); }
// in core/src/java/org/apache/solr/search/SurroundQParserPlugin.java
catch (Exception e) { LOG.warn("Couldn't parse maxBasicQueries value " + mbqparam +", using default of 1000"); this.maxBasicQueries = DEFMAXBASICQUERIES; }
// in core/src/java/org/apache/solr/search/ExtendedDismaxQParserPlugin.java
catch (Exception e) { // ignore failure and reparse later after escaping reserved chars up.exceptions = false; }
// in core/src/java/org/apache/solr/search/ExtendedDismaxQParserPlugin.java
catch (Exception e) { // an exception here is due to the field query not being compatible with the input text // for example, passing a string to a numeric field. return null; }
// in core/src/java/org/apache/solr/search/QueryParsing.java
catch (Exception e) { // hang onto this in case the string isn't a full field name either qParserException = e; }
// in core/src/java/org/apache/solr/search/QueryParsing.java
catch (Exception e) { out.append("EXCEPTION(val="); out.append(val); out.append(")"); }
// in core/src/java/org/apache/solr/search/QueryParsing.java
catch (Exception e) { out.append("EXCEPTION(val="); out.append(val.utf8ToString()); out.append(")"); }
// in core/src/java/org/apache/solr/search/QueryParsing.java
catch (Exception e) { throw new RuntimeException(e); }
// in core/src/java/org/apache/solr/search/function/FileFloatSource.java
catch (Exception e) { if (++otherErrors<=10) { SolrCore.log.error( "Error loading external value source + fileName + " + e + (otherErrors<10 ? "" : "\tSkipping future errors for this file.") ); } continue; // go to next line in file.. leave values as default. }
// in core/src/java/org/apache/solr/search/function/FileFloatSource.java
catch(Exception e){}
// in core/src/java/org/apache/solr/search/SolrCacheBase.java
catch (Exception e) { throw new RuntimeException("Can't parse autoWarm value: " + configValue, e); }
// in core/src/java/org/apache/solr/search/CacheConfig.java
catch (Exception e) { SolrException.log(SolrCache.log,"Error instantiating cache",e); // we can carry on without a cache... but should we? // in some cases (like an OOM) we probably should try to continue. return null; }
// in core/src/java/org/apache/solr/spelling/SpellCheckCollator.java
catch (Exception e) { LOG.warn("Exception trying to re-query to check if a spell check possibility would return any hits.", e); }
// in core/src/java/org/apache/solr/spelling/suggest/Suggester.java
catch (Exception e) { LOG.error("Error while building or storing Suggester data", e); }
// in core/src/java/org/apache/solr/cloud/ZkSolrResourceLoader.java
catch (Exception e) { throw new RuntimeException("Error opening " + file, e); }
// in core/src/java/org/apache/solr/cloud/ZkSolrResourceLoader.java
catch (Exception e) { throw new RuntimeException("Error opening " + resource, e); }
// in core/src/java/org/apache/solr/cloud/SyncStrategy.java
catch (Exception e) { SolrException.log(log, "Sync Failed", e); }
// in core/src/java/org/apache/solr/cloud/SyncStrategy.java
catch (Exception e) { SolrException.log(log, "Sync Failed", e); }
// in core/src/java/org/apache/solr/cloud/SyncStrategy.java
catch (Exception e) { SolrException.log(log, "Error syncing replica to leader", e); }
// in core/src/java/org/apache/solr/cloud/SyncStrategy.java
catch (Exception e) { log.info("Could not tell a replica to recover", e); }
// in core/src/java/org/apache/solr/cloud/NodeStateWatcher.java
catch (Exception e) { log.warn("Error processing state change", e); }
// in core/src/java/org/apache/solr/cloud/SolrZkServer.java
catch (Exception e) { log.error("STARTING ZOOKEEPER", e); }
// in core/src/java/org/apache/solr/cloud/RecoveryStrategy.java
catch (Exception e) { log.error("", e); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (Exception e) { SolrException.log(log, "", e); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (Exception e) { SolrException.log(log, "", e); throw new ZooKeeperException( SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (Exception e) { SolrException.log(log, "There was a problem finding the leader in zk", e); try { Thread.sleep(2000); } catch (InterruptedException e1) { Thread.currentThread().interrupt(); } if (i == retries - 1) { throw new SolrException(ErrorCode.SERVER_ERROR, "There was a problem finding the leader in zk"); } }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (Exception e) { SolrException.log(log, "There was a problem making a request to the leader", e); try { Thread.sleep(2000); } catch (InterruptedException e1) { Thread.currentThread().interrupt(); } if (i == retries - 1) { throw new SolrException(ErrorCode.SERVER_ERROR, "There was a problem making a request to the leader"); } }
// in core/src/java/org/apache/solr/cloud/LeaderElector.java
catch (Exception e) { log.warn("", e); }
// in core/src/java/org/apache/solr/update/UpdateLog.java
catch (Exception e) { SolrException.log(log, "Failure to open existing log file (non fatal) " + f, e); f.delete(); }
// in core/src/java/org/apache/solr/update/UpdateLog.java
catch (Exception e) { SolrException.log(log, e); }
// in core/src/java/org/apache/solr/update/UpdateLog.java
catch (Exception ex) { log.warn("Exception reverse reading log", ex); break; }
// in core/src/java/org/apache/solr/update/CommitTracker.java
catch (Exception e) { SolrException.log(log, "auto commit error...", e); }
// in core/src/java/org/apache/solr/update/processor/DistributedUpdateProcessor.java
catch (Exception e) { log.info("Could not tell a replica to recover", e); }
// in core/src/java/org/apache/solr/update/processor/FieldMutatingUpdateProcessor.java
catch (Exception e) { throw new SolrException(SERVER_ERROR, "Can't resolve typeClass: " + t, e); }
// in core/src/java/org/apache/solr/update/DocumentBuilder.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "ERROR: "+getID(doc, schema)+"Error adding field '" + field.getName() + "'='" +field.getValue()+"' msg=" + ex.getMessage(), ex ); }
// in core/src/java/org/apache/solr/update/SolrCmdDistributor.java
catch (Exception e) { clonedRequest.exception = e; if (e instanceof SolrException) { clonedRequest.rspCode = ((SolrException) e).code(); } else { clonedRequest.rspCode = -1; } }
// in core/src/java/org/apache/solr/update/PeerSync.java
catch (Exception e) { sreq.updateException = e; log.error(msg() + "Error applying updates from " + sreq.shards + " ,update=" + o, e); return false; }
// in core/src/java/org/apache/solr/update/PeerSync.java
catch (Exception e) { sreq.updateException = e; log.error(msg() + "Error applying updates from " + sreq.shards + " ,finish()", e); return false; }
// in core/src/java/org/apache/solr/core/RequestHandlers.java
catch (Exception ex) { throw new SolrException (ErrorCode.SERVER_ERROR, "RequestHandler init failure", ex); }
// in core/src/java/org/apache/solr/core/RequestHandlers.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "lazy loading error", ex ); }
// in core/src/java/org/apache/solr/core/SolrConfig.java
catch (Exception e) { log.warn( "Unrecognized value for lastModFrom: " + s, e); return BOGUS; }
// in core/src/java/org/apache/solr/core/SolrConfig.java
catch (Exception e) { log.warn( "Ignoring exception while attempting to " + "extract max-age from cacheControl config: " + cacheControlHeader, e); }
// in core/src/java/org/apache/solr/core/MMapDirectoryFactory.java
catch (Exception e) { log.warn("Unmap not supported on this JVM, continuing on without setting unmap", e); }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (Exception e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR,"Error Instantiating "+msg+", "+className+ " failed to instantiate " +cast.getName(), e); }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (Exception e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR,"Error Instantiating "+msg+", "+className+ " failed to instantiate " + UpdateHandler.class.getName(), e); }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (Exception e) { SolrException.log(log, e); }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (Exception e) { SolrException.log(log, e); }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (Exception e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Error opening new searcher", e); }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (Exception e) { if (e instanceof SolrException) throw (SolrException)e; throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (Exception ex) { SolrException e = new SolrException (SolrException.ErrorCode.SERVER_ERROR, "QueryResponseWriter init failure", ex); SolrException.log(log,null,e); throw e; }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (Exception e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (Exception e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (Exception e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "lazy loading error", ex ); }
// in core/src/java/org/apache/solr/core/SolrResourceLoader.java
catch (Exception e) { throw new RuntimeException("Error opening " + resource, e); }
// in core/src/java/org/apache/solr/core/SolrResourceLoader.java
catch (Exception e) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "Error instantiating class: '" + clazz.getName()+"'", e); }
// in core/src/java/org/apache/solr/core/SolrResourceLoader.java
catch (Exception e) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "Error instantiating class: '" + clazz.getName()+"'", e); }
// in core/src/java/org/apache/solr/core/SolrResourceLoader.java
catch (Exception e) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "Error instantiating class: '" + clazz.getName()+"'", e); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (Exception e) { // if register fails, this is really bad - close the zkController to // minimize any damage we can cause zkController.publish(core.getCoreDescriptor(), ZkStateReader.DOWN); SolrException.log(log, "", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/core/SolrDeletionPolicy.java
catch (Exception e) { sb.append(e); }
// in core/src/java/org/apache/solr/core/SolrDeletionPolicy.java
catch (Exception e) { log.warn("Exception while checking commit point's age for deletion", e); }
// in core/src/java/org/apache/solr/core/QuerySenderListener.java
catch (Exception e) { // do nothing... we want to continue with the other requests. // the failure should have already been logged. }
// in core/src/java/org/apache/solr/core/JmxMonitoredMap.java
catch (Exception e) { // Release the reference server = null; throw new RuntimeException("Could not start JMX monitoring ", e); }
// in core/src/java/org/apache/solr/core/JmxMonitoredMap.java
catch (Exception e) { LOG.warn( "Failed to register info bean: " + key, e); }
// in core/src/java/org/apache/solr/core/JmxMonitoredMap.java
catch (Exception e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Failed to unregister info bean: " + key, e); }
// in core/src/java/org/apache/solr/core/JmxMonitoredMap.java
catch (Exception e) { LOG.warn("Could not getStatistics on info bean {}", infoBean.getName(), e); }
// in core/src/java/org/apache/solr/core/JmxMonitoredMap.java
catch (Exception e) { throw new RuntimeException(e); }
// in core/src/java/org/apache/solr/core/JmxMonitoredMap.java
catch (Exception e) { throw new AttributeNotFoundException(attribute); }
// in core/src/java/org/apache/solr/core/JmxMonitoredMap.java
catch (Exception e) { LOG.warn("Could not get attibute " + attribute); }
// in core/src/java/org/apache/solr/util/SolrPluginUtils.java
catch (Exception e) { return null; }
// in core/src/java/org/apache/solr/util/VersionedFile.java
catch (Exception e) { // swallow exception for now }
// in core/src/java/org/apache/solr/util/xslt/TransformerProvider.java
catch (Exception e) { log.error(getClass().getName(), "newTemplates", e); final IOException ioe = new IOException("Unable to initialize Templates '" + filename + "'"); ioe.initCause(e); throw ioe; }
// in core/src/java/org/apache/solr/util/plugin/AbstractPluginLoader.java
catch (Exception ex) { SolrException e = new SolrException (ErrorCode.SERVER_ERROR, "Plugin init failure for " + type + (null != name ? (" \"" + name + "\"") : "") + ": " + ex.getMessage(), ex); throw e; }
// in core/src/java/org/apache/solr/util/plugin/AbstractPluginLoader.java
catch( Exception ex ) { SolrException e = new SolrException (ErrorCode.SERVER_ERROR, "Plugin Initializing failure for " + type, ex); throw e; }
// in core/src/java/org/apache/solr/util/plugin/AbstractPluginLoader.java
catch (Exception ex) { SolrException e = new SolrException (ErrorCode.SERVER_ERROR, "Plugin init failure for " + type, ex); throw e; }
// in core/src/java/org/apache/solr/util/plugin/AbstractPluginLoader.java
catch (Exception ex) { SolrException e = new SolrException (ErrorCode.SERVER_ERROR, "Plugin init failure for " + type, ex); throw e; }
119
            
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
catch (Exception e) { throw new RuntimeException("Problem pretty printing XML", e); }
// in solrj/src/java/org/apache/solr/common/params/SolrParams.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, ex.getMessage(), ex ); }
// in solrj/src/java/org/apache/solr/common/params/SolrParams.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, ex.getMessage(), ex ); }
// in solrj/src/java/org/apache/solr/common/params/SolrParams.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, ex.getMessage(), ex ); }
// in solrj/src/java/org/apache/solr/common/params/SolrParams.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, ex.getMessage(), ex ); }
// in solrj/src/java/org/apache/solr/common/params/SolrParams.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, ex.getMessage(), ex ); }
// in solrj/src/java/org/apache/solr/common/params/SolrParams.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, ex.getMessage(), ex ); }
// in solrj/src/java/org/apache/solr/common/params/SolrParams.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, ex.getMessage(), ex ); }
// in solrj/src/java/org/apache/solr/common/params/SolrParams.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, ex.getMessage(), ex ); }
// in solrj/src/java/org/apache/solr/common/params/SolrParams.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, ex.getMessage(), ex ); }
// in solrj/src/java/org/apache/solr/common/params/SolrParams.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, ex.getMessage(), ex ); }
// in solrj/src/java/org/apache/solr/common/params/SolrParams.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, ex.getMessage(), ex ); }
// in solrj/src/java/org/apache/solr/common/params/SolrParams.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, ex.getMessage(), ex ); }
// in solrj/src/java/org/apache/solr/client/solrj/request/QueryRequest.java
catch (Exception e) { throw new SolrServerException("Error executing query", e); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
catch (Exception e) { throw new SolrServerException(e); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
catch (Exception e) { throw new SolrServerException(e); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
catch (Exception e) { throw new SolrServerException(e); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
catch (Exception e) { throw new SolrServerException(e); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/XMLResponseParser.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "parsing error", ex ); }
// in solrj/src/java/org/apache/solr/client/solrj/beans/DocumentObjectBinder.java
catch (Exception e) { throw new BindingException("Could not instantiate object of " + clazz, e); }
// in solrj/src/java/org/apache/solr/client/solrj/beans/DocumentObjectBinder.java
catch (Exception e) { throw new BindingException("Exception while setting value : " + v + " on " + (field != null ? field : setter), e); }
// in solrj/src/java/org/apache/solr/client/solrj/beans/DocumentObjectBinder.java
catch (Exception e) { throw new BindingException("Exception while getting value: " + field, e); }
// in solrj/src/java/org/apache/solr/client/solrj/beans/DocumentObjectBinder.java
catch (Exception e) { throw new BindingException("Exception while getting value: " + getter, e); }
// in contrib/extraction/src/java/org/apache/solr/handler/extraction/ExtractingRequestHandler.java
catch (Exception e) { throw new SolrException(ErrorCode.SERVER_ERROR, e); }
// in contrib/uima/src/java/org/apache/solr/uima/processor/UIMAUpdateRequestProcessor.java
catch (Exception e) { String logField = solrUIMAConfiguration.getLogField(); if(logField == null){ SchemaField uniqueKeyField = solrCore.getSchema().getUniqueKeyField(); if(uniqueKeyField != null){ logField = uniqueKeyField.getName(); } } String optionalFieldInfo = logField == null ? "." : new StringBuilder(". ").append(logField).append("=") .append((String)cmd.getSolrInputDocument().getField(logField).getValue()) .append(", ").toString(); int len = Math.min(text.length(), 100); if (solrUIMAConfiguration.isIgnoreErrors()) { log.warn(new StringBuilder("skip the text processing due to ") .append(e.getLocalizedMessage()).append(optionalFieldInfo) .append(" text=\"").append(text.substring(0, len)).append("...\"").toString()); } else { throw new SolrException(ErrorCode.SERVER_ERROR, new StringBuilder("processing error: ") .append(e.getLocalizedMessage()).append(optionalFieldInfo) .append(" text=\"").append(text.substring(0, len)).append("...\"").toString(), e); } }
// in contrib/uima/src/java/org/apache/solr/uima/processor/UIMAToSolrMapper.java
catch (Exception e) { throw new FieldMappingException(e); }
// in contrib/clustering/src/java/org/apache/solr/handler/clustering/carrot2/LuceneCarrot2TokenizerFactory.java
catch (Exception e) { throw ExceptionUtils.wrapAsRuntimeException(e); }
// in contrib/clustering/src/java/org/apache/solr/handler/clustering/carrot2/CarrotClusteringEngine.java
catch (Exception e) { log.error("Carrot2 clustering failed", e); throw new SolrException(ErrorCode.SERVER_ERROR, "Carrot2 clustering failed", e); }
// in contrib/langid/src/java/org/apache/solr/update/processor/LangDetectLanguageIdentifierUpdateProcessorFactory.java
catch (Exception e) { throw new RuntimeException("Couldn't load profile data, will return empty languages always!", e); }
// in contrib/dataimporthandler-extras/src/java/org/apache/solr/handler/dataimport/MailEntityProcessor.java
catch (Exception e) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Custom filter could not be created", e); }
// in contrib/analysis-extras/src/java/org/apache/solr/schema/ICUCollationField.java
catch (Exception e) { // io error or invalid rules throw new RuntimeException(e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DataImporter.java
catch (Exception e) { throw new DataImportHandlerException(SEVERE, "Exception occurred while initializing context", e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/XPathEntityProcessor.java
catch (Exception e) { throw new DataImportHandlerException(SEVERE, "Error initializing XSL ", e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DebugLogger.java
catch (Exception e) { log(DIHLogLevels.ENTITY_EXCEPTION, null, e); DataImportHandlerException de = new DataImportHandlerException( DataImportHandlerException.SEVERE, "", e); de.debugged = true; throw de; }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DebugLogger.java
catch (Exception e) { log(DIHLogLevels.TRANSFORMER_EXCEPTION, tName, e); DataImportHandlerException de = new DataImportHandlerException(DataImportHandlerException.SEVERE, "", e); de.debugged = true; throw de; }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/SqlEntityProcessor.java
catch (Exception e) { LOG.error( "The query failed '" + q + "'", e); throw new DataImportHandlerException(DataImportHandlerException.SEVERE, e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/SimplePropertiesWriter.java
catch (Exception e) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Unable to persist Index Start Time", e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DIHCacheSupport.java
catch (Exception e) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Unable to load Cache implementation:" + cacheImplName, e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/EntityProcessorWrapper.java
catch (Exception e) { log.warn("method invocation failed on transformer : " + trans, e); throw new DataImportHandlerException(WARN, e);
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/XPathRecordReader.java
catch (Exception e) { throw new RuntimeException(e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/URLDataSource.java
catch (Exception e) { LOG.error("Exception thrown while getting data", e); throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Exception in invoking url " + url, e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DocBuilder.java
catch (Exception e) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Unable to load Writer implementation:" + writerClassStr, e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DocBuilder.java
catch(Exception e) { throw new RuntimeException(e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DocBuilder.java
catch (Exception e) { throw new RuntimeException(e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DocBuilder.java
catch (Exception e) { try { String n = DocBuilder.class.getPackage().getName() + "." + name; return core != null ? core.getResourceLoader().findClass(n, Object.class) : Class.forName(n); } catch (Exception e1) { throw new ClassNotFoundException("Unable to load " + name + " or " + DocBuilder.class.getPackage().getName() + "." + name, e); } }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DocBuilder.java
catch (Exception e1) { throw new ClassNotFoundException("Unable to load " + name + " or " + DocBuilder.class.getPackage().getName() + "." + name, e); }
// in contrib/velocity/src/java/org/apache/solr/response/VelocityResponseWriter.java
catch (Exception e) { throw new IOException(e.getMessage()); }
// in contrib/velocity/src/java/org/apache/solr/response/VelocityResponseWriter.java
catch (Exception e) { throw new RuntimeException(e); }
// in contrib/velocity/src/java/org/apache/solr/response/VelocityResponseWriter.java
catch (Exception e) { throw new IOException(e.getMessage()); }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
catch (Exception e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Index fetch failed : ", e); }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
catch (Exception e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Unable to write index.properties", e); }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
catch (Exception e) { LOG.warn("Error in fetching packets ", e); //for any failure , increment the error count errorCount++; //if it fails for the same pacaket for MAX_RETRIES fail and come out if (errorCount > MAX_RETRIES) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Fetch failed for file:" + fileName, e); } return ERR; }
// in core/src/java/org/apache/solr/handler/component/QueryElevationComponent.java
catch (Exception ex) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Error initializing QueryElevationComponent.", ex); }
// in core/src/java/org/apache/solr/handler/component/QueryElevationComponent.java
catch (Exception ex) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Error loading elevation", ex); }
// in core/src/java/org/apache/solr/handler/component/HighlightComponent.java
catch (Exception e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/handler/admin/SolrInfoMBeanHandler.java
catch(Exception ex) { throw new SolrException(ErrorCode.BAD_REQUEST, "missing content-stream for diff"); }
// in core/src/java/org/apache/solr/handler/admin/SolrInfoMBeanHandler.java
catch(Exception ex) { throw new SolrException(ErrorCode.BAD_REQUEST, "Unable to read original XML", ex); }
// in core/src/java/org/apache/solr/handler/admin/CoreAdminHandler.java
catch (Exception ex) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Error executing default implementation of CREATE", ex); }
// in core/src/java/org/apache/solr/handler/admin/CoreAdminHandler.java
catch (Exception ex) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Error handling 'status' action ", ex); }
// in core/src/java/org/apache/solr/handler/admin/CoreAdminHandler.java
catch (Exception ex) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Error handling 'reload' action", ex); }
// in core/src/java/org/apache/solr/handler/admin/LoggingHandler.java
catch(Exception ex) { throw new SolrException(ErrorCode.BAD_REQUEST, "invalid timestamp: "+since); }
// in core/src/java/org/apache/solr/analysis/PhoneticFilterFactory.java
catch (Exception e) { throw new InitializationException("Encoder " + name + " / " + clazz + " does not support " + MAX_CODE_LENGTH, e); }
// in core/src/java/org/apache/solr/analysis/PhoneticFilterFactory.java
catch (Exception e) { final Throwable t = (e instanceof InvocationTargetException) ? e.getCause() : e; throw new InitializationException("Error initializing encoder: " + name + " / " + clazz, t); }
// in core/src/java/org/apache/solr/analysis/HunspellStemFilterFactory.java
catch (Exception e) { throw new InitializationException("Unable to load hunspell data! [dictionary=" + args.get("dictionary") + ",affix=" + affixFile + "]", e); }
// in core/src/java/org/apache/solr/analysis/SynonymFilterFactory.java
catch (Exception e) { throw new InitializationException("Exception thrown while loading synonyms", e); }
// in core/src/java/org/apache/solr/analysis/TrimFilterFactory.java
catch( Exception ex ) { throw new InitializationException("Error reading updateOffsets value. Must be true or false.", ex); }
// in core/src/java/org/apache/solr/analysis/HyphenationCompoundWordTokenFilterFactory.java
catch (Exception e) { // TODO: getHyphenationTree really shouldn't throw "Exception" throw new InitializationException("Exception thrown while loading dictionary and hyphenation file", e); }
// in core/src/java/org/apache/solr/analysis/SnowballPorterFilterFactory.java
catch (Exception e) { throw new InitializationException("Error instantiating stemmer for language " + language + "from class " + stemClass, e); }
// in core/src/java/org/apache/solr/analysis/JapaneseTokenizerFactory.java
catch (Exception e) { throw new InitializationException("Exception thrown while loading dictionary", e); }
// in core/src/java/org/apache/solr/analysis/PatternTokenizerFactory.java
catch( Exception ex ) { throw new InitializationException("invalid group argument: " + g); }
// in core/src/java/org/apache/solr/client/solrj/embedded/EmbeddedSolrServer.java
catch (Exception ex) { throw new RuntimeException(ex); }
// in core/src/java/org/apache/solr/client/solrj/embedded/EmbeddedSolrServer.java
catch( Exception ex ) { throw new SolrServerException( ex ); }
// in core/src/java/org/apache/solr/response/XSLTResponseWriter.java
catch(Exception e) { // TODO should our parent interface throw (IO)Exception? throw new RuntimeException("getTransformer fails in getContentType",e); }
// in core/src/java/org/apache/solr/response/transform/ValueAugmenterFactory.java
catch( Exception ex ) { throw new SolrException( ErrorCode.BAD_REQUEST, "Unable to parse "+type+"="+val, ex ); }
// in core/src/java/org/apache/solr/response/transform/ExplainAugmenterFactory.java
catch( Exception ex ) { throw new SolrException( ErrorCode.BAD_REQUEST, "Unknown Explain Style: "+str ); }
// in core/src/java/org/apache/solr/response/BinaryResponseWriter.java
catch (Exception ex) { throw new RuntimeException(ex); }
// in core/src/java/org/apache/solr/request/SimpleFacets.java
catch (Exception e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Can't parse value "+rawval+" for field: " + field.getName(), e); }
// in core/src/java/org/apache/solr/request/SimpleFacets.java
catch (Exception e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Can't parse gap "+gap+" for field: " + field.getName(), e); }
// in core/src/java/org/apache/solr/request/SimpleFacets.java
catch (Exception e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Can't add gap "+gap+" to value " + value + " for field: " + field.getName(), e); }
// in core/src/java/org/apache/solr/servlet/SolrDispatchFilter.java
catch (Exception e) { //unlikely throw new SolrException(SolrException.ErrorCode.SERVER_ERROR,e); }
// in core/src/java/org/apache/solr/schema/OpenExchangeRatesOrgProvider.java
catch (Exception e) { throw new SolrException(ErrorCode.SERVER_ERROR, "Error reloading exchange rates", e); }
// in core/src/java/org/apache/solr/schema/OpenExchangeRatesOrgProvider.java
catch (Exception e) { throw new SolrException(ErrorCode.BAD_REQUEST, "Error initializing", e); }
// in core/src/java/org/apache/solr/schema/IndexSchema.java
catch(Exception e) { // unexpected exception... throw new SolrException( SolrException.ErrorCode.SERVER_ERROR,"Schema Parsing Failed: " + e.getMessage(), e); }
// in core/src/java/org/apache/solr/schema/CurrencyField.java
catch (Exception e) { throw new SolrException(ErrorCode.BAD_REQUEST, "Error instansiating exhange rate provider "+exchangeRateProviderClass+". Please check your FieldType configuration", e); }
// in core/src/java/org/apache/solr/schema/FieldTypePluginLoader.java
catch (Exception e) { log.error("Cannot load analyzer: "+analyzerName, e); throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "Cannot load analyzer: "+analyzerName, e ); }
// in core/src/java/org/apache/solr/search/QueryParsing.java
catch (Exception e) { throw new RuntimeException(e); }
// in core/src/java/org/apache/solr/search/SolrCacheBase.java
catch (Exception e) { throw new RuntimeException("Can't parse autoWarm value: " + configValue, e); }
// in core/src/java/org/apache/solr/cloud/ZkSolrResourceLoader.java
catch (Exception e) { throw new RuntimeException("Error opening " + file, e); }
// in core/src/java/org/apache/solr/cloud/ZkSolrResourceLoader.java
catch (Exception e) { throw new RuntimeException("Error opening " + resource, e); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (Exception e) { SolrException.log(log, "", e); throw new ZooKeeperException( SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (Exception e) { SolrException.log(log, "There was a problem finding the leader in zk", e); try { Thread.sleep(2000); } catch (InterruptedException e1) { Thread.currentThread().interrupt(); } if (i == retries - 1) { throw new SolrException(ErrorCode.SERVER_ERROR, "There was a problem finding the leader in zk"); } }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (Exception e) { SolrException.log(log, "There was a problem making a request to the leader", e); try { Thread.sleep(2000); } catch (InterruptedException e1) { Thread.currentThread().interrupt(); } if (i == retries - 1) { throw new SolrException(ErrorCode.SERVER_ERROR, "There was a problem making a request to the leader"); } }
// in core/src/java/org/apache/solr/update/processor/FieldMutatingUpdateProcessor.java
catch (Exception e) { throw new SolrException(SERVER_ERROR, "Can't resolve typeClass: " + t, e); }
// in core/src/java/org/apache/solr/update/DocumentBuilder.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "ERROR: "+getID(doc, schema)+"Error adding field '" + field.getName() + "'='" +field.getValue()+"' msg=" + ex.getMessage(), ex ); }
// in core/src/java/org/apache/solr/core/RequestHandlers.java
catch (Exception ex) { throw new SolrException (ErrorCode.SERVER_ERROR, "RequestHandler init failure", ex); }
// in core/src/java/org/apache/solr/core/RequestHandlers.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "lazy loading error", ex ); }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (Exception e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR,"Error Instantiating "+msg+", "+className+ " failed to instantiate " +cast.getName(), e); }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (Exception e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR,"Error Instantiating "+msg+", "+className+ " failed to instantiate " + UpdateHandler.class.getName(), e); }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (Exception e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Error opening new searcher", e); }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (Exception e) { if (e instanceof SolrException) throw (SolrException)e; throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (Exception ex) { SolrException e = new SolrException (SolrException.ErrorCode.SERVER_ERROR, "QueryResponseWriter init failure", ex); SolrException.log(log,null,e); throw e; }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (Exception e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (Exception e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (Exception e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "lazy loading error", ex ); }
// in core/src/java/org/apache/solr/core/SolrResourceLoader.java
catch (Exception e) { throw new RuntimeException("Error opening " + resource, e); }
// in core/src/java/org/apache/solr/core/SolrResourceLoader.java
catch (Exception e) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "Error instantiating class: '" + clazz.getName()+"'", e); }
// in core/src/java/org/apache/solr/core/SolrResourceLoader.java
catch (Exception e) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "Error instantiating class: '" + clazz.getName()+"'", e); }
// in core/src/java/org/apache/solr/core/SolrResourceLoader.java
catch (Exception e) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "Error instantiating class: '" + clazz.getName()+"'", e); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (Exception e) { // if register fails, this is really bad - close the zkController to // minimize any damage we can cause zkController.publish(core.getCoreDescriptor(), ZkStateReader.DOWN); SolrException.log(log, "", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/core/JmxMonitoredMap.java
catch (Exception e) { // Release the reference server = null; throw new RuntimeException("Could not start JMX monitoring ", e); }
// in core/src/java/org/apache/solr/core/JmxMonitoredMap.java
catch (Exception e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Failed to unregister info bean: " + key, e); }
// in core/src/java/org/apache/solr/core/JmxMonitoredMap.java
catch (Exception e) { throw new RuntimeException(e); }
// in core/src/java/org/apache/solr/core/JmxMonitoredMap.java
catch (Exception e) { throw new AttributeNotFoundException(attribute); }
// in core/src/java/org/apache/solr/util/xslt/TransformerProvider.java
catch (Exception e) { log.error(getClass().getName(), "newTemplates", e); final IOException ioe = new IOException("Unable to initialize Templates '" + filename + "'"); ioe.initCause(e); throw ioe; }
// in core/src/java/org/apache/solr/util/plugin/AbstractPluginLoader.java
catch (Exception ex) { SolrException e = new SolrException (ErrorCode.SERVER_ERROR, "Plugin init failure for " + type + (null != name ? (" \"" + name + "\"") : "") + ": " + ex.getMessage(), ex); throw e; }
// in core/src/java/org/apache/solr/util/plugin/AbstractPluginLoader.java
catch( Exception ex ) { SolrException e = new SolrException (ErrorCode.SERVER_ERROR, "Plugin Initializing failure for " + type, ex); throw e; }
// in core/src/java/org/apache/solr/util/plugin/AbstractPluginLoader.java
catch (Exception ex) { SolrException e = new SolrException (ErrorCode.SERVER_ERROR, "Plugin init failure for " + type, ex); throw e; }
// in core/src/java/org/apache/solr/util/plugin/AbstractPluginLoader.java
catch (Exception ex) { SolrException e = new SolrException (ErrorCode.SERVER_ERROR, "Plugin init failure for " + type, ex); throw e; }
(Lib) IOException 170
            
// in solrj/src/java/org/apache/solr/common/cloud/ZkStateReader.java
catch (IOException e) { throw new RuntimeException(e); // should never happen w/o using real IO }
// in solrj/src/java/org/apache/solr/common/params/MapSolrParams.java
catch (IOException e) {throw new RuntimeException(e);}
// in solrj/src/java/org/apache/solr/common/params/MultiMapSolrParams.java
catch (IOException e) {throw new RuntimeException(e);}
// in solrj/src/java/org/apache/solr/common/params/ModifiableSolrParams.java
catch (IOException e) {throw new RuntimeException(e);}
// in solrj/src/java/org/apache/solr/client/solrj/request/RequestWriter.java
catch (IOException e) { throw new RuntimeException("Unable to write xml into a stream", e); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/BinaryResponseParser.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "parsing error", e); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/CloudSolrServer.java
catch (IOException e) { throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/StreamingBinaryResponseParser.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "parsing error", e); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/HttpSolrServer.java
catch (IOException ex) { throw new SolrServerException("error reading streams", ex); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/HttpSolrServer.java
catch (IOException e) { throw new SolrServerException( "IOException occured when talking to server at: " + getBaseURL(), e); }
// in solrj/src/java/org/apache/solr/client/solrj/util/ClientUtils.java
catch (IOException e) {throw new RuntimeException(e);}
// in solrj/src/java/org/apache/noggit/CharArr.java
catch (IOException e) { throw new RuntimeException(e); }
// in solrj/src/java/org/apache/noggit/CharArr.java
catch (IOException e) { throw new RuntimeException(e); }
// in solrj/src/java/org/apache/noggit/CharArr.java
catch (IOException e) { throw new RuntimeException(e); }
// in contrib/extraction/src/java/org/apache/solr/handler/extraction/ExtractingRequestHandler.java
catch (IOException e) { throw new SolrException(ErrorCode.SERVER_ERROR, e); }
// in contrib/clustering/src/java/org/apache/solr/handler/clustering/carrot2/CarrotClusteringEngine.java
catch (IOException e) { log.warn("Could not read Solr resource " + resourceName); return new IResource[] {}; }
// in contrib/clustering/src/java/org/apache/solr/handler/clustering/carrot2/CarrotClusteringEngine.java
catch (IOException e) { throw new SolrException(ErrorCode.SERVER_ERROR, e); }
// in contrib/analysis-extras/src/java/org/apache/solr/analysis/StempelPolishStemFilterFactory.java
catch (IOException e) { throw new SolrException(ErrorCode.SERVER_ERROR, "Could not load stem table: " + STEMTABLE); }
// in contrib/analysis-extras/src/java/org/apache/solr/schema/ICUCollationField.java
catch (IOException e) { throw new RuntimeException("Unable to initialize TokenStream to analyze range part: " + part, e); }
// in contrib/analysis-extras/src/java/org/apache/solr/schema/ICUCollationField.java
catch (IOException e) { throw new RuntimeException("error analyzing range part: " + part, e); }
// in contrib/analysis-extras/src/java/org/apache/solr/schema/ICUCollationField.java
catch (IOException e) { throw new RuntimeException("Unable to end & close TokenStream after analyzing range part: " + part, e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/ClobTransformer.java
catch (IOException e) { DataImportHandlerException.wrapAndThrow(DataImportHandlerException.SEVERE, e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/LineEntityProcessor.java
catch (IOException exp) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Problem reading from input", exp); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/SimplePropertiesWriter.java
catch (IOException e) { propOutput = null; }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/SimplePropertiesWriter.java
catch (IOException e) { propInput = null; }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/BinContentStreamDataSource.java
catch (IOException e) { DataImportHandlerException.wrapAndThrow(SEVERE, e); return null; }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/BinContentStreamDataSource.java
catch (IOException e) { /*no op*/ }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/SolrWriter.java
catch (IOException e) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Unable to call finish() on UpdateRequestProcessor", e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/SolrWriter.java
catch (IOException e) { log.error("Exception while deleteing: " + id, e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/SolrWriter.java
catch (IOException e) { log.error("Exception while deleting by query: " + query, e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/SolrWriter.java
catch (IOException e) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Exception in full dump while deleting all documents.", e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/HTMLStripTransformer.java
catch (IOException e) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Failed stripping HTML for column: " + column, e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/ContentStreamDataSource.java
catch (IOException e) { DataImportHandlerException.wrapAndThrow(SEVERE, e); return null; }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/ContentStreamDataSource.java
catch (IOException e) { }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/PlainTextEntityProcessor.java
catch (IOException e) { IOUtils.closeQuietly(r); wrapAndThrow(SEVERE, e, "Exception reading url : " + url); }
// in core/src/java/org/apache/solr/handler/AnalysisRequestHandlerBase.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/handler/AnalysisRequestHandlerBase.java
catch (IOException ioe) { throw new RuntimeException("Error occured while iterating over tokenstream", ioe); }
// in core/src/java/org/apache/solr/handler/AnalysisRequestHandlerBase.java
catch (IOException ioe) { throw new RuntimeException("Error occured while iterating over tokenstream", ioe); }
// in core/src/java/org/apache/solr/handler/AnalysisRequestHandlerBase.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
catch (IOException e) { LOG.error("Unable to copy index file from: " + indexFileInTmpDir + " to: " + indexFileInIndex , e); }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
catch (IOException e) { fsyncException = e; }
// in core/src/java/org/apache/solr/handler/loader/JavabinLoader.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "ERROR handling commit/rollback"); }
// in core/src/java/org/apache/solr/handler/loader/JavabinLoader.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "ERROR adding document " + document); }
// in core/src/java/org/apache/solr/handler/loader/CSVLoaderBase.java
catch (IOException e) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST,e); }
// in core/src/java/org/apache/solr/handler/loader/CSVLoaderBase.java
catch (IOException e) { //Catch the exception and rethrow it with more line information input_err("can't read line: " + line, null, line, e); }
// in core/src/java/org/apache/solr/handler/FieldAnalysisRequestHandler.java
catch (IOException e) { // do nothing, leave value set to the request parameter }
// in core/src/java/org/apache/solr/handler/ReplicationHandler.java
catch (IOException e) { LOG.warn("Exception while reading files for commit " + c, e); }
// in core/src/java/org/apache/solr/handler/ReplicationHandler.java
catch (IOException e) { rsp.add("status", "unable to get file names for given index generation"); rsp.add("exception", e); LOG.warn("Unable to get file names for indexCommit generation: " + gen, e); }
// in core/src/java/org/apache/solr/handler/ReplicationHandler.java
catch (IOException e) { LOG.warn("Unable to get index version : ", e); }
// in core/src/java/org/apache/solr/handler/ReplicationHandler.java
catch (IOException e) { LOG.warn("Unable to get IndexCommit on startup", e); }
// in core/src/java/org/apache/solr/handler/ReplicationHandler.java
catch (IOException e) { LOG.warn("Exception while writing response for params: " + params, e); }
// in core/src/java/org/apache/solr/handler/component/SpellCheckComponent.java
catch (IOException e) { log.error( "Exception in reloading spell check index for spellchecker: " + checker.getDictionaryName(), e); }
// in core/src/java/org/apache/solr/handler/component/StatsComponent.java
catch (IOException e) { throw new RuntimeException( "failed to open field cache for: "+fieldName, e ); }
// in core/src/java/org/apache/solr/handler/component/StatsComponent.java
catch (IOException e) { throw new RuntimeException( "failed to open field cache for: " + facetField, e ); }
// in core/src/java/org/apache/solr/handler/component/TermVectorComponent.java
catch (IOException e) { throw new RuntimeException(e); }
// in core/src/java/org/apache/solr/handler/SnapShooter.java
catch (IOException e) { LOG.error("Unable to release snapshoot lock: " + directoryName + ".lock"); }
// in core/src/java/org/apache/solr/handler/PingRequestHandler.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Unable to write healthcheck flag file", e); }
// in core/src/java/org/apache/solr/analysis/TrieTokenizerFactory.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Unable to create TrieIndexTokenizer", e); }
// in core/src/java/org/apache/solr/analysis/TrieTokenizerFactory.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Unable to create TrieIndexTokenizer", e); }
// in core/src/java/org/apache/solr/analysis/ElisionFilterFactory.java
catch (IOException e) { throw new InitializationException("IOException thrown while loading articles", e); }
// in core/src/java/org/apache/solr/analysis/StemmerOverrideFilterFactory.java
catch (IOException e) { throw new InitializationException("IOException thrown while loading dictionary", e); }
// in core/src/java/org/apache/solr/analysis/KeywordMarkerFilterFactory.java
catch (IOException e) { throw new InitializationException("IOException thrown while loading protected words", e); }
// in core/src/java/org/apache/solr/analysis/SnowballPorterFilterFactory.java
catch (IOException e) { throw new InitializationException("IOException thrown while loading protected words", e); }
// in core/src/java/org/apache/solr/analysis/KeepWordFilterFactory.java
catch (IOException e) { throw new InitializationException("IOException thrown while loading words", e); }
// in core/src/java/org/apache/solr/analysis/JapanesePartOfSpeechStopFilterFactory.java
catch (IOException e) { throw new InitializationException("IOException thrown while loading tags", e); }
// in core/src/java/org/apache/solr/analysis/MappingCharFilterFactory.java
catch( IOException e ){ throw new InitializationException("IOException thrown while loading mappings", e); }
// in core/src/java/org/apache/solr/analysis/PatternTokenizerFactory.java
catch( IOException ex ) { throw new InitializationException("IOException thrown creating PatternTokenizer instance", ex); }
// in core/src/java/org/apache/solr/analysis/DictionaryCompoundWordTokenFilterFactory.java
catch (IOException e) { throw new InitializationException("IOException thrown while loading dictionary", e); }
// in core/src/java/org/apache/solr/analysis/WordDelimiterFilterFactory.java
catch (IOException e) { throw new InitializationException("IOException thrown while loading protected words", e); }
// in core/src/java/org/apache/solr/analysis/WordDelimiterFilterFactory.java
catch (IOException e) { throw new InitializationException("IOException while loading types", e); }
// in core/src/java/org/apache/solr/analysis/TypeTokenFilterFactory.java
catch (IOException e) { throw new InitializationException("IOException thrown while loading types", e); }
// in core/src/java/org/apache/solr/analysis/CommonGramsQueryFilterFactory.java
catch (IOException e) { throw new InitializationException("IOException thrown while loading common word file", e); }
// in core/src/java/org/apache/solr/analysis/StopFilterFactory.java
catch (IOException e) { throw new InitializationException("IOException thrown while loading stopwords", e); }
// in core/src/java/org/apache/solr/analysis/CommonGramsFilterFactory.java
catch (IOException e) { throw new InitializationException("IOException thrown while loading common word file", e); }
// in core/src/java/org/apache/solr/client/solrj/embedded/EmbeddedSolrServer.java
catch( IOException iox ) { throw iox; }
// in core/src/java/org/apache/solr/response/transform/ValueSourceAugmenter.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/response/transform/ValueSourceAugmenter.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "exception at docid " + docid + " for valuesource " + valueSource, e); }
// in core/src/java/org/apache/solr/response/transform/ExplainAugmenterFactory.java
catch (IOException e) { e.printStackTrace(); }
// in core/src/java/org/apache/solr/request/SimpleFacets.java
catch (IOException e) { throw new SolrException(ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/request/UnInvertedField.java
catch (IOException e) { throw new RuntimeException("failed to open field cache for: " + f, e); }
// in core/src/java/org/apache/solr/servlet/cache/HttpCacheHeaderUtil.java
catch (IOException e) { // we're pretty freaking screwed if this happens throw new SolrException(ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/schema/OpenExchangeRatesOrgProvider.java
catch (IOException e) { throw new SolrException(ErrorCode.SERVER_ERROR, "Error closing stream", e); }
// in core/src/java/org/apache/solr/schema/CurrencyField.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/schema/CurrencyField.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Error parsing currency config.", e); }
// in core/src/java/org/apache/solr/schema/CurrencyField.java
catch (IOException e) { throw new SolrException(ErrorCode.BAD_REQUEST, "Error while opening Currency configuration file "+currencyConfigFile, e); }
// in core/src/java/org/apache/solr/schema/CurrencyField.java
catch (IOException e) { e.printStackTrace(); }
// in core/src/java/org/apache/solr/schema/LatLonType.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/schema/PreAnalyzedField.java
catch (IOException e) { return null; }
// in core/src/java/org/apache/solr/schema/TextField.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Unable to initialize TokenStream to analyze multiTerm term: " + part, e); }
// in core/src/java/org/apache/solr/schema/TextField.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST,"error analyzing range part: " + part, e); }
// in core/src/java/org/apache/solr/schema/TextField.java
catch (IOException e) { throw new RuntimeException("Unable to end & close TokenStream after analyzing multiTerm term: " + part, e); }
// in core/src/java/org/apache/solr/schema/TextField.java
catch (IOException e) { throw new RuntimeException("Unable to initialize TokenStream to analyze query text", e); }
// in core/src/java/org/apache/solr/schema/TextField.java
catch (IOException e) { throw new RuntimeException("Unable to initialize TokenStream to analyze query text", e); }
// in core/src/java/org/apache/solr/schema/TextField.java
catch (IOException e) { // ignore }
// in core/src/java/org/apache/solr/schema/TextField.java
catch (IOException e) { // ignore }
// in core/src/java/org/apache/solr/schema/TextField.java
catch (IOException e) { // safe to ignore, because we know the number of tokens }
// in core/src/java/org/apache/solr/schema/TextField.java
catch (IOException e) { // safe to ignore, because we know the number of tokens }
// in core/src/java/org/apache/solr/schema/TextField.java
catch (IOException e) { // safe to ignore, because we know the number of tokens }
// in core/src/java/org/apache/solr/schema/TextField.java
catch (IOException e) { // safe to ignore, because we know the number of tokens }
// in core/src/java/org/apache/solr/schema/CollationField.java
catch (IOException e) { // io error throw new RuntimeException(e); }
// in core/src/java/org/apache/solr/schema/CollationField.java
catch (IOException e) { throw new RuntimeException("Unable to initialize TokenStream to analyze range part: " + part, e); }
// in core/src/java/org/apache/solr/schema/CollationField.java
catch (IOException e) { throw new RuntimeException("error analyzing range part: " + part, e); }
// in core/src/java/org/apache/solr/schema/CollationField.java
catch (IOException e) { throw new RuntimeException("Unable to end & close TokenStream after analyzing range part: " + part, e); }
// in core/src/java/org/apache/solr/internal/csv/CSVUtils.java
catch (IOException e) { // should not happen with StringWriter }
// in core/src/java/org/apache/solr/search/SolrConstantScoreQuery.java
catch (IOException e) { // TODO: remove this if ConstantScoreQuery.createWeight adds IOException throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/search/QueryParsing.java
catch (IOException ioe) { throw ioe; }
// in core/src/java/org/apache/solr/search/QueryParsing.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "error in sort: " + sortSpec, e); }
// in core/src/java/org/apache/solr/search/grouping/distributed/responseprocessor/TopGroupsShardResponseProcessor.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/search/grouping/distributed/responseprocessor/SearchGroupShardResponseProcessor.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/search/function/FileFloatSource.java
catch (IOException e) { // log, use defaults SolrCore.log.error("Error opening external value source file: " +e); return vals; }
// in core/src/java/org/apache/solr/search/function/FileFloatSource.java
catch (IOException e) { // log, use defaults SolrCore.log.error("Error loading external value source: " +e); }
// in core/src/java/org/apache/solr/spelling/SpellingQueryConverter.java
catch (IOException e) { // TODO: shouldn't we log something? }
// in core/src/java/org/apache/solr/spelling/AbstractLuceneSpellChecker.java
catch (IOException e) { throw new RuntimeException(e); }
// in core/src/java/org/apache/solr/spelling/suggest/Suggester.java
catch (IOException e) { LOG.warn("Loading stored lookup data failed", e); }
// in core/src/java/org/apache/solr/spelling/FileBasedSpellChecker.java
catch (IOException e) { throw new RuntimeException(e); }
// in core/src/java/org/apache/solr/spelling/FileBasedSpellChecker.java
catch (IOException e) { log.error( "Unable to load spellings", e); }
// in core/src/java/org/apache/solr/spelling/SuggestQueryConverter.java
catch (IOException e) { throw new RuntimeException(e); }
// in core/src/java/org/apache/solr/spelling/IndexBasedSpellChecker.java
catch (IOException e) { throw new RuntimeException(e); }
// in core/src/java/org/apache/solr/spelling/IndexBasedSpellChecker.java
catch (IOException e) { throw new RuntimeException(e); }
// in core/src/java/org/apache/solr/cloud/SolrZkServer.java
catch (IOException e) { if (zkRun != null) throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/cloud/SolrZkServer.java
catch (IOException e) { throw new ConfigException("Error processing " + path, e); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (IOException e) { log.error("", e); throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Can't create ZooKeeperController", e); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (IOException e) { throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/cloud/LeaderElector.java
catch (IOException e) { log.warn("", e); }
// in core/src/java/org/apache/solr/update/UpdateLog.java
catch (IOException e) { log.error("Error inspecting tlog " + ll); ll.decref(); continue; }
// in core/src/java/org/apache/solr/update/UpdateLog.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Can't open new tlog!", e); }
// in core/src/java/org/apache/solr/update/UpdateLog.java
catch (IOException e) { // failure to read a log record isn't fatal log.error("Exception reading versions from log",e); }
// in core/src/java/org/apache/solr/update/UpdateLog.java
catch (IOException e) { SolrException.log(log,"Error attempting to roll back log", e); return false; }
// in core/src/java/org/apache/solr/update/UpdateLog.java
catch (IOException e) { SolrException.log(log,e); }
// in core/src/java/org/apache/solr/update/UpdateLog.java
catch (IOException ex) { recoveryInfo.errors++; loglog.warn("REYPLAY_ERR: IOException reading log", ex); // could be caused by an incomplete flush if recovering from log }
// in core/src/java/org/apache/solr/update/UpdateLog.java
catch (IOException ex) { recoveryInfo.errors++; loglog.error("Replay exception: final commit.", ex); }
// in core/src/java/org/apache/solr/update/UpdateLog.java
catch (IOException ex) { recoveryInfo.errors++; loglog.error("Replay exception: finish()", ex); }
// in core/src/java/org/apache/solr/update/DirectUpdateHandler2.java
catch (IOException e) { throw new SolrException(ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/update/processor/HTMLStripFieldUpdateProcessorFactory.java
catch (IOException e) { // we tried and failed return s; }
// in core/src/java/org/apache/solr/update/VersionInfo.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Error reading version from index", e); }
// in core/src/java/org/apache/solr/update/PeerSync.java
catch (IOException e) { // TODO: should this be handled separately as a problem with us? // I guess it probably already will by causing replication to be kicked off. sreq.updateException = e; log.error(msg() + "Error applying updates from " + sreq.shards + " ,update=" + o, e); return false; }
// in core/src/java/org/apache/solr/update/TransactionLog.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/update/TransactionLog.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/update/TransactionLog.java
catch (IOException e) { // TODO: reset our file pointer back to "pos", the start of this record. throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Error logging add", e); }
// in core/src/java/org/apache/solr/update/TransactionLog.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/update/TransactionLog.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/update/TransactionLog.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/update/TransactionLog.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/update/TransactionLog.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/update/TransactionLog.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/core/RunExecutableListener.java
catch (IOException e) { // don't throw exception, just log it... SolrException.log(log,e); ret = INVALID_PROCESS_RETURN_CODE; }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (IOException e) { /*no op*/ }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (IOException e) { throw new RuntimeException(e); }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (IOException e) { SolrException.log(log,null,e); return null; }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (IOException e) { log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (IOException e) { log.warn("Error loading properties ",e); }
// in core/src/java/org/apache/solr/core/SolrXMLSerializer.java
catch (java.io.IOException xio) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, xio); }
// in core/src/java/org/apache/solr/core/SolrXMLSerializer.java
catch (IOException xio) { xforward = xio; }
// in core/src/java/org/apache/solr/core/SolrXMLSerializer.java
catch (IOException xio) {}
// in core/src/java/org/apache/solr/core/SolrXMLSerializer.java
catch (IOException xio) {}
// in core/src/java/org/apache/solr/core/SolrXMLSerializer.java
catch (IOException xio) {}
// in core/src/java/org/apache/solr/core/SolrXMLSerializer.java
catch (IOException xio) {}
// in core/src/java/org/apache/solr/util/SystemIdResolver.java
catch (IOException ioe) { throw new TransformerException("Cannot resolve entity", ioe); }
// in core/src/java/org/apache/solr/util/SystemIdResolver.java
catch (IOException ioe) { throw new XMLStreamException("Cannot resolve entity", ioe); }
// in core/src/java/org/apache/solr/util/SimplePostTool.java
catch (IOException e) { fatal("Can't open/read file: " + file); }
// in core/src/java/org/apache/solr/util/SimplePostTool.java
catch (IOException e) { fatal("IOException while closing file: "+ e); }
// in core/src/java/org/apache/solr/util/SimplePostTool.java
catch (IOException e) { fatal("An error occured posting data to "+url+". Please check that Solr is running."); }
// in core/src/java/org/apache/solr/util/SimplePostTool.java
catch (IOException e) { fatal("Connection error (is Solr running at " + solrUrl + " ?): " + e); }
// in core/src/java/org/apache/solr/util/SimplePostTool.java
catch (IOException e) { fatal("IOException while posting data: " + e); }
// in core/src/java/org/apache/solr/util/SimplePostTool.java
catch (IOException x) { /*NOOP*/ }
// in core/src/java/org/apache/solr/util/SimplePostTool.java
catch (IOException e) { fatal("IOException while reading response: " + e); }
// in core/src/java/org/apache/solr/util/SimplePostTool.java
catch (IOException x) { /*NOOP*/ }
// in core/src/java/org/apache/solr/util/FileUtils.java
catch (IOException e) {}
// in core/src/java/org/apache/solr/util/FileUtils.java
catch (IOException e) {}
// in core/src/java/org/apache/solr/util/FileUtils.java
catch (IOException ioe) { if (exc == null) exc = ioe; try { // Pause 5 msec Thread.sleep(5); } catch (InterruptedException ie) { Thread.currentThread().interrupt(); } }
103
            
// in solrj/src/java/org/apache/solr/common/cloud/ZkStateReader.java
catch (IOException e) { throw new RuntimeException(e); // should never happen w/o using real IO }
// in solrj/src/java/org/apache/solr/common/params/MapSolrParams.java
catch (IOException e) {throw new RuntimeException(e);}
// in solrj/src/java/org/apache/solr/common/params/MultiMapSolrParams.java
catch (IOException e) {throw new RuntimeException(e);}
// in solrj/src/java/org/apache/solr/common/params/ModifiableSolrParams.java
catch (IOException e) {throw new RuntimeException(e);}
// in solrj/src/java/org/apache/solr/client/solrj/request/RequestWriter.java
catch (IOException e) { throw new RuntimeException("Unable to write xml into a stream", e); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/BinaryResponseParser.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "parsing error", e); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/CloudSolrServer.java
catch (IOException e) { throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/StreamingBinaryResponseParser.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "parsing error", e); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/HttpSolrServer.java
catch (IOException ex) { throw new SolrServerException("error reading streams", ex); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/HttpSolrServer.java
catch (IOException e) { throw new SolrServerException( "IOException occured when talking to server at: " + getBaseURL(), e); }
// in solrj/src/java/org/apache/solr/client/solrj/util/ClientUtils.java
catch (IOException e) {throw new RuntimeException(e);}
// in solrj/src/java/org/apache/noggit/CharArr.java
catch (IOException e) { throw new RuntimeException(e); }
// in solrj/src/java/org/apache/noggit/CharArr.java
catch (IOException e) { throw new RuntimeException(e); }
// in solrj/src/java/org/apache/noggit/CharArr.java
catch (IOException e) { throw new RuntimeException(e); }
// in contrib/extraction/src/java/org/apache/solr/handler/extraction/ExtractingRequestHandler.java
catch (IOException e) { throw new SolrException(ErrorCode.SERVER_ERROR, e); }
// in contrib/clustering/src/java/org/apache/solr/handler/clustering/carrot2/CarrotClusteringEngine.java
catch (IOException e) { throw new SolrException(ErrorCode.SERVER_ERROR, e); }
// in contrib/analysis-extras/src/java/org/apache/solr/analysis/StempelPolishStemFilterFactory.java
catch (IOException e) { throw new SolrException(ErrorCode.SERVER_ERROR, "Could not load stem table: " + STEMTABLE); }
// in contrib/analysis-extras/src/java/org/apache/solr/schema/ICUCollationField.java
catch (IOException e) { throw new RuntimeException("Unable to initialize TokenStream to analyze range part: " + part, e); }
// in contrib/analysis-extras/src/java/org/apache/solr/schema/ICUCollationField.java
catch (IOException e) { throw new RuntimeException("error analyzing range part: " + part, e); }
// in contrib/analysis-extras/src/java/org/apache/solr/schema/ICUCollationField.java
catch (IOException e) { throw new RuntimeException("Unable to end & close TokenStream after analyzing range part: " + part, e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/LineEntityProcessor.java
catch (IOException exp) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Problem reading from input", exp); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/SolrWriter.java
catch (IOException e) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Unable to call finish() on UpdateRequestProcessor", e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/SolrWriter.java
catch (IOException e) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Exception in full dump while deleting all documents.", e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/HTMLStripTransformer.java
catch (IOException e) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Failed stripping HTML for column: " + column, e); }
// in core/src/java/org/apache/solr/handler/AnalysisRequestHandlerBase.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/handler/AnalysisRequestHandlerBase.java
catch (IOException ioe) { throw new RuntimeException("Error occured while iterating over tokenstream", ioe); }
// in core/src/java/org/apache/solr/handler/AnalysisRequestHandlerBase.java
catch (IOException ioe) { throw new RuntimeException("Error occured while iterating over tokenstream", ioe); }
// in core/src/java/org/apache/solr/handler/AnalysisRequestHandlerBase.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/handler/loader/JavabinLoader.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "ERROR handling commit/rollback"); }
// in core/src/java/org/apache/solr/handler/loader/JavabinLoader.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "ERROR adding document " + document); }
// in core/src/java/org/apache/solr/handler/loader/CSVLoaderBase.java
catch (IOException e) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST,e); }
// in core/src/java/org/apache/solr/handler/component/StatsComponent.java
catch (IOException e) { throw new RuntimeException( "failed to open field cache for: "+fieldName, e ); }
// in core/src/java/org/apache/solr/handler/component/StatsComponent.java
catch (IOException e) { throw new RuntimeException( "failed to open field cache for: " + facetField, e ); }
// in core/src/java/org/apache/solr/handler/component/TermVectorComponent.java
catch (IOException e) { throw new RuntimeException(e); }
// in core/src/java/org/apache/solr/handler/PingRequestHandler.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Unable to write healthcheck flag file", e); }
// in core/src/java/org/apache/solr/analysis/TrieTokenizerFactory.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Unable to create TrieIndexTokenizer", e); }
// in core/src/java/org/apache/solr/analysis/TrieTokenizerFactory.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Unable to create TrieIndexTokenizer", e); }
// in core/src/java/org/apache/solr/analysis/ElisionFilterFactory.java
catch (IOException e) { throw new InitializationException("IOException thrown while loading articles", e); }
// in core/src/java/org/apache/solr/analysis/StemmerOverrideFilterFactory.java
catch (IOException e) { throw new InitializationException("IOException thrown while loading dictionary", e); }
// in core/src/java/org/apache/solr/analysis/KeywordMarkerFilterFactory.java
catch (IOException e) { throw new InitializationException("IOException thrown while loading protected words", e); }
// in core/src/java/org/apache/solr/analysis/SnowballPorterFilterFactory.java
catch (IOException e) { throw new InitializationException("IOException thrown while loading protected words", e); }
// in core/src/java/org/apache/solr/analysis/KeepWordFilterFactory.java
catch (IOException e) { throw new InitializationException("IOException thrown while loading words", e); }
// in core/src/java/org/apache/solr/analysis/JapanesePartOfSpeechStopFilterFactory.java
catch (IOException e) { throw new InitializationException("IOException thrown while loading tags", e); }
// in core/src/java/org/apache/solr/analysis/MappingCharFilterFactory.java
catch( IOException e ){ throw new InitializationException("IOException thrown while loading mappings", e); }
// in core/src/java/org/apache/solr/analysis/PatternTokenizerFactory.java
catch( IOException ex ) { throw new InitializationException("IOException thrown creating PatternTokenizer instance", ex); }
// in core/src/java/org/apache/solr/analysis/DictionaryCompoundWordTokenFilterFactory.java
catch (IOException e) { throw new InitializationException("IOException thrown while loading dictionary", e); }
// in core/src/java/org/apache/solr/analysis/WordDelimiterFilterFactory.java
catch (IOException e) { throw new InitializationException("IOException thrown while loading protected words", e); }
// in core/src/java/org/apache/solr/analysis/WordDelimiterFilterFactory.java
catch (IOException e) { throw new InitializationException("IOException while loading types", e); }
// in core/src/java/org/apache/solr/analysis/TypeTokenFilterFactory.java
catch (IOException e) { throw new InitializationException("IOException thrown while loading types", e); }
// in core/src/java/org/apache/solr/analysis/CommonGramsQueryFilterFactory.java
catch (IOException e) { throw new InitializationException("IOException thrown while loading common word file", e); }
// in core/src/java/org/apache/solr/analysis/StopFilterFactory.java
catch (IOException e) { throw new InitializationException("IOException thrown while loading stopwords", e); }
// in core/src/java/org/apache/solr/analysis/CommonGramsFilterFactory.java
catch (IOException e) { throw new InitializationException("IOException thrown while loading common word file", e); }
// in core/src/java/org/apache/solr/client/solrj/embedded/EmbeddedSolrServer.java
catch( IOException iox ) { throw iox; }
// in core/src/java/org/apache/solr/response/transform/ValueSourceAugmenter.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/response/transform/ValueSourceAugmenter.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "exception at docid " + docid + " for valuesource " + valueSource, e); }
// in core/src/java/org/apache/solr/request/SimpleFacets.java
catch (IOException e) { throw new SolrException(ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/request/UnInvertedField.java
catch (IOException e) { throw new RuntimeException("failed to open field cache for: " + f, e); }
// in core/src/java/org/apache/solr/servlet/cache/HttpCacheHeaderUtil.java
catch (IOException e) { // we're pretty freaking screwed if this happens throw new SolrException(ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/schema/OpenExchangeRatesOrgProvider.java
catch (IOException e) { throw new SolrException(ErrorCode.SERVER_ERROR, "Error closing stream", e); }
// in core/src/java/org/apache/solr/schema/CurrencyField.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/schema/CurrencyField.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Error parsing currency config.", e); }
// in core/src/java/org/apache/solr/schema/CurrencyField.java
catch (IOException e) { throw new SolrException(ErrorCode.BAD_REQUEST, "Error while opening Currency configuration file "+currencyConfigFile, e); }
// in core/src/java/org/apache/solr/schema/LatLonType.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/schema/TextField.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Unable to initialize TokenStream to analyze multiTerm term: " + part, e); }
// in core/src/java/org/apache/solr/schema/TextField.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST,"error analyzing range part: " + part, e); }
// in core/src/java/org/apache/solr/schema/TextField.java
catch (IOException e) { throw new RuntimeException("Unable to end & close TokenStream after analyzing multiTerm term: " + part, e); }
// in core/src/java/org/apache/solr/schema/TextField.java
catch (IOException e) { throw new RuntimeException("Unable to initialize TokenStream to analyze query text", e); }
// in core/src/java/org/apache/solr/schema/TextField.java
catch (IOException e) { throw new RuntimeException("Unable to initialize TokenStream to analyze query text", e); }
// in core/src/java/org/apache/solr/schema/CollationField.java
catch (IOException e) { // io error throw new RuntimeException(e); }
// in core/src/java/org/apache/solr/schema/CollationField.java
catch (IOException e) { throw new RuntimeException("Unable to initialize TokenStream to analyze range part: " + part, e); }
// in core/src/java/org/apache/solr/schema/CollationField.java
catch (IOException e) { throw new RuntimeException("error analyzing range part: " + part, e); }
// in core/src/java/org/apache/solr/schema/CollationField.java
catch (IOException e) { throw new RuntimeException("Unable to end & close TokenStream after analyzing range part: " + part, e); }
// in core/src/java/org/apache/solr/search/SolrConstantScoreQuery.java
catch (IOException e) { // TODO: remove this if ConstantScoreQuery.createWeight adds IOException throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/search/QueryParsing.java
catch (IOException ioe) { throw ioe; }
// in core/src/java/org/apache/solr/search/QueryParsing.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "error in sort: " + sortSpec, e); }
// in core/src/java/org/apache/solr/search/grouping/distributed/responseprocessor/TopGroupsShardResponseProcessor.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/search/grouping/distributed/responseprocessor/SearchGroupShardResponseProcessor.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/spelling/AbstractLuceneSpellChecker.java
catch (IOException e) { throw new RuntimeException(e); }
// in core/src/java/org/apache/solr/spelling/FileBasedSpellChecker.java
catch (IOException e) { throw new RuntimeException(e); }
// in core/src/java/org/apache/solr/spelling/SuggestQueryConverter.java
catch (IOException e) { throw new RuntimeException(e); }
// in core/src/java/org/apache/solr/spelling/IndexBasedSpellChecker.java
catch (IOException e) { throw new RuntimeException(e); }
// in core/src/java/org/apache/solr/spelling/IndexBasedSpellChecker.java
catch (IOException e) { throw new RuntimeException(e); }
// in core/src/java/org/apache/solr/cloud/SolrZkServer.java
catch (IOException e) { if (zkRun != null) throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/cloud/SolrZkServer.java
catch (IOException e) { throw new ConfigException("Error processing " + path, e); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (IOException e) { log.error("", e); throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Can't create ZooKeeperController", e); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (IOException e) { throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/update/UpdateLog.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Can't open new tlog!", e); }
// in core/src/java/org/apache/solr/update/DirectUpdateHandler2.java
catch (IOException e) { throw new SolrException(ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/update/VersionInfo.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Error reading version from index", e); }
// in core/src/java/org/apache/solr/update/TransactionLog.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/update/TransactionLog.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/update/TransactionLog.java
catch (IOException e) { // TODO: reset our file pointer back to "pos", the start of this record. throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Error logging add", e); }
// in core/src/java/org/apache/solr/update/TransactionLog.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/update/TransactionLog.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/update/TransactionLog.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/update/TransactionLog.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/update/TransactionLog.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/update/TransactionLog.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (IOException e) { throw new RuntimeException(e); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (IOException e) { log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/core/SolrXMLSerializer.java
catch (java.io.IOException xio) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, xio); }
// in core/src/java/org/apache/solr/util/SystemIdResolver.java
catch (IOException ioe) { throw new TransformerException("Cannot resolve entity", ioe); }
// in core/src/java/org/apache/solr/util/SystemIdResolver.java
catch (IOException ioe) { throw new XMLStreamException("Cannot resolve entity", ioe); }
(Lib) InterruptedException 70
            
// in solrj/src/java/org/apache/zookeeper/SolrZooKeeper.java
catch (InterruptedException e) {}
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in solrj/src/java/org/apache/solr/common/cloud/ZkStateReader.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in solrj/src/java/org/apache/solr/common/cloud/ZkStateReader.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.warn("", e); return; }
// in solrj/src/java/org/apache/solr/common/cloud/ZkStateReader.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.warn("", e); return; }
// in solrj/src/java/org/apache/solr/common/cloud/ZkStateReader.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException( SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in solrj/src/java/org/apache/solr/common/cloud/ZkStateReader.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/ConcurrentUpdateSolrServer.java
catch (InterruptedException e) { e.printStackTrace(); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/ConcurrentUpdateSolrServer.java
catch (InterruptedException e) { log.error("interrupted", e); throw new IOException(e.getLocalizedMessage()); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/ConcurrentUpdateSolrServer.java
catch (InterruptedException ie) { scheduler.shutdownNow(); Thread.currentThread().interrupt(); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/ConcurrentUpdateSolrServer.java
catch (InterruptedException ie) { scheduler.shutdownNow(); Thread.currentThread().interrupt(); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/CloudSolrServer.java
catch (InterruptedException e) { Thread.currentThread().interrupt(); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/XPathEntityProcessor.java
catch (InterruptedException e) { return; }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/XPathEntityProcessor.java
catch (InterruptedException e) { LOG.debug("Caught InterruptedException while waiting for row. Aborting."); isEnd.set(true); return null; }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/ZKPropertiesWriter.java
catch (InterruptedException e) { Thread.currentThread().interrupt(); log.warn( "Could not persist properties to " + path + " :" + e.getClass(), e); }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
catch (InterruptedException e) { throw new InterruptedException("Index fetch interrupted"); }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
catch (InterruptedException e) { SolrException.log(LOG,e); }
// in core/src/java/org/apache/solr/handler/component/HttpShardHandler.java
catch (InterruptedException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/handler/admin/CoreAdminHandler.java
catch (InterruptedException e) { Thread.currentThread().interrupt(); throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Could not unregister core " + cname + " from cloudstate: " + e.getMessage(), e); }
// in core/src/java/org/apache/solr/request/PerSegmentSingleValuedFaceting.java
catch (InterruptedException e) { Thread.currentThread().interrupt(); throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/servlet/ZookeeperInfoServlet.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); writeError(503, "Could not connect to zookeeper at '" + addr + "'\""); zkClient = null; return; }
// in core/src/java/org/apache/solr/servlet/ZookeeperInfoServlet.java
catch (InterruptedException e) { // ignore exception on close }
// in core/src/java/org/apache/solr/servlet/ZookeeperInfoServlet.java
catch (InterruptedException e) { writeKeyValue(json, "warning", e.toString(), false); log.warn("InterruptedException", e); }
// in core/src/java/org/apache/solr/servlet/ZookeeperInfoServlet.java
catch (InterruptedException e) { writeError(500, e.toString()); return false; }
// in core/src/java/org/apache/solr/servlet/ZookeeperInfoServlet.java
catch (InterruptedException e) { writeError(500, e.toString()); return false; }
// in core/src/java/org/apache/solr/cloud/ShardLeaderWatcher.java
catch (InterruptedException e) { Thread.interrupted(); logger.warn("Shard leader watch triggered but Solr cannot talk to zk."); }
// in core/src/java/org/apache/solr/cloud/Overseer.java
catch (InterruptedException e) { Thread.currentThread().interrupt(); return; }
// in core/src/java/org/apache/solr/cloud/Overseer.java
catch (InterruptedException e) { Thread.currentThread().interrupt(); }
// in core/src/java/org/apache/solr/cloud/Overseer.java
catch (InterruptedException e) { Thread.currentThread().interrupt(); }
// in core/src/java/org/apache/solr/cloud/Overseer.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.warn("", e); }
// in core/src/java/org/apache/solr/cloud/Overseer.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); }
// in core/src/java/org/apache/solr/cloud/Overseer.java
catch (InterruptedException e) { Thread.currentThread().interrupt(); log.error("Failed to create watcher for shard leader col:" + collection + " shard:" + shardId + ", exception: " + e.getClass()); }
// in core/src/java/org/apache/solr/cloud/Overseer.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); }
// in core/src/java/org/apache/solr/cloud/ZkSolrResourceLoader.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/cloud/NodeStateWatcher.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); return; }
// in core/src/java/org/apache/solr/cloud/NodeStateWatcher.java
catch (InterruptedException e) { Thread.currentThread().interrupt(); log.warn("Could not talk to ZK", e); }
// in core/src/java/org/apache/solr/cloud/RecoveryStrategy.java
catch (InterruptedException e) { Thread.currentThread().interrupt(); log.warn("Recovery was interrupted", e); retries = INTERRUPTED; }
// in core/src/java/org/apache/solr/cloud/RecoveryStrategy.java
catch (InterruptedException e) { Thread.currentThread().interrupt(); log.warn("Recovery was interrupted", e); retries = INTERRUPTED; }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); throw new ZooKeeperException( SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.warn("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (InterruptedException e) { throw new SolrException(ErrorCode.SERVER_ERROR, "Interrupted"); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "could not publish node state", e); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (InterruptedException e) { Thread.currentThread().interrupt(); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (Exception e) { SolrException.log(log, "There was a problem finding the leader in zk", e); try { Thread.sleep(2000); } catch (InterruptedException e1) { Thread.currentThread().interrupt(); } if (i == retries - 1) { throw new SolrException(ErrorCode.SERVER_ERROR, "There was a problem finding the leader in zk"); } }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (InterruptedException e1) { Thread.currentThread().interrupt(); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (Exception e) { SolrException.log(log, "There was a problem making a request to the leader", e); try { Thread.sleep(2000); } catch (InterruptedException e1) { Thread.currentThread().interrupt(); } if (i == retries - 1) { throw new SolrException(ErrorCode.SERVER_ERROR, "There was a problem making a request to the leader"); } }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (InterruptedException e1) { Thread.currentThread().interrupt(); }
// in core/src/java/org/apache/solr/cloud/LeaderElector.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.warn("", e); }
// in core/src/java/org/apache/solr/update/UpdateLog.java
catch (InterruptedException e) { SolrException.log(log,e); }
// in core/src/java/org/apache/solr/update/DirectUpdateHandler2.java
catch (InterruptedException e) { SolrException.log(log,e); }
// in core/src/java/org/apache/solr/update/processor/DistributedUpdateProcessor.java
catch (InterruptedException e) { Thread.currentThread().interrupt(); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/update/processor/DistributedUpdateProcessor.java
catch (InterruptedException e) { Thread.currentThread().interrupt(); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/update/processor/DistributedUpdateProcessor.java
catch (InterruptedException e) { throw new SolrException(ErrorCode.SERVER_ERROR, "Exception finding leader for shard " + sliceName, e); }
// in core/src/java/org/apache/solr/update/processor/DistributedUpdateProcessor.java
catch (InterruptedException e) { Thread.currentThread().interrupt(); break; }
// in core/src/java/org/apache/solr/update/processor/DistributedUpdateProcessor.java
catch (InterruptedException e) { Thread.currentThread().interrupt(); return false; }
// in core/src/java/org/apache/solr/update/DefaultSolrCoreState.java
catch (InterruptedException e) { }
// in core/src/java/org/apache/solr/update/DefaultSolrCoreState.java
catch (InterruptedException e) { }
// in core/src/java/org/apache/solr/update/SolrCmdDistributor.java
catch (InterruptedException e) { throw new SolrException(SolrException.ErrorCode.SERVICE_UNAVAILABLE, "interrupted waiting for shard update response", e); }
// in core/src/java/org/apache/solr/core/RunExecutableListener.java
catch (InterruptedException e) { SolrException.log(log,e); ret = INVALID_PROCESS_RETURN_CODE; }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (InterruptedException e) { searcherExecutor.shutdownNow(); try { if (!searcherExecutor.awaitTermination(30, TimeUnit.SECONDS)) { log.error("Timeout waiting for searchExecutor to terminate"); } } catch (InterruptedException e2) { SolrException.log(log, e2); } }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (InterruptedException e2) { SolrException.log(log, e2); }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (InterruptedException e) { log.info(SolrException.toStr(e)); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); SolrException.log(log, "", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/util/ConcurrentLRUCache.java
catch (InterruptedException e) {}
// in core/src/java/org/apache/solr/util/ConcurrentLFUCache.java
catch (InterruptedException e) { }
// in core/src/java/org/apache/solr/util/FileUtils.java
catch (IOException ioe) { if (exc == null) exc = ioe; try { // Pause 5 msec Thread.sleep(5); } catch (InterruptedException ie) { Thread.currentThread().interrupt(); } }
// in core/src/java/org/apache/solr/util/FileUtils.java
catch (InterruptedException ie) { Thread.currentThread().interrupt(); }
26
            
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in solrj/src/java/org/apache/solr/common/cloud/ZkStateReader.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in solrj/src/java/org/apache/solr/common/cloud/ZkStateReader.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException( SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in solrj/src/java/org/apache/solr/common/cloud/ZkStateReader.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/ConcurrentUpdateSolrServer.java
catch (InterruptedException e) { log.error("interrupted", e); throw new IOException(e.getLocalizedMessage()); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/CloudSolrServer.java
catch (InterruptedException e) { Thread.currentThread().interrupt(); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
catch (InterruptedException e) { throw new InterruptedException("Index fetch interrupted"); }
// in core/src/java/org/apache/solr/handler/component/HttpShardHandler.java
catch (InterruptedException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/handler/admin/CoreAdminHandler.java
catch (InterruptedException e) { Thread.currentThread().interrupt(); throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Could not unregister core " + cname + " from cloudstate: " + e.getMessage(), e); }
// in core/src/java/org/apache/solr/request/PerSegmentSingleValuedFaceting.java
catch (InterruptedException e) { Thread.currentThread().interrupt(); throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/cloud/ZkSolrResourceLoader.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); throw new ZooKeeperException( SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.warn("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (InterruptedException e) { throw new SolrException(ErrorCode.SERVER_ERROR, "Interrupted"); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "could not publish node state", e); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (Exception e) { SolrException.log(log, "There was a problem finding the leader in zk", e); try { Thread.sleep(2000); } catch (InterruptedException e1) { Thread.currentThread().interrupt(); } if (i == retries - 1) { throw new SolrException(ErrorCode.SERVER_ERROR, "There was a problem finding the leader in zk"); } }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (Exception e) { SolrException.log(log, "There was a problem making a request to the leader", e); try { Thread.sleep(2000); } catch (InterruptedException e1) { Thread.currentThread().interrupt(); } if (i == retries - 1) { throw new SolrException(ErrorCode.SERVER_ERROR, "There was a problem making a request to the leader"); } }
// in core/src/java/org/apache/solr/update/processor/DistributedUpdateProcessor.java
catch (InterruptedException e) { Thread.currentThread().interrupt(); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/update/processor/DistributedUpdateProcessor.java
catch (InterruptedException e) { Thread.currentThread().interrupt(); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/update/processor/DistributedUpdateProcessor.java
catch (InterruptedException e) { throw new SolrException(ErrorCode.SERVER_ERROR, "Exception finding leader for shard " + sliceName, e); }
// in core/src/java/org/apache/solr/update/SolrCmdDistributor.java
catch (InterruptedException e) { throw new SolrException(SolrException.ErrorCode.SERVICE_UNAVAILABLE, "interrupted waiting for shard update response", e); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); SolrException.log(log, "", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
(Lib) Throwable 58
            
// in solrj/src/java/org/apache/solr/client/solrj/impl/ConcurrentUpdateSolrServer.java
catch (Throwable e) { handleError(e); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/HttpSolrServer.java
catch (Throwable t) {}
// in contrib/clustering/src/java/org/apache/solr/handler/clustering/carrot2/LuceneCarrot2TokenizerFactory.java
catch (Throwable e) { logger .warn("Could not instantiate Smart Chinese Analyzer, clustering quality " + "of Chinese content may be degraded. For best quality clusters, " + "make sure Lucene's Smart Chinese Analyzer JAR is in the classpath"); }
// in contrib/clustering/src/java/org/apache/solr/handler/clustering/carrot2/LuceneCarrot2TokenizerFactory.java
catch (Throwable e) { return new ExtendedWhitespaceTokenizer(); }
// in contrib/clustering/src/java/org/apache/solr/handler/clustering/carrot2/LuceneCarrot2StemmerFactory.java
catch (Throwable e) { return IdentityStemmer.INSTANCE; }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DataImporter.java
catch (Throwable t) { SolrException.log(LOG, "Full Import failed", t); docBuilder.rollback(); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DataImporter.java
catch (Throwable t) { LOG.error("Delta Import Failed", t); docBuilder.rollback(); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/SolrWriter.java
catch (Throwable t) { log.error("Exception while solr commit.", t); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/SolrWriter.java
catch (Throwable t) { log.error("Exception while solr rollback.", t); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/ZKPropertiesWriter.java
catch (Throwable e) { log.warn( "Could not read DIH properties from " + path + " :" + e.getClass(), e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DataImportHandler.java
catch (Throwable e) { LOG.error( DataImporter.MSG.LOAD_EXP, e); throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, DataImporter.MSG.INVALID_CONFIG, e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DocBuilder.java
catch (Throwable t) { if (verboseDebug) { getDebugLogger().log(DIHLogLevels.ENTITY_EXCEPTION, epw.getEntity().getName(), t); } throw new DataImportHandlerException(DataImportHandlerException.SEVERE, t); }
// in core/src/java/org/apache/solr/handler/component/HttpShardHandler.java
catch (Throwable th) { srsp.setException(th); if (th instanceof SolrException) { srsp.setResponseCode(((SolrException)th).code()); } else { srsp.setResponseCode(-1); } }
// in core/src/java/org/apache/solr/handler/component/HttpShardHandlerFactory.java
catch (Throwable e) { SolrException.log(log, e); }
// in core/src/java/org/apache/solr/handler/component/HttpShardHandlerFactory.java
catch (Throwable e) { SolrException.log(log, e); }
// in core/src/java/org/apache/solr/handler/component/HttpShardHandlerFactory.java
catch (Throwable e) { SolrException.log(log, e); }
// in core/src/java/org/apache/solr/handler/admin/SystemInfoHandler.java
catch( Throwable ex ) { ex.printStackTrace(); }
// in core/src/java/org/apache/solr/handler/PingRequestHandler.java
catch( Throwable th ) { ex = th; }
// in core/src/java/org/apache/solr/SolrLogFormatter.java
catch (Throwable th) { // logging swallows exceptions, so if we hit an exception we need to convert it to a string to see it return "ERROR IN SolrLogFormatter! original message:" + record.getMessage() + "\n\tException: " + SolrException.toStr(th); }
// in core/src/java/org/apache/solr/SolrLogFormatter.java
catch (Throwable e) { e.printStackTrace(); }
// in core/src/java/org/apache/solr/request/SolrRequestInfo.java
catch (Throwable throwable) { SolrException.log(SolrCore.log, "Exception during close hook", throwable); }
// in core/src/java/org/apache/solr/servlet/SolrDispatchFilter.java
catch( Throwable t ) { // catch this so our filter still works log.error( "Could not start Solr. Check solr/home property and the logs"); SolrCore.log( t ); }
// in core/src/java/org/apache/solr/servlet/SolrDispatchFilter.java
catch (Throwable ex) { sendError( core, solrReq, request, (HttpServletResponse)response, ex ); return; }
// in core/src/java/org/apache/solr/servlet/SolrDispatchFilter.java
catch( Throwable t ) { // This error really does not matter SimpleOrderedMap info = new SimpleOrderedMap(); int code=getErrorInfo(ex, info); response.sendError( code, info.toString() ); }
// in core/src/java/org/apache/solr/search/LFUCache.java
catch (Throwable e) { SolrException.log(log, "Error during auto-warming of key:" + itemsArr[i].getKey(), e); }
// in core/src/java/org/apache/solr/search/LRUCache.java
catch (Throwable e) { SolrException.log(log,"Error during auto-warming of key:" + keys[i], e); }
// in core/src/java/org/apache/solr/search/FastLRUCache.java
catch (Throwable e) { SolrException.log(log, "Error during auto-warming of key:" + itemsArr[i].getKey(), e); }
// in core/src/java/org/apache/solr/cloud/SolrZkServer.java
catch (Throwable e) { log.error("ZooKeeper Server ERROR", e); throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/cloud/RecoveryStrategy.java
catch (Throwable t) { log.error("Error while trying to recover", t); }
// in core/src/java/org/apache/solr/cloud/RecoveryStrategy.java
catch (Throwable t) { SolrException.log(log, "", t); }
// in core/src/java/org/apache/solr/cloud/RecoveryStrategy.java
catch (Throwable t) { log.error("Error while trying to recover.", t); }
// in core/src/java/org/apache/solr/update/UpdateLog.java
catch (Throwable e) { SolrException.log(log, "Error opening realtime searcher for deleteByQuery", e); }
// in core/src/java/org/apache/solr/update/UpdateLog.java
catch (Throwable e) { recoveryInfo.errors++; SolrException.log(log,e); }
// in core/src/java/org/apache/solr/update/UpdateLog.java
catch (Throwable e) { SolrException.log(log,e); }
// in core/src/java/org/apache/solr/update/UpdateLog.java
catch (Throwable ex) { recoveryInfo.errors++; loglog.warn("REPLAY_ERR: Exception replaying log", ex); // something wrong with the request? }
// in core/src/java/org/apache/solr/update/DirectUpdateHandler2.java
catch (Throwable th) { log.error("Error in final commit", th); }
// in core/src/java/org/apache/solr/update/DirectUpdateHandler2.java
catch (Throwable th) { log.error("Error closing log files", th); }
// in core/src/java/org/apache/solr/update/DefaultSolrCoreState.java
catch (Throwable t) { log.error("Error during shutdown of writer.", t); }
// in core/src/java/org/apache/solr/update/DefaultSolrCoreState.java
catch (Throwable t) { log.error("Error during shutdown of directory factory.", t); }
// in core/src/java/org/apache/solr/update/DefaultSolrCoreState.java
catch (Throwable t) { log.error("Error cancelling recovery", t); }
// in core/src/java/org/apache/solr/core/Config.java
catch (Throwable e) { SolrException.log(log,"Error in xpath",e); throw new SolrException( SolrException.ErrorCode.SERVER_ERROR,"Error in xpath:" + xstr+ " for " + name,e); }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (Throwable e) { latch.countDown();//release the latch, otherwise we block trying to do the close. This should be fine, since counting down on a latch of 0 is still fine //close down the searcher and any other resources, if it exists, as this is not recoverable close(); throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, null, e); }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (Throwable e) { SolrException.log(log, e); }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (Throwable e) { SolrException.log(log,e); }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (Throwable e) { SolrException.log(log,e); }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (Throwable e) { SolrException.log(log, e); }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (Throwable e) { SolrException.log(log,e); }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (Throwable e) { SolrException.log(log,null,e); }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (Throwable e) { SolrException.log(log,null,e); }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (Throwable e) { SolrException.log(log, e); }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (Throwable e) { // do not allow decref() operations to fail since they are typically called in finally blocks // and throwing another exception would be very unexpected. SolrException.log(log, "Error closing searcher:", e); }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (Throwable e) { // an exception in register() shouldn't be fatal. log(e); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch(Throwable ex) { log.warn("Unable to read SLF4J version. LogWatcher will be disabled: "+ex); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (Throwable e) { log.warn("Unable to load LogWatcher", e); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (Throwable e) { SolrException.log(log,null,e); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (Throwable ex) { SolrException.log(log,null,ex); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (Throwable t) { SolrException.log(log, "Error shutting down core", t); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (Throwable t) { SolrException.log(log, "Error canceling recovery for core", t); }
5
            
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DataImportHandler.java
catch (Throwable e) { LOG.error( DataImporter.MSG.LOAD_EXP, e); throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, DataImporter.MSG.INVALID_CONFIG, e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DocBuilder.java
catch (Throwable t) { if (verboseDebug) { getDebugLogger().log(DIHLogLevels.ENTITY_EXCEPTION, epw.getEntity().getName(), t); } throw new DataImportHandlerException(DataImportHandlerException.SEVERE, t); }
// in core/src/java/org/apache/solr/cloud/SolrZkServer.java
catch (Throwable e) { log.error("ZooKeeper Server ERROR", e); throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/core/Config.java
catch (Throwable e) { SolrException.log(log,"Error in xpath",e); throw new SolrException( SolrException.ErrorCode.SERVER_ERROR,"Error in xpath:" + xstr+ " for " + name,e); }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (Throwable e) { latch.countDown();//release the latch, otherwise we block trying to do the close. This should be fine, since counting down on a latch of 0 is still fine //close down the searcher and any other resources, if it exists, as this is not recoverable close(); throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, null, e); }
(Lib) KeeperException 34
            
// in solrj/src/java/org/apache/solr/common/cloud/ZkStateReader.java
catch (KeeperException e) { log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in solrj/src/java/org/apache/solr/common/cloud/ZkStateReader.java
catch (KeeperException e) { if (e.code() == KeeperException.Code.SESSIONEXPIRED || e.code() == KeeperException.Code.CONNECTIONLOSS) { log.warn("ZooKeeper watch triggered, but Solr cannot talk to ZK"); return; } log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in solrj/src/java/org/apache/solr/common/cloud/ZkStateReader.java
catch (KeeperException e) { if (e.code() == KeeperException.Code.SESSIONEXPIRED || e.code() == KeeperException.Code.CONNECTIONLOSS) { log.warn("ZooKeeper watch triggered, but Solr cannot talk to ZK"); return; } log.error("", e); throw new ZooKeeperException( SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in solrj/src/java/org/apache/solr/common/cloud/ZkStateReader.java
catch (KeeperException e) { if (e.code() == KeeperException.Code.SESSIONEXPIRED || e.code() == KeeperException.Code.CONNECTIONLOSS) { log.warn("ZooKeeper watch triggered, but Solr cannot talk to ZK"); return; } log.error("", e); throw new ZooKeeperException( SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in solrj/src/java/org/apache/solr/common/cloud/ZkCmdExecutor.java
catch (KeeperException.ConnectionLossException e) { if (exception == null) { exception = e; } if (Thread.currentThread().isInterrupted()) { Thread.currentThread().interrupt(); throw new InterruptedException(); } if (Thread.currentThread() instanceof SafeStopThread) { if (((SafeStopThread) Thread.currentThread()).isClosed()) { throw new RuntimeException("Interrupted"); } } retryDelay(i); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/CloudSolrServer.java
catch (KeeperException e) { throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/handler/admin/CoreAdminHandler.java
catch (KeeperException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Could not unregister core " + cname + " from cloudstate: " + e.getMessage(), e); }
// in core/src/java/org/apache/solr/servlet/ZookeeperInfoServlet.java
catch (KeeperException e) { writeKeyValue(json, "warning", e.toString(), false); log.warn("Keeper Exception", e); }
// in core/src/java/org/apache/solr/servlet/ZookeeperInfoServlet.java
catch (KeeperException e) { writeError(500, e.toString()); return false; }
// in core/src/java/org/apache/solr/servlet/ZookeeperInfoServlet.java
catch (KeeperException e) { writeError(500, e.toString()); return false; }
// in core/src/java/org/apache/solr/cloud/ShardLeaderWatcher.java
catch (KeeperException ke) { //check if we lost connection or the node was gone if (ke.code() != Code.CONNECTIONLOSS && ke.code() != Code.SESSIONEXPIRED && ke.code() != Code.NONODE) { throw ke; } }
// in core/src/java/org/apache/solr/cloud/ShardLeaderWatcher.java
catch (KeeperException e) { logger.warn("Shard leader watch triggered but Solr cannot talk to zk."); }
// in core/src/java/org/apache/solr/cloud/Overseer.java
catch (KeeperException e) { if (e.code() == KeeperException.Code.SESSIONEXPIRED || e.code() == KeeperException.Code.CONNECTIONLOSS) { log.warn("ZooKeeper watch triggered, but Solr cannot talk to ZK"); return; } SolrException.log(log, "", e); throw new ZooKeeperException( SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/cloud/Overseer.java
catch (KeeperException e) { log.warn("", e); }
// in core/src/java/org/apache/solr/cloud/Overseer.java
catch (KeeperException e) { if (e.code() == Code.CONNECTIONLOSS || e.code() == Code.SESSIONEXPIRED) { log.warn("ZooKeeper watch triggered, but Solr cannot talk to ZK"); return; } }
// in core/src/java/org/apache/solr/cloud/Overseer.java
catch (KeeperException e) { if (e.code() == KeeperException.Code.SESSIONEXPIRED || e.code() == KeeperException.Code.CONNECTIONLOSS) { log.warn("ZooKeeper watch triggered, but Solr cannot talk to ZK"); return; } SolrException.log(log, "", e); throw new ZooKeeperException( SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/cloud/Overseer.java
catch (KeeperException e) { log.error("Failed to create watcher for shard leader col:" + collection + " shard:" + shardId + ", exception: " + e.getClass()); }
// in core/src/java/org/apache/solr/cloud/Overseer.java
catch (KeeperException e) { if (e.code() == KeeperException.Code.SESSIONEXPIRED || e.code() == KeeperException.Code.CONNECTIONLOSS) { log.warn("ZooKeeper watch triggered, but Solr cannot talk to ZK"); return; } SolrException.log(log, "", e); throw new ZooKeeperException( SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/cloud/ZkSolrResourceLoader.java
catch (KeeperException e) { log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/cloud/NodeStateWatcher.java
catch (KeeperException e) { log.warn("Could not talk to ZK", e); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (KeeperException e) { log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (KeeperException.NoNodeException e) { // fine if there is nothing to delete // TODO: annoying that ZK logs a warning on us nodeDeleted = false; }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (KeeperException e) { // its okay if the node already exists if (e.code() != KeeperException.Code.NODEEXISTS) { throw e; } }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (KeeperException e) { throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (KeeperException e) { // its okay if the node already exists if (e.code() != KeeperException.Code.NODEEXISTS) { throw e; } }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (KeeperException e) { // its okay if another beats us creating the node if (e.code() != KeeperException.Code.NODEEXISTS) { throw e; } }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (KeeperException e) { throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "could not publish node state", e); }
// in core/src/java/org/apache/solr/cloud/LeaderElector.java
catch (KeeperException.SessionExpiredException e) { throw e; }
// in core/src/java/org/apache/solr/cloud/LeaderElector.java
catch (KeeperException e) { // we couldn't set our watch - the node before us may already be down? // we need to check if we are the leader again checkIfIamLeader(seq, context, true); }
// in core/src/java/org/apache/solr/cloud/LeaderElector.java
catch (KeeperException.NoNodeException e) { // we must have failed in creating the election node - someone else must // be working on it, lets try again if (tries++ > 9) { throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } cont = true; Thread.sleep(50); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (KeeperException e) { log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (KeeperException e) { log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (KeeperException e) { log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (KeeperException e) { log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
25
            
// in solrj/src/java/org/apache/solr/common/cloud/ZkStateReader.java
catch (KeeperException e) { log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in solrj/src/java/org/apache/solr/common/cloud/ZkStateReader.java
catch (KeeperException e) { if (e.code() == KeeperException.Code.SESSIONEXPIRED || e.code() == KeeperException.Code.CONNECTIONLOSS) { log.warn("ZooKeeper watch triggered, but Solr cannot talk to ZK"); return; } log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in solrj/src/java/org/apache/solr/common/cloud/ZkStateReader.java
catch (KeeperException e) { if (e.code() == KeeperException.Code.SESSIONEXPIRED || e.code() == KeeperException.Code.CONNECTIONLOSS) { log.warn("ZooKeeper watch triggered, but Solr cannot talk to ZK"); return; } log.error("", e); throw new ZooKeeperException( SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in solrj/src/java/org/apache/solr/common/cloud/ZkStateReader.java
catch (KeeperException e) { if (e.code() == KeeperException.Code.SESSIONEXPIRED || e.code() == KeeperException.Code.CONNECTIONLOSS) { log.warn("ZooKeeper watch triggered, but Solr cannot talk to ZK"); return; } log.error("", e); throw new ZooKeeperException( SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in solrj/src/java/org/apache/solr/common/cloud/ZkCmdExecutor.java
catch (KeeperException.ConnectionLossException e) { if (exception == null) { exception = e; } if (Thread.currentThread().isInterrupted()) { Thread.currentThread().interrupt(); throw new InterruptedException(); } if (Thread.currentThread() instanceof SafeStopThread) { if (((SafeStopThread) Thread.currentThread()).isClosed()) { throw new RuntimeException("Interrupted"); } } retryDelay(i); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/CloudSolrServer.java
catch (KeeperException e) { throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/handler/admin/CoreAdminHandler.java
catch (KeeperException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Could not unregister core " + cname + " from cloudstate: " + e.getMessage(), e); }
// in core/src/java/org/apache/solr/cloud/ShardLeaderWatcher.java
catch (KeeperException ke) { //check if we lost connection or the node was gone if (ke.code() != Code.CONNECTIONLOSS && ke.code() != Code.SESSIONEXPIRED && ke.code() != Code.NONODE) { throw ke; } }
// in core/src/java/org/apache/solr/cloud/Overseer.java
catch (KeeperException e) { if (e.code() == KeeperException.Code.SESSIONEXPIRED || e.code() == KeeperException.Code.CONNECTIONLOSS) { log.warn("ZooKeeper watch triggered, but Solr cannot talk to ZK"); return; } SolrException.log(log, "", e); throw new ZooKeeperException( SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/cloud/Overseer.java
catch (KeeperException e) { if (e.code() == KeeperException.Code.SESSIONEXPIRED || e.code() == KeeperException.Code.CONNECTIONLOSS) { log.warn("ZooKeeper watch triggered, but Solr cannot talk to ZK"); return; } SolrException.log(log, "", e); throw new ZooKeeperException( SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/cloud/Overseer.java
catch (KeeperException e) { if (e.code() == KeeperException.Code.SESSIONEXPIRED || e.code() == KeeperException.Code.CONNECTIONLOSS) { log.warn("ZooKeeper watch triggered, but Solr cannot talk to ZK"); return; } SolrException.log(log, "", e); throw new ZooKeeperException( SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/cloud/ZkSolrResourceLoader.java
catch (KeeperException e) { log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (KeeperException e) { log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (KeeperException e) { // its okay if the node already exists if (e.code() != KeeperException.Code.NODEEXISTS) { throw e; } }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (KeeperException e) { throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (KeeperException e) { // its okay if the node already exists if (e.code() != KeeperException.Code.NODEEXISTS) { throw e; } }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (KeeperException e) { // its okay if another beats us creating the node if (e.code() != KeeperException.Code.NODEEXISTS) { throw e; } }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (KeeperException e) { throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "could not publish node state", e); }
// in core/src/java/org/apache/solr/cloud/LeaderElector.java
catch (KeeperException.SessionExpiredException e) { throw e; }
// in core/src/java/org/apache/solr/cloud/LeaderElector.java
catch (KeeperException.NoNodeException e) { // we must have failed in creating the election node - someone else must // be working on it, lets try again if (tries++ > 9) { throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } cont = true; Thread.sleep(50); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (KeeperException e) { log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (KeeperException e) { log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (KeeperException e) { log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (KeeperException e) { log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
(Lib) NumberFormatException 31
            
// in solrj/src/java/org/apache/solr/client/solrj/response/QueryResponse.java
catch (NumberFormatException e) { //Ignore for non-number responses which are already handled above }
// in contrib/dataimporthandler-extras/src/java/org/apache/solr/handler/dataimport/MailEntityProcessor.java
catch (NumberFormatException e) { //do nothing }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/EvaluatorBag.java
catch (NumberFormatException e) { if (vr.resolve(ss[i]) == null) { wrapAndThrow( SEVERE, e, "Invalid number :" + ss[i] + "in parameters " + expression); } }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/JdbcDataSource.java
catch (NumberFormatException e) { LOG.warn("Invalid batch size: " + bsz); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/BinURLDataSource.java
catch (NumberFormatException e) { LOG.warn("Invalid connection timeout: " + cTimeout); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/BinURLDataSource.java
catch (NumberFormatException e) { LOG.warn("Invalid read timeout: " + rTimeout); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/URLDataSource.java
catch (NumberFormatException e) { LOG.warn("Invalid connection timeout: " + cTimeout); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/URLDataSource.java
catch (NumberFormatException e) { LOG.warn("Invalid read timeout: " + rTimeout); }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
catch (NumberFormatException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, INTERVAL_ERR_MSG); }
// in core/src/java/org/apache/solr/handler/ReplicationHandler.java
catch (NumberFormatException e) {/*no op*/ }
// in core/src/java/org/apache/solr/handler/component/TermVectorComponent.java
catch (NumberFormatException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e.getMessage(), e); }
// in core/src/java/org/apache/solr/analysis/TrieTokenizerFactory.java
catch (NumberFormatException nfe) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Invalid Number: " + v); }
// in core/src/java/org/apache/solr/analysis/LegacyHTMLStripCharFilter.java
catch (NumberFormatException e) { return MISMATCH; }
// in core/src/java/org/apache/solr/schema/IndexSchema.java
catch (NumberFormatException e) { log.warn("Couldn't parse maxChars attribute for copyField from " + source + " to " + dest + " as integer. The whole field will be copied."); }
// in core/src/java/org/apache/solr/schema/DoubleField.java
catch (NumberFormatException e){ // can't parse - write out the contents as a string so nothing is lost and // clients don't get a parse error. writer.writeStr(name, s, true); }
// in core/src/java/org/apache/solr/schema/CurrencyField.java
catch (NumberFormatException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Could not parse exchange rate: " + rateNode, e); }
// in core/src/java/org/apache/solr/schema/CurrencyField.java
catch (NumberFormatException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/schema/IntField.java
catch (NumberFormatException e){ // can't parse - write out the contents as a string so nothing is lost and // clients don't get a parse error. writer.writeStr(name, s, true); }
// in core/src/java/org/apache/solr/schema/FloatField.java
catch (NumberFormatException e){ // can't parse - write out the contents as a string so nothing is lost and // clients don't get a parse error. writer.writeStr(name, s, true); }
// in core/src/java/org/apache/solr/schema/ByteField.java
catch (NumberFormatException e){ // can't parse - write out the contents as a string so nothing is lost and // clients don't get a parse error. writer.writeStr(name, s, true); }
// in core/src/java/org/apache/solr/schema/LongField.java
catch (NumberFormatException e){ // can't parse - write out the contents as a string so nothing is lost and // clients don't get a parse error. writer.writeStr(name, s, true); }
// in core/src/java/org/apache/solr/schema/ShortField.java
catch (NumberFormatException e){ // can't parse - write out the contents as a string so nothing is lost and // clients don't get a parse error. writer.writeStr(name, s, true); }
// in core/src/java/org/apache/solr/schema/JsonPreAnalyzedParser.java
catch (NumberFormatException nfe) { LOG.warn("Invalid " + OFFSET_START_KEY + " attribute, skipped: '" + obj + "'"); hasOffsetStart = false; }
// in core/src/java/org/apache/solr/schema/JsonPreAnalyzedParser.java
catch (NumberFormatException nfe) { LOG.warn("Invalid " + OFFSET_END_KEY + " attribute, skipped: '" + obj + "'"); hasOffsetEnd = false; }
// in core/src/java/org/apache/solr/schema/JsonPreAnalyzedParser.java
catch (NumberFormatException nfe) { LOG.warn("Invalid " + POSINCR_KEY + " attribute, skipped: '" + obj + "'"); }
// in core/src/java/org/apache/solr/schema/JsonPreAnalyzedParser.java
catch (NumberFormatException nfe) { LOG.warn("Invalid " + FLAGS_KEY + " attribute, skipped: '" + e.getValue() + "'"); }
// in core/src/java/org/apache/solr/internal/csv/CSVParser.java
catch (NumberFormatException e) { throw new IOException( "(line " + getLineNumber() + ") Wrong unicode escape sequence found '" + code.toString() + "'" + e.toString()); }
// in core/src/java/org/apache/solr/spelling/AbstractLuceneSpellChecker.java
catch (NumberFormatException e) { throw new RuntimeException( "Unparseable accuracy given for dictionary: " + name, e); }
// in core/src/java/org/apache/solr/cloud/SolrZkServer.java
catch (NumberFormatException e) { throw new IllegalArgumentException("serverid " + myIdString + " is not a number"); }
// in core/src/java/org/apache/solr/util/DOMUtil.java
catch (NumberFormatException nfe) { throw new SolrException (SolrException.ErrorCode.SERVER_ERROR, "Value " + (null != name ? ("of '" +name+ "' ") : "") + "can not be parsed as '" +type+ "': \"" + textValue + "\"", nfe); }
// in core/src/java/org/apache/solr/util/DateMathParser.java
catch (NumberFormatException e) { throw new ParseException ("Not a Number: \"" + ops[pos-1] + "\"", pos-1); }
10
            
// in core/src/java/org/apache/solr/handler/SnapPuller.java
catch (NumberFormatException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, INTERVAL_ERR_MSG); }
// in core/src/java/org/apache/solr/handler/component/TermVectorComponent.java
catch (NumberFormatException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e.getMessage(), e); }
// in core/src/java/org/apache/solr/analysis/TrieTokenizerFactory.java
catch (NumberFormatException nfe) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Invalid Number: " + v); }
// in core/src/java/org/apache/solr/schema/CurrencyField.java
catch (NumberFormatException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Could not parse exchange rate: " + rateNode, e); }
// in core/src/java/org/apache/solr/schema/CurrencyField.java
catch (NumberFormatException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/internal/csv/CSVParser.java
catch (NumberFormatException e) { throw new IOException( "(line " + getLineNumber() + ") Wrong unicode escape sequence found '" + code.toString() + "'" + e.toString()); }
// in core/src/java/org/apache/solr/spelling/AbstractLuceneSpellChecker.java
catch (NumberFormatException e) { throw new RuntimeException( "Unparseable accuracy given for dictionary: " + name, e); }
// in core/src/java/org/apache/solr/cloud/SolrZkServer.java
catch (NumberFormatException e) { throw new IllegalArgumentException("serverid " + myIdString + " is not a number"); }
// in core/src/java/org/apache/solr/util/DOMUtil.java
catch (NumberFormatException nfe) { throw new SolrException (SolrException.ErrorCode.SERVER_ERROR, "Value " + (null != name ? ("of '" +name+ "' ") : "") + "can not be parsed as '" +type+ "': \"" + textValue + "\"", nfe); }
// in core/src/java/org/apache/solr/util/DateMathParser.java
catch (NumberFormatException e) { throw new ParseException ("Not a Number: \"" + ops[pos-1] + "\"", pos-1); }
(Domain) ParseException 28
            
// in solrj/src/java/org/apache/solr/common/util/DateUtil.java
catch (ParseException pe) { // ignore this exception, we will try the next format }
// in contrib/dataimporthandler-extras/src/java/org/apache/solr/handler/dataimport/MailEntityProcessor.java
catch (ParseException e) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Invalid value for fetchMailSince: " + s, e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/EvaluatorBag.java
catch (ParseException exp) { wrapAndThrow(SEVERE, exp, "Invalid expression for date"); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/EvaluatorBag.java
catch (ParseException e) { wrapAndThrow(SEVERE, e, "Invalid expression for date"); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DateFormatTransformer.java
catch (ParseException e) { LOG.warn("Could not parse a Date field ", e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/NumberFormatTransformer.java
catch (ParseException e) { throw new DataImportHandlerException( DataImportHandlerException.SEVERE, "Failed to apply NumberFormat on column: " + column, e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/NumberFormatTransformer.java
catch (ParseException e) { throw new DataImportHandlerException( DataImportHandlerException.SEVERE, "Failed to apply NumberFormat on column: " + column, e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/FileListEntityProcessor.java
catch (ParseException exp) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Invalid expression for date", exp); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/FileListEntityProcessor.java
catch (ParseException exp) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Invalid expression for date", exp); }
// in core/src/java/org/apache/solr/handler/MoreLikeThisHandler.java
catch (ParseException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/handler/component/QueryComponent.java
catch (ParseException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/handler/component/QueryComponent.java
catch (ParseException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/handler/component/FacetComponent.java
catch (ParseException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/handler/component/HighlightComponent.java
catch (ParseException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/request/SimpleFacets.java
catch (ParseException e) { throw new SolrException(ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/request/SimpleFacets.java
catch (java.text.ParseException e) { throw new SolrException (SolrException.ErrorCode.BAD_REQUEST, "date facet 'gap' is not a valid Date Math string: " + gap, e); }
// in core/src/java/org/apache/solr/schema/DateField.java
catch (ParseException e) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "Invalid Date in Date Math String:'" +val+'\'',e); }
// in core/src/java/org/apache/solr/schema/DateField.java
catch (ParseException e) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "Invalid Date Math String:'" +val+'\'',e); }
// in core/src/java/org/apache/solr/schema/DateField.java
catch( ParseException ex ) { throw new RuntimeException( ex ); }
// in core/src/java/org/apache/solr/schema/DateField.java
catch (ParseException e) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "Invalid Date in Date Math String:'" +val+'\'',e); }
// in core/src/java/org/apache/solr/schema/DateField.java
catch (ParseException e) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "Invalid Date Math String:'" +val+'\'',e); }
// in core/src/java/org/apache/solr/schema/CollationField.java
catch (ParseException e) { // invalid rules throw new RuntimeException(e); }
// in core/src/java/org/apache/solr/search/SurroundQParserPlugin.java
catch (org.apache.lucene.queryparser.surround.parser.ParseException pe) { throw new org.apache.lucene.queryparser.classic.ParseException( pe.getMessage()); }
// in core/src/java/org/apache/solr/search/ReturnFields.java
catch (ParseException e) { // try again, simple rules for a field name with no whitespace sp.pos = start; field = sp.getSimpleString(); if (req.getSchema().getFieldOrNull(field) != null) { // OK, it was an oddly named field fields.add(field); if( key != null ) { rename.add(field, key); } } else { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Error parsing fieldname: " + e.getMessage(), e); } }
// in core/src/java/org/apache/solr/search/ReturnFields.java
catch (ParseException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Error parsing fieldname", e); }
// in core/src/java/org/apache/solr/search/QueryParsing.java
catch (ParseException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "error in sort: " + sortSpec, e); }
// in core/src/java/org/apache/solr/update/DirectUpdateHandler2.java
catch (ParseException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/util/SolrPluginUtils.java
catch (ParseException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Error parsing query: " + qs); }
24
            
// in contrib/dataimporthandler-extras/src/java/org/apache/solr/handler/dataimport/MailEntityProcessor.java
catch (ParseException e) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Invalid value for fetchMailSince: " + s, e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/NumberFormatTransformer.java
catch (ParseException e) { throw new DataImportHandlerException( DataImportHandlerException.SEVERE, "Failed to apply NumberFormat on column: " + column, e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/NumberFormatTransformer.java
catch (ParseException e) { throw new DataImportHandlerException( DataImportHandlerException.SEVERE, "Failed to apply NumberFormat on column: " + column, e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/FileListEntityProcessor.java
catch (ParseException exp) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Invalid expression for date", exp); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/FileListEntityProcessor.java
catch (ParseException exp) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Invalid expression for date", exp); }
// in core/src/java/org/apache/solr/handler/MoreLikeThisHandler.java
catch (ParseException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/handler/component/QueryComponent.java
catch (ParseException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/handler/component/QueryComponent.java
catch (ParseException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/handler/component/FacetComponent.java
catch (ParseException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/handler/component/HighlightComponent.java
catch (ParseException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/request/SimpleFacets.java
catch (ParseException e) { throw new SolrException(ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/request/SimpleFacets.java
catch (java.text.ParseException e) { throw new SolrException (SolrException.ErrorCode.BAD_REQUEST, "date facet 'gap' is not a valid Date Math string: " + gap, e); }
// in core/src/java/org/apache/solr/schema/DateField.java
catch (ParseException e) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "Invalid Date in Date Math String:'" +val+'\'',e); }
// in core/src/java/org/apache/solr/schema/DateField.java
catch (ParseException e) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "Invalid Date Math String:'" +val+'\'',e); }
// in core/src/java/org/apache/solr/schema/DateField.java
catch( ParseException ex ) { throw new RuntimeException( ex ); }
// in core/src/java/org/apache/solr/schema/DateField.java
catch (ParseException e) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "Invalid Date in Date Math String:'" +val+'\'',e); }
// in core/src/java/org/apache/solr/schema/DateField.java
catch (ParseException e) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "Invalid Date Math String:'" +val+'\'',e); }
// in core/src/java/org/apache/solr/schema/CollationField.java
catch (ParseException e) { // invalid rules throw new RuntimeException(e); }
// in core/src/java/org/apache/solr/search/SurroundQParserPlugin.java
catch (org.apache.lucene.queryparser.surround.parser.ParseException pe) { throw new org.apache.lucene.queryparser.classic.ParseException( pe.getMessage()); }
// in core/src/java/org/apache/solr/search/ReturnFields.java
catch (ParseException e) { // try again, simple rules for a field name with no whitespace sp.pos = start; field = sp.getSimpleString(); if (req.getSchema().getFieldOrNull(field) != null) { // OK, it was an oddly named field fields.add(field); if( key != null ) { rename.add(field, key); } } else { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Error parsing fieldname: " + e.getMessage(), e); } }
// in core/src/java/org/apache/solr/search/ReturnFields.java
catch (ParseException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Error parsing fieldname", e); }
// in core/src/java/org/apache/solr/search/QueryParsing.java
catch (ParseException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "error in sort: " + sortSpec, e); }
// in core/src/java/org/apache/solr/update/DirectUpdateHandler2.java
catch (ParseException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/util/SolrPluginUtils.java
catch (ParseException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Error parsing query: " + qs); }
(Lib) IllegalArgumentException 20
            
// in solrj/src/java/org/apache/solr/common/params/FacetParams.java
catch (IllegalArgumentException e) { throw new SolrException (SolrException.ErrorCode.BAD_REQUEST, label+" is not a valid type of 'other' range facet information",e); }
// in solrj/src/java/org/apache/solr/common/params/FacetParams.java
catch (IllegalArgumentException e) { throw new SolrException (SolrException.ErrorCode.BAD_REQUEST, label+" is not a valid type of 'other' range facet information",e); }
// in solrj/src/java/org/apache/solr/common/params/FacetParams.java
catch (IllegalArgumentException e) { throw new SolrException (SolrException.ErrorCode.BAD_REQUEST, label+" is not a valid type of for range 'include' information",e); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/XMLResponseParser.java
catch( IllegalArgumentException ex ) { // Other implementations will likely throw this exception since "reuse-instance" // isimplementation specific. log.debug( "Unable to set the 'reuse-instance' property for the input factory: "+factory ); }
// in contrib/analysis-extras/src/java/org/apache/solr/analysis/MorfologikFilterFactory.java
catch (IllegalArgumentException e) { throw new IllegalArgumentException("The " + DICTIONARY_SCHEMA_ATTRIBUTE + " attribute accepts the " + "following constants: " + Arrays.toString(DICTIONARY.values()) + ", this value is invalid: " + dictionaryName); }
// in core/src/java/org/apache/solr/handler/DocumentAnalysisRequestHandler.java
catch (IllegalArgumentException ex) { // Other implementations will likely throw this exception since "reuse-instance" // isimplementation specific. log.debug("Unable to set the 'reuse-instance' property for the input factory: " + inputFactory); }
// in core/src/java/org/apache/solr/handler/loader/XMLLoader.java
catch (IllegalArgumentException ex) { // Other implementations will likely throw this exception since "reuse-instance" // isimplementation specific. log.debug("Unable to set the 'reuse-instance' property for the input chain: " + inputFactory); }
// in core/src/java/org/apache/solr/handler/component/TermsComponent.java
catch (IllegalArgumentException iae) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Unknown terms regex flag '" + flagParam + "'"); }
// in core/src/java/org/apache/solr/handler/component/QueryComponent.java
catch (IllegalArgumentException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, String.format("Illegal %s parameter", GroupParams.GROUP_FORMAT)); }
// in core/src/java/org/apache/solr/handler/PingRequestHandler.java
catch (IllegalArgumentException iae){ throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Unknown action: " + actionParam); }
// in core/src/java/org/apache/solr/servlet/ZookeeperInfoServlet.java
catch (IllegalArgumentException e) { // path doesn't exist (must have been removed) writeKeyValue(json, "warning", "(path gone)", false); }
// in core/src/java/org/apache/solr/servlet/ZookeeperInfoServlet.java
catch (IllegalArgumentException e) { // path doesn't exist (must have been removed) json.writeString("(children gone)"); }
// in core/src/java/org/apache/solr/servlet/cache/HttpCacheHeaderUtil.java
catch (IllegalArgumentException iae) { // one of our date headers was not formated properly, ignore it /* NOOP */ }
// in core/src/java/org/apache/solr/schema/TrieField.java
catch (IllegalArgumentException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Invalid type specified in schema.xml for field: " + args.get("name"), e); }
// in core/src/java/org/apache/solr/highlight/DefaultSolrHighlighter.java
catch (IllegalArgumentException e) { // No problem. But we can't use TermOffsets optimization. }
// in core/src/java/org/apache/solr/cloud/SolrZkServer.java
catch (IllegalArgumentException e) { throw new ConfigException("Error processing " + path, e); }
// in core/src/java/org/apache/solr/update/processor/DistributedUpdateProcessor.java
catch (IllegalArgumentException e) { throw new SolrException (SolrException.ErrorCode.BAD_REQUEST, "Illegal value for " + DISTRIB_UPDATE_PARAM + ": " + param, e); }
// in core/src/java/org/apache/solr/core/Config.java
catch (IllegalArgumentException iae) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Invalid luceneMatchVersion '" + matchVersion + "', valid values are: " + Arrays.toString(Version.values()) + " or a string in format 'V.V'", iae); }
// in core/src/java/org/apache/solr/util/DateMathParser.java
catch (IllegalArgumentException e) { throw new ParseException ("Unit not recognized: \"" + ops[pos-1] + "\"", pos-1); }
// in core/src/java/org/apache/solr/util/DateMathParser.java
catch (IllegalArgumentException e) { throw new ParseException ("Unit not recognized: \"" + ops[pos-1] + "\"", pos-1); }
13
            
// in solrj/src/java/org/apache/solr/common/params/FacetParams.java
catch (IllegalArgumentException e) { throw new SolrException (SolrException.ErrorCode.BAD_REQUEST, label+" is not a valid type of 'other' range facet information",e); }
// in solrj/src/java/org/apache/solr/common/params/FacetParams.java
catch (IllegalArgumentException e) { throw new SolrException (SolrException.ErrorCode.BAD_REQUEST, label+" is not a valid type of 'other' range facet information",e); }
// in solrj/src/java/org/apache/solr/common/params/FacetParams.java
catch (IllegalArgumentException e) { throw new SolrException (SolrException.ErrorCode.BAD_REQUEST, label+" is not a valid type of for range 'include' information",e); }
// in contrib/analysis-extras/src/java/org/apache/solr/analysis/MorfologikFilterFactory.java
catch (IllegalArgumentException e) { throw new IllegalArgumentException("The " + DICTIONARY_SCHEMA_ATTRIBUTE + " attribute accepts the " + "following constants: " + Arrays.toString(DICTIONARY.values()) + ", this value is invalid: " + dictionaryName); }
// in core/src/java/org/apache/solr/handler/component/TermsComponent.java
catch (IllegalArgumentException iae) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Unknown terms regex flag '" + flagParam + "'"); }
// in core/src/java/org/apache/solr/handler/component/QueryComponent.java
catch (IllegalArgumentException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, String.format("Illegal %s parameter", GroupParams.GROUP_FORMAT)); }
// in core/src/java/org/apache/solr/handler/PingRequestHandler.java
catch (IllegalArgumentException iae){ throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Unknown action: " + actionParam); }
// in core/src/java/org/apache/solr/schema/TrieField.java
catch (IllegalArgumentException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Invalid type specified in schema.xml for field: " + args.get("name"), e); }
// in core/src/java/org/apache/solr/cloud/SolrZkServer.java
catch (IllegalArgumentException e) { throw new ConfigException("Error processing " + path, e); }
// in core/src/java/org/apache/solr/update/processor/DistributedUpdateProcessor.java
catch (IllegalArgumentException e) { throw new SolrException (SolrException.ErrorCode.BAD_REQUEST, "Illegal value for " + DISTRIB_UPDATE_PARAM + ": " + param, e); }
// in core/src/java/org/apache/solr/core/Config.java
catch (IllegalArgumentException iae) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Invalid luceneMatchVersion '" + matchVersion + "', valid values are: " + Arrays.toString(Version.values()) + " or a string in format 'V.V'", iae); }
// in core/src/java/org/apache/solr/util/DateMathParser.java
catch (IllegalArgumentException e) { throw new ParseException ("Unit not recognized: \"" + ops[pos-1] + "\"", pos-1); }
// in core/src/java/org/apache/solr/util/DateMathParser.java
catch (IllegalArgumentException e) { throw new ParseException ("Unit not recognized: \"" + ops[pos-1] + "\"", pos-1); }
(Domain) SolrException 18
            
// in solrj/src/java/org/apache/solr/client/solrj/request/QueryRequest.java
catch (SolrException s){ throw s; }
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
catch (SolrException e) { // we retry on 404 or 403 or 503 - you can see this on solr shutdown if (e.code() == 404 || e.code() == 403 || e.code() == 503 || e.code() == 500) { ex = addZombie(server, e); } else { // Server is alive but the request was likely malformed or invalid throw e; } // TODO: consider using below above - currently does cause a problem with distrib updates: // seems to match up against a failed forward to leader exception as well... // || e.getMessage().contains("java.net.SocketException") // || e.getMessage().contains("java.net.ConnectException") }
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
catch (SolrException e) { // we retry on 404 or 403 or 503 - you can see this on solr shutdown if (e.code() == 404 || e.code() == 403 || e.code() == 503 || e.code() == 500) { ex = e; // already a zombie, no need to re-add } else { // Server is alive but the request was malformed or invalid zombieServers.remove(wrapper.getKey()); throw e; } }
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
catch (SolrException e) { // Server is alive but the request was malformed or invalid throw e; }
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
catch (SolrException e) { // Server is alive but the request was malformed or invalid throw e; }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
catch (SolrException e) { throw e; }
// in core/src/java/org/apache/solr/handler/component/RealTimeGetComponent.java
catch (SolrException e) { log.warn("Exception reading log for updates", e); }
// in core/src/java/org/apache/solr/client/solrj/embedded/EmbeddedSolrServer.java
catch( SolrException sx ) { throw sx; }
// in core/src/java/org/apache/solr/request/SimpleFacets.java
catch (SolrException e) { throw new SolrException (SolrException.ErrorCode.BAD_REQUEST, "date facet 'start' is not a valid Date string: " + startS, e); }
// in core/src/java/org/apache/solr/request/SimpleFacets.java
catch (SolrException e) { throw new SolrException (SolrException.ErrorCode.BAD_REQUEST, "date facet 'end' is not a valid Date string: " + endS, e); }
// in core/src/java/org/apache/solr/schema/IndexSchema.java
catch (SolrException e) { throw e; }
// in core/src/java/org/apache/solr/update/processor/FieldMutatingUpdateProcessor.java
catch (SolrException e) { String msg = "Unable to mutate field '"+fname+"': "+e.getMessage(); SolrException.log(log, msg, e); throw new SolrException(BAD_REQUEST, msg, e); }
// in core/src/java/org/apache/solr/update/DocumentBuilder.java
catch( SolrException ex ) { throw ex; }
// in core/src/java/org/apache/solr/core/Config.java
catch( SolrException e ){ SolrException.log(log,"Error in "+name,e); throw e; }
// in core/src/java/org/apache/solr/core/Config.java
catch (SolrException e) { throw(e); }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (SolrException e) { throw e; }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (SolrException e) { throw e; }
// in core/src/java/org/apache/solr/util/SolrPluginUtils.java
catch (SolrException e) { sortE = e; }
16
            
// in solrj/src/java/org/apache/solr/client/solrj/request/QueryRequest.java
catch (SolrException s){ throw s; }
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
catch (SolrException e) { // we retry on 404 or 403 or 503 - you can see this on solr shutdown if (e.code() == 404 || e.code() == 403 || e.code() == 503 || e.code() == 500) { ex = addZombie(server, e); } else { // Server is alive but the request was likely malformed or invalid throw e; } // TODO: consider using below above - currently does cause a problem with distrib updates: // seems to match up against a failed forward to leader exception as well... // || e.getMessage().contains("java.net.SocketException") // || e.getMessage().contains("java.net.ConnectException") }
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
catch (SolrException e) { // we retry on 404 or 403 or 503 - you can see this on solr shutdown if (e.code() == 404 || e.code() == 403 || e.code() == 503 || e.code() == 500) { ex = e; // already a zombie, no need to re-add } else { // Server is alive but the request was malformed or invalid zombieServers.remove(wrapper.getKey()); throw e; } }
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
catch (SolrException e) { // Server is alive but the request was malformed or invalid throw e; }
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
catch (SolrException e) { // Server is alive but the request was malformed or invalid throw e; }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
catch (SolrException e) { throw e; }
// in core/src/java/org/apache/solr/client/solrj/embedded/EmbeddedSolrServer.java
catch( SolrException sx ) { throw sx; }
// in core/src/java/org/apache/solr/request/SimpleFacets.java
catch (SolrException e) { throw new SolrException (SolrException.ErrorCode.BAD_REQUEST, "date facet 'start' is not a valid Date string: " + startS, e); }
// in core/src/java/org/apache/solr/request/SimpleFacets.java
catch (SolrException e) { throw new SolrException (SolrException.ErrorCode.BAD_REQUEST, "date facet 'end' is not a valid Date string: " + endS, e); }
// in core/src/java/org/apache/solr/schema/IndexSchema.java
catch (SolrException e) { throw e; }
// in core/src/java/org/apache/solr/update/processor/FieldMutatingUpdateProcessor.java
catch (SolrException e) { String msg = "Unable to mutate field '"+fname+"': "+e.getMessage(); SolrException.log(log, msg, e); throw new SolrException(BAD_REQUEST, msg, e); }
// in core/src/java/org/apache/solr/update/DocumentBuilder.java
catch( SolrException ex ) { throw ex; }
// in core/src/java/org/apache/solr/core/Config.java
catch( SolrException e ){ SolrException.log(log,"Error in "+name,e); throw e; }
// in core/src/java/org/apache/solr/core/Config.java
catch (SolrException e) { throw(e); }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (SolrException e) { throw e; }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (SolrException e) { throw e; }
(Lib) MalformedURLException 13
            
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
catch (MalformedURLException e) { throw new RuntimeException(e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/SolrEntityProcessor.java
catch (MalformedURLException e) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, e); }
// in core/src/java/org/apache/solr/handler/StandardRequestHandler.java
catch( MalformedURLException ex ) { return null; }
// in core/src/java/org/apache/solr/handler/MoreLikeThisHandler.java
catch( MalformedURLException ex ) { return null; }
// in core/src/java/org/apache/solr/handler/component/QueryElevationComponent.java
catch (MalformedURLException e) { throw new RuntimeException(e); }
// in core/src/java/org/apache/solr/handler/component/HttpShardHandlerFactory.java
catch (MalformedURLException e) { // should be impossible since we're not passing any URLs here throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/handler/admin/LukeRequestHandler.java
catch( MalformedURLException ex ) { return null; }
// in core/src/java/org/apache/solr/update/processor/URLClassifyProcessor.java
catch (MalformedURLException e) { log.warn("cannot get the normalized url for \"" + url + "\" due to " + e.getMessage()); }
// in core/src/java/org/apache/solr/update/processor/URLClassifyProcessor.java
catch (MalformedURLException e) { e.printStackTrace(); }
// in core/src/java/org/apache/solr/core/SolrResourceLoader.java
catch (MalformedURLException e) { SolrException.log(log, "Can't add element to classloader: " + files[j], e); }
// in core/src/java/org/apache/solr/util/SuggestMissingFactories.java
catch (MalformedURLException e) { throw new RuntimeException ("WTF, how can JarFile.toURL() be malformed?", e); }
// in core/src/java/org/apache/solr/util/SimplePostTool.java
catch (MalformedURLException e) { fatal("System Property 'url' is not a valid URL: " + u); }
// in core/src/java/org/apache/solr/util/SimplePostTool.java
catch (MalformedURLException e) { fatal("The specified URL "+url+" is not a valid URL. Please check"); }
5
            
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
catch (MalformedURLException e) { throw new RuntimeException(e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/SolrEntityProcessor.java
catch (MalformedURLException e) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, e); }
// in core/src/java/org/apache/solr/handler/component/QueryElevationComponent.java
catch (MalformedURLException e) { throw new RuntimeException(e); }
// in core/src/java/org/apache/solr/handler/component/HttpShardHandlerFactory.java
catch (MalformedURLException e) { // should be impossible since we're not passing any URLs here throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/util/SuggestMissingFactories.java
catch (MalformedURLException e) { throw new RuntimeException ("WTF, how can JarFile.toURL() be malformed?", e); }
(Lib) InvalidShapeException 11
            
// in core/src/java/org/apache/solr/schema/LatLonType.java
catch (InvalidShapeException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/schema/LatLonType.java
catch (InvalidShapeException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/schema/LatLonType.java
catch (InvalidShapeException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/schema/LatLonType.java
catch (InvalidShapeException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/schema/PointType.java
catch (InvalidShapeException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/schema/PointType.java
catch (InvalidShapeException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/schema/PointType.java
catch (InvalidShapeException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/schema/PointType.java
catch (InvalidShapeException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/schema/GeoHashField.java
catch (InvalidShapeException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/schema/GeoHashField.java
catch (InvalidShapeException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/search/function/distance/HaversineConstFunction.java
catch (InvalidShapeException e) { throw new ParseException("Bad spatial pt:" + pt); }
11
            
// in core/src/java/org/apache/solr/schema/LatLonType.java
catch (InvalidShapeException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/schema/LatLonType.java
catch (InvalidShapeException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/schema/LatLonType.java
catch (InvalidShapeException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/schema/LatLonType.java
catch (InvalidShapeException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/schema/PointType.java
catch (InvalidShapeException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/schema/PointType.java
catch (InvalidShapeException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/schema/PointType.java
catch (InvalidShapeException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/schema/PointType.java
catch (InvalidShapeException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/schema/GeoHashField.java
catch (InvalidShapeException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/schema/GeoHashField.java
catch (InvalidShapeException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/search/function/distance/HaversineConstFunction.java
catch (InvalidShapeException e) { throw new ParseException("Bad spatial pt:" + pt); }
(Lib) RuntimeException 10
            
// in contrib/clustering/src/java/org/apache/solr/handler/clustering/carrot2/CarrotClusteringEngine.java
catch (RuntimeException e) { log.debug("Resource not found in Solr's config: " + resourceName + ". Using the default " + resource + " from Carrot JAR."); return new IResource[] {}; }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/XPathEntityProcessor.java
catch (RuntimeException e) { throw new DataImportHandlerException(SEVERE, "Exception while reading xpaths for fields", e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DataImportHandler.java
catch (RuntimeException e) { rsp.add("exception", DebugLogger.getStacktraceString(e)); importer = null; return; }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DataImportHandler.java
catch (RuntimeException e) { LOG.error( "Exception while adding: " + document, e); return false; }
// in core/src/java/org/apache/solr/schema/FieldType.java
catch (RuntimeException e) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "Error while creating field '" + field + "' from value '" + value + "'", e); }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
catch (RuntimeException e) { // getField() throws a SolrException, but it arrives as a RuntimeException log.warn("Field \"" + fieldName + "\" found in index, but not defined in schema."); }
// in core/src/java/org/apache/solr/core/SolrResourceLoader.java
catch( RuntimeException ex ) { log.warn("Odd RuntimeException while testing for JNDI: " + ex.getMessage()); }
// in core/src/java/org/apache/solr/core/JmxMonitoredMap.java
catch (RuntimeException e) { LOG.warn( "Failed to unregister info bean: " + key, e); }
// in core/src/java/org/apache/solr/util/SystemIdResolver.java
catch (RuntimeException re) { // unfortunately XInclude fallback only works with IOException, but openResource() never throws that one throw (IOException) (new IOException(re.getMessage()).initCause(re)); }
// in core/src/java/org/apache/solr/util/SimplePostTool.java
catch(RuntimeException e) { e.printStackTrace(); fatal("RuntimeException " + e); }
3
            
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/XPathEntityProcessor.java
catch (RuntimeException e) { throw new DataImportHandlerException(SEVERE, "Exception while reading xpaths for fields", e); }
// in core/src/java/org/apache/solr/schema/FieldType.java
catch (RuntimeException e) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "Error while creating field '" + field + "' from value '" + value + "'", e); }
// in core/src/java/org/apache/solr/util/SystemIdResolver.java
catch (RuntimeException re) { // unfortunately XInclude fallback only works with IOException, but openResource() never throws that one throw (IOException) (new IOException(re.getMessage()).initCause(re)); }
(Lib) ClassCastException 7
            
// in contrib/velocity/src/java/org/apache/solr/response/VelocityResponseWriter.java
catch (ClassCastException e) { // known edge case where QueryResponse's extraction assumes "response" is a SolrDocumentList // (AnalysisRequestHandler emits a "response") e.printStackTrace(); rsp = new SolrResponseBase(); rsp.setResponse(parsedResponse); }
// in core/src/java/org/apache/solr/handler/component/RealTimeGetComponent.java
catch (ClassCastException e) { log.warn("Exception reading log for updates", e); }
// in core/src/java/org/apache/solr/analysis/PhoneticFilterFactory.java
catch (ClassCastException e) { throw new InitializationException("Not an encoder: " + name + " must be full class name or one of " + registry.keySet(), e); }
// in core/src/java/org/apache/solr/update/UpdateLog.java
catch (ClassCastException cl) { log.warn("Unexpected log entry or corrupt log. Entry=" + o, cl); // would be caused by a corrupt transaction log }
// in core/src/java/org/apache/solr/update/UpdateLog.java
catch (ClassCastException cl) { recoveryInfo.errors++; loglog.warn("REPLAY_ERR: Unexpected log entry or corrupt log. Entry=" + o, cl); // would be caused by a corrupt transaction log }
// in core/src/java/org/apache/solr/update/processor/MinFieldValueUpdateProcessorFactory.java
catch (ClassCastException e) { throw new SolrException (BAD_REQUEST, "Field values are not mutually comparable: " + e.getMessage(), e); }
// in core/src/java/org/apache/solr/update/processor/MaxFieldValueUpdateProcessorFactory.java
catch (ClassCastException e) { throw new SolrException (BAD_REQUEST, "Field values are not mutually comparable: " + e.getMessage(), e); }
3
            
// in core/src/java/org/apache/solr/analysis/PhoneticFilterFactory.java
catch (ClassCastException e) { throw new InitializationException("Not an encoder: " + name + " must be full class name or one of " + registry.keySet(), e); }
// in core/src/java/org/apache/solr/update/processor/MinFieldValueUpdateProcessorFactory.java
catch (ClassCastException e) { throw new SolrException (BAD_REQUEST, "Field values are not mutually comparable: " + e.getMessage(), e); }
// in core/src/java/org/apache/solr/update/processor/MaxFieldValueUpdateProcessorFactory.java
catch (ClassCastException e) { throw new SolrException (BAD_REQUEST, "Field values are not mutually comparable: " + e.getMessage(), e); }
(Lib) ClassNotFoundException 7
            
// in contrib/clustering/src/java/org/apache/solr/handler/clustering/carrot2/LuceneCarrot2StemmerFactory.java
catch (ClassNotFoundException e) { logger .warn( "Could not instantiate Lucene stemmer for Arabic, clustering quality " + "of Arabic content may be degraded. For best quality clusters, " + "make sure Lucene's Arabic analyzer JAR is in the classpath", e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/JdbcDataSource.java
catch (ClassNotFoundException e) { wrapAndThrow(SEVERE, e, "Could not load driver: " + driver); }
// in core/src/java/org/apache/solr/analysis/PhoneticFilterFactory.java
catch (ClassNotFoundException cnfe) { throw new InitializationException("Unknown encoder: " + name + " must be full class name or one of " + registry.keySet(), cnfe); }
// in core/src/java/org/apache/solr/analysis/SnowballPorterFilterFactory.java
catch (ClassNotFoundException e) { throw new InitializationException("Can't find class for stemmer language " + language, e); }
// in core/src/java/org/apache/solr/core/SolrResourceLoader.java
catch (ClassNotFoundException e) { //this is unlikely log.error("Unable to load cached class-name : "+ c +" for shortname : "+cname + e); }
// in core/src/java/org/apache/solr/core/SolrResourceLoader.java
catch (ClassNotFoundException e) { String newName=cname; if (newName.startsWith(project)) { newName = cname.substring(project.length()+1); } for (String subpackage : subpackages) { try { String name = base + '.' + subpackage + newName; log.trace("Trying class name " + name); return clazz = Class.forName(name,true,classLoader).asSubclass(expectedType); } catch (ClassNotFoundException e1) { // ignore... assume first exception is best. } } throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "Error loading class '" + cname + "'", e); }
// in core/src/java/org/apache/solr/core/SolrResourceLoader.java
catch (ClassNotFoundException e1) { // ignore... assume first exception is best. }
3
            
// in core/src/java/org/apache/solr/analysis/PhoneticFilterFactory.java
catch (ClassNotFoundException cnfe) { throw new InitializationException("Unknown encoder: " + name + " must be full class name or one of " + registry.keySet(), cnfe); }
// in core/src/java/org/apache/solr/analysis/SnowballPorterFilterFactory.java
catch (ClassNotFoundException e) { throw new InitializationException("Can't find class for stemmer language " + language, e); }
// in core/src/java/org/apache/solr/core/SolrResourceLoader.java
catch (ClassNotFoundException e) { String newName=cname; if (newName.startsWith(project)) { newName = cname.substring(project.length()+1); } for (String subpackage : subpackages) { try { String name = base + '.' + subpackage + newName; log.trace("Trying class name " + name); return clazz = Class.forName(name,true,classLoader).asSubclass(expectedType); } catch (ClassNotFoundException e1) { // ignore... assume first exception is best. } } throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "Error loading class '" + cname + "'", e); }
(Lib) UnsupportedEncodingException 7
            
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
catch (UnsupportedEncodingException e) { // can't happen - UTF-8 throw new RuntimeException(e); }
// in contrib/velocity/src/java/org/apache/solr/response/SolrParamResourceLoader.java
catch (UnsupportedEncodingException e) { throw new RuntimeException(e); // may not happen }
// in core/src/java/org/apache/solr/servlet/SolrRequestParsers.java
catch( UnsupportedEncodingException uex ) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, uex ); }
// in core/src/java/org/apache/solr/servlet/cache/HttpCacheHeaderUtil.java
catch (UnsupportedEncodingException e) { throw new RuntimeException(e); // may not happen }
// in core/src/java/org/apache/solr/spelling/suggest/Suggester.java
catch (UnsupportedEncodingException e) { // should not happen LOG.error("should not happen", e); }
// in core/src/java/org/apache/solr/update/processor/MD5Signature.java
catch (UnsupportedEncodingException e) { // won't happen log.error("UTF-8 not supported", e); throw new RuntimeException(e); }
// in core/src/java/org/apache/solr/util/SimplePostTool.java
catch (UnsupportedEncodingException e) { fatal("Shouldn't happen: UTF-8 not supported?!?!?!"); }
5
            
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
catch (UnsupportedEncodingException e) { // can't happen - UTF-8 throw new RuntimeException(e); }
// in contrib/velocity/src/java/org/apache/solr/response/SolrParamResourceLoader.java
catch (UnsupportedEncodingException e) { throw new RuntimeException(e); // may not happen }
// in core/src/java/org/apache/solr/servlet/SolrRequestParsers.java
catch( UnsupportedEncodingException uex ) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, uex ); }
// in core/src/java/org/apache/solr/servlet/cache/HttpCacheHeaderUtil.java
catch (UnsupportedEncodingException e) { throw new RuntimeException(e); // may not happen }
// in core/src/java/org/apache/solr/update/processor/MD5Signature.java
catch (UnsupportedEncodingException e) { // won't happen log.error("UTF-8 not supported", e); throw new RuntimeException(e); }
(Lib) MessagingException 6
            
// in contrib/dataimporthandler-extras/src/java/org/apache/solr/handler/dataimport/MailEntityProcessor.java
catch (MessagingException e) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Connection failed", e); }
// in contrib/dataimporthandler-extras/src/java/org/apache/solr/handler/dataimport/MailEntityProcessor.java
catch (MessagingException e) { //throw new DataImportHandlerException(DataImportHandlerException.SEVERE, // "Folder open failed", e); }
// in contrib/dataimporthandler-extras/src/java/org/apache/solr/handler/dataimport/MailEntityProcessor.java
catch (MessagingException e) { // skip bad ones unless its the last one and still no good folder if (folders.size() == 0 && i == topLevelFolders.size() - 1) throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Folder retreival failed"); }
// in contrib/dataimporthandler-extras/src/java/org/apache/solr/handler/dataimport/MailEntityProcessor.java
catch (MessagingException e) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Folder retreival failed"); }
// in contrib/dataimporthandler-extras/src/java/org/apache/solr/handler/dataimport/MailEntityProcessor.java
catch (MessagingException e) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Message retreival failed", e); }
// in contrib/dataimporthandler-extras/src/java/org/apache/solr/handler/dataimport/MailEntityProcessor.java
catch (MessagingException e) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Message retreival failed", e); }
5
            
// in contrib/dataimporthandler-extras/src/java/org/apache/solr/handler/dataimport/MailEntityProcessor.java
catch (MessagingException e) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Connection failed", e); }
// in contrib/dataimporthandler-extras/src/java/org/apache/solr/handler/dataimport/MailEntityProcessor.java
catch (MessagingException e) { // skip bad ones unless its the last one and still no good folder if (folders.size() == 0 && i == topLevelFolders.size() - 1) throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Folder retreival failed"); }
// in contrib/dataimporthandler-extras/src/java/org/apache/solr/handler/dataimport/MailEntityProcessor.java
catch (MessagingException e) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Folder retreival failed"); }
// in contrib/dataimporthandler-extras/src/java/org/apache/solr/handler/dataimport/MailEntityProcessor.java
catch (MessagingException e) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Message retreival failed", e); }
// in contrib/dataimporthandler-extras/src/java/org/apache/solr/handler/dataimport/MailEntityProcessor.java
catch (MessagingException e) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Message retreival failed", e); }
(Domain) SolrServerException 6
            
// in solrj/src/java/org/apache/solr/client/solrj/request/QueryRequest.java
catch (SolrServerException e){ throw e; }
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
catch (SolrServerException e) { Throwable rootCause = e.getRootCause(); if (rootCause instanceof IOException) { ex = addZombie(server, e); } else { throw e; } }
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
catch (SolrServerException e) { Throwable rootCause = e.getRootCause(); if (rootCause instanceof IOException) { ex = e; // already a zombie, no need to re-add } else { throw e; } }
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
catch (SolrServerException e) { if (e.getRootCause() instanceof IOException) { ex = e; moveAliveToDead(wrapper); if (justFailed == null) justFailed = new HashMap<String,ServerWrapper>(); justFailed.put(wrapper.getKey(), wrapper); } else { throw e; } }
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
catch (SolrServerException e) { if (e.getRootCause() instanceof IOException) { ex = e; // still dead } else { throw e; } }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/SolrEntityProcessor.java
catch (SolrServerException e) { if (ABORT.equals(onError)) { wrapAndThrow(SEVERE, e); } else if (SKIP.equals(onError)) { wrapAndThrow(DataImportHandlerException.SKIP_ROW, e); } }
5
            
// in solrj/src/java/org/apache/solr/client/solrj/request/QueryRequest.java
catch (SolrServerException e){ throw e; }
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
catch (SolrServerException e) { Throwable rootCause = e.getRootCause(); if (rootCause instanceof IOException) { ex = addZombie(server, e); } else { throw e; } }
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
catch (SolrServerException e) { Throwable rootCause = e.getRootCause(); if (rootCause instanceof IOException) { ex = e; // already a zombie, no need to re-add } else { throw e; } }
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
catch (SolrServerException e) { if (e.getRootCause() instanceof IOException) { ex = e; moveAliveToDead(wrapper); if (justFailed == null) justFailed = new HashMap<String,ServerWrapper>(); justFailed.put(wrapper.getKey(), wrapper); } else { throw e; } }
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
catch (SolrServerException e) { if (e.getRootCause() instanceof IOException) { ex = e; // still dead } else { throw e; } }
(Lib) TimeExceededException 6
            
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
catch( TimeLimitingCollector.TimeExceededException x ) { log.warn( "Query: " + query + "; " + x.getMessage() ); qr.setPartialResults(true); }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
catch( TimeLimitingCollector.TimeExceededException x ) { log.warn( "Query: " + query + "; " + x.getMessage() ); qr.setPartialResults(true); }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
catch( TimeLimitingCollector.TimeExceededException x ) { log.warn( "Query: " + query + "; " + x.getMessage() ); qr.setPartialResults(true); }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
catch( TimeLimitingCollector.TimeExceededException x ) { log.warn( "Query: " + query + "; " + x.getMessage() ); qr.setPartialResults(true); }
// in core/src/java/org/apache/solr/search/Grouping.java
catch (TimeLimitingCollector.TimeExceededException x) { logger.warn( "Query: " + query + "; " + x.getMessage() ); qr.setPartialResults(true); }
// in core/src/java/org/apache/solr/search/grouping/CommandHandler.java
catch (TimeLimitingCollector.TimeExceededException x) { partialResults = true; logger.warn( "Query: " + query + "; " + x.getMessage() ); }
0
(Domain) DataImportHandlerException 5
            
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DebugLogger.java
catch (DataImportHandlerException de) { log(DIHLogLevels.ENTITY_EXCEPTION, null, de); throw de; }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DebugLogger.java
catch (DataImportHandlerException de) { log(DIHLogLevels.TRANSFORMER_EXCEPTION, tName, de); de.debugged = true; throw de; }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/ScriptTransformer.java
catch (DataImportHandlerException e) { throw e; }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/SqlEntityProcessor.java
catch (DataImportHandlerException e) { throw e; }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DocBuilder.java
catch (DataImportHandlerException e) { if (verboseDebug) { getDebugLogger().log(DIHLogLevels.ENTITY_EXCEPTION, epw.getEntity().getName(), e); } if(e.getErrCode() == DataImportHandlerException.SKIP_ROW){ continue; } if (isRoot) { if (e.getErrCode() == DataImportHandlerException.SKIP) { importStatistics.skipDocCount.getAndIncrement(); doc = null; } else { SolrException.log(LOG, "Exception while processing: " + epw.getEntity().getName() + " document : " + doc, e); } if (e.getErrCode() == DataImportHandlerException.SEVERE) throw e; } else throw e; }
6
            
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DebugLogger.java
catch (DataImportHandlerException de) { log(DIHLogLevels.ENTITY_EXCEPTION, null, de); throw de; }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DebugLogger.java
catch (DataImportHandlerException de) { log(DIHLogLevels.TRANSFORMER_EXCEPTION, tName, de); de.debugged = true; throw de; }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/ScriptTransformer.java
catch (DataImportHandlerException e) { throw e; }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/SqlEntityProcessor.java
catch (DataImportHandlerException e) { throw e; }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DocBuilder.java
catch (DataImportHandlerException e) { if (verboseDebug) { getDebugLogger().log(DIHLogLevels.ENTITY_EXCEPTION, epw.getEntity().getName(), e); } if(e.getErrCode() == DataImportHandlerException.SKIP_ROW){ continue; } if (isRoot) { if (e.getErrCode() == DataImportHandlerException.SKIP) { importStatistics.skipDocCount.getAndIncrement(); doc = null; } else { SolrException.log(LOG, "Exception while processing: " + epw.getEntity().getName() + " document : " + doc, e); } if (e.getErrCode() == DataImportHandlerException.SEVERE) throw e; } else throw e; }
(Lib) ExecutionException 5
            
// in core/src/java/org/apache/solr/handler/SnapPuller.java
catch (ExecutionException e) { SolrException.log(LOG,e); }
// in core/src/java/org/apache/solr/handler/component/HttpShardHandler.java
catch (ExecutionException e) { // should be impossible... the problem with catching the exception // at this level is we don't know what ShardRequest it applied to throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Impossible Exception",e); }
// in core/src/java/org/apache/solr/request/PerSegmentSingleValuedFaceting.java
catch (ExecutionException e) { Throwable cause = e.getCause(); if (cause instanceof RuntimeException) { throw (RuntimeException)cause; } else { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Error in per-segment faceting on field: " + fieldName, cause); } }
// in core/src/java/org/apache/solr/update/DirectUpdateHandler2.java
catch (ExecutionException e) { SolrException.log(log,e); }
// in core/src/java/org/apache/solr/update/SolrCmdDistributor.java
catch (ExecutionException e) { // shouldn't happen since we catch exceptions ourselves SolrException.log(SolrCore.log, "error sending update request to shard", e); }
3
            
// in core/src/java/org/apache/solr/handler/component/HttpShardHandler.java
catch (ExecutionException e) { // should be impossible... the problem with catching the exception // at this level is we don't know what ShardRequest it applied to throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Impossible Exception",e); }
// in core/src/java/org/apache/solr/request/PerSegmentSingleValuedFaceting.java
catch (ExecutionException e) { Throwable cause = e.getCause(); if (cause instanceof RuntimeException) { throw (RuntimeException)cause; } else { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Error in per-segment faceting on field: " + fieldName, cause); } }
(Lib) NoNodeException 5
            
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
catch (NoNodeException e) { // must have gone away }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (KeeperException.NoNodeException e) { // fine if there is nothing to delete // TODO: annoying that ZK logs a warning on us nodeDeleted = false; }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (NoNodeException e) { Thread.sleep(500); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (NoNodeException e) { // just keep trying }
// in core/src/java/org/apache/solr/cloud/LeaderElector.java
catch (KeeperException.NoNodeException e) { // we must have failed in creating the election node - someone else must // be working on it, lets try again if (tries++ > 9) { throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } cont = true; Thread.sleep(50); }
1
            
// in core/src/java/org/apache/solr/cloud/LeaderElector.java
catch (KeeperException.NoNodeException e) { // we must have failed in creating the election node - someone else must // be working on it, lets try again if (tries++ > 9) { throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } cont = true; Thread.sleep(50); }
(Lib) NodeExistsException 5
            
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
catch (NodeExistsException e) { if (!failOnExists) { // TODO: version ? for now, don't worry about race setData(currentPath, data, -1, retryOnConnLoss); // set new watch exists(currentPath, watcher, retryOnConnLoss); return; } // ignore unless it's the last node in the path if (i == paths.length - 1) { throw e; } }
// in solrj/src/java/org/apache/solr/common/cloud/ZkCmdExecutor.java
catch (NodeExistsException e) { // its okay if another beats us creating the node }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/ZKPropertiesWriter.java
catch (NodeExistsException e) {}
// in core/src/java/org/apache/solr/cloud/ElectionContext.java
catch (NodeExistsException e) { // if a previous leader ephemeral still exists for some reason, try and // remove it zkClient.delete(leaderPath, -1, true); zkClient.makePath(leaderPath, leaderProps == null ? null : ZkStateReader.toJSON(leaderProps), CreateMode.EPHEMERAL, true); }
// in core/src/java/org/apache/solr/cloud/ElectionContext.java
catch (NodeExistsException e) { // if a previous leader ephemeral still exists for some reason, try and // remove it zkClient.delete(leaderPath, -1, true); zkClient.makePath(leaderPath, ZkStateReader.toJSON(myProps), CreateMode.EPHEMERAL, true); }
1
            
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
catch (NodeExistsException e) { if (!failOnExists) { // TODO: version ? for now, don't worry about race setData(currentPath, data, -1, retryOnConnLoss); // set new watch exists(currentPath, watcher, retryOnConnLoss); return; } // ignore unless it's the last node in the path if (i == paths.length - 1) { throw e; } }
(Lib) TransformerException 4
            
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/XPathEntityProcessor.java
catch (TransformerException e) { if (ABORT.equals(onError)) { wrapAndThrow(SEVERE, e, "Exception in applying XSL Transformeation"); } else if (SKIP.equals(onError)) { wrapAndThrow(DataImportHandlerException.SKIP, e); } else { LOG.warn("Failed for url : " + s, e); rowIterator = Collections.EMPTY_LIST.iterator(); return; } }
// in core/src/java/org/apache/solr/handler/loader/XMLLoader.java
catch(TransformerException te) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, te.getMessage(), te); }
// in core/src/java/org/apache/solr/response/XSLTResponseWriter.java
catch(TransformerException te) { final IOException ioe = new IOException("XSLT transformation error"); ioe.initCause(te); throw ioe; }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (TransformerException e) { throw new SolrException(ErrorCode.SERVER_ERROR, "", e); }
3
            
// in core/src/java/org/apache/solr/handler/loader/XMLLoader.java
catch(TransformerException te) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, te.getMessage(), te); }
// in core/src/java/org/apache/solr/response/XSLTResponseWriter.java
catch(TransformerException te) { final IOException ioe = new IOException("XSLT transformation error"); ioe.initCause(te); throw ioe; }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (TransformerException e) { throw new SolrException(ErrorCode.SERVER_ERROR, "", e); }
(Lib) URISyntaxException 4
            
// in core/src/java/org/apache/solr/handler/admin/ShowFileRequestHandler.java
catch (URISyntaxException e) { throw new SolrException( ErrorCode.FORBIDDEN, "Can not access configuration directory!"); }
// in core/src/java/org/apache/solr/update/processor/URLClassifyProcessor.java
catch (URISyntaxException e) { log.warn("cannot get the normalized url for \"" + url + "\" due to " + e.getMessage()); }
// in core/src/java/org/apache/solr/util/SystemIdResolver.java
catch (URISyntaxException use) { log.warn("An URI systax problem occurred during resolving SystemId, falling back to default resolver", use); return null; }
// in core/src/java/org/apache/solr/util/SystemIdResolver.java
catch (URISyntaxException use) { throw new IllegalArgumentException("Invalid syntax of Solr Resource URI", use); }
2
            
// in core/src/java/org/apache/solr/handler/admin/ShowFileRequestHandler.java
catch (URISyntaxException e) { throw new SolrException( ErrorCode.FORBIDDEN, "Can not access configuration directory!"); }
// in core/src/java/org/apache/solr/util/SystemIdResolver.java
catch (URISyntaxException use) { throw new IllegalArgumentException("Invalid syntax of Solr Resource URI", use); }
(Lib) UnsupportedOperationException 4
            
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DataImporter.java
catch( UnsupportedOperationException e ) { LOG.warn( "XML parser doesn't support XInclude option" ); }
// in core/src/java/org/apache/solr/schema/CurrencyField.java
catch (UnsupportedOperationException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "XML parser doesn't support XInclude option", e); }
// in core/src/java/org/apache/solr/spelling/SolrSpellChecker.java
catch(UnsupportedOperationException uoe) { //just use .5 as a default }
// in core/src/java/org/apache/solr/core/Config.java
catch(UnsupportedOperationException e) { log.warn(name + " XML parser doesn't support XInclude option"); }
1
            
// in core/src/java/org/apache/solr/schema/CurrencyField.java
catch (UnsupportedOperationException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "XML parser doesn't support XInclude option", e); }
(Lib) XMLStreamException 4
            
// in solrj/src/java/org/apache/solr/client/solrj/impl/XMLResponseParser.java
catch (XMLStreamException e) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "parsing error", e); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/XMLResponseParser.java
catch (XMLStreamException e) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "parsing error", e); }
// in core/src/java/org/apache/solr/handler/loader/XMLLoader.java
catch (XMLStreamException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e.getMessage(), e); }
// in core/src/java/org/apache/solr/handler/loader/XMLLoader.java
catch (XMLStreamException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e.getMessage(), e); }
4
            
// in solrj/src/java/org/apache/solr/client/solrj/impl/XMLResponseParser.java
catch (XMLStreamException e) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "parsing error", e); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/XMLResponseParser.java
catch (XMLStreamException e) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "parsing error", e); }
// in core/src/java/org/apache/solr/handler/loader/XMLLoader.java
catch (XMLStreamException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e.getMessage(), e); }
// in core/src/java/org/apache/solr/handler/loader/XMLLoader.java
catch (XMLStreamException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e.getMessage(), e); }
(Lib) XPathExpressionException 4
            
// in core/src/java/org/apache/solr/handler/component/QueryElevationComponent.java
catch (XPathExpressionException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "query requires '<doc .../>' child"); }
// in core/src/java/org/apache/solr/schema/CurrencyField.java
catch (XPathExpressionException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Error parsing currency config.", e); }
// in core/src/java/org/apache/solr/core/Config.java
catch (XPathExpressionException e) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR,"Error in xpath:" + path +" for " + name,e); }
// in core/src/java/org/apache/solr/core/Config.java
catch (XPathExpressionException e) { SolrException.log(log,"Error in xpath",e); throw new SolrException( SolrException.ErrorCode.SERVER_ERROR,"Error in xpath:" + xstr + " for " + name,e); }
4
            
// in core/src/java/org/apache/solr/handler/component/QueryElevationComponent.java
catch (XPathExpressionException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "query requires '<doc .../>' child"); }
// in core/src/java/org/apache/solr/schema/CurrencyField.java
catch (XPathExpressionException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Error parsing currency config.", e); }
// in core/src/java/org/apache/solr/core/Config.java
catch (XPathExpressionException e) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR,"Error in xpath:" + path +" for " + name,e); }
// in core/src/java/org/apache/solr/core/Config.java
catch (XPathExpressionException e) { SolrException.log(log,"Error in xpath",e); throw new SolrException( SolrException.ErrorCode.SERVER_ERROR,"Error in xpath:" + xstr + " for " + name,e); }
(Lib) CloneNotSupportedException 3
            
// in core/src/java/org/apache/solr/internal/csv/CSVStrategy.java
catch (CloneNotSupportedException e) { throw new RuntimeException(e); // impossible }
// in core/src/java/org/apache/solr/search/DocSlice.java
catch (CloneNotSupportedException e) {}
// in core/src/java/org/apache/solr/update/UpdateCommand.java
catch (CloneNotSupportedException e) { return null; }
1
            
// in core/src/java/org/apache/solr/internal/csv/CSVStrategy.java
catch (CloneNotSupportedException e) { throw new RuntimeException(e); // impossible }
(Lib) FileNotFoundException 3
            
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/BinFileDataSource.java
catch (FileNotFoundException e) { wrapAndThrow(SEVERE,e,"Unable to open file "+f.getAbsolutePath()); return null; }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/FileDataSource.java
catch (FileNotFoundException e) { throw new RuntimeException(e); }
// in core/src/java/org/apache/solr/core/SolrXMLSerializer.java
catch (java.io.FileNotFoundException xnf) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, xnf); }
2
            
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/FileDataSource.java
catch (FileNotFoundException e) { throw new RuntimeException(e); }
// in core/src/java/org/apache/solr/core/SolrXMLSerializer.java
catch (java.io.FileNotFoundException xnf) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, xnf); }
(Lib) NoSuchMethodException 3
            
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/EntityProcessorWrapper.java
catch (NoSuchMethodException nsme){ String msg = "Transformer :" + trans + "does not implement Transformer interface or does not have a transformRow(Map<String.Object> m)method"; log.error(msg); wrapAndThrow(SEVERE, nsme,msg); }
// in core/src/java/org/apache/solr/schema/FieldTypePluginLoader.java
catch (NoSuchMethodException nsme) { // otherwise use default ctor return clazz.newInstance(); }
// in core/src/java/org/apache/solr/util/SuggestMissingFactories.java
catch (NoSuchMethodException e) { /* :NOOP: we expect this and skip clazz */ }
0
(Lib) SAXException 3
            
// in contrib/extraction/src/java/org/apache/solr/handler/extraction/ExtractingDocumentLoader.java
catch (SAXException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/schema/CurrencyField.java
catch (SAXException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Error parsing currency config.", e); }
// in core/src/java/org/apache/solr/core/Config.java
catch (SAXException e) { SolrException.log(log, "Exception during parsing file: " + name, e); throw e; }
3
            
// in contrib/extraction/src/java/org/apache/solr/handler/extraction/ExtractingDocumentLoader.java
catch (SAXException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/schema/CurrencyField.java
catch (SAXException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Error parsing currency config.", e); }
// in core/src/java/org/apache/solr/core/Config.java
catch (SAXException e) { SolrException.log(log, "Exception during parsing file: " + name, e); throw e; }
(Lib) SQLException 3
            
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/JdbcDataSource.java
catch (SQLException e) { // DriverManager does not allow you to use a driver which is not loaded through // the class loader of the class which is trying to make the connection. // This is a workaround for cases where the user puts the driver jar in the // solr.home/lib or solr.home/core/lib directories. Driver d = (Driver) DocBuilder.loadClass(driver, context.getSolrCore()).newInstance(); c = d.connect(url, initProps); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/JdbcDataSource.java
catch (SQLException e) { logError("Error reading data ", e); wrapAndThrow(SEVERE, e, "Error reading data from database"); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/JdbcDataSource.java
catch (SQLException e) { close(); wrapAndThrow(SEVERE,e); return false; }
0
(Lib) SocketTimeoutException 3
            
// in solrj/src/java/org/apache/solr/client/solrj/impl/HttpSolrServer.java
catch (SocketTimeoutException e) { throw new SolrServerException( "Timeout occured while waiting response from server at: " + getBaseURL(), e); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
catch (SocketTimeoutException e) { ex = addZombie(server, e); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
catch (SocketTimeoutException e) { ex = e; }
1
            
// in solrj/src/java/org/apache/solr/client/solrj/impl/HttpSolrServer.java
catch (SocketTimeoutException e) { throw new SolrServerException( "Timeout occured while waiting response from server at: " + getBaseURL(), e); }
(Lib) TimeoutException 3
            
// in solrj/src/java/org/apache/solr/client/solrj/impl/CloudSolrServer.java
catch (TimeoutException e) { throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/servlet/ZookeeperInfoServlet.java
catch (TimeoutException e) { writeError(503, "Could not connect to zookeeper at '" + addr + "'\""); zkClient = null; return; }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (TimeoutException e) { log.error("Could not connect to ZooKeeper", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
2
            
// in solrj/src/java/org/apache/solr/client/solrj/impl/CloudSolrServer.java
catch (TimeoutException e) { throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (TimeoutException e) { log.error("Could not connect to ZooKeeper", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
(Lib) ConnectException 2
            
// in solrj/src/java/org/apache/solr/client/solrj/impl/HttpSolrServer.java
catch (ConnectException e) { throw new SolrServerException("Server refused connection at: " + getBaseURL(), e); }
// in core/src/java/org/apache/solr/handler/component/HttpShardHandler.java
catch( ConnectException cex ) { srsp.setException(cex); //???? }
1
            
// in solrj/src/java/org/apache/solr/client/solrj/impl/HttpSolrServer.java
catch (ConnectException e) { throw new SolrServerException("Server refused connection at: " + getBaseURL(), e); }
(Lib) ConnectionLossException 2
            
// in solrj/src/java/org/apache/solr/common/cloud/ZkCmdExecutor.java
catch (KeeperException.ConnectionLossException e) { if (exception == null) { exception = e; } if (Thread.currentThread().isInterrupted()) { Thread.currentThread().interrupt(); throw new InterruptedException(); } if (Thread.currentThread() instanceof SafeStopThread) { if (((SafeStopThread) Thread.currentThread()).isClosed()) { throw new RuntimeException("Interrupted"); } } retryDelay(i); }
// in core/src/java/org/apache/solr/cloud/LeaderElector.java
catch (ConnectionLossException e) { // we don't know if we made our node or not... List<String> entries = zkClient.getChildren(shardsElectZkPath, null, true); boolean foundId = false; for (String entry : entries) { String nodeId = getNodeId(entry); if (id.equals(nodeId)) { // we did create our node... foundId = true; break; } } if (!foundId) { throw e; } }
3
            
// in solrj/src/java/org/apache/solr/common/cloud/ZkCmdExecutor.java
catch (KeeperException.ConnectionLossException e) { if (exception == null) { exception = e; } if (Thread.currentThread().isInterrupted()) { Thread.currentThread().interrupt(); throw new InterruptedException(); } if (Thread.currentThread() instanceof SafeStopThread) { if (((SafeStopThread) Thread.currentThread()).isClosed()) { throw new RuntimeException("Interrupted"); } } retryDelay(i); }
// in core/src/java/org/apache/solr/cloud/LeaderElector.java
catch (ConnectionLossException e) { // we don't know if we made our node or not... List<String> entries = zkClient.getChildren(shardsElectZkPath, null, true); boolean foundId = false; for (String entry : entries) { String nodeId = getNodeId(entry); if (id.equals(nodeId)) { // we did create our node... foundId = true; break; } } if (!foundId) { throw e; } }
(Lib) ParserConfigurationException 2
            
// in core/src/java/org/apache/solr/schema/CurrencyField.java
catch (ParserConfigurationException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Error parsing currency config.", e); }
// in core/src/java/org/apache/solr/core/Config.java
catch (ParserConfigurationException e) { SolrException.log(log, "Exception during parsing file: " + name, e); throw e; }
2
            
// in core/src/java/org/apache/solr/schema/CurrencyField.java
catch (ParserConfigurationException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Error parsing currency config.", e); }
// in core/src/java/org/apache/solr/core/Config.java
catch (ParserConfigurationException e) { SolrException.log(log, "Exception during parsing file: " + name, e); throw e; }
(Lib) PatternSyntaxException 2
            
// in core/src/java/org/apache/solr/update/processor/RegexReplaceProcessorFactory.java
catch (PatternSyntaxException e) { throw new SolrException(ErrorCode.SERVER_ERROR, "Invalid regex: " + patternParam, e); }
// in core/src/java/org/apache/solr/update/processor/FieldMutatingUpdateProcessorFactory.java
catch (PatternSyntaxException e) { throw new SolrException (SERVER_ERROR, "Invalid 'fieldRegex' pattern: " + s, e); }
2
            
// in core/src/java/org/apache/solr/update/processor/RegexReplaceProcessorFactory.java
catch (PatternSyntaxException e) { throw new SolrException(ErrorCode.SERVER_ERROR, "Invalid regex: " + patternParam, e); }
// in core/src/java/org/apache/solr/update/processor/FieldMutatingUpdateProcessorFactory.java
catch (PatternSyntaxException e) { throw new SolrException (SERVER_ERROR, "Invalid 'fieldRegex' pattern: " + s, e); }
(Domain) ReplicationHandlerException 2
            
// in core/src/java/org/apache/solr/handler/SnapPuller.java
catch (ReplicationHandlerException e) { LOG.error("User aborted Replication"); return false; }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
catch (ReplicationHandlerException e) { throw e; }
1
            
// in core/src/java/org/apache/solr/handler/SnapPuller.java
catch (ReplicationHandlerException e) { throw e; }
(Lib) SocketException 2
            
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
catch (SocketException e) { ex = addZombie(server, e); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
catch (SocketException e) { ex = e; }
0
(Lib) TransformerConfigurationException 2
            
// in contrib/dataimporthandler-extras/src/java/org/apache/solr/handler/dataimport/TikaEntityProcessor.java
catch (TransformerConfigurationException e) { wrapAndThrow(SEVERE, e, "Unable to create content handler"); }
// in core/src/java/org/apache/solr/util/xslt/TransformerProvider.java
catch(TransformerConfigurationException tce) { log.error(getClass().getName(), "getTransformer", tce); final IOException ioe = new IOException("newTransformer fails ( " + lastFilename + ")"); ioe.initCause(tce); throw ioe; }
1
            
// in core/src/java/org/apache/solr/util/xslt/TransformerProvider.java
catch(TransformerConfigurationException tce) { log.error(getClass().getName(), "getTransformer", tce); final IOException ioe = new IOException("newTransformer fails ( " + lastFilename + ")"); ioe.initCause(tce); throw ioe; }
(Lib) UnknownHostException 2
            
// in core/src/java/org/apache/solr/handler/admin/SystemInfoHandler.java
catch (UnknownHostException e) { //default to null }
// in core/src/java/org/apache/solr/cloud/SolrZkServer.java
catch (UnknownHostException e) { throw new RuntimeException(e); }
1
            
// in core/src/java/org/apache/solr/cloud/SolrZkServer.java
catch (UnknownHostException e) { throw new RuntimeException(e); }
(Lib) CharacterCodingException 1
            
// in core/src/java/org/apache/solr/core/SolrResourceLoader.java
catch (CharacterCodingException ex) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Error loading resource (wrong encoding?): " + resource, ex); }
1
            
// in core/src/java/org/apache/solr/core/SolrResourceLoader.java
catch (CharacterCodingException ex) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Error loading resource (wrong encoding?): " + resource, ex); }
(Lib) ConfigException 1
            
// in core/src/java/org/apache/solr/cloud/SolrZkServer.java
catch (QuorumPeerConfig.ConfigException e) { if (zkRun != null) throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
1
            
// in core/src/java/org/apache/solr/cloud/SolrZkServer.java
catch (QuorumPeerConfig.ConfigException e) { if (zkRun != null) throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
(Lib) EOFException 1
            
// in core/src/java/org/apache/solr/handler/loader/JavabinLoader.java
catch (EOFException e) { break; // this is expected }
0
(Lib) IllegalAccessException 1
            
// in core/src/java/org/apache/solr/util/SolrPluginUtils.java
catch (IllegalAccessException e1) { throw new RuntimeException("Error invoking setter " + setterName + " on class : " + clazz.getName(), e1); }
1
            
// in core/src/java/org/apache/solr/util/SolrPluginUtils.java
catch (IllegalAccessException e1) { throw new RuntimeException("Error invoking setter " + setterName + " on class : " + clazz.getName(), e1); }
(Lib) IllegalStateException 1
            
// in core/src/java/org/apache/solr/request/UnInvertedField.java
catch (IllegalStateException ise) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, ise.getMessage()); }
1
            
// in core/src/java/org/apache/solr/request/UnInvertedField.java
catch (IllegalStateException ise) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, ise.getMessage()); }
(Lib) InvalidTokenOffsetsException 1
            
// in core/src/java/org/apache/solr/highlight/DefaultSolrHighlighter.java
catch (InvalidTokenOffsetsException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
1
            
// in core/src/java/org/apache/solr/highlight/DefaultSolrHighlighter.java
catch (InvalidTokenOffsetsException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
(Lib) InvocationTargetException 1
            
// in core/src/java/org/apache/solr/util/SolrPluginUtils.java
catch (InvocationTargetException e1) { throw new RuntimeException("Error invoking setter " + setterName + " on class : " + clazz.getName(), e1); }
1
            
// in core/src/java/org/apache/solr/util/SolrPluginUtils.java
catch (InvocationTargetException e1) { throw new RuntimeException("Error invoking setter " + setterName + " on class : " + clazz.getName(), e1); }
(Lib) LangDetectException 1
            
// in contrib/langid/src/java/org/apache/solr/update/processor/LangDetectLanguageIdentifierUpdateProcessor.java
catch (LangDetectException e) { log.debug("Could not determine language, returning empty list: ", e); return Collections.emptyList(); }
0
(Lib) MimeTypeException 1
            
// in contrib/extraction/src/java/org/apache/solr/handler/extraction/ExtractingRequestHandler.java
catch (MimeTypeException e) { throw new SolrException(ErrorCode.SERVER_ERROR, e); }
1
            
// in contrib/extraction/src/java/org/apache/solr/handler/extraction/ExtractingRequestHandler.java
catch (MimeTypeException e) { throw new SolrException(ErrorCode.SERVER_ERROR, e); }
(Lib) NamingException 1
            
// in core/src/java/org/apache/solr/core/SolrResourceLoader.java
catch (NamingException e) { log.info("No /"+project+"/home in JNDI"); }
0
(Lib) NoClassDefFoundError 1
            
// in core/src/java/org/apache/solr/util/SuggestMissingFactories.java
catch (NoClassDefFoundError e1) { throw new ClassNotFoundException ("Can't load: " + cn, e1); }
1
            
// in core/src/java/org/apache/solr/util/SuggestMissingFactories.java
catch (NoClassDefFoundError e1) { throw new ClassNotFoundException ("Can't load: " + cn, e1); }
(Lib) NoHttpResponseException 1
            
// in solrj/src/java/org/apache/solr/client/solrj/impl/HttpSolrServer.java
catch( NoHttpResponseException r ) { method = null; if(is != null) { is.close(); } // If out of tries then just rethrow (as normal error). if (tries < 1) { throw r; } }
1
            
// in solrj/src/java/org/apache/solr/client/solrj/impl/HttpSolrServer.java
catch( NoHttpResponseException r ) { method = null; if(is != null) { is.close(); } // If out of tries then just rethrow (as normal error). if (tries < 1) { throw r; } }
(Lib) NoInitialContextException 1
            
// in core/src/java/org/apache/solr/core/SolrResourceLoader.java
catch (NoInitialContextException e) { log.info("JNDI not configured for "+project+" (NoInitialContextEx)"); }
0
(Lib) NoSuchAlgorithmException 1
            
// in core/src/java/org/apache/solr/update/processor/MD5Signature.java
catch (NoSuchAlgorithmException e) { throw new RuntimeException(e); }
1
            
// in core/src/java/org/apache/solr/update/processor/MD5Signature.java
catch (NoSuchAlgorithmException e) { throw new RuntimeException(e); }
(Lib) ProtocolException 1
            
// in core/src/java/org/apache/solr/util/SimplePostTool.java
catch (ProtocolException e) { fatal("Shouldn't happen: HttpURLConnection doesn't support POST??"+e); }
0
(Lib) ScriptException 1
            
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/ScriptTransformer.java
catch (ScriptException e) { wrapAndThrow(SEVERE, e, "'eval' failed with language: " + scriptLang + " and script: \n" + scriptText); }
0
(Lib) SecurityException 1
            
// in core/src/java/org/apache/solr/util/VersionedFile.java
catch (SecurityException e) { if (!df.exists()) { deleted.add(df); } }
0
(Lib) SessionExpiredException 1
            
// in core/src/java/org/apache/solr/cloud/LeaderElector.java
catch (KeeperException.SessionExpiredException e) { throw e; }
1
            
// in core/src/java/org/apache/solr/cloud/LeaderElector.java
catch (KeeperException.SessionExpiredException e) { throw e; }
(Lib) TikaException 1
            
// in contrib/extraction/src/java/org/apache/solr/handler/extraction/ExtractingDocumentLoader.java
catch (TikaException e) { if(ignoreTikaException) log.warn(new StringBuilder("skip extracting text due to ").append(e.getLocalizedMessage()) .append(". metadata=").append(metadata.toString()).toString()); else throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
1
            
// in contrib/extraction/src/java/org/apache/solr/handler/extraction/ExtractingDocumentLoader.java
catch (TikaException e) { if(ignoreTikaException) log.warn(new StringBuilder("skip extracting text due to ").append(e.getLocalizedMessage()) .append(". metadata=").append(metadata.toString()).toString()); else throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }

Exception Recast Summary

There is a common practice of throwing exceptions from within a catch block (e.g. for wrapping a low-level exception). The following table summarizes the usage of this practice in the application. The last column gives the number of times it happens for a pair of exceptions. The graph below the table graphically renders the same information. For a given node, its color represents its origin (blue means library exception, orange means domain exception); the left-most number is the number of times it is thrown, the right-most is the number of times it is caught.

Catch Throw
(Lib) Exception
(Lib) RuntimeException
(Domain) SolrException
(Domain) SolrServerException
(Domain) BindingException
(Domain) FieldMappingException
(Domain) DataImportHandlerException
(Lib) ClassNotFoundException
(Lib) IOException
(Lib) InitializationException
(Domain) ZooKeeperException
(Lib) AttributeNotFoundException
Unknown
17
                    
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
catch (Exception e) { throw new RuntimeException("Problem pretty printing XML", e); }
// in contrib/langid/src/java/org/apache/solr/update/processor/LangDetectLanguageIdentifierUpdateProcessorFactory.java
catch (Exception e) { throw new RuntimeException("Couldn't load profile data, will return empty languages always!", e); }
// in contrib/analysis-extras/src/java/org/apache/solr/schema/ICUCollationField.java
catch (Exception e) { // io error or invalid rules throw new RuntimeException(e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/XPathRecordReader.java
catch (Exception e) { throw new RuntimeException(e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DocBuilder.java
catch(Exception e) { throw new RuntimeException(e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DocBuilder.java
catch (Exception e) { throw new RuntimeException(e); }
// in contrib/velocity/src/java/org/apache/solr/response/VelocityResponseWriter.java
catch (Exception e) { throw new RuntimeException(e); }
// in core/src/java/org/apache/solr/client/solrj/embedded/EmbeddedSolrServer.java
catch (Exception ex) { throw new RuntimeException(ex); }
// in core/src/java/org/apache/solr/response/XSLTResponseWriter.java
catch(Exception e) { // TODO should our parent interface throw (IO)Exception? throw new RuntimeException("getTransformer fails in getContentType",e); }
// in core/src/java/org/apache/solr/response/BinaryResponseWriter.java
catch (Exception ex) { throw new RuntimeException(ex); }
// in core/src/java/org/apache/solr/search/QueryParsing.java
catch (Exception e) { throw new RuntimeException(e); }
// in core/src/java/org/apache/solr/search/SolrCacheBase.java
catch (Exception e) { throw new RuntimeException("Can't parse autoWarm value: " + configValue, e); }
// in core/src/java/org/apache/solr/cloud/ZkSolrResourceLoader.java
catch (Exception e) { throw new RuntimeException("Error opening " + file, e); }
// in core/src/java/org/apache/solr/cloud/ZkSolrResourceLoader.java
catch (Exception e) { throw new RuntimeException("Error opening " + resource, e); }
// in core/src/java/org/apache/solr/core/SolrResourceLoader.java
catch (Exception e) { throw new RuntimeException("Error opening " + resource, e); }
// in core/src/java/org/apache/solr/core/JmxMonitoredMap.java
catch (Exception e) { // Release the reference server = null; throw new RuntimeException("Could not start JMX monitoring ", e); }
// in core/src/java/org/apache/solr/core/JmxMonitoredMap.java
catch (Exception e) { throw new RuntimeException(e); }
57
                    
// in solrj/src/java/org/apache/solr/common/params/SolrParams.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, ex.getMessage(), ex ); }
// in solrj/src/java/org/apache/solr/common/params/SolrParams.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, ex.getMessage(), ex ); }
// in solrj/src/java/org/apache/solr/common/params/SolrParams.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, ex.getMessage(), ex ); }
// in solrj/src/java/org/apache/solr/common/params/SolrParams.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, ex.getMessage(), ex ); }
// in solrj/src/java/org/apache/solr/common/params/SolrParams.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, ex.getMessage(), ex ); }
// in solrj/src/java/org/apache/solr/common/params/SolrParams.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, ex.getMessage(), ex ); }
// in solrj/src/java/org/apache/solr/common/params/SolrParams.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, ex.getMessage(), ex ); }
// in solrj/src/java/org/apache/solr/common/params/SolrParams.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, ex.getMessage(), ex ); }
// in solrj/src/java/org/apache/solr/common/params/SolrParams.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, ex.getMessage(), ex ); }
// in solrj/src/java/org/apache/solr/common/params/SolrParams.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, ex.getMessage(), ex ); }
// in solrj/src/java/org/apache/solr/common/params/SolrParams.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, ex.getMessage(), ex ); }
// in solrj/src/java/org/apache/solr/common/params/SolrParams.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, ex.getMessage(), ex ); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/XMLResponseParser.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "parsing error", ex ); }
// in contrib/extraction/src/java/org/apache/solr/handler/extraction/ExtractingRequestHandler.java
catch (Exception e) { throw new SolrException(ErrorCode.SERVER_ERROR, e); }
// in contrib/uima/src/java/org/apache/solr/uima/processor/UIMAUpdateRequestProcessor.java
catch (Exception e) { String logField = solrUIMAConfiguration.getLogField(); if(logField == null){ SchemaField uniqueKeyField = solrCore.getSchema().getUniqueKeyField(); if(uniqueKeyField != null){ logField = uniqueKeyField.getName(); } } String optionalFieldInfo = logField == null ? "." : new StringBuilder(". ").append(logField).append("=") .append((String)cmd.getSolrInputDocument().getField(logField).getValue()) .append(", ").toString(); int len = Math.min(text.length(), 100); if (solrUIMAConfiguration.isIgnoreErrors()) { log.warn(new StringBuilder("skip the text processing due to ") .append(e.getLocalizedMessage()).append(optionalFieldInfo) .append(" text=\"").append(text.substring(0, len)).append("...\"").toString()); } else { throw new SolrException(ErrorCode.SERVER_ERROR, new StringBuilder("processing error: ") .append(e.getLocalizedMessage()).append(optionalFieldInfo) .append(" text=\"").append(text.substring(0, len)).append("...\"").toString(), e); } }
// in contrib/clustering/src/java/org/apache/solr/handler/clustering/carrot2/CarrotClusteringEngine.java
catch (Exception e) { log.error("Carrot2 clustering failed", e); throw new SolrException(ErrorCode.SERVER_ERROR, "Carrot2 clustering failed", e); }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
catch (Exception e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Index fetch failed : ", e); }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
catch (Exception e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Unable to write index.properties", e); }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
catch (Exception e) { LOG.warn("Error in fetching packets ", e); //for any failure , increment the error count errorCount++; //if it fails for the same pacaket for MAX_RETRIES fail and come out if (errorCount > MAX_RETRIES) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Fetch failed for file:" + fileName, e); } return ERR; }
// in core/src/java/org/apache/solr/handler/component/QueryElevationComponent.java
catch (Exception ex) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Error initializing QueryElevationComponent.", ex); }
// in core/src/java/org/apache/solr/handler/component/QueryElevationComponent.java
catch (Exception ex) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Error loading elevation", ex); }
// in core/src/java/org/apache/solr/handler/component/HighlightComponent.java
catch (Exception e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/handler/admin/SolrInfoMBeanHandler.java
catch(Exception ex) { throw new SolrException(ErrorCode.BAD_REQUEST, "missing content-stream for diff"); }
// in core/src/java/org/apache/solr/handler/admin/SolrInfoMBeanHandler.java
catch(Exception ex) { throw new SolrException(ErrorCode.BAD_REQUEST, "Unable to read original XML", ex); }
// in core/src/java/org/apache/solr/handler/admin/CoreAdminHandler.java
catch (Exception ex) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Error executing default implementation of CREATE", ex); }
// in core/src/java/org/apache/solr/handler/admin/CoreAdminHandler.java
catch (Exception ex) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Error handling 'status' action ", ex); }
// in core/src/java/org/apache/solr/handler/admin/CoreAdminHandler.java
catch (Exception ex) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Error handling 'reload' action", ex); }
// in core/src/java/org/apache/solr/handler/admin/LoggingHandler.java
catch(Exception ex) { throw new SolrException(ErrorCode.BAD_REQUEST, "invalid timestamp: "+since); }
// in core/src/java/org/apache/solr/response/transform/ValueAugmenterFactory.java
catch( Exception ex ) { throw new SolrException( ErrorCode.BAD_REQUEST, "Unable to parse "+type+"="+val, ex ); }
// in core/src/java/org/apache/solr/response/transform/ExplainAugmenterFactory.java
catch( Exception ex ) { throw new SolrException( ErrorCode.BAD_REQUEST, "Unknown Explain Style: "+str ); }
// in core/src/java/org/apache/solr/request/SimpleFacets.java
catch (Exception e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Can't parse value "+rawval+" for field: " + field.getName(), e); }
// in core/src/java/org/apache/solr/request/SimpleFacets.java
catch (Exception e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Can't parse gap "+gap+" for field: " + field.getName(), e); }
// in core/src/java/org/apache/solr/request/SimpleFacets.java
catch (Exception e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Can't add gap "+gap+" to value " + value + " for field: " + field.getName(), e); }
// in core/src/java/org/apache/solr/servlet/SolrDispatchFilter.java
catch (Exception e) { //unlikely throw new SolrException(SolrException.ErrorCode.SERVER_ERROR,e); }
// in core/src/java/org/apache/solr/schema/OpenExchangeRatesOrgProvider.java
catch (Exception e) { throw new SolrException(ErrorCode.SERVER_ERROR, "Error reloading exchange rates", e); }
// in core/src/java/org/apache/solr/schema/OpenExchangeRatesOrgProvider.java
catch (Exception e) { throw new SolrException(ErrorCode.BAD_REQUEST, "Error initializing", e); }
// in core/src/java/org/apache/solr/schema/IndexSchema.java
catch(Exception e) { // unexpected exception... throw new SolrException( SolrException.ErrorCode.SERVER_ERROR,"Schema Parsing Failed: " + e.getMessage(), e); }
// in core/src/java/org/apache/solr/schema/CurrencyField.java
catch (Exception e) { throw new SolrException(ErrorCode.BAD_REQUEST, "Error instansiating exhange rate provider "+exchangeRateProviderClass+". Please check your FieldType configuration", e); }
// in core/src/java/org/apache/solr/schema/FieldTypePluginLoader.java
catch (Exception e) { log.error("Cannot load analyzer: "+analyzerName, e); throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "Cannot load analyzer: "+analyzerName, e ); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (Exception e) { SolrException.log(log, "There was a problem finding the leader in zk", e); try { Thread.sleep(2000); } catch (InterruptedException e1) { Thread.currentThread().interrupt(); } if (i == retries - 1) { throw new SolrException(ErrorCode.SERVER_ERROR, "There was a problem finding the leader in zk"); } }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (Exception e) { SolrException.log(log, "There was a problem making a request to the leader", e); try { Thread.sleep(2000); } catch (InterruptedException e1) { Thread.currentThread().interrupt(); } if (i == retries - 1) { throw new SolrException(ErrorCode.SERVER_ERROR, "There was a problem making a request to the leader"); } }
// in core/src/java/org/apache/solr/update/processor/FieldMutatingUpdateProcessor.java
catch (Exception e) { throw new SolrException(SERVER_ERROR, "Can't resolve typeClass: " + t, e); }
// in core/src/java/org/apache/solr/update/DocumentBuilder.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "ERROR: "+getID(doc, schema)+"Error adding field '" + field.getName() + "'='" +field.getValue()+"' msg=" + ex.getMessage(), ex ); }
// in core/src/java/org/apache/solr/core/RequestHandlers.java
catch (Exception ex) { throw new SolrException (ErrorCode.SERVER_ERROR, "RequestHandler init failure", ex); }
// in core/src/java/org/apache/solr/core/RequestHandlers.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "lazy loading error", ex ); }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (Exception e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR,"Error Instantiating "+msg+", "+className+ " failed to instantiate " +cast.getName(), e); }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (Exception e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR,"Error Instantiating "+msg+", "+className+ " failed to instantiate " + UpdateHandler.class.getName(), e); }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (Exception e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Error opening new searcher", e); }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (Exception e) { if (e instanceof SolrException) throw (SolrException)e; throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (Exception e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (Exception e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (Exception e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "lazy loading error", ex ); }
// in core/src/java/org/apache/solr/core/SolrResourceLoader.java
catch (Exception e) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "Error instantiating class: '" + clazz.getName()+"'", e); }
// in core/src/java/org/apache/solr/core/SolrResourceLoader.java
catch (Exception e) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "Error instantiating class: '" + clazz.getName()+"'", e); }
// in core/src/java/org/apache/solr/core/SolrResourceLoader.java
catch (Exception e) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "Error instantiating class: '" + clazz.getName()+"'", e); }
// in core/src/java/org/apache/solr/core/JmxMonitoredMap.java
catch (Exception e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Failed to unregister info bean: " + key, e); }
6
                    
// in solrj/src/java/org/apache/solr/client/solrj/request/QueryRequest.java
catch (Exception e) { throw new SolrServerException("Error executing query", e); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
catch (Exception e) { throw new SolrServerException(e); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
catch (Exception e) { throw new SolrServerException(e); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
catch (Exception e) { throw new SolrServerException(e); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
catch (Exception e) { throw new SolrServerException(e); }
// in core/src/java/org/apache/solr/client/solrj/embedded/EmbeddedSolrServer.java
catch( Exception ex ) { throw new SolrServerException( ex ); }
4
                    
// in solrj/src/java/org/apache/solr/client/solrj/beans/DocumentObjectBinder.java
catch (Exception e) { throw new BindingException("Could not instantiate object of " + clazz, e); }
// in solrj/src/java/org/apache/solr/client/solrj/beans/DocumentObjectBinder.java
catch (Exception e) { throw new BindingException("Exception while setting value : " + v + " on " + (field != null ? field : setter), e); }
// in solrj/src/java/org/apache/solr/client/solrj/beans/DocumentObjectBinder.java
catch (Exception e) { throw new BindingException("Exception while getting value: " + field, e); }
// in solrj/src/java/org/apache/solr/client/solrj/beans/DocumentObjectBinder.java
catch (Exception e) { throw new BindingException("Exception while getting value: " + getter, e); }
1
                    
// in contrib/uima/src/java/org/apache/solr/uima/processor/UIMAToSolrMapper.java
catch (Exception e) { throw new FieldMappingException(e); }
9
                    
// in contrib/dataimporthandler-extras/src/java/org/apache/solr/handler/dataimport/MailEntityProcessor.java
catch (Exception e) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Custom filter could not be created", e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DataImporter.java
catch (Exception e) { throw new DataImportHandlerException(SEVERE, "Exception occurred while initializing context", e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/XPathEntityProcessor.java
catch (Exception e) { throw new DataImportHandlerException(SEVERE, "Error initializing XSL ", e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/SqlEntityProcessor.java
catch (Exception e) { LOG.error( "The query failed '" + q + "'", e); throw new DataImportHandlerException(DataImportHandlerException.SEVERE, e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/SimplePropertiesWriter.java
catch (Exception e) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Unable to persist Index Start Time", e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DIHCacheSupport.java
catch (Exception e) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Unable to load Cache implementation:" + cacheImplName, e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/EntityProcessorWrapper.java
catch (Exception e) { log.warn("method invocation failed on transformer : " + trans, e); throw new DataImportHandlerException(WARN, e);
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/URLDataSource.java
catch (Exception e) { LOG.error("Exception thrown while getting data", e); throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Exception in invoking url " + url, e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DocBuilder.java
catch (Exception e) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Unable to load Writer implementation:" + writerClassStr, e); }
1
                    
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DocBuilder.java
catch (Exception e) { try { String n = DocBuilder.class.getPackage().getName() + "." + name; return core != null ? core.getResourceLoader().findClass(n, Object.class) : Class.forName(n); } catch (Exception e1) { throw new ClassNotFoundException("Unable to load " + name + " or " + DocBuilder.class.getPackage().getName() + "." + name, e); } }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DocBuilder.java
catch (Exception e1) { throw new ClassNotFoundException("Unable to load " + name + " or " + DocBuilder.class.getPackage().getName() + "." + name, e); }
2
                    
// in contrib/velocity/src/java/org/apache/solr/response/VelocityResponseWriter.java
catch (Exception e) { throw new IOException(e.getMessage()); }
// in contrib/velocity/src/java/org/apache/solr/response/VelocityResponseWriter.java
catch (Exception e) { throw new IOException(e.getMessage()); }
9
                    
// in core/src/java/org/apache/solr/analysis/PhoneticFilterFactory.java
catch (Exception e) { throw new InitializationException("Encoder " + name + " / " + clazz + " does not support " + MAX_CODE_LENGTH, e); }
// in core/src/java/org/apache/solr/analysis/PhoneticFilterFactory.java
catch (Exception e) { final Throwable t = (e instanceof InvocationTargetException) ? e.getCause() : e; throw new InitializationException("Error initializing encoder: " + name + " / " + clazz, t); }
// in core/src/java/org/apache/solr/analysis/HunspellStemFilterFactory.java
catch (Exception e) { throw new InitializationException("Unable to load hunspell data! [dictionary=" + args.get("dictionary") + ",affix=" + affixFile + "]", e); }
// in core/src/java/org/apache/solr/analysis/SynonymFilterFactory.java
catch (Exception e) { throw new InitializationException("Exception thrown while loading synonyms", e); }
// in core/src/java/org/apache/solr/analysis/TrimFilterFactory.java
catch( Exception ex ) { throw new InitializationException("Error reading updateOffsets value. Must be true or false.", ex); }
// in core/src/java/org/apache/solr/analysis/HyphenationCompoundWordTokenFilterFactory.java
catch (Exception e) { // TODO: getHyphenationTree really shouldn't throw "Exception" throw new InitializationException("Exception thrown while loading dictionary and hyphenation file", e); }
// in core/src/java/org/apache/solr/analysis/SnowballPorterFilterFactory.java
catch (Exception e) { throw new InitializationException("Error instantiating stemmer for language " + language + "from class " + stemClass, e); }
// in core/src/java/org/apache/solr/analysis/JapaneseTokenizerFactory.java
catch (Exception e) { throw new InitializationException("Exception thrown while loading dictionary", e); }
// in core/src/java/org/apache/solr/analysis/PatternTokenizerFactory.java
catch( Exception ex ) { throw new InitializationException("invalid group argument: " + g); }
2
                    
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (Exception e) { SolrException.log(log, "", e); throw new ZooKeeperException( SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (Exception e) { // if register fails, this is really bad - close the zkController to // minimize any damage we can cause zkController.publish(core.getCoreDescriptor(), ZkStateReader.DOWN); SolrException.log(log, "", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
1
                    
// in core/src/java/org/apache/solr/core/JmxMonitoredMap.java
catch (Exception e) { throw new AttributeNotFoundException(attribute); }
10
                    
// in contrib/clustering/src/java/org/apache/solr/handler/clustering/carrot2/LuceneCarrot2TokenizerFactory.java
catch (Exception e) { throw ExceptionUtils.wrapAsRuntimeException(e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DebugLogger.java
catch (Exception e) { log(DIHLogLevels.ENTITY_EXCEPTION, null, e); DataImportHandlerException de = new DataImportHandlerException( DataImportHandlerException.SEVERE, "", e); de.debugged = true; throw de; }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DebugLogger.java
catch (Exception e) { log(DIHLogLevels.TRANSFORMER_EXCEPTION, tName, e); DataImportHandlerException de = new DataImportHandlerException(DataImportHandlerException.SEVERE, "", e); de.debugged = true; throw de; }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (Exception e) { if (e instanceof SolrException) throw (SolrException)e; throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (Exception ex) { SolrException e = new SolrException (SolrException.ErrorCode.SERVER_ERROR, "QueryResponseWriter init failure", ex); SolrException.log(log,null,e); throw e; }
// in core/src/java/org/apache/solr/util/xslt/TransformerProvider.java
catch (Exception e) { log.error(getClass().getName(), "newTemplates", e); final IOException ioe = new IOException("Unable to initialize Templates '" + filename + "'"); ioe.initCause(e); throw ioe; }
// in core/src/java/org/apache/solr/util/plugin/AbstractPluginLoader.java
catch (Exception ex) { SolrException e = new SolrException (ErrorCode.SERVER_ERROR, "Plugin init failure for " + type + (null != name ? (" \"" + name + "\"") : "") + ": " + ex.getMessage(), ex); throw e; }
// in core/src/java/org/apache/solr/util/plugin/AbstractPluginLoader.java
catch( Exception ex ) { SolrException e = new SolrException (ErrorCode.SERVER_ERROR, "Plugin Initializing failure for " + type, ex); throw e; }
// in core/src/java/org/apache/solr/util/plugin/AbstractPluginLoader.java
catch (Exception ex) { SolrException e = new SolrException (ErrorCode.SERVER_ERROR, "Plugin init failure for " + type, ex); throw e; }
// in core/src/java/org/apache/solr/util/plugin/AbstractPluginLoader.java
catch (Exception ex) { SolrException e = new SolrException (ErrorCode.SERVER_ERROR, "Plugin init failure for " + type, ex); throw e; }
(Domain) SolrServerException
Unknown
5
                    
// in solrj/src/java/org/apache/solr/client/solrj/request/QueryRequest.java
catch (SolrServerException e){ throw e; }
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
catch (SolrServerException e) { Throwable rootCause = e.getRootCause(); if (rootCause instanceof IOException) { ex = addZombie(server, e); } else { throw e; } }
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
catch (SolrServerException e) { Throwable rootCause = e.getRootCause(); if (rootCause instanceof IOException) { ex = e; // already a zombie, no need to re-add } else { throw e; } }
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
catch (SolrServerException e) { if (e.getRootCause() instanceof IOException) { ex = e; moveAliveToDead(wrapper); if (justFailed == null) justFailed = new HashMap<String,ServerWrapper>(); justFailed.put(wrapper.getKey(), wrapper); } else { throw e; } }
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
catch (SolrServerException e) { if (e.getRootCause() instanceof IOException) { ex = e; // still dead } else { throw e; } }
(Lib) InterruptedException
(Domain) ZooKeeperException
(Lib) IOException
(Lib) InterruptedException
(Domain) SolrException
18
                    
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in solrj/src/java/org/apache/solr/common/cloud/ZkStateReader.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in solrj/src/java/org/apache/solr/common/cloud/ZkStateReader.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException( SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in solrj/src/java/org/apache/solr/common/cloud/ZkStateReader.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/CloudSolrServer.java
catch (InterruptedException e) { Thread.currentThread().interrupt(); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/cloud/ZkSolrResourceLoader.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); throw new ZooKeeperException( SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.warn("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "could not publish node state", e); }
// in core/src/java/org/apache/solr/update/processor/DistributedUpdateProcessor.java
catch (InterruptedException e) { Thread.currentThread().interrupt(); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/update/processor/DistributedUpdateProcessor.java
catch (InterruptedException e) { Thread.currentThread().interrupt(); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); SolrException.log(log, "", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
1
                    
// in solrj/src/java/org/apache/solr/client/solrj/impl/ConcurrentUpdateSolrServer.java
catch (InterruptedException e) { log.error("interrupted", e); throw new IOException(e.getLocalizedMessage()); }
1
                    
// in core/src/java/org/apache/solr/handler/SnapPuller.java
catch (InterruptedException e) { throw new InterruptedException("Index fetch interrupted"); }
6
                    
// in core/src/java/org/apache/solr/handler/component/HttpShardHandler.java
catch (InterruptedException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/handler/admin/CoreAdminHandler.java
catch (InterruptedException e) { Thread.currentThread().interrupt(); throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Could not unregister core " + cname + " from cloudstate: " + e.getMessage(), e); }
// in core/src/java/org/apache/solr/request/PerSegmentSingleValuedFaceting.java
catch (InterruptedException e) { Thread.currentThread().interrupt(); throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (InterruptedException e) { throw new SolrException(ErrorCode.SERVER_ERROR, "Interrupted"); }
// in core/src/java/org/apache/solr/update/processor/DistributedUpdateProcessor.java
catch (InterruptedException e) { throw new SolrException(ErrorCode.SERVER_ERROR, "Exception finding leader for shard " + sliceName, e); }
// in core/src/java/org/apache/solr/update/SolrCmdDistributor.java
catch (InterruptedException e) { throw new SolrException(SolrException.ErrorCode.SERVICE_UNAVAILABLE, "interrupted waiting for shard update response", e); }
(Domain) ReplicationHandlerException
Unknown
1
                    
// in core/src/java/org/apache/solr/handler/SnapPuller.java
catch (ReplicationHandlerException e) { throw e; }
(Lib) UnsupportedOperationException
(Domain) SolrException
1
                    
// in core/src/java/org/apache/solr/schema/CurrencyField.java
catch (UnsupportedOperationException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "XML parser doesn't support XInclude option", e); }
(Lib) IllegalArgumentException
(Domain) SolrException
(Lib) IllegalArgumentException
(Lib) ConfigException
(Domain) ParseException
9
                    
// in solrj/src/java/org/apache/solr/common/params/FacetParams.java
catch (IllegalArgumentException e) { throw new SolrException (SolrException.ErrorCode.BAD_REQUEST, label+" is not a valid type of 'other' range facet information",e); }
// in solrj/src/java/org/apache/solr/common/params/FacetParams.java
catch (IllegalArgumentException e) { throw new SolrException (SolrException.ErrorCode.BAD_REQUEST, label+" is not a valid type of 'other' range facet information",e); }
// in solrj/src/java/org/apache/solr/common/params/FacetParams.java
catch (IllegalArgumentException e) { throw new SolrException (SolrException.ErrorCode.BAD_REQUEST, label+" is not a valid type of for range 'include' information",e); }
// in core/src/java/org/apache/solr/handler/component/TermsComponent.java
catch (IllegalArgumentException iae) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Unknown terms regex flag '" + flagParam + "'"); }
// in core/src/java/org/apache/solr/handler/component/QueryComponent.java
catch (IllegalArgumentException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, String.format("Illegal %s parameter", GroupParams.GROUP_FORMAT)); }
// in core/src/java/org/apache/solr/handler/PingRequestHandler.java
catch (IllegalArgumentException iae){ throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Unknown action: " + actionParam); }
// in core/src/java/org/apache/solr/schema/TrieField.java
catch (IllegalArgumentException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Invalid type specified in schema.xml for field: " + args.get("name"), e); }
// in core/src/java/org/apache/solr/update/processor/DistributedUpdateProcessor.java
catch (IllegalArgumentException e) { throw new SolrException (SolrException.ErrorCode.BAD_REQUEST, "Illegal value for " + DISTRIB_UPDATE_PARAM + ": " + param, e); }
// in core/src/java/org/apache/solr/core/Config.java
catch (IllegalArgumentException iae) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Invalid luceneMatchVersion '" + matchVersion + "', valid values are: " + Arrays.toString(Version.values()) + " or a string in format 'V.V'", iae); }
1
                    
// in contrib/analysis-extras/src/java/org/apache/solr/analysis/MorfologikFilterFactory.java
catch (IllegalArgumentException e) { throw new IllegalArgumentException("The " + DICTIONARY_SCHEMA_ATTRIBUTE + " attribute accepts the " + "following constants: " + Arrays.toString(DICTIONARY.values()) + ", this value is invalid: " + dictionaryName); }
1
                    
// in core/src/java/org/apache/solr/cloud/SolrZkServer.java
catch (IllegalArgumentException e) { throw new ConfigException("Error processing " + path, e); }
2
                    
// in core/src/java/org/apache/solr/util/DateMathParser.java
catch (IllegalArgumentException e) { throw new ParseException ("Unit not recognized: \"" + ops[pos-1] + "\"", pos-1); }
// in core/src/java/org/apache/solr/util/DateMathParser.java
catch (IllegalArgumentException e) { throw new ParseException ("Unit not recognized: \"" + ops[pos-1] + "\"", pos-1); }
(Lib) NodeExistsException
Unknown
1
                    
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
catch (NodeExistsException e) { if (!failOnExists) { // TODO: version ? for now, don't worry about race setData(currentPath, data, -1, retryOnConnLoss); // set new watch exists(currentPath, watcher, retryOnConnLoss); return; } // ignore unless it's the last node in the path if (i == paths.length - 1) { throw e; } }
(Lib) UnsupportedEncodingException
(Lib) RuntimeException
(Domain) SolrException
4
                    
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
catch (UnsupportedEncodingException e) { // can't happen - UTF-8 throw new RuntimeException(e); }
// in contrib/velocity/src/java/org/apache/solr/response/SolrParamResourceLoader.java
catch (UnsupportedEncodingException e) { throw new RuntimeException(e); // may not happen }
// in core/src/java/org/apache/solr/servlet/cache/HttpCacheHeaderUtil.java
catch (UnsupportedEncodingException e) { throw new RuntimeException(e); // may not happen }
// in core/src/java/org/apache/solr/update/processor/MD5Signature.java
catch (UnsupportedEncodingException e) { // won't happen log.error("UTF-8 not supported", e); throw new RuntimeException(e); }
1
                    
// in core/src/java/org/apache/solr/servlet/SolrRequestParsers.java
catch( UnsupportedEncodingException uex ) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, uex ); }
(Lib) RuntimeException
(Domain) DataImportHandlerException
(Domain) SolrException
Unknown
1
                    
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/XPathEntityProcessor.java
catch (RuntimeException e) { throw new DataImportHandlerException(SEVERE, "Exception while reading xpaths for fields", e); }
1
                    
// in core/src/java/org/apache/solr/schema/FieldType.java
catch (RuntimeException e) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "Error while creating field '" + field + "' from value '" + value + "'", e); }
1
                    
// in core/src/java/org/apache/solr/util/SystemIdResolver.java
catch (RuntimeException re) { // unfortunately XInclude fallback only works with IOException, but openResource() never throws that one throw (IOException) (new IOException(re.getMessage()).initCause(re)); }
(Domain) SolrException
(Domain) SolrException
Unknown
3
                    
// in core/src/java/org/apache/solr/request/SimpleFacets.java
catch (SolrException e) { throw new SolrException (SolrException.ErrorCode.BAD_REQUEST, "date facet 'start' is not a valid Date string: " + startS, e); }
// in core/src/java/org/apache/solr/request/SimpleFacets.java
catch (SolrException e) { throw new SolrException (SolrException.ErrorCode.BAD_REQUEST, "date facet 'end' is not a valid Date string: " + endS, e); }
// in core/src/java/org/apache/solr/update/processor/FieldMutatingUpdateProcessor.java
catch (SolrException e) { String msg = "Unable to mutate field '"+fname+"': "+e.getMessage(); SolrException.log(log, msg, e); throw new SolrException(BAD_REQUEST, msg, e); }
13
                    
// in solrj/src/java/org/apache/solr/client/solrj/request/QueryRequest.java
catch (SolrException s){ throw s; }
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
catch (SolrException e) { // we retry on 404 or 403 or 503 - you can see this on solr shutdown if (e.code() == 404 || e.code() == 403 || e.code() == 503 || e.code() == 500) { ex = addZombie(server, e); } else { // Server is alive but the request was likely malformed or invalid throw e; } // TODO: consider using below above - currently does cause a problem with distrib updates: // seems to match up against a failed forward to leader exception as well... // || e.getMessage().contains("java.net.SocketException") // || e.getMessage().contains("java.net.ConnectException") }
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
catch (SolrException e) { // we retry on 404 or 403 or 503 - you can see this on solr shutdown if (e.code() == 404 || e.code() == 403 || e.code() == 503 || e.code() == 500) { ex = e; // already a zombie, no need to re-add } else { // Server is alive but the request was malformed or invalid zombieServers.remove(wrapper.getKey()); throw e; } }
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
catch (SolrException e) { // Server is alive but the request was malformed or invalid throw e; }
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
catch (SolrException e) { // Server is alive but the request was malformed or invalid throw e; }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
catch (SolrException e) { throw e; }
// in core/src/java/org/apache/solr/client/solrj/embedded/EmbeddedSolrServer.java
catch( SolrException sx ) { throw sx; }
// in core/src/java/org/apache/solr/schema/IndexSchema.java
catch (SolrException e) { throw e; }
// in core/src/java/org/apache/solr/update/DocumentBuilder.java
catch( SolrException ex ) { throw ex; }
// in core/src/java/org/apache/solr/core/Config.java
catch( SolrException e ){ SolrException.log(log,"Error in "+name,e); throw e; }
// in core/src/java/org/apache/solr/core/Config.java
catch (SolrException e) { throw(e); }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (SolrException e) { throw e; }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (SolrException e) { throw e; }
(Domain) ParseException
(Domain) DataImportHandlerException
(Domain) SolrException
(Lib) RuntimeException
(Domain) ParseException
5
                    
// in contrib/dataimporthandler-extras/src/java/org/apache/solr/handler/dataimport/MailEntityProcessor.java
catch (ParseException e) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Invalid value for fetchMailSince: " + s, e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/NumberFormatTransformer.java
catch (ParseException e) { throw new DataImportHandlerException( DataImportHandlerException.SEVERE, "Failed to apply NumberFormat on column: " + column, e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/NumberFormatTransformer.java
catch (ParseException e) { throw new DataImportHandlerException( DataImportHandlerException.SEVERE, "Failed to apply NumberFormat on column: " + column, e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/FileListEntityProcessor.java
catch (ParseException exp) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Invalid expression for date", exp); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/FileListEntityProcessor.java
catch (ParseException exp) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Invalid expression for date", exp); }
16
                    
// in core/src/java/org/apache/solr/handler/MoreLikeThisHandler.java
catch (ParseException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/handler/component/QueryComponent.java
catch (ParseException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/handler/component/QueryComponent.java
catch (ParseException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/handler/component/FacetComponent.java
catch (ParseException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/handler/component/HighlightComponent.java
catch (ParseException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/request/SimpleFacets.java
catch (ParseException e) { throw new SolrException(ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/request/SimpleFacets.java
catch (java.text.ParseException e) { throw new SolrException (SolrException.ErrorCode.BAD_REQUEST, "date facet 'gap' is not a valid Date Math string: " + gap, e); }
// in core/src/java/org/apache/solr/schema/DateField.java
catch (ParseException e) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "Invalid Date in Date Math String:'" +val+'\'',e); }
// in core/src/java/org/apache/solr/schema/DateField.java
catch (ParseException e) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "Invalid Date Math String:'" +val+'\'',e); }
// in core/src/java/org/apache/solr/schema/DateField.java
catch (ParseException e) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "Invalid Date in Date Math String:'" +val+'\'',e); }
// in core/src/java/org/apache/solr/schema/DateField.java
catch (ParseException e) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "Invalid Date Math String:'" +val+'\'',e); }
// in core/src/java/org/apache/solr/search/ReturnFields.java
catch (ParseException e) { // try again, simple rules for a field name with no whitespace sp.pos = start; field = sp.getSimpleString(); if (req.getSchema().getFieldOrNull(field) != null) { // OK, it was an oddly named field fields.add(field); if( key != null ) { rename.add(field, key); } } else { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Error parsing fieldname: " + e.getMessage(), e); } }
// in core/src/java/org/apache/solr/search/ReturnFields.java
catch (ParseException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Error parsing fieldname", e); }
// in core/src/java/org/apache/solr/search/QueryParsing.java
catch (ParseException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "error in sort: " + sortSpec, e); }
// in core/src/java/org/apache/solr/update/DirectUpdateHandler2.java
catch (ParseException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/util/SolrPluginUtils.java
catch (ParseException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Error parsing query: " + qs); }
2
                    
// in core/src/java/org/apache/solr/schema/DateField.java
catch( ParseException ex ) { throw new RuntimeException( ex ); }
// in core/src/java/org/apache/solr/schema/CollationField.java
catch (ParseException e) { // invalid rules throw new RuntimeException(e); }
1
                    
// in core/src/java/org/apache/solr/search/SurroundQParserPlugin.java
catch (org.apache.lucene.queryparser.surround.parser.ParseException pe) { throw new org.apache.lucene.queryparser.classic.ParseException( pe.getMessage()); }
(Domain) DataImportHandlerException
Unknown
6
                    
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DebugLogger.java
catch (DataImportHandlerException de) { log(DIHLogLevels.ENTITY_EXCEPTION, null, de); throw de; }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DebugLogger.java
catch (DataImportHandlerException de) { log(DIHLogLevels.TRANSFORMER_EXCEPTION, tName, de); de.debugged = true; throw de; }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/ScriptTransformer.java
catch (DataImportHandlerException e) { throw e; }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/SqlEntityProcessor.java
catch (DataImportHandlerException e) { throw e; }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DocBuilder.java
catch (DataImportHandlerException e) { if (verboseDebug) { getDebugLogger().log(DIHLogLevels.ENTITY_EXCEPTION, epw.getEntity().getName(), e); } if(e.getErrCode() == DataImportHandlerException.SKIP_ROW){ continue; } if (isRoot) { if (e.getErrCode() == DataImportHandlerException.SKIP) { importStatistics.skipDocCount.getAndIncrement(); doc = null; } else { SolrException.log(LOG, "Exception while processing: " + epw.getEntity().getName() + " document : " + doc, e); } if (e.getErrCode() == DataImportHandlerException.SEVERE) throw e; } else throw e; }
(Lib) NoNodeException
(Domain) ZooKeeperException
1
                    
// in core/src/java/org/apache/solr/cloud/LeaderElector.java
catch (KeeperException.NoNodeException e) { // we must have failed in creating the election node - someone else must // be working on it, lets try again if (tries++ > 9) { throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } cont = true; Thread.sleep(50); }
(Lib) TimeoutException
(Domain) ZooKeeperException
2
                    
// in solrj/src/java/org/apache/solr/client/solrj/impl/CloudSolrServer.java
catch (TimeoutException e) { throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (TimeoutException e) { log.error("Could not connect to ZooKeeper", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
(Lib) IOException
(Lib) RuntimeException
(Domain) SolrException
(Domain) ZooKeeperException
(Domain) SolrServerException
(Domain) DataImportHandlerException
(Lib) InitializationException
(Lib) ConfigException
(Lib) TransformerException
(Lib) XMLStreamException
Unknown
31
                    
// in solrj/src/java/org/apache/solr/common/cloud/ZkStateReader.java
catch (IOException e) { throw new RuntimeException(e); // should never happen w/o using real IO }
// in solrj/src/java/org/apache/solr/common/params/MapSolrParams.java
catch (IOException e) {throw new RuntimeException(e);}
// in solrj/src/java/org/apache/solr/common/params/MultiMapSolrParams.java
catch (IOException e) {throw new RuntimeException(e);}
// in solrj/src/java/org/apache/solr/common/params/ModifiableSolrParams.java
catch (IOException e) {throw new RuntimeException(e);}
// in solrj/src/java/org/apache/solr/client/solrj/request/RequestWriter.java
catch (IOException e) { throw new RuntimeException("Unable to write xml into a stream", e); }
// in solrj/src/java/org/apache/solr/client/solrj/util/ClientUtils.java
catch (IOException e) {throw new RuntimeException(e);}
// in solrj/src/java/org/apache/noggit/CharArr.java
catch (IOException e) { throw new RuntimeException(e); }
// in solrj/src/java/org/apache/noggit/CharArr.java
catch (IOException e) { throw new RuntimeException(e); }
// in solrj/src/java/org/apache/noggit/CharArr.java
catch (IOException e) { throw new RuntimeException(e); }
// in contrib/analysis-extras/src/java/org/apache/solr/schema/ICUCollationField.java
catch (IOException e) { throw new RuntimeException("Unable to initialize TokenStream to analyze range part: " + part, e); }
// in contrib/analysis-extras/src/java/org/apache/solr/schema/ICUCollationField.java
catch (IOException e) { throw new RuntimeException("error analyzing range part: " + part, e); }
// in contrib/analysis-extras/src/java/org/apache/solr/schema/ICUCollationField.java
catch (IOException e) { throw new RuntimeException("Unable to end & close TokenStream after analyzing range part: " + part, e); }
// in core/src/java/org/apache/solr/handler/AnalysisRequestHandlerBase.java
catch (IOException ioe) { throw new RuntimeException("Error occured while iterating over tokenstream", ioe); }
// in core/src/java/org/apache/solr/handler/AnalysisRequestHandlerBase.java
catch (IOException ioe) { throw new RuntimeException("Error occured while iterating over tokenstream", ioe); }
// in core/src/java/org/apache/solr/handler/component/StatsComponent.java
catch (IOException e) { throw new RuntimeException( "failed to open field cache for: "+fieldName, e ); }
// in core/src/java/org/apache/solr/handler/component/StatsComponent.java
catch (IOException e) { throw new RuntimeException( "failed to open field cache for: " + facetField, e ); }
// in core/src/java/org/apache/solr/handler/component/TermVectorComponent.java
catch (IOException e) { throw new RuntimeException(e); }
// in core/src/java/org/apache/solr/request/UnInvertedField.java
catch (IOException e) { throw new RuntimeException("failed to open field cache for: " + f, e); }
// in core/src/java/org/apache/solr/schema/TextField.java
catch (IOException e) { throw new RuntimeException("Unable to end & close TokenStream after analyzing multiTerm term: " + part, e); }
// in core/src/java/org/apache/solr/schema/TextField.java
catch (IOException e) { throw new RuntimeException("Unable to initialize TokenStream to analyze query text", e); }
// in core/src/java/org/apache/solr/schema/TextField.java
catch (IOException e) { throw new RuntimeException("Unable to initialize TokenStream to analyze query text", e); }
// in core/src/java/org/apache/solr/schema/CollationField.java
catch (IOException e) { // io error throw new RuntimeException(e); }
// in core/src/java/org/apache/solr/schema/CollationField.java
catch (IOException e) { throw new RuntimeException("Unable to initialize TokenStream to analyze range part: " + part, e); }
// in core/src/java/org/apache/solr/schema/CollationField.java
catch (IOException e) { throw new RuntimeException("error analyzing range part: " + part, e); }
// in core/src/java/org/apache/solr/schema/CollationField.java
catch (IOException e) { throw new RuntimeException("Unable to end & close TokenStream after analyzing range part: " + part, e); }
// in core/src/java/org/apache/solr/spelling/AbstractLuceneSpellChecker.java
catch (IOException e) { throw new RuntimeException(e); }
// in core/src/java/org/apache/solr/spelling/FileBasedSpellChecker.java
catch (IOException e) { throw new RuntimeException(e); }
// in core/src/java/org/apache/solr/spelling/SuggestQueryConverter.java
catch (IOException e) { throw new RuntimeException(e); }
// in core/src/java/org/apache/solr/spelling/IndexBasedSpellChecker.java
catch (IOException e) { throw new RuntimeException(e); }
// in core/src/java/org/apache/solr/spelling/IndexBasedSpellChecker.java
catch (IOException e) { throw new RuntimeException(e); }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (IOException e) { throw new RuntimeException(e); }
43
                    
// in solrj/src/java/org/apache/solr/client/solrj/impl/BinaryResponseParser.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "parsing error", e); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/StreamingBinaryResponseParser.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "parsing error", e); }
// in contrib/extraction/src/java/org/apache/solr/handler/extraction/ExtractingRequestHandler.java
catch (IOException e) { throw new SolrException(ErrorCode.SERVER_ERROR, e); }
// in contrib/clustering/src/java/org/apache/solr/handler/clustering/carrot2/CarrotClusteringEngine.java
catch (IOException e) { throw new SolrException(ErrorCode.SERVER_ERROR, e); }
// in contrib/analysis-extras/src/java/org/apache/solr/analysis/StempelPolishStemFilterFactory.java
catch (IOException e) { throw new SolrException(ErrorCode.SERVER_ERROR, "Could not load stem table: " + STEMTABLE); }
// in core/src/java/org/apache/solr/handler/AnalysisRequestHandlerBase.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/handler/AnalysisRequestHandlerBase.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/handler/loader/JavabinLoader.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "ERROR handling commit/rollback"); }
// in core/src/java/org/apache/solr/handler/loader/JavabinLoader.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "ERROR adding document " + document); }
// in core/src/java/org/apache/solr/handler/loader/CSVLoaderBase.java
catch (IOException e) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST,e); }
// in core/src/java/org/apache/solr/handler/PingRequestHandler.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Unable to write healthcheck flag file", e); }
// in core/src/java/org/apache/solr/analysis/TrieTokenizerFactory.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Unable to create TrieIndexTokenizer", e); }
// in core/src/java/org/apache/solr/analysis/TrieTokenizerFactory.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Unable to create TrieIndexTokenizer", e); }
// in core/src/java/org/apache/solr/response/transform/ValueSourceAugmenter.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/response/transform/ValueSourceAugmenter.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "exception at docid " + docid + " for valuesource " + valueSource, e); }
// in core/src/java/org/apache/solr/request/SimpleFacets.java
catch (IOException e) { throw new SolrException(ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/servlet/cache/HttpCacheHeaderUtil.java
catch (IOException e) { // we're pretty freaking screwed if this happens throw new SolrException(ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/schema/OpenExchangeRatesOrgProvider.java
catch (IOException e) { throw new SolrException(ErrorCode.SERVER_ERROR, "Error closing stream", e); }
// in core/src/java/org/apache/solr/schema/CurrencyField.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/schema/CurrencyField.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Error parsing currency config.", e); }
// in core/src/java/org/apache/solr/schema/CurrencyField.java
catch (IOException e) { throw new SolrException(ErrorCode.BAD_REQUEST, "Error while opening Currency configuration file "+currencyConfigFile, e); }
// in core/src/java/org/apache/solr/schema/LatLonType.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/schema/TextField.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Unable to initialize TokenStream to analyze multiTerm term: " + part, e); }
// in core/src/java/org/apache/solr/schema/TextField.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST,"error analyzing range part: " + part, e); }
// in core/src/java/org/apache/solr/search/SolrConstantScoreQuery.java
catch (IOException e) { // TODO: remove this if ConstantScoreQuery.createWeight adds IOException throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/search/QueryParsing.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "error in sort: " + sortSpec, e); }
// in core/src/java/org/apache/solr/search/grouping/distributed/responseprocessor/TopGroupsShardResponseProcessor.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/search/grouping/distributed/responseprocessor/SearchGroupShardResponseProcessor.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/cloud/SolrZkServer.java
catch (IOException e) { if (zkRun != null) throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (IOException e) { log.error("", e); throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Can't create ZooKeeperController", e); }
// in core/src/java/org/apache/solr/update/UpdateLog.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Can't open new tlog!", e); }
// in core/src/java/org/apache/solr/update/DirectUpdateHandler2.java
catch (IOException e) { throw new SolrException(ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/update/VersionInfo.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Error reading version from index", e); }
// in core/src/java/org/apache/solr/update/TransactionLog.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/update/TransactionLog.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/update/TransactionLog.java
catch (IOException e) { // TODO: reset our file pointer back to "pos", the start of this record. throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Error logging add", e); }
// in core/src/java/org/apache/solr/update/TransactionLog.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/update/TransactionLog.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/update/TransactionLog.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/update/TransactionLog.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/update/TransactionLog.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/update/TransactionLog.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/core/SolrXMLSerializer.java
catch (java.io.IOException xio) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, xio); }
3
                    
// in solrj/src/java/org/apache/solr/client/solrj/impl/CloudSolrServer.java
catch (IOException e) { throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (IOException e) { throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (IOException e) { log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
2
                    
// in solrj/src/java/org/apache/solr/client/solrj/impl/HttpSolrServer.java
catch (IOException ex) { throw new SolrServerException("error reading streams", ex); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/HttpSolrServer.java
catch (IOException e) { throw new SolrServerException( "IOException occured when talking to server at: " + getBaseURL(), e); }
4
                    
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/LineEntityProcessor.java
catch (IOException exp) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Problem reading from input", exp); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/SolrWriter.java
catch (IOException e) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Unable to call finish() on UpdateRequestProcessor", e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/SolrWriter.java
catch (IOException e) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Exception in full dump while deleting all documents.", e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/HTMLStripTransformer.java
catch (IOException e) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Failed stripping HTML for column: " + column, e); }
15
                    
// in core/src/java/org/apache/solr/analysis/ElisionFilterFactory.java
catch (IOException e) { throw new InitializationException("IOException thrown while loading articles", e); }
// in core/src/java/org/apache/solr/analysis/StemmerOverrideFilterFactory.java
catch (IOException e) { throw new InitializationException("IOException thrown while loading dictionary", e); }
// in core/src/java/org/apache/solr/analysis/KeywordMarkerFilterFactory.java
catch (IOException e) { throw new InitializationException("IOException thrown while loading protected words", e); }
// in core/src/java/org/apache/solr/analysis/SnowballPorterFilterFactory.java
catch (IOException e) { throw new InitializationException("IOException thrown while loading protected words", e); }
// in core/src/java/org/apache/solr/analysis/KeepWordFilterFactory.java
catch (IOException e) { throw new InitializationException("IOException thrown while loading words", e); }
// in core/src/java/org/apache/solr/analysis/JapanesePartOfSpeechStopFilterFactory.java
catch (IOException e) { throw new InitializationException("IOException thrown while loading tags", e); }
// in core/src/java/org/apache/solr/analysis/MappingCharFilterFactory.java
catch( IOException e ){ throw new InitializationException("IOException thrown while loading mappings", e); }
// in core/src/java/org/apache/solr/analysis/PatternTokenizerFactory.java
catch( IOException ex ) { throw new InitializationException("IOException thrown creating PatternTokenizer instance", ex); }
// in core/src/java/org/apache/solr/analysis/DictionaryCompoundWordTokenFilterFactory.java
catch (IOException e) { throw new InitializationException("IOException thrown while loading dictionary", e); }
// in core/src/java/org/apache/solr/analysis/WordDelimiterFilterFactory.java
catch (IOException e) { throw new InitializationException("IOException thrown while loading protected words", e); }
// in core/src/java/org/apache/solr/analysis/WordDelimiterFilterFactory.java
catch (IOException e) { throw new InitializationException("IOException while loading types", e); }
// in core/src/java/org/apache/solr/analysis/TypeTokenFilterFactory.java
catch (IOException e) { throw new InitializationException("IOException thrown while loading types", e); }
// in core/src/java/org/apache/solr/analysis/CommonGramsQueryFilterFactory.java
catch (IOException e) { throw new InitializationException("IOException thrown while loading common word file", e); }
// in core/src/java/org/apache/solr/analysis/StopFilterFactory.java
catch (IOException e) { throw new InitializationException("IOException thrown while loading stopwords", e); }
// in core/src/java/org/apache/solr/analysis/CommonGramsFilterFactory.java
catch (IOException e) { throw new InitializationException("IOException thrown while loading common word file", e); }
1
                    
// in core/src/java/org/apache/solr/cloud/SolrZkServer.java
catch (IOException e) { throw new ConfigException("Error processing " + path, e); }
1
                    
// in core/src/java/org/apache/solr/util/SystemIdResolver.java
catch (IOException ioe) { throw new TransformerException("Cannot resolve entity", ioe); }
1
                    
// in core/src/java/org/apache/solr/util/SystemIdResolver.java
catch (IOException ioe) { throw new XMLStreamException("Cannot resolve entity", ioe); }
2
                    
// in core/src/java/org/apache/solr/client/solrj/embedded/EmbeddedSolrServer.java
catch( IOException iox ) { throw iox; }
// in core/src/java/org/apache/solr/search/QueryParsing.java
catch (IOException ioe) { throw ioe; }
(Lib) KeeperException
(Domain) ZooKeeperException
(Lib) InterruptedException
(Lib) RuntimeException
(Domain) SolrException
Unknown
17
                    
// in solrj/src/java/org/apache/solr/common/cloud/ZkStateReader.java
catch (KeeperException e) { log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in solrj/src/java/org/apache/solr/common/cloud/ZkStateReader.java
catch (KeeperException e) { if (e.code() == KeeperException.Code.SESSIONEXPIRED || e.code() == KeeperException.Code.CONNECTIONLOSS) { log.warn("ZooKeeper watch triggered, but Solr cannot talk to ZK"); return; } log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in solrj/src/java/org/apache/solr/common/cloud/ZkStateReader.java
catch (KeeperException e) { if (e.code() == KeeperException.Code.SESSIONEXPIRED || e.code() == KeeperException.Code.CONNECTIONLOSS) { log.warn("ZooKeeper watch triggered, but Solr cannot talk to ZK"); return; } log.error("", e); throw new ZooKeeperException( SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in solrj/src/java/org/apache/solr/common/cloud/ZkStateReader.java
catch (KeeperException e) { if (e.code() == KeeperException.Code.SESSIONEXPIRED || e.code() == KeeperException.Code.CONNECTIONLOSS) { log.warn("ZooKeeper watch triggered, but Solr cannot talk to ZK"); return; } log.error("", e); throw new ZooKeeperException( SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/CloudSolrServer.java
catch (KeeperException e) { throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/cloud/Overseer.java
catch (KeeperException e) { if (e.code() == KeeperException.Code.SESSIONEXPIRED || e.code() == KeeperException.Code.CONNECTIONLOSS) { log.warn("ZooKeeper watch triggered, but Solr cannot talk to ZK"); return; } SolrException.log(log, "", e); throw new ZooKeeperException( SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/cloud/Overseer.java
catch (KeeperException e) { if (e.code() == KeeperException.Code.SESSIONEXPIRED || e.code() == KeeperException.Code.CONNECTIONLOSS) { log.warn("ZooKeeper watch triggered, but Solr cannot talk to ZK"); return; } SolrException.log(log, "", e); throw new ZooKeeperException( SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/cloud/Overseer.java
catch (KeeperException e) { if (e.code() == KeeperException.Code.SESSIONEXPIRED || e.code() == KeeperException.Code.CONNECTIONLOSS) { log.warn("ZooKeeper watch triggered, but Solr cannot talk to ZK"); return; } SolrException.log(log, "", e); throw new ZooKeeperException( SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/cloud/ZkSolrResourceLoader.java
catch (KeeperException e) { log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (KeeperException e) { log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (KeeperException e) { throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (KeeperException e) { throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "could not publish node state", e); }
// in core/src/java/org/apache/solr/cloud/LeaderElector.java
catch (KeeperException.NoNodeException e) { // we must have failed in creating the election node - someone else must // be working on it, lets try again if (tries++ > 9) { throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } cont = true; Thread.sleep(50); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (KeeperException e) { log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (KeeperException e) { log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (KeeperException e) { log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (KeeperException e) { log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
1
                    
// in solrj/src/java/org/apache/solr/common/cloud/ZkCmdExecutor.java
catch (KeeperException.ConnectionLossException e) { if (exception == null) { exception = e; } if (Thread.currentThread().isInterrupted()) { Thread.currentThread().interrupt(); throw new InterruptedException(); } if (Thread.currentThread() instanceof SafeStopThread) { if (((SafeStopThread) Thread.currentThread()).isClosed()) { throw new RuntimeException("Interrupted"); } } retryDelay(i); }
1
                    
// in solrj/src/java/org/apache/solr/common/cloud/ZkCmdExecutor.java
catch (KeeperException.ConnectionLossException e) { if (exception == null) { exception = e; } if (Thread.currentThread().isInterrupted()) { Thread.currentThread().interrupt(); throw new InterruptedException(); } if (Thread.currentThread() instanceof SafeStopThread) { if (((SafeStopThread) Thread.currentThread()).isClosed()) { throw new RuntimeException("Interrupted"); } } retryDelay(i); }
1
                    
// in core/src/java/org/apache/solr/handler/admin/CoreAdminHandler.java
catch (KeeperException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Could not unregister core " + cname + " from cloudstate: " + e.getMessage(), e); }
5
                    
// in core/src/java/org/apache/solr/cloud/ShardLeaderWatcher.java
catch (KeeperException ke) { //check if we lost connection or the node was gone if (ke.code() != Code.CONNECTIONLOSS && ke.code() != Code.SESSIONEXPIRED && ke.code() != Code.NONODE) { throw ke; } }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (KeeperException e) { // its okay if the node already exists if (e.code() != KeeperException.Code.NODEEXISTS) { throw e; } }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (KeeperException e) { // its okay if the node already exists if (e.code() != KeeperException.Code.NODEEXISTS) { throw e; } }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (KeeperException e) { // its okay if another beats us creating the node if (e.code() != KeeperException.Code.NODEEXISTS) { throw e; } }
// in core/src/java/org/apache/solr/cloud/LeaderElector.java
catch (KeeperException.SessionExpiredException e) { throw e; }
(Lib) IllegalStateException
(Domain) SolrException
1
                    
// in core/src/java/org/apache/solr/request/UnInvertedField.java
catch (IllegalStateException ise) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, ise.getMessage()); }
(Lib) ConnectionLossException
(Lib) InterruptedException
(Lib) RuntimeException
Unknown
1
                    
// in solrj/src/java/org/apache/solr/common/cloud/ZkCmdExecutor.java
catch (KeeperException.ConnectionLossException e) { if (exception == null) { exception = e; } if (Thread.currentThread().isInterrupted()) { Thread.currentThread().interrupt(); throw new InterruptedException(); } if (Thread.currentThread() instanceof SafeStopThread) { if (((SafeStopThread) Thread.currentThread()).isClosed()) { throw new RuntimeException("Interrupted"); } } retryDelay(i); }
1
                    
// in solrj/src/java/org/apache/solr/common/cloud/ZkCmdExecutor.java
catch (KeeperException.ConnectionLossException e) { if (exception == null) { exception = e; } if (Thread.currentThread().isInterrupted()) { Thread.currentThread().interrupt(); throw new InterruptedException(); } if (Thread.currentThread() instanceof SafeStopThread) { if (((SafeStopThread) Thread.currentThread()).isClosed()) { throw new RuntimeException("Interrupted"); } } retryDelay(i); }
1
                    
// in core/src/java/org/apache/solr/cloud/LeaderElector.java
catch (ConnectionLossException e) { // we don't know if we made our node or not... List<String> entries = zkClient.getChildren(shardsElectZkPath, null, true); boolean foundId = false; for (String entry : entries) { String nodeId = getNodeId(entry); if (id.equals(nodeId)) { // we did create our node... foundId = true; break; } } if (!foundId) { throw e; } }
(Lib) NumberFormatException
(Domain) SolrException
(Lib) IOException
(Lib) RuntimeException
(Lib) IllegalArgumentException
(Domain) ParseException
6
                    
// in core/src/java/org/apache/solr/handler/SnapPuller.java
catch (NumberFormatException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, INTERVAL_ERR_MSG); }
// in core/src/java/org/apache/solr/handler/component/TermVectorComponent.java
catch (NumberFormatException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e.getMessage(), e); }
// in core/src/java/org/apache/solr/analysis/TrieTokenizerFactory.java
catch (NumberFormatException nfe) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Invalid Number: " + v); }
// in core/src/java/org/apache/solr/schema/CurrencyField.java
catch (NumberFormatException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Could not parse exchange rate: " + rateNode, e); }
// in core/src/java/org/apache/solr/schema/CurrencyField.java
catch (NumberFormatException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/util/DOMUtil.java
catch (NumberFormatException nfe) { throw new SolrException (SolrException.ErrorCode.SERVER_ERROR, "Value " + (null != name ? ("of '" +name+ "' ") : "") + "can not be parsed as '" +type+ "': \"" + textValue + "\"", nfe); }
1
                    
// in core/src/java/org/apache/solr/internal/csv/CSVParser.java
catch (NumberFormatException e) { throw new IOException( "(line " + getLineNumber() + ") Wrong unicode escape sequence found '" + code.toString() + "'" + e.toString()); }
1
                    
// in core/src/java/org/apache/solr/spelling/AbstractLuceneSpellChecker.java
catch (NumberFormatException e) { throw new RuntimeException( "Unparseable accuracy given for dictionary: " + name, e); }
1
                    
// in core/src/java/org/apache/solr/cloud/SolrZkServer.java
catch (NumberFormatException e) { throw new IllegalArgumentException("serverid " + myIdString + " is not a number"); }
1
                    
// in core/src/java/org/apache/solr/util/DateMathParser.java
catch (NumberFormatException e) { throw new ParseException ("Not a Number: \"" + ops[pos-1] + "\"", pos-1); }
(Lib) Throwable
(Domain) SolrException
(Domain) DataImportHandlerException
4
                    
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DataImportHandler.java
catch (Throwable e) { LOG.error( DataImporter.MSG.LOAD_EXP, e); throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, DataImporter.MSG.INVALID_CONFIG, e); }
// in core/src/java/org/apache/solr/cloud/SolrZkServer.java
catch (Throwable e) { log.error("ZooKeeper Server ERROR", e); throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/core/Config.java
catch (Throwable e) { SolrException.log(log,"Error in xpath",e); throw new SolrException( SolrException.ErrorCode.SERVER_ERROR,"Error in xpath:" + xstr+ " for " + name,e); }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (Throwable e) { latch.countDown();//release the latch, otherwise we block trying to do the close. This should be fine, since counting down on a latch of 0 is still fine //close down the searcher and any other resources, if it exists, as this is not recoverable close(); throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, null, e); }
1
                    
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DocBuilder.java
catch (Throwable t) { if (verboseDebug) { getDebugLogger().log(DIHLogLevels.ENTITY_EXCEPTION, epw.getEntity().getName(), t); } throw new DataImportHandlerException(DataImportHandlerException.SEVERE, t); }
(Lib) NoHttpResponseException
Unknown
1
                    
// in solrj/src/java/org/apache/solr/client/solrj/impl/HttpSolrServer.java
catch( NoHttpResponseException r ) { method = null; if(is != null) { is.close(); } // If out of tries then just rethrow (as normal error). if (tries < 1) { throw r; } }
(Lib) ConnectException
(Domain) SolrServerException
1
                    
// in solrj/src/java/org/apache/solr/client/solrj/impl/HttpSolrServer.java
catch (ConnectException e) { throw new SolrServerException("Server refused connection at: " + getBaseURL(), e); }
(Lib) SocketTimeoutException
(Domain) SolrServerException
1
                    
// in solrj/src/java/org/apache/solr/client/solrj/impl/HttpSolrServer.java
catch (SocketTimeoutException e) { throw new SolrServerException( "Timeout occured while waiting response from server at: " + getBaseURL(), e); }
(Lib) MalformedURLException
(Lib) RuntimeException
(Domain) DataImportHandlerException
(Domain) SolrException
3
                    
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
catch (MalformedURLException e) { throw new RuntimeException(e); }
// in core/src/java/org/apache/solr/handler/component/QueryElevationComponent.java
catch (MalformedURLException e) { throw new RuntimeException(e); }
// in core/src/java/org/apache/solr/util/SuggestMissingFactories.java
catch (MalformedURLException e) { throw new RuntimeException ("WTF, how can JarFile.toURL() be malformed?", e); }
1
                    
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/SolrEntityProcessor.java
catch (MalformedURLException e) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, e); }
1
                    
// in core/src/java/org/apache/solr/handler/component/HttpShardHandlerFactory.java
catch (MalformedURLException e) { // should be impossible since we're not passing any URLs here throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
(Lib) XMLStreamException
(Domain) SolrException
4
                    
// in solrj/src/java/org/apache/solr/client/solrj/impl/XMLResponseParser.java
catch (XMLStreamException e) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "parsing error", e); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/XMLResponseParser.java
catch (XMLStreamException e) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "parsing error", e); }
// in core/src/java/org/apache/solr/handler/loader/XMLLoader.java
catch (XMLStreamException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e.getMessage(), e); }
// in core/src/java/org/apache/solr/handler/loader/XMLLoader.java
catch (XMLStreamException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e.getMessage(), e); }
(Lib) TikaException
(Domain) SolrException
1
                    
// in contrib/extraction/src/java/org/apache/solr/handler/extraction/ExtractingDocumentLoader.java
catch (TikaException e) { if(ignoreTikaException) log.warn(new StringBuilder("skip extracting text due to ").append(e.getLocalizedMessage()) .append(". metadata=").append(metadata.toString()).toString()); else throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
(Lib) SAXException
(Domain) SolrException
Unknown
2
                    
// in contrib/extraction/src/java/org/apache/solr/handler/extraction/ExtractingDocumentLoader.java
catch (SAXException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/schema/CurrencyField.java
catch (SAXException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Error parsing currency config.", e); }
1
                    
// in core/src/java/org/apache/solr/core/Config.java
catch (SAXException e) { SolrException.log(log, "Exception during parsing file: " + name, e); throw e; }
(Lib) MimeTypeException
(Domain) SolrException
1
                    
// in contrib/extraction/src/java/org/apache/solr/handler/extraction/ExtractingRequestHandler.java
catch (MimeTypeException e) { throw new SolrException(ErrorCode.SERVER_ERROR, e); }
(Lib) ClassNotFoundException
(Lib) InitializationException
(Domain) SolrException
2
                    
// in core/src/java/org/apache/solr/analysis/PhoneticFilterFactory.java
catch (ClassNotFoundException cnfe) { throw new InitializationException("Unknown encoder: " + name + " must be full class name or one of " + registry.keySet(), cnfe); }
// in core/src/java/org/apache/solr/analysis/SnowballPorterFilterFactory.java
catch (ClassNotFoundException e) { throw new InitializationException("Can't find class for stemmer language " + language, e); }
1
                    
// in core/src/java/org/apache/solr/core/SolrResourceLoader.java
catch (ClassNotFoundException e) { String newName=cname; if (newName.startsWith(project)) { newName = cname.substring(project.length()+1); } for (String subpackage : subpackages) { try { String name = base + '.' + subpackage + newName; log.trace("Trying class name " + name); return clazz = Class.forName(name,true,classLoader).asSubclass(expectedType); } catch (ClassNotFoundException e1) { // ignore... assume first exception is best. } } throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "Error loading class '" + cname + "'", e); }
(Lib) TransformerConfigurationException
Unknown
1
                    
// in core/src/java/org/apache/solr/util/xslt/TransformerProvider.java
catch(TransformerConfigurationException tce) { log.error(getClass().getName(), "getTransformer", tce); final IOException ioe = new IOException("newTransformer fails ( " + lastFilename + ")"); ioe.initCause(tce); throw ioe; }
(Lib) MessagingException
(Domain) DataImportHandlerException
5
                    
// in contrib/dataimporthandler-extras/src/java/org/apache/solr/handler/dataimport/MailEntityProcessor.java
catch (MessagingException e) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Connection failed", e); }
// in contrib/dataimporthandler-extras/src/java/org/apache/solr/handler/dataimport/MailEntityProcessor.java
catch (MessagingException e) { // skip bad ones unless its the last one and still no good folder if (folders.size() == 0 && i == topLevelFolders.size() - 1) throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Folder retreival failed"); }
// in contrib/dataimporthandler-extras/src/java/org/apache/solr/handler/dataimport/MailEntityProcessor.java
catch (MessagingException e) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Folder retreival failed"); }
// in contrib/dataimporthandler-extras/src/java/org/apache/solr/handler/dataimport/MailEntityProcessor.java
catch (MessagingException e) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Message retreival failed", e); }
// in contrib/dataimporthandler-extras/src/java/org/apache/solr/handler/dataimport/MailEntityProcessor.java
catch (MessagingException e) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Message retreival failed", e); }
(Lib) TransformerException
(Domain) SolrException
Unknown
2
                    
// in core/src/java/org/apache/solr/handler/loader/XMLLoader.java
catch(TransformerException te) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, te.getMessage(), te); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (TransformerException e) { throw new SolrException(ErrorCode.SERVER_ERROR, "", e); }
1
                    
// in core/src/java/org/apache/solr/response/XSLTResponseWriter.java
catch(TransformerException te) { final IOException ioe = new IOException("XSLT transformation error"); ioe.initCause(te); throw ioe; }
(Lib) FileNotFoundException
(Lib) RuntimeException
(Domain) SolrException
1
                    
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/FileDataSource.java
catch (FileNotFoundException e) { throw new RuntimeException(e); }
1
                    
// in core/src/java/org/apache/solr/core/SolrXMLSerializer.java
catch (java.io.FileNotFoundException xnf) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, xnf); }
(Lib) ClassCastException
(Lib) InitializationException
(Domain) SolrException
1
                    
// in core/src/java/org/apache/solr/analysis/PhoneticFilterFactory.java
catch (ClassCastException e) { throw new InitializationException("Not an encoder: " + name + " must be full class name or one of " + registry.keySet(), e); }
2
                    
// in core/src/java/org/apache/solr/update/processor/MinFieldValueUpdateProcessorFactory.java
catch (ClassCastException e) { throw new SolrException (BAD_REQUEST, "Field values are not mutually comparable: " + e.getMessage(), e); }
// in core/src/java/org/apache/solr/update/processor/MaxFieldValueUpdateProcessorFactory.java
catch (ClassCastException e) { throw new SolrException (BAD_REQUEST, "Field values are not mutually comparable: " + e.getMessage(), e); }
(Lib) ExecutionException
(Domain) SolrException
Unknown
2
                    
// in core/src/java/org/apache/solr/handler/component/HttpShardHandler.java
catch (ExecutionException e) { // should be impossible... the problem with catching the exception // at this level is we don't know what ShardRequest it applied to throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Impossible Exception",e); }
// in core/src/java/org/apache/solr/request/PerSegmentSingleValuedFaceting.java
catch (ExecutionException e) { Throwable cause = e.getCause(); if (cause instanceof RuntimeException) { throw (RuntimeException)cause; } else { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Error in per-segment faceting on field: " + fieldName, cause); } }
1
                    
// in core/src/java/org/apache/solr/request/PerSegmentSingleValuedFaceting.java
catch (ExecutionException e) { Throwable cause = e.getCause(); if (cause instanceof RuntimeException) { throw (RuntimeException)cause; } else { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Error in per-segment faceting on field: " + fieldName, cause); } }
(Lib) XPathExpressionException
(Domain) SolrException
4
                    
// in core/src/java/org/apache/solr/handler/component/QueryElevationComponent.java
catch (XPathExpressionException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "query requires '<doc .../>' child"); }
// in core/src/java/org/apache/solr/schema/CurrencyField.java
catch (XPathExpressionException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Error parsing currency config.", e); }
// in core/src/java/org/apache/solr/core/Config.java
catch (XPathExpressionException e) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR,"Error in xpath:" + path +" for " + name,e); }
// in core/src/java/org/apache/solr/core/Config.java
catch (XPathExpressionException e) { SolrException.log(log,"Error in xpath",e); throw new SolrException( SolrException.ErrorCode.SERVER_ERROR,"Error in xpath:" + xstr + " for " + name,e); }
(Lib) URISyntaxException
(Domain) SolrException
(Lib) IllegalArgumentException
1
                    
// in core/src/java/org/apache/solr/handler/admin/ShowFileRequestHandler.java
catch (URISyntaxException e) { throw new SolrException( ErrorCode.FORBIDDEN, "Can not access configuration directory!"); }
1
                    
// in core/src/java/org/apache/solr/util/SystemIdResolver.java
catch (URISyntaxException use) { throw new IllegalArgumentException("Invalid syntax of Solr Resource URI", use); }
(Lib) UnknownHostException
(Lib) RuntimeException
1
                    
// in core/src/java/org/apache/solr/cloud/SolrZkServer.java
catch (UnknownHostException e) { throw new RuntimeException(e); }
(Lib) ParserConfigurationException
(Domain) SolrException
Unknown
1
                    
// in core/src/java/org/apache/solr/schema/CurrencyField.java
catch (ParserConfigurationException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Error parsing currency config.", e); }
1
                    
// in core/src/java/org/apache/solr/core/Config.java
catch (ParserConfigurationException e) { SolrException.log(log, "Exception during parsing file: " + name, e); throw e; }
(Lib) InvalidShapeException
(Domain) SolrException
(Domain) ParseException
10
                    
// in core/src/java/org/apache/solr/schema/LatLonType.java
catch (InvalidShapeException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/schema/LatLonType.java
catch (InvalidShapeException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/schema/LatLonType.java
catch (InvalidShapeException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/schema/LatLonType.java
catch (InvalidShapeException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/schema/PointType.java
catch (InvalidShapeException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/schema/PointType.java
catch (InvalidShapeException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/schema/PointType.java
catch (InvalidShapeException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/schema/PointType.java
catch (InvalidShapeException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/schema/GeoHashField.java
catch (InvalidShapeException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/schema/GeoHashField.java
catch (InvalidShapeException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
1
                    
// in core/src/java/org/apache/solr/search/function/distance/HaversineConstFunction.java
catch (InvalidShapeException e) { throw new ParseException("Bad spatial pt:" + pt); }
(Lib) CloneNotSupportedException
(Lib) RuntimeException
1
                    
// in core/src/java/org/apache/solr/internal/csv/CSVStrategy.java
catch (CloneNotSupportedException e) { throw new RuntimeException(e); // impossible }
(Lib) InvalidTokenOffsetsException
(Domain) SolrException
1
                    
// in core/src/java/org/apache/solr/highlight/DefaultSolrHighlighter.java
catch (InvalidTokenOffsetsException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
(Lib) ConfigException
(Domain) SolrException
1
                    
// in core/src/java/org/apache/solr/cloud/SolrZkServer.java
catch (QuorumPeerConfig.ConfigException e) { if (zkRun != null) throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
(Lib) SessionExpiredException
Unknown
1
                    
// in core/src/java/org/apache/solr/cloud/LeaderElector.java
catch (KeeperException.SessionExpiredException e) { throw e; }
(Lib) NoSuchAlgorithmException
(Lib) RuntimeException
1
                    
// in core/src/java/org/apache/solr/update/processor/MD5Signature.java
catch (NoSuchAlgorithmException e) { throw new RuntimeException(e); }
(Lib) PatternSyntaxException
(Domain) SolrException
2
                    
// in core/src/java/org/apache/solr/update/processor/RegexReplaceProcessorFactory.java
catch (PatternSyntaxException e) { throw new SolrException(ErrorCode.SERVER_ERROR, "Invalid regex: " + patternParam, e); }
// in core/src/java/org/apache/solr/update/processor/FieldMutatingUpdateProcessorFactory.java
catch (PatternSyntaxException e) { throw new SolrException (SERVER_ERROR, "Invalid 'fieldRegex' pattern: " + s, e); }
(Lib) CharacterCodingException
(Domain) SolrException
1
                    
// in core/src/java/org/apache/solr/core/SolrResourceLoader.java
catch (CharacterCodingException ex) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Error loading resource (wrong encoding?): " + resource, ex); }
(Lib) InvocationTargetException
(Lib) RuntimeException
1
                    
// in core/src/java/org/apache/solr/util/SolrPluginUtils.java
catch (InvocationTargetException e1) { throw new RuntimeException("Error invoking setter " + setterName + " on class : " + clazz.getName(), e1); }
(Lib) IllegalAccessException
(Lib) RuntimeException
1
                    
// in core/src/java/org/apache/solr/util/SolrPluginUtils.java
catch (IllegalAccessException e1) { throw new RuntimeException("Error invoking setter " + setterName + " on class : " + clazz.getName(), e1); }
(Lib) NoClassDefFoundError
(Lib) ClassNotFoundException
1
                    
// in core/src/java/org/apache/solr/util/SuggestMissingFactories.java
catch (NoClassDefFoundError e1) { throw new ClassNotFoundException ("Can't load: " + cn, e1); }

Caught / Thrown Exception

Not all exceptions are thrown AND caught in the same project. The following table gives the exceptions types with respect to this. The lower left hand side sell lists all exceptions thrown but not caught (prevalent for libraries), the upper right-hand side lists all exceptions caught but not thrown (usually coming from external dependencies).

Thrown Not Thrown
Caught
Type Name
(Lib) Exception
(Domain) SolrServerException
(Lib) InterruptedException
(Domain) ReplicationHandlerException
(Lib) UnsupportedOperationException
(Lib) IllegalArgumentException
(Lib) RuntimeException
(Domain) SolrException
(Domain) ParseException
(Domain) DataImportHandlerException
(Lib) TimeoutException
(Lib) IOException
(Lib) IllegalStateException
(Lib) EOFException
(Lib) NumberFormatException
(Lib) XMLStreamException
(Lib) ClassNotFoundException
(Lib) TransformerException
(Lib) FileNotFoundException
(Lib) ConfigException
Type Name
(Lib) NodeExistsException
(Lib) UnsupportedEncodingException
(Lib) NoNodeException
(Lib) KeeperException
(Lib) ConnectionLossException
(Lib) Throwable
(Lib) NoHttpResponseException
(Lib) ConnectException
(Lib) SocketTimeoutException
(Lib) SocketException
(Lib) MalformedURLException
(Lib) TikaException
(Lib) SAXException
(Lib) MimeTypeException
(Lib) LangDetectException
(Lib) TransformerConfigurationException
(Lib) MessagingException
(Lib) ScriptException
(Lib) SQLException
(Lib) NoSuchMethodException
(Lib) ClassCastException
(Lib) ExecutionException
(Lib) XPathExpressionException
(Lib) URISyntaxException
(Lib) UnknownHostException
(Lib) ParserConfigurationException
(Lib) InvalidShapeException
(Lib) CloneNotSupportedException
(Lib) TimeExceededException
(Lib) InvalidTokenOffsetsException
(Lib) SessionExpiredException
(Lib) NoSuchAlgorithmException
(Lib) PatternSyntaxException
(Lib) CharacterCodingException
(Lib) NoInitialContextException
(Lib) NamingException
(Lib) InvocationTargetException
(Lib) IllegalAccessException
(Lib) SecurityException
(Lib) NoClassDefFoundError
(Lib) ProtocolException
Not caught
Type Name
(Domain) ZooKeeperException
(Lib) AssertionError
(Domain) BindingException
(Domain) FieldMappingException
(Lib) InitializationException
(Lib) NullPointerException
(Lib) ServletException
(Lib) IndexOutOfBoundsException
(Lib) NoSuchElementException
(Lib) LockObtainFailedException
(Lib) AttributeNotFoundException

Methods called in Catch and Finally Blocks

The following shows the methods that are called inside catch blocks (first column) and finally blocks (second column). For each method, we give the number of times it is called in a catch block (second sub-column), and the total number of calls (third sub-column). If the method name is red, it means that it is only called from catch/finally blocks. Hovering over a number triggers showing code snippets from the application code.

Catch Finally
Method Nbr Nbr total
log 212
                  
// in solrj/src/java/org/apache/solr/common/cloud/DefaultConnectionStrategy.java
catch (Exception e) { SolrException.log(log, "Reconnect to ZooKeeper failed", e); log.info("Reconnect to ZooKeeper failed"); }
// in solrj/src/java/org/apache/solr/common/cloud/ConnectionManager.java
catch (Exception e) { SolrException.log(log, "", e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DataImporter.java
catch (Throwable t) { SolrException.log(LOG, "Full Import failed", t); docBuilder.rollback(); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DebugLogger.java
catch (DataImportHandlerException de) { log(DIHLogLevels.ENTITY_EXCEPTION, null, de); throw de; }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DebugLogger.java
catch (Exception e) { log(DIHLogLevels.ENTITY_EXCEPTION, null, e); DataImportHandlerException de = new DataImportHandlerException( DataImportHandlerException.SEVERE, "", e); de.debugged = true; throw de; }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DebugLogger.java
catch (DataImportHandlerException de) { log(DIHLogLevels.TRANSFORMER_EXCEPTION, tName, de); de.debugged = true; throw de; }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DebugLogger.java
catch (Exception e) { log(DIHLogLevels.TRANSFORMER_EXCEPTION, tName, e); DataImportHandlerException de = new DataImportHandlerException(DataImportHandlerException.SEVERE, "", e); de.debugged = true; throw de; }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DIHCacheSupport.java
catch (Exception e) { SolrException.log(log, "getNextFromCache() failed for query '" + query + "'", e); wrapAndThrow(DataImportHandlerException.WARN, e); return null; }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/EntityProcessorWrapper.java
catch (Exception e) { if(ABORT.equals(onError)){ wrapAndThrow(SEVERE, e); } else { //SKIP is not really possible. If this calls the nextRow() again the Entityprocessor would be in an inconisttent state SolrException.log(log, "Exception in entity : "+ entityName, e); return null; } }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/EntityProcessorBase.java
catch (Exception e) { SolrException.log(log, "getNext() failed for query '" + query + "'", e); query = null; rowIterator = null; wrapAndThrow(DataImportHandlerException.WARN, e); return null; }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DocBuilder.java
catch (DataImportHandlerException e) { if (verboseDebug) { getDebugLogger().log(DIHLogLevels.ENTITY_EXCEPTION, epw.getEntity().getName(), e); } if(e.getErrCode() == DataImportHandlerException.SKIP_ROW){ continue; } if (isRoot) { if (e.getErrCode() == DataImportHandlerException.SKIP) { importStatistics.skipDocCount.getAndIncrement(); doc = null; } else { SolrException.log(LOG, "Exception while processing: " + epw.getEntity().getName() + " document : " + doc, e); } if (e.getErrCode() == DataImportHandlerException.SEVERE) throw e; } else throw e; }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DocBuilder.java
catch (Throwable t) { if (verboseDebug) { getDebugLogger().log(DIHLogLevels.ENTITY_EXCEPTION, epw.getEntity().getName(), t); } throw new DataImportHandlerException(DataImportHandlerException.SEVERE, t); }
// in core/src/java/org/apache/solr/handler/RequestHandlerBase.java
catch (Exception e) { if (e instanceof SolrException) { SolrException se = (SolrException)e; if (se.code() == SolrException.ErrorCode.CONFLICT.code) { // TODO: should we allow this to be counted as an error (numErrors++)? } else { SolrException.log(SolrCore.log,e); } } else { SolrException.log(SolrCore.log,e); if (e instanceof ParseException) { e = new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); } } rsp.setException(e); numErrors++; }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
catch (InterruptedException e) { SolrException.log(LOG,e); }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
catch (ExecutionException e) { SolrException.log(LOG,e); }
// in core/src/java/org/apache/solr/handler/ReplicationHandler.java
catch (Exception e) { SolrException.log(LOG, "SnapPull failed ", e); }
// in core/src/java/org/apache/solr/handler/MoreLikeThisHandler.java
catch (Exception e) { SolrException.log(SolrCore.log, "Exception during debug", e); rsp.add("exception_during_debug", SolrException.toStr(e)); }
// in core/src/java/org/apache/solr/handler/component/HttpShardHandlerFactory.java
catch (Throwable e) { SolrException.log(log, e); }
// in core/src/java/org/apache/solr/handler/component/HttpShardHandlerFactory.java
catch (Throwable e) { SolrException.log(log, e); }
// in core/src/java/org/apache/solr/handler/component/HttpShardHandlerFactory.java
catch (Throwable e) { SolrException.log(log, e); }
// in core/src/java/org/apache/solr/request/SolrRequestInfo.java
catch (Throwable throwable) { SolrException.log(SolrCore.log, "Exception during close hook", throwable); }
// in core/src/java/org/apache/solr/servlet/SolrDispatchFilter.java
catch( Throwable t ) { // catch this so our filter still works log.error( "Could not start Solr. Check solr/home property and the logs"); SolrCore.log( t ); }
// in core/src/java/org/apache/solr/search/LFUCache.java
catch (Throwable e) { SolrException.log(log, "Error during auto-warming of key:" + itemsArr[i].getKey(), e); }
// in core/src/java/org/apache/solr/search/LRUCache.java
catch (Throwable e) { SolrException.log(log,"Error during auto-warming of key:" + keys[i], e); }
// in core/src/java/org/apache/solr/search/CacheConfig.java
catch (Exception e) { SolrException.log(SolrCache.log,"Error instantiating cache",e); // we can carry on without a cache... but should we? // in some cases (like an OOM) we probably should try to continue. return null; }
// in core/src/java/org/apache/solr/search/FastLRUCache.java
catch (Throwable e) { SolrException.log(log, "Error during auto-warming of key:" + itemsArr[i].getKey(), e); }
// in core/src/java/org/apache/solr/cloud/Overseer.java
catch (KeeperException e) { if (e.code() == KeeperException.Code.SESSIONEXPIRED || e.code() == KeeperException.Code.CONNECTIONLOSS) { log.warn("ZooKeeper watch triggered, but Solr cannot talk to ZK"); return; } SolrException.log(log, "", e); throw new ZooKeeperException( SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/cloud/Overseer.java
catch (KeeperException e) { if (e.code() == KeeperException.Code.SESSIONEXPIRED || e.code() == KeeperException.Code.CONNECTIONLOSS) { log.warn("ZooKeeper watch triggered, but Solr cannot talk to ZK"); return; } SolrException.log(log, "", e); throw new ZooKeeperException( SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/cloud/Overseer.java
catch (KeeperException e) { if (e.code() == KeeperException.Code.SESSIONEXPIRED || e.code() == KeeperException.Code.CONNECTIONLOSS) { log.warn("ZooKeeper watch triggered, but Solr cannot talk to ZK"); return; } SolrException.log(log, "", e); throw new ZooKeeperException( SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/cloud/SyncStrategy.java
catch (Exception e) { SolrException.log(log, "Sync Failed", e); }
// in core/src/java/org/apache/solr/cloud/SyncStrategy.java
catch (Exception e) { SolrException.log(log, "Sync Failed", e); }
// in core/src/java/org/apache/solr/cloud/SyncStrategy.java
catch (Exception e) { SolrException.log(log, "Error syncing replica to leader", e); }
// in core/src/java/org/apache/solr/cloud/RecoveryStrategy.java
catch (Throwable t) { SolrException.log(log, "", t); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (Exception e) { SolrException.log(log, "", e); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (Exception e) { SolrException.log(log, "", e); throw new ZooKeeperException( SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (Exception e) { SolrException.log(log, "There was a problem finding the leader in zk", e); try { Thread.sleep(2000); } catch (InterruptedException e1) { Thread.currentThread().interrupt(); } if (i == retries - 1) { throw new SolrException(ErrorCode.SERVER_ERROR, "There was a problem finding the leader in zk"); } }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (Exception e) { SolrException.log(log, "There was a problem making a request to the leader", e); try { Thread.sleep(2000); } catch (InterruptedException e1) { Thread.currentThread().interrupt(); } if (i == retries - 1) { throw new SolrException(ErrorCode.SERVER_ERROR, "There was a problem making a request to the leader"); } }
// in core/src/java/org/apache/solr/update/UpdateLog.java
catch (Exception e) { SolrException.log(log, "Failure to open existing log file (non fatal) " + f, e); f.delete(); }
// in core/src/java/org/apache/solr/update/UpdateLog.java
catch (Throwable e) { SolrException.log(log, "Error opening realtime searcher for deleteByQuery", e); }
// in core/src/java/org/apache/solr/update/UpdateLog.java
catch (Exception e) { SolrException.log(log, e); }
// in core/src/java/org/apache/solr/update/UpdateLog.java
catch (IOException e) { SolrException.log(log,"Error attempting to roll back log", e); return false; }
// in core/src/java/org/apache/solr/update/UpdateLog.java
catch (Throwable e) { recoveryInfo.errors++; SolrException.log(log,e); }
// in core/src/java/org/apache/solr/update/UpdateLog.java
catch (InterruptedException e) { SolrException.log(log,e); }
// in core/src/java/org/apache/solr/update/UpdateLog.java
catch (IOException e) { SolrException.log(log,e); }
// in core/src/java/org/apache/solr/update/UpdateLog.java
catch (Throwable e) { SolrException.log(log,e); }
// in core/src/java/org/apache/solr/update/CommitTracker.java
catch (Exception e) { SolrException.log(log, "auto commit error...", e); }
// in core/src/java/org/apache/solr/update/DirectUpdateHandler2.java
catch (InterruptedException e) { SolrException.log(log,e); }
// in core/src/java/org/apache/solr/update/DirectUpdateHandler2.java
catch (ExecutionException e) { SolrException.log(log,e); }
// in core/src/java/org/apache/solr/update/processor/FieldMutatingUpdateProcessor.java
catch (SolrException e) { String msg = "Unable to mutate field '"+fname+"': "+e.getMessage(); SolrException.log(log, msg, e); throw new SolrException(BAD_REQUEST, msg, e); }
// in core/src/java/org/apache/solr/update/SolrCmdDistributor.java
catch (ExecutionException e) { // shouldn't happen since we catch exceptions ourselves SolrException.log(SolrCore.log, "error sending update request to shard", e); }
// in core/src/java/org/apache/solr/core/Config.java
catch (ParserConfigurationException e) { SolrException.log(log, "Exception during parsing file: " + name, e); throw e; }
// in core/src/java/org/apache/solr/core/Config.java
catch (SAXException e) { SolrException.log(log, "Exception during parsing file: " + name, e); throw e; }
// in core/src/java/org/apache/solr/core/Config.java
catch( SolrException e ){ SolrException.log(log,"Error in "+name,e); throw e; }
// in core/src/java/org/apache/solr/core/Config.java
catch (XPathExpressionException e) { SolrException.log(log,"Error in xpath",e); throw new SolrException( SolrException.ErrorCode.SERVER_ERROR,"Error in xpath:" + xstr + " for " + name,e); }
// in core/src/java/org/apache/solr/core/Config.java
catch (Throwable e) { SolrException.log(log,"Error in xpath",e); throw new SolrException( SolrException.ErrorCode.SERVER_ERROR,"Error in xpath:" + xstr+ " for " + name,e); }
// in core/src/java/org/apache/solr/core/RunExecutableListener.java
catch (InterruptedException e) { SolrException.log(log,e); ret = INVALID_PROCESS_RETURN_CODE; }
// in core/src/java/org/apache/solr/core/RunExecutableListener.java
catch (IOException e) { // don't throw exception, just log it... SolrException.log(log,e); ret = INVALID_PROCESS_RETURN_CODE; }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (Throwable e) { SolrException.log(log, e); }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (Exception e) { SolrException.log(log, e); }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (InterruptedException e) { searcherExecutor.shutdownNow(); try { if (!searcherExecutor.awaitTermination(30, TimeUnit.SECONDS)) { log.error("Timeout waiting for searchExecutor to terminate"); } } catch (InterruptedException e2) { SolrException.log(log, e2); } }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (InterruptedException e2) { SolrException.log(log, e2); }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (Exception e) { SolrException.log(log, e); }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (Throwable e) { SolrException.log(log,e); }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (Throwable e) { SolrException.log(log,e); }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (Throwable e) { SolrException.log(log, e); }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (IOException e) { SolrException.log(log,null,e); return null; }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (Throwable e) { SolrException.log(log,e); }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (Throwable e) { SolrException.log(log,null,e); }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (Throwable e) { SolrException.log(log,null,e); }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (Throwable e) { SolrException.log(log, e); }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (Throwable e) { // do not allow decref() operations to fail since they are typically called in finally blocks // and throwing another exception would be very unexpected. SolrException.log(log, "Error closing searcher:", e); }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (Throwable e) { // an exception in register() shouldn't be fatal. log(e); }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (Exception ex) { SolrException e = new SolrException (SolrException.ErrorCode.SERVER_ERROR, "QueryResponseWriter init failure", ex); SolrException.log(log,null,e); throw e; }
// in core/src/java/org/apache/solr/core/SolrResourceLoader.java
catch (MalformedURLException e) { SolrException.log(log, "Can't add element to classloader: " + files[j], e); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (Throwable e) { SolrException.log(log,null,e); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (Throwable ex) { SolrException.log(log,null,ex); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (Throwable t) { SolrException.log(log, "Error shutting down core", t); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (Throwable t) { SolrException.log(log, "Error canceling recovery for core", t); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); SolrException.log(log, "", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (Exception e) { // if register fails, this is really bad - close the zkController to // minimize any damage we can cause zkController.publish(core.getCoreDescriptor(), ZkStateReader.DOWN); SolrException.log(log, "", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
682
warn 104
                  
// in solrj/src/java/org/apache/solr/common/cloud/ZkStateReader.java
catch (KeeperException e) { if (e.code() == KeeperException.Code.SESSIONEXPIRED || e.code() == KeeperException.Code.CONNECTIONLOSS) { log.warn("ZooKeeper watch triggered, but Solr cannot talk to ZK"); return; } log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in solrj/src/java/org/apache/solr/common/cloud/ZkStateReader.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.warn("", e); return; }
// in solrj/src/java/org/apache/solr/common/cloud/ZkStateReader.java
catch (KeeperException e) { if (e.code() == KeeperException.Code.SESSIONEXPIRED || e.code() == KeeperException.Code.CONNECTIONLOSS) { log.warn("ZooKeeper watch triggered, but Solr cannot talk to ZK"); return; } log.error("", e); throw new ZooKeeperException( SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in solrj/src/java/org/apache/solr/common/cloud/ZkStateReader.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.warn("", e); return; }
// in solrj/src/java/org/apache/solr/common/cloud/ZkStateReader.java
catch (KeeperException e) { if (e.code() == KeeperException.Code.SESSIONEXPIRED || e.code() == KeeperException.Code.CONNECTIONLOSS) { log.warn("ZooKeeper watch triggered, but Solr cannot talk to ZK"); return; } log.error("", e); throw new ZooKeeperException( SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in contrib/extraction/src/java/org/apache/solr/handler/extraction/ExtractingDocumentLoader.java
catch (TikaException e) { if(ignoreTikaException) log.warn(new StringBuilder("skip extracting text due to ").append(e.getLocalizedMessage()) .append(". metadata=").append(metadata.toString()).toString()); else throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in contrib/uima/src/java/org/apache/solr/uima/processor/UIMAUpdateRequestProcessor.java
catch (Exception e) { String logField = solrUIMAConfiguration.getLogField(); if(logField == null){ SchemaField uniqueKeyField = solrCore.getSchema().getUniqueKeyField(); if(uniqueKeyField != null){ logField = uniqueKeyField.getName(); } } String optionalFieldInfo = logField == null ? "." : new StringBuilder(". ").append(logField).append("=") .append((String)cmd.getSolrInputDocument().getField(logField).getValue()) .append(", ").toString(); int len = Math.min(text.length(), 100); if (solrUIMAConfiguration.isIgnoreErrors()) { log.warn(new StringBuilder("skip the text processing due to ") .append(e.getLocalizedMessage()).append(optionalFieldInfo) .append(" text=\"").append(text.substring(0, len)).append("...\"").toString()); } else { throw new SolrException(ErrorCode.SERVER_ERROR, new StringBuilder("processing error: ") .append(e.getLocalizedMessage()).append(optionalFieldInfo) .append(" text=\"").append(text.substring(0, len)).append("...\"").toString(), e); } }
// in contrib/clustering/src/java/org/apache/solr/handler/clustering/carrot2/LuceneCarrot2TokenizerFactory.java
catch (Throwable e) { logger .warn("Could not instantiate Smart Chinese Analyzer, clustering quality " + "of Chinese content may be degraded. For best quality clusters, " + "make sure Lucene's Smart Chinese Analyzer JAR is in the classpath"); }
// in contrib/clustering/src/java/org/apache/solr/handler/clustering/carrot2/LuceneCarrot2StemmerFactory.java
catch (Exception e) { logger.warn("Could not instantiate snowball stemmer" + " for language: " + language.name() + ". Quality of clustering may be degraded.", e); return IdentityStemmer.INSTANCE; }
// in contrib/clustering/src/java/org/apache/solr/handler/clustering/carrot2/LuceneCarrot2StemmerFactory.java
catch (ClassNotFoundException e) { logger .warn( "Could not instantiate Lucene stemmer for Arabic, clustering quality " + "of Arabic content may be degraded. For best quality clusters, " + "make sure Lucene's Arabic analyzer JAR is in the classpath", e); }
// in contrib/clustering/src/java/org/apache/solr/handler/clustering/carrot2/CarrotClusteringEngine.java
catch (IOException e) { log.warn("Could not read Solr resource " + resourceName); return new IResource[] {}; }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DataImporter.java
catch( UnsupportedOperationException e ) { LOG.warn( "XML parser doesn't support XInclude option" ); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/XPathEntityProcessor.java
catch (Exception e) { if (ABORT.equals(onError)) { wrapAndThrow(SEVERE, e); } else if (SKIP.equals(onError)) { if (LOG.isDebugEnabled()) LOG.debug("Skipping url : " + s, e); wrapAndThrow(DataImportHandlerException.SKIP, e); } else { LOG.warn("Failed for url : " + s, e); rowIterator = Collections.EMPTY_LIST.iterator(); return; } }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/XPathEntityProcessor.java
catch (TransformerException e) { if (ABORT.equals(onError)) { wrapAndThrow(SEVERE, e, "Exception in applying XSL Transformeation"); } else if (SKIP.equals(onError)) { wrapAndThrow(DataImportHandlerException.SKIP, e); } else { LOG.warn("Failed for url : " + s, e); rowIterator = Collections.EMPTY_LIST.iterator(); return; } }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/XPathEntityProcessor.java
catch (Exception e) { String msg = "Parsing failed for xml, url:" + s + " rows processed:" + rows.size(); if (rows.size() > 0) msg += " last row: " + rows.get(rows.size() - 1); if (ABORT.equals(onError)) { wrapAndThrow(SEVERE, e, msg); } else if (SKIP.equals(onError)) { LOG.warn(msg, e); Map<String, Object> map = new HashMap<String, Object>(); map.put(SKIP_DOC, Boolean.TRUE); rows.add(map); } else if (CONTINUE.equals(onError)) { LOG.warn(msg, e); } }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/RegexTransformer.java
catch (Exception e) { LOG.warn("Parsing failed for field : " + columnName, e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/SimplePropertiesWriter.java
catch (Exception e) { log.warn("Unable to read: " + persistFilename); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/JdbcDataSource.java
catch (NumberFormatException e) { LOG.warn("Invalid batch size: " + bsz); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DateFormatTransformer.java
catch (ParseException e) { LOG.warn("Could not parse a Date field ", e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/SolrWriter.java
catch (Exception e) { log.warn("Error creating document : " + d, e); return false; }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/EntityProcessorWrapper.java
catch (Exception e) { log.warn("method invocation failed on transformer : " + trans, e); throw new DataImportHandlerException(WARN, e);
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/EntityProcessorWrapper.java
catch (Exception e) { log.warn("transformer threw error", e); if (ABORT.equals(onError)) { wrapAndThrow(SEVERE, e); } else if (SKIP.equals(onError)) { wrapAndThrow(DataImportHandlerException.SKIP, e); } // onError = continue }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/ZKPropertiesWriter.java
catch (InterruptedException e) { Thread.currentThread().interrupt(); log.warn( "Could not persist properties to " + path + " :" + e.getClass(), e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/ZKPropertiesWriter.java
catch (Exception e) { log.warn( "Could not persist properties to " + path + " :" + e.getClass(), e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/ZKPropertiesWriter.java
catch (Throwable e) { log.warn( "Could not read DIH properties from " + path + " :" + e.getClass(), e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/BinURLDataSource.java
catch (NumberFormatException e) { LOG.warn("Invalid connection timeout: " + cTimeout); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/BinURLDataSource.java
catch (NumberFormatException e) { LOG.warn("Invalid read timeout: " + rTimeout); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/URLDataSource.java
catch (NumberFormatException e) { LOG.warn("Invalid connection timeout: " + cTimeout); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/URLDataSource.java
catch (NumberFormatException e) { LOG.warn("Invalid read timeout: " + rTimeout); }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
catch (Exception e) { LOG.warn("Exception while updating statistics", e); }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
catch (Exception e) { LOG.warn("Error in fetching packets ", e); //for any failure , increment the error count errorCount++; //if it fails for the same pacaket for MAX_RETRIES fail and come out if (errorCount > MAX_RETRIES) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Fetch failed for file:" + fileName, e); } return ERR; }
// in core/src/java/org/apache/solr/handler/ReplicationHandler.java
catch (IOException e) { LOG.warn("Exception while reading files for commit " + c, e); }
// in core/src/java/org/apache/solr/handler/ReplicationHandler.java
catch (Exception e) { LOG.warn("Exception in finding checksum of " + f, e); }
// in core/src/java/org/apache/solr/handler/ReplicationHandler.java
catch (Exception e) { LOG.warn("Exception during creating a snapshot", e); rsp.add("exception", e); }
// in core/src/java/org/apache/solr/handler/ReplicationHandler.java
catch (IOException e) { rsp.add("status", "unable to get file names for given index generation"); rsp.add("exception", e); LOG.warn("Unable to get file names for indexCommit generation: " + gen, e); }
// in core/src/java/org/apache/solr/handler/ReplicationHandler.java
catch (IOException e) { LOG.warn("Unable to get index version : ", e); }
// in core/src/java/org/apache/solr/handler/ReplicationHandler.java
catch (Exception e) { LOG.warn("Exception while invoking 'details' method for replication on master ", e); slave.add(ERR_STATUS, "invalid_master"); }
// in core/src/java/org/apache/solr/handler/ReplicationHandler.java
catch (Exception e) { LOG.warn("Exception while reading " + SnapPuller.REPLICATION_PROPERTIES); }
// in core/src/java/org/apache/solr/handler/ReplicationHandler.java
catch (IOException e) { LOG.warn("Unable to get IndexCommit on startup", e); }
// in core/src/java/org/apache/solr/handler/ReplicationHandler.java
catch (IOException e) { LOG.warn("Exception while writing response for params: " + params, e); }
// in core/src/java/org/apache/solr/handler/component/RealTimeGetComponent.java
catch (SolrException e) { log.warn("Exception reading log for updates", e); }
// in core/src/java/org/apache/solr/handler/component/RealTimeGetComponent.java
catch (ClassCastException e) { log.warn("Exception reading log for updates", e); }
// in core/src/java/org/apache/solr/handler/admin/LukeRequestHandler.java
catch( Exception ex ) { log.warn( "error writing term vector", ex ); }
// in core/src/java/org/apache/solr/handler/admin/LukeRequestHandler.java
catch( Exception ex ) { log.warn( "error reading field: "+fieldName ); }
// in core/src/java/org/apache/solr/handler/admin/SystemInfoHandler.java
catch (Exception e) { log.warn("Error getting JMX properties", e); }
// in core/src/java/org/apache/solr/response/BinaryResponseWriter.java
catch (Exception e) { LOG.warn("Error reading a field : " + o, e); }
// in core/src/java/org/apache/solr/response/BinaryResponseWriter.java
catch (Exception e) { // There is a chance of the underlying field not really matching the // actual field type . So ,it can throw exception LOG.warn("Error reading a field from document : " + solrDoc, e); //if it happens log it and continue continue; }
// in core/src/java/org/apache/solr/servlet/ZookeeperInfoServlet.java
catch (KeeperException e) { writeKeyValue(json, "warning", e.toString(), false); log.warn("Keeper Exception", e); }
// in core/src/java/org/apache/solr/servlet/ZookeeperInfoServlet.java
catch (InterruptedException e) { writeKeyValue(json, "warning", e.toString(), false); log.warn("InterruptedException", e); }
// in core/src/java/org/apache/solr/schema/IndexSchema.java
catch (NumberFormatException e) { log.warn("Couldn't parse maxChars attribute for copyField from " + source + " to " + dest + " as integer. The whole field will be copied."); }
// in core/src/java/org/apache/solr/schema/PreAnalyzedField.java
catch (Exception e) { LOG.warn("Can't use the configured PreAnalyzedParser class '" + implName + "' (" + e.getMessage() + "), using default " + DEFAULT_IMPL); parser = new JsonPreAnalyzedParser(); }
// in core/src/java/org/apache/solr/schema/JsonPreAnalyzedParser.java
catch (NumberFormatException nfe) { LOG.warn("Invalid " + OFFSET_START_KEY + " attribute, skipped: '" + obj + "'"); hasOffsetStart = false; }
// in core/src/java/org/apache/solr/schema/JsonPreAnalyzedParser.java
catch (NumberFormatException nfe) { LOG.warn("Invalid " + OFFSET_END_KEY + " attribute, skipped: '" + obj + "'"); hasOffsetEnd = false; }
// in core/src/java/org/apache/solr/schema/JsonPreAnalyzedParser.java
catch (NumberFormatException nfe) { LOG.warn("Invalid " + POSINCR_KEY + " attribute, skipped: '" + obj + "'"); }
// in core/src/java/org/apache/solr/schema/JsonPreAnalyzedParser.java
catch (NumberFormatException nfe) { LOG.warn("Invalid " + FLAGS_KEY + " attribute, skipped: '" + e.getValue() + "'"); }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
catch (RuntimeException e) { // getField() throws a SolrException, but it arrives as a RuntimeException log.warn("Field \"" + fieldName + "\" found in index, but not defined in schema."); }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
catch( TimeLimitingCollector.TimeExceededException x ) { log.warn( "Query: " + query + "; " + x.getMessage() ); qr.setPartialResults(true); }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
catch( TimeLimitingCollector.TimeExceededException x ) { log.warn( "Query: " + query + "; " + x.getMessage() ); qr.setPartialResults(true); }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
catch( TimeLimitingCollector.TimeExceededException x ) { log.warn( "Query: " + query + "; " + x.getMessage() ); qr.setPartialResults(true); }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
catch( TimeLimitingCollector.TimeExceededException x ) { log.warn( "Query: " + query + "; " + x.getMessage() ); qr.setPartialResults(true); }
// in core/src/java/org/apache/solr/search/SurroundQParserPlugin.java
catch (Exception e) { LOG.warn("Couldn't parse maxBasicQueries value " + mbqparam +", using default of 1000"); this.maxBasicQueries = DEFMAXBASICQUERIES; }
// in core/src/java/org/apache/solr/search/Grouping.java
catch (TimeLimitingCollector.TimeExceededException x) { logger.warn( "Query: " + query + "; " + x.getMessage() ); qr.setPartialResults(true); }
// in core/src/java/org/apache/solr/search/grouping/CommandHandler.java
catch (TimeLimitingCollector.TimeExceededException x) { partialResults = true; logger.warn( "Query: " + query + "; " + x.getMessage() ); }
// in core/src/java/org/apache/solr/spelling/SpellCheckCollator.java
catch (Exception e) { LOG.warn("Exception trying to re-query to check if a spell check possibility would return any hits.", e); }
// in core/src/java/org/apache/solr/spelling/suggest/Suggester.java
catch (IOException e) { LOG.warn("Loading stored lookup data failed", e); }
// in core/src/java/org/apache/solr/cloud/ShardLeaderWatcher.java
catch (KeeperException e) { logger.warn("Shard leader watch triggered but Solr cannot talk to zk."); }
// in core/src/java/org/apache/solr/cloud/ShardLeaderWatcher.java
catch (InterruptedException e) { Thread.interrupted(); logger.warn("Shard leader watch triggered but Solr cannot talk to zk."); }
// in core/src/java/org/apache/solr/cloud/Overseer.java
catch (KeeperException e) { if (e.code() == KeeperException.Code.SESSIONEXPIRED || e.code() == KeeperException.Code.CONNECTIONLOSS) { log.warn("ZooKeeper watch triggered, but Solr cannot talk to ZK"); return; } SolrException.log(log, "", e); throw new ZooKeeperException( SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/cloud/Overseer.java
catch (KeeperException e) { log.warn("", e); }
// in core/src/java/org/apache/solr/cloud/Overseer.java
catch (KeeperException e) { if (e.code() == Code.CONNECTIONLOSS || e.code() == Code.SESSIONEXPIRED) { log.warn("ZooKeeper watch triggered, but Solr cannot talk to ZK"); return; } }
// in core/src/java/org/apache/solr/cloud/Overseer.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.warn("", e); }
// in core/src/java/org/apache/solr/cloud/Overseer.java
catch (KeeperException e) { if (e.code() == KeeperException.Code.SESSIONEXPIRED || e.code() == KeeperException.Code.CONNECTIONLOSS) { log.warn("ZooKeeper watch triggered, but Solr cannot talk to ZK"); return; } SolrException.log(log, "", e); throw new ZooKeeperException( SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/cloud/Overseer.java
catch (KeeperException e) { if (e.code() == KeeperException.Code.SESSIONEXPIRED || e.code() == KeeperException.Code.CONNECTIONLOSS) { log.warn("ZooKeeper watch triggered, but Solr cannot talk to ZK"); return; } SolrException.log(log, "", e); throw new ZooKeeperException( SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/cloud/NodeStateWatcher.java
catch (Exception e) { log.warn("Error processing state change", e); }
// in core/src/java/org/apache/solr/cloud/NodeStateWatcher.java
catch (KeeperException e) { log.warn("Could not talk to ZK", e); }
// in core/src/java/org/apache/solr/cloud/NodeStateWatcher.java
catch (InterruptedException e) { Thread.currentThread().interrupt(); log.warn("Could not talk to ZK", e); }
// in core/src/java/org/apache/solr/cloud/RecoveryStrategy.java
catch (InterruptedException e) { Thread.currentThread().interrupt(); log.warn("Recovery was interrupted", e); retries = INTERRUPTED; }
// in core/src/java/org/apache/solr/cloud/RecoveryStrategy.java
catch (InterruptedException e) { Thread.currentThread().interrupt(); log.warn("Recovery was interrupted", e); retries = INTERRUPTED; }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.warn("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/cloud/LeaderElector.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.warn("", e); }
// in core/src/java/org/apache/solr/cloud/LeaderElector.java
catch (IOException e) { log.warn("", e); }
// in core/src/java/org/apache/solr/cloud/LeaderElector.java
catch (Exception e) { log.warn("", e); }
// in core/src/java/org/apache/solr/update/UpdateLog.java
catch (ClassCastException cl) { log.warn("Unexpected log entry or corrupt log. Entry=" + o, cl); // would be caused by a corrupt transaction log }
// in core/src/java/org/apache/solr/update/UpdateLog.java
catch (Exception ex) { log.warn("Exception reverse reading log", ex); break; }
// in core/src/java/org/apache/solr/update/UpdateLog.java
catch (IOException ex) { recoveryInfo.errors++; loglog.warn("REYPLAY_ERR: IOException reading log", ex); // could be caused by an incomplete flush if recovering from log }
// in core/src/java/org/apache/solr/update/UpdateLog.java
catch (ClassCastException cl) { recoveryInfo.errors++; loglog.warn("REPLAY_ERR: Unexpected log entry or corrupt log. Entry=" + o, cl); // would be caused by a corrupt transaction log }
// in core/src/java/org/apache/solr/update/UpdateLog.java
catch (Throwable ex) { recoveryInfo.errors++; loglog.warn("REPLAY_ERR: Exception replaying log", ex); // something wrong with the request? }
// in core/src/java/org/apache/solr/update/processor/URLClassifyProcessor.java
catch (MalformedURLException e) { log.warn("cannot get the normalized url for \"" + url + "\" due to " + e.getMessage()); }
// in core/src/java/org/apache/solr/update/processor/URLClassifyProcessor.java
catch (URISyntaxException e) { log.warn("cannot get the normalized url for \"" + url + "\" due to " + e.getMessage()); }
// in core/src/java/org/apache/solr/core/Config.java
catch(UnsupportedOperationException e) { log.warn(name + " XML parser doesn't support XInclude option"); }
// in core/src/java/org/apache/solr/core/SolrConfig.java
catch (Exception e) { log.warn( "Unrecognized value for lastModFrom: " + s, e); return BOGUS; }
// in core/src/java/org/apache/solr/core/SolrConfig.java
catch (Exception e) { log.warn( "Ignoring exception while attempting to " + "extract max-age from cacheControl config: " + cacheControlHeader, e); }
// in core/src/java/org/apache/solr/core/MMapDirectoryFactory.java
catch (Exception e) { log.warn("Unmap not supported on this JVM, continuing on without setting unmap", e); }
// in core/src/java/org/apache/solr/core/SolrResourceLoader.java
catch( RuntimeException ex ) { log.warn("Odd RuntimeException while testing for JNDI: " + ex.getMessage()); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (IOException e) { log.warn("Error loading properties ",e); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch(Throwable ex) { log.warn("Unable to read SLF4J version. LogWatcher will be disabled: "+ex); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (Throwable e) { log.warn("Unable to load LogWatcher", e); }
// in core/src/java/org/apache/solr/core/SolrDeletionPolicy.java
catch (Exception e) { log.warn("Exception while checking commit point's age for deletion", e); }
// in core/src/java/org/apache/solr/core/JmxMonitoredMap.java
catch (Exception e) { LOG.warn( "Failed to register info bean: " + key, e); }
// in core/src/java/org/apache/solr/core/JmxMonitoredMap.java
catch (RuntimeException e) { LOG.warn( "Failed to unregister info bean: " + key, e); }
// in core/src/java/org/apache/solr/core/JmxMonitoredMap.java
catch (Exception e) { LOG.warn("Could not getStatistics on info bean {}", infoBean.getName(), e); }
// in core/src/java/org/apache/solr/core/JmxMonitoredMap.java
catch (Exception e) { LOG.warn("Could not get attibute " + attribute); }
// in core/src/java/org/apache/solr/util/SystemIdResolver.java
catch (URISyntaxException use) { log.warn("An URI systax problem occurred during resolving SystemId, falling back to default resolver", use); return null; }
199
error 86
                  
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in solrj/src/java/org/apache/solr/common/cloud/ZkStateReader.java
catch (KeeperException e) { log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in solrj/src/java/org/apache/solr/common/cloud/ZkStateReader.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in solrj/src/java/org/apache/solr/common/cloud/ZkStateReader.java
catch (KeeperException e) { if (e.code() == KeeperException.Code.SESSIONEXPIRED || e.code() == KeeperException.Code.CONNECTIONLOSS) { log.warn("ZooKeeper watch triggered, but Solr cannot talk to ZK"); return; } log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in solrj/src/java/org/apache/solr/common/cloud/ZkStateReader.java
catch (KeeperException e) { if (e.code() == KeeperException.Code.SESSIONEXPIRED || e.code() == KeeperException.Code.CONNECTIONLOSS) { log.warn("ZooKeeper watch triggered, but Solr cannot talk to ZK"); return; } log.error("", e); throw new ZooKeeperException( SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in solrj/src/java/org/apache/solr/common/cloud/ZkStateReader.java
catch (KeeperException e) { if (e.code() == KeeperException.Code.SESSIONEXPIRED || e.code() == KeeperException.Code.CONNECTIONLOSS) { log.warn("ZooKeeper watch triggered, but Solr cannot talk to ZK"); return; } log.error("", e); throw new ZooKeeperException( SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in solrj/src/java/org/apache/solr/common/cloud/ZkStateReader.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException( SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in solrj/src/java/org/apache/solr/common/cloud/ZkStateReader.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/ConcurrentUpdateSolrServer.java
catch (InterruptedException e) { log.error("interrupted", e); throw new IOException(e.getLocalizedMessage()); }
// in contrib/clustering/src/java/org/apache/solr/handler/clustering/carrot2/CarrotClusteringEngine.java
catch (Exception e) { log.error("Carrot2 clustering failed", e); throw new SolrException(ErrorCode.SERVER_ERROR, "Carrot2 clustering failed", e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DataImporter.java
catch (Throwable t) { LOG.error("Delta Import Failed", t); docBuilder.rollback(); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/SqlEntityProcessor.java
catch (Exception e) { LOG.error( "The query failed '" + q + "'", e); throw new DataImportHandlerException(DataImportHandlerException.SEVERE, e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/JdbcDataSource.java
catch (Exception e) { LOG.error("Ignoring Error when closing connection", e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/SolrWriter.java
catch (IOException e) { log.error("Exception while deleteing: " + id, e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/SolrWriter.java
catch (IOException e) { log.error("Exception while deleting by query: " + query, e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/SolrWriter.java
catch (Throwable t) { log.error("Exception while solr commit.", t); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/SolrWriter.java
catch (Throwable t) { log.error("Exception while solr rollback.", t); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/EntityProcessorWrapper.java
catch (NoSuchMethodException nsme){ String msg = "Transformer :" + trans + "does not implement Transformer interface or does not have a transformRow(Map<String.Object> m)method"; log.error(msg); wrapAndThrow(SEVERE, nsme,msg); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/EntityProcessorWrapper.java
catch (Exception e) { log.error("Unable to load Transformer: " + aTransArr, e); wrapAndThrow(SEVERE, e,"Unable to load Transformer: " + trans); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/BinURLDataSource.java
catch (Exception e) { LOG.error("Exception thrown while getting data", e); wrapAndThrow (SEVERE, e, "Exception in invoking url " + url); return null;//unreachable }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/URLDataSource.java
catch (Exception e) { LOG.error("Exception thrown while getting data", e); throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Exception in invoking url " + url, e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DataImportHandler.java
catch (Throwable e) { LOG.error( DataImporter.MSG.LOAD_EXP, e); throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, DataImporter.MSG.INVALID_CONFIG, e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DataImportHandler.java
catch (RuntimeException e) { LOG.error( "Exception while adding: " + document, e); return false; }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DocBuilder.java
catch (Exception e) { LOG.error("Could not write property file", e); statusMessages.put("error", "Could not write property file. Delta imports will not work. " + "Make sure your conf directory is writable"); }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
catch (Exception e) { LOG.error("Exception in fetching index", e); }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
catch (Exception e) { LOG.error("Master at: " + masterUrl + " is not available. Index fetch failed. Exception: " + e.getMessage()); return false; }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
catch (ReplicationHandlerException e) { LOG.error("User aborted Replication"); return false; }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
catch (Exception e) { LOG.error("Could not restart core ", e); }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
catch (IOException e) { LOG.error("Unable to copy index file from: " + indexFileInTmpDir + " to: " + indexFileInIndex , e); }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
catch (Exception e) { LOG.error("Unable to load index.properties"); }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
catch (Exception e) {/* noop */ LOG.error("Error closing the file stream: "+ this.saveAs ,e); }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
catch (Exception e) { LOG.error("Error deleting file in cleanup" + e.getMessage()); }
// in core/src/java/org/apache/solr/handler/loader/JavabinLoader.java
catch (Exception e) { log.error("Exception while processing update request", e); break; }
// in core/src/java/org/apache/solr/handler/ReplicationHandler.java
catch (Exception e) { LOG.error("Exception while writing replication details: ", e); }
// in core/src/java/org/apache/solr/handler/ReplicationHandler.java
catch (Exception e) { LOG.error("Exception while snapshooting", e); }
// in core/src/java/org/apache/solr/handler/component/SpellCheckComponent.java
catch (IOException e) { log.error( "Exception in reloading spell check index for spellchecker: " + checker.getDictionaryName(), e); }
// in core/src/java/org/apache/solr/handler/component/SpellCheckComponent.java
catch (Exception e) { log.error( "Exception in building spell check index for spellchecker: " + checker.getDictionaryName(), e); }
// in core/src/java/org/apache/solr/handler/SnapShooter.java
catch (Exception e) { SnapPuller.delTree(snapShotDir); LOG.error("Exception while creating snapshot", e); details.add("snapShootException", e.getMessage()); }
// in core/src/java/org/apache/solr/handler/SnapShooter.java
catch (IOException e) { LOG.error("Unable to release snapshoot lock: " + directoryName + ".lock"); }
// in core/src/java/org/apache/solr/servlet/SolrDispatchFilter.java
catch( Throwable t ) { // catch this so our filter still works log.error( "Could not start Solr. Check solr/home property and the logs"); SolrCore.log( t ); }
// in core/src/java/org/apache/solr/schema/FieldTypePluginLoader.java
catch (Exception e) { log.error("Cannot load analyzer: "+analyzerName, e); throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "Cannot load analyzer: "+analyzerName, e ); }
// in core/src/java/org/apache/solr/search/function/FileFloatSource.java
catch (IOException e) { // log, use defaults SolrCore.log.error("Error opening external value source file: " +e); return vals; }
// in core/src/java/org/apache/solr/search/function/FileFloatSource.java
catch (Exception e) { if (++otherErrors<=10) { SolrCore.log.error( "Error loading external value source + fileName + " + e + (otherErrors<10 ? "" : "\tSkipping future errors for this file.") ); } continue; // go to next line in file.. leave values as default. }
// in core/src/java/org/apache/solr/search/function/FileFloatSource.java
catch (IOException e) { // log, use defaults SolrCore.log.error("Error loading external value source: " +e); }
// in core/src/java/org/apache/solr/spelling/suggest/Suggester.java
catch (UnsupportedEncodingException e) { // should not happen LOG.error("should not happen", e); }
// in core/src/java/org/apache/solr/spelling/suggest/Suggester.java
catch (Exception e) { LOG.error("Error while building or storing Suggester data", e); }
// in core/src/java/org/apache/solr/spelling/FileBasedSpellChecker.java
catch (IOException e) { log.error( "Unable to load spellings", e); }
// in core/src/java/org/apache/solr/cloud/Overseer.java
catch (KeeperException e) { log.error("Failed to create watcher for shard leader col:" + collection + " shard:" + shardId + ", exception: " + e.getClass()); }
// in core/src/java/org/apache/solr/cloud/Overseer.java
catch (InterruptedException e) { Thread.currentThread().interrupt(); log.error("Failed to create watcher for shard leader col:" + collection + " shard:" + shardId + ", exception: " + e.getClass()); }
// in core/src/java/org/apache/solr/cloud/ZkSolrResourceLoader.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/cloud/ZkSolrResourceLoader.java
catch (KeeperException e) { log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/cloud/SolrZkServer.java
catch (Throwable e) { log.error("ZooKeeper Server ERROR", e); throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/cloud/SolrZkServer.java
catch (Exception e) { log.error("STARTING ZOOKEEPER", e); }
// in core/src/java/org/apache/solr/cloud/RecoveryStrategy.java
catch (Throwable t) { log.error("Error while trying to recover", t); }
// in core/src/java/org/apache/solr/cloud/RecoveryStrategy.java
catch (Throwable t) { log.error("Error while trying to recover.", t); }
// in core/src/java/org/apache/solr/cloud/RecoveryStrategy.java
catch (Exception e) { log.error("", e); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (IOException e) { log.error("", e); throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Can't create ZooKeeperController", e); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (KeeperException e) { log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/update/UpdateLog.java
catch (IOException e) { log.error("Error inspecting tlog " + ll); ll.decref(); continue; }
// in core/src/java/org/apache/solr/update/UpdateLog.java
catch (IOException e) { // failure to read a log record isn't fatal log.error("Exception reading versions from log",e); }
// in core/src/java/org/apache/solr/update/UpdateLog.java
catch (IOException ex) { recoveryInfo.errors++; loglog.error("Replay exception: final commit.", ex); }
// in core/src/java/org/apache/solr/update/UpdateLog.java
catch (IOException ex) { recoveryInfo.errors++; loglog.error("Replay exception: finish()", ex); }
// in core/src/java/org/apache/solr/update/DirectUpdateHandler2.java
catch (Throwable th) { log.error("Error in final commit", th); }
// in core/src/java/org/apache/solr/update/DirectUpdateHandler2.java
catch (Throwable th) { log.error("Error closing log files", th); }
// in core/src/java/org/apache/solr/update/processor/MD5Signature.java
catch (UnsupportedEncodingException e) { // won't happen log.error("UTF-8 not supported", e); throw new RuntimeException(e); }
// in core/src/java/org/apache/solr/update/DefaultSolrCoreState.java
catch (Throwable t) { log.error("Error during shutdown of writer.", t); }
// in core/src/java/org/apache/solr/update/DefaultSolrCoreState.java
catch (Throwable t) { log.error("Error during shutdown of directory factory.", t); }
// in core/src/java/org/apache/solr/update/DefaultSolrCoreState.java
catch (Throwable t) { log.error("Error cancelling recovery", t); }
// in core/src/java/org/apache/solr/update/PeerSync.java
catch (IOException e) { // TODO: should this be handled separately as a problem with us? // I guess it probably already will by causing replication to be kicked off. sreq.updateException = e; log.error(msg() + "Error applying updates from " + sreq.shards + " ,update=" + o, e); return false; }
// in core/src/java/org/apache/solr/update/PeerSync.java
catch (Exception e) { sreq.updateException = e; log.error(msg() + "Error applying updates from " + sreq.shards + " ,update=" + o, e); return false; }
// in core/src/java/org/apache/solr/update/PeerSync.java
catch (Exception e) { sreq.updateException = e; log.error(msg() + "Error applying updates from " + sreq.shards + " ,finish()", e); return false; }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (InterruptedException e) { searcherExecutor.shutdownNow(); try { if (!searcherExecutor.awaitTermination(30, TimeUnit.SECONDS)) { log.error("Timeout waiting for searchExecutor to terminate"); } } catch (InterruptedException e2) { SolrException.log(log, e2); } }
// in core/src/java/org/apache/solr/core/SolrResourceLoader.java
catch (ClassNotFoundException e) { //this is unlikely log.error("Unable to load cached class-name : "+ c +" for shortname : "+cname + e); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (TimeoutException e) { log.error("Could not connect to ZooKeeper", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (IOException e) { log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (KeeperException e) { log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (KeeperException e) { log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (KeeperException e) { log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (KeeperException e) { log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/util/xslt/TransformerProvider.java
catch(TransformerConfigurationException tce) { log.error(getClass().getName(), "getTransformer", tce); final IOException ioe = new IOException("newTransformer fails ( " + lastFilename + ")"); ioe.initCause(tce); throw ioe; }
// in core/src/java/org/apache/solr/util/xslt/TransformerProvider.java
catch (Exception e) { log.error(getClass().getName(), "newTemplates", e); final IOException ioe = new IOException("Unable to initialize Templates '" + filename + "'"); ioe.initCause(e); throw ioe; }
145
currentThread 48
                  
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in solrj/src/java/org/apache/solr/common/cloud/ZkStateReader.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in solrj/src/java/org/apache/solr/common/cloud/ZkStateReader.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.warn("", e); return; }
// in solrj/src/java/org/apache/solr/common/cloud/ZkStateReader.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.warn("", e); return; }
// in solrj/src/java/org/apache/solr/common/cloud/ZkStateReader.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException( SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in solrj/src/java/org/apache/solr/common/cloud/ZkStateReader.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in solrj/src/java/org/apache/solr/common/cloud/ZkCmdExecutor.java
catch (KeeperException.ConnectionLossException e) { if (exception == null) { exception = e; } if (Thread.currentThread().isInterrupted()) { Thread.currentThread().interrupt(); throw new InterruptedException(); } if (Thread.currentThread() instanceof SafeStopThread) { if (((SafeStopThread) Thread.currentThread()).isClosed()) { throw new RuntimeException("Interrupted"); } } retryDelay(i); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/ConcurrentUpdateSolrServer.java
catch (InterruptedException ie) { scheduler.shutdownNow(); Thread.currentThread().interrupt(); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/ConcurrentUpdateSolrServer.java
catch (InterruptedException ie) { scheduler.shutdownNow(); Thread.currentThread().interrupt(); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/CloudSolrServer.java
catch (InterruptedException e) { Thread.currentThread().interrupt(); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/ZKPropertiesWriter.java
catch (InterruptedException e) { Thread.currentThread().interrupt(); log.warn( "Could not persist properties to " + path + " :" + e.getClass(), e); }
// in core/src/java/org/apache/solr/handler/admin/CoreAdminHandler.java
catch (InterruptedException e) { Thread.currentThread().interrupt(); throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Could not unregister core " + cname + " from cloudstate: " + e.getMessage(), e); }
// in core/src/java/org/apache/solr/request/PerSegmentSingleValuedFaceting.java
catch (InterruptedException e) { Thread.currentThread().interrupt(); throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/servlet/ZookeeperInfoServlet.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); writeError(503, "Could not connect to zookeeper at '" + addr + "'\""); zkClient = null; return; }
// in core/src/java/org/apache/solr/cloud/Overseer.java
catch (InterruptedException e) { Thread.currentThread().interrupt(); return; }
// in core/src/java/org/apache/solr/cloud/Overseer.java
catch (InterruptedException e) { Thread.currentThread().interrupt(); }
// in core/src/java/org/apache/solr/cloud/Overseer.java
catch (InterruptedException e) { Thread.currentThread().interrupt(); }
// in core/src/java/org/apache/solr/cloud/Overseer.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.warn("", e); }
// in core/src/java/org/apache/solr/cloud/Overseer.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); }
// in core/src/java/org/apache/solr/cloud/Overseer.java
catch (InterruptedException e) { Thread.currentThread().interrupt(); log.error("Failed to create watcher for shard leader col:" + collection + " shard:" + shardId + ", exception: " + e.getClass()); }
// in core/src/java/org/apache/solr/cloud/Overseer.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); }
// in core/src/java/org/apache/solr/cloud/ZkSolrResourceLoader.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/cloud/NodeStateWatcher.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); return; }
// in core/src/java/org/apache/solr/cloud/NodeStateWatcher.java
catch (InterruptedException e) { Thread.currentThread().interrupt(); log.warn("Could not talk to ZK", e); }
// in core/src/java/org/apache/solr/cloud/RecoveryStrategy.java
catch (InterruptedException e) { Thread.currentThread().interrupt(); log.warn("Recovery was interrupted", e); retries = INTERRUPTED; }
// in core/src/java/org/apache/solr/cloud/RecoveryStrategy.java
catch (InterruptedException e) { Thread.currentThread().interrupt(); log.warn("Recovery was interrupted", e); retries = INTERRUPTED; }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); throw new ZooKeeperException( SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.warn("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "could not publish node state", e); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (InterruptedException e) { Thread.currentThread().interrupt(); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (Exception e) { SolrException.log(log, "There was a problem finding the leader in zk", e); try { Thread.sleep(2000); } catch (InterruptedException e1) { Thread.currentThread().interrupt(); } if (i == retries - 1) { throw new SolrException(ErrorCode.SERVER_ERROR, "There was a problem finding the leader in zk"); } }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (InterruptedException e1) { Thread.currentThread().interrupt(); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (Exception e) { SolrException.log(log, "There was a problem making a request to the leader", e); try { Thread.sleep(2000); } catch (InterruptedException e1) { Thread.currentThread().interrupt(); } if (i == retries - 1) { throw new SolrException(ErrorCode.SERVER_ERROR, "There was a problem making a request to the leader"); } }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (InterruptedException e1) { Thread.currentThread().interrupt(); }
// in core/src/java/org/apache/solr/cloud/LeaderElector.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.warn("", e); }
// in core/src/java/org/apache/solr/update/processor/DistributedUpdateProcessor.java
catch (InterruptedException e) { Thread.currentThread().interrupt(); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/update/processor/DistributedUpdateProcessor.java
catch (InterruptedException e) { Thread.currentThread().interrupt(); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/update/processor/DistributedUpdateProcessor.java
catch (InterruptedException e) { Thread.currentThread().interrupt(); break; }
// in core/src/java/org/apache/solr/update/processor/DistributedUpdateProcessor.java
catch (InterruptedException e) { Thread.currentThread().interrupt(); return false; }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); SolrException.log(log, "", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/util/FileUtils.java
catch (IOException ioe) { if (exc == null) exc = ioe; try { // Pause 5 msec Thread.sleep(5); } catch (InterruptedException ie) { Thread.currentThread().interrupt(); } }
// in core/src/java/org/apache/solr/util/FileUtils.java
catch (InterruptedException ie) { Thread.currentThread().interrupt(); }
54
interrupt 45
                  
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in solrj/src/java/org/apache/solr/common/cloud/ZkStateReader.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in solrj/src/java/org/apache/solr/common/cloud/ZkStateReader.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.warn("", e); return; }
// in solrj/src/java/org/apache/solr/common/cloud/ZkStateReader.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.warn("", e); return; }
// in solrj/src/java/org/apache/solr/common/cloud/ZkStateReader.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException( SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in solrj/src/java/org/apache/solr/common/cloud/ZkStateReader.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in solrj/src/java/org/apache/solr/common/cloud/ZkCmdExecutor.java
catch (KeeperException.ConnectionLossException e) { if (exception == null) { exception = e; } if (Thread.currentThread().isInterrupted()) { Thread.currentThread().interrupt(); throw new InterruptedException(); } if (Thread.currentThread() instanceof SafeStopThread) { if (((SafeStopThread) Thread.currentThread()).isClosed()) { throw new RuntimeException("Interrupted"); } } retryDelay(i); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/ConcurrentUpdateSolrServer.java
catch (InterruptedException ie) { scheduler.shutdownNow(); Thread.currentThread().interrupt(); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/ConcurrentUpdateSolrServer.java
catch (InterruptedException ie) { scheduler.shutdownNow(); Thread.currentThread().interrupt(); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/CloudSolrServer.java
catch (InterruptedException e) { Thread.currentThread().interrupt(); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/ZKPropertiesWriter.java
catch (InterruptedException e) { Thread.currentThread().interrupt(); log.warn( "Could not persist properties to " + path + " :" + e.getClass(), e); }
// in core/src/java/org/apache/solr/handler/admin/CoreAdminHandler.java
catch (InterruptedException e) { Thread.currentThread().interrupt(); throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Could not unregister core " + cname + " from cloudstate: " + e.getMessage(), e); }
// in core/src/java/org/apache/solr/request/PerSegmentSingleValuedFaceting.java
catch (InterruptedException e) { Thread.currentThread().interrupt(); throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/servlet/ZookeeperInfoServlet.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); writeError(503, "Could not connect to zookeeper at '" + addr + "'\""); zkClient = null; return; }
// in core/src/java/org/apache/solr/cloud/Overseer.java
catch (InterruptedException e) { Thread.currentThread().interrupt(); return; }
// in core/src/java/org/apache/solr/cloud/Overseer.java
catch (InterruptedException e) { Thread.currentThread().interrupt(); }
// in core/src/java/org/apache/solr/cloud/Overseer.java
catch (InterruptedException e) { Thread.currentThread().interrupt(); }
// in core/src/java/org/apache/solr/cloud/Overseer.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.warn("", e); }
// in core/src/java/org/apache/solr/cloud/Overseer.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); }
// in core/src/java/org/apache/solr/cloud/Overseer.java
catch (InterruptedException e) { Thread.currentThread().interrupt(); log.error("Failed to create watcher for shard leader col:" + collection + " shard:" + shardId + ", exception: " + e.getClass()); }
// in core/src/java/org/apache/solr/cloud/Overseer.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); }
// in core/src/java/org/apache/solr/cloud/ZkSolrResourceLoader.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/cloud/NodeStateWatcher.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); return; }
// in core/src/java/org/apache/solr/cloud/NodeStateWatcher.java
catch (InterruptedException e) { Thread.currentThread().interrupt(); log.warn("Could not talk to ZK", e); }
// in core/src/java/org/apache/solr/cloud/RecoveryStrategy.java
catch (InterruptedException e) { Thread.currentThread().interrupt(); log.warn("Recovery was interrupted", e); retries = INTERRUPTED; }
// in core/src/java/org/apache/solr/cloud/RecoveryStrategy.java
catch (InterruptedException e) { Thread.currentThread().interrupt(); log.warn("Recovery was interrupted", e); retries = INTERRUPTED; }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); throw new ZooKeeperException( SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.warn("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "could not publish node state", e); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (InterruptedException e) { Thread.currentThread().interrupt(); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (Exception e) { SolrException.log(log, "There was a problem finding the leader in zk", e); try { Thread.sleep(2000); } catch (InterruptedException e1) { Thread.currentThread().interrupt(); } if (i == retries - 1) { throw new SolrException(ErrorCode.SERVER_ERROR, "There was a problem finding the leader in zk"); } }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (InterruptedException e1) { Thread.currentThread().interrupt(); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (Exception e) { SolrException.log(log, "There was a problem making a request to the leader", e); try { Thread.sleep(2000); } catch (InterruptedException e1) { Thread.currentThread().interrupt(); } if (i == retries - 1) { throw new SolrException(ErrorCode.SERVER_ERROR, "There was a problem making a request to the leader"); } }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (InterruptedException e1) { Thread.currentThread().interrupt(); }
// in core/src/java/org/apache/solr/cloud/LeaderElector.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.warn("", e); }
// in core/src/java/org/apache/solr/update/processor/DistributedUpdateProcessor.java
catch (InterruptedException e) { Thread.currentThread().interrupt(); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/update/processor/DistributedUpdateProcessor.java
catch (InterruptedException e) { Thread.currentThread().interrupt(); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/update/processor/DistributedUpdateProcessor.java
catch (InterruptedException e) { Thread.currentThread().interrupt(); break; }
// in core/src/java/org/apache/solr/update/processor/DistributedUpdateProcessor.java
catch (InterruptedException e) { Thread.currentThread().interrupt(); return false; }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); SolrException.log(log, "", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/util/FileUtils.java
catch (IOException ioe) { if (exc == null) exc = ioe; try { // Pause 5 msec Thread.sleep(5); } catch (InterruptedException ie) { Thread.currentThread().interrupt(); } }
// in core/src/java/org/apache/solr/util/FileUtils.java
catch (InterruptedException ie) { Thread.currentThread().interrupt(); }
46
getMessage 44
                  
// in solrj/src/java/org/apache/solr/common/params/SolrParams.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, ex.getMessage(), ex ); }
// in solrj/src/java/org/apache/solr/common/params/SolrParams.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, ex.getMessage(), ex ); }
// in solrj/src/java/org/apache/solr/common/params/SolrParams.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, ex.getMessage(), ex ); }
// in solrj/src/java/org/apache/solr/common/params/SolrParams.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, ex.getMessage(), ex ); }
// in solrj/src/java/org/apache/solr/common/params/SolrParams.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, ex.getMessage(), ex ); }
// in solrj/src/java/org/apache/solr/common/params/SolrParams.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, ex.getMessage(), ex ); }
// in solrj/src/java/org/apache/solr/common/params/SolrParams.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, ex.getMessage(), ex ); }
// in solrj/src/java/org/apache/solr/common/params/SolrParams.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, ex.getMessage(), ex ); }
// in solrj/src/java/org/apache/solr/common/params/SolrParams.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, ex.getMessage(), ex ); }
// in solrj/src/java/org/apache/solr/common/params/SolrParams.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, ex.getMessage(), ex ); }
// in solrj/src/java/org/apache/solr/common/params/SolrParams.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, ex.getMessage(), ex ); }
// in solrj/src/java/org/apache/solr/common/params/SolrParams.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, ex.getMessage(), ex ); }
// in contrib/velocity/src/java/org/apache/solr/response/VelocityResponseWriter.java
catch (Exception e) { throw new IOException(e.getMessage()); }
// in contrib/velocity/src/java/org/apache/solr/response/VelocityResponseWriter.java
catch (Exception e) { throw new IOException(e.getMessage()); }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
catch (Exception e) { LOG.error("Master at: " + masterUrl + " is not available. Index fetch failed. Exception: " + e.getMessage()); return false; }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
catch (Exception e) { LOG.error("Error deleting file in cleanup" + e.getMessage()); }
// in core/src/java/org/apache/solr/handler/loader/XMLLoader.java
catch(TransformerException te) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, te.getMessage(), te); }
// in core/src/java/org/apache/solr/handler/loader/XMLLoader.java
catch (XMLStreamException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e.getMessage(), e); }
// in core/src/java/org/apache/solr/handler/loader/XMLLoader.java
catch (XMLStreamException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e.getMessage(), e); }
// in core/src/java/org/apache/solr/handler/component/TermVectorComponent.java
catch (NumberFormatException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e.getMessage(), e); }
// in core/src/java/org/apache/solr/handler/admin/CoreAdminHandler.java
catch (InterruptedException e) { Thread.currentThread().interrupt(); throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Could not unregister core " + cname + " from cloudstate: " + e.getMessage(), e); }
// in core/src/java/org/apache/solr/handler/admin/CoreAdminHandler.java
catch (KeeperException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Could not unregister core " + cname + " from cloudstate: " + e.getMessage(), e); }
// in core/src/java/org/apache/solr/handler/SnapShooter.java
catch (Exception e) { SnapPuller.delTree(snapShotDir); LOG.error("Exception while creating snapshot", e); details.add("snapShootException", e.getMessage()); }
// in core/src/java/org/apache/solr/SolrLogFormatter.java
catch (Throwable th) { // logging swallows exceptions, so if we hit an exception we need to convert it to a string to see it return "ERROR IN SolrLogFormatter! original message:" + record.getMessage() + "\n\tException: " + SolrException.toStr(th); }
// in core/src/java/org/apache/solr/request/UnInvertedField.java
catch (IllegalStateException ise) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, ise.getMessage()); }
// in core/src/java/org/apache/solr/schema/IndexSchema.java
catch(Exception e) { // unexpected exception... throw new SolrException( SolrException.ErrorCode.SERVER_ERROR,"Schema Parsing Failed: " + e.getMessage(), e); }
// in core/src/java/org/apache/solr/schema/PreAnalyzedField.java
catch (Exception e) { LOG.warn("Can't use the configured PreAnalyzedParser class '" + implName + "' (" + e.getMessage() + "), using default " + DEFAULT_IMPL); parser = new JsonPreAnalyzedParser(); }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
catch( TimeLimitingCollector.TimeExceededException x ) { log.warn( "Query: " + query + "; " + x.getMessage() ); qr.setPartialResults(true); }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
catch( TimeLimitingCollector.TimeExceededException x ) { log.warn( "Query: " + query + "; " + x.getMessage() ); qr.setPartialResults(true); }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
catch( TimeLimitingCollector.TimeExceededException x ) { log.warn( "Query: " + query + "; " + x.getMessage() ); qr.setPartialResults(true); }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
catch( TimeLimitingCollector.TimeExceededException x ) { log.warn( "Query: " + query + "; " + x.getMessage() ); qr.setPartialResults(true); }
// in core/src/java/org/apache/solr/search/SurroundQParserPlugin.java
catch (org.apache.lucene.queryparser.surround.parser.ParseException pe) { throw new org.apache.lucene.queryparser.classic.ParseException( pe.getMessage()); }
// in core/src/java/org/apache/solr/search/Grouping.java
catch (TimeLimitingCollector.TimeExceededException x) { logger.warn( "Query: " + query + "; " + x.getMessage() ); qr.setPartialResults(true); }
// in core/src/java/org/apache/solr/search/ReturnFields.java
catch (ParseException e) { // try again, simple rules for a field name with no whitespace sp.pos = start; field = sp.getSimpleString(); if (req.getSchema().getFieldOrNull(field) != null) { // OK, it was an oddly named field fields.add(field); if( key != null ) { rename.add(field, key); } } else { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Error parsing fieldname: " + e.getMessage(), e); } }
// in core/src/java/org/apache/solr/search/grouping/CommandHandler.java
catch (TimeLimitingCollector.TimeExceededException x) { partialResults = true; logger.warn( "Query: " + query + "; " + x.getMessage() ); }
// in core/src/java/org/apache/solr/update/processor/MinFieldValueUpdateProcessorFactory.java
catch (ClassCastException e) { throw new SolrException (BAD_REQUEST, "Field values are not mutually comparable: " + e.getMessage(), e); }
// in core/src/java/org/apache/solr/update/processor/FieldMutatingUpdateProcessor.java
catch (SolrException e) { String msg = "Unable to mutate field '"+fname+"': "+e.getMessage(); SolrException.log(log, msg, e); throw new SolrException(BAD_REQUEST, msg, e); }
// in core/src/java/org/apache/solr/update/processor/URLClassifyProcessor.java
catch (MalformedURLException e) { log.warn("cannot get the normalized url for \"" + url + "\" due to " + e.getMessage()); }
// in core/src/java/org/apache/solr/update/processor/URLClassifyProcessor.java
catch (URISyntaxException e) { log.warn("cannot get the normalized url for \"" + url + "\" due to " + e.getMessage()); }
// in core/src/java/org/apache/solr/update/processor/MaxFieldValueUpdateProcessorFactory.java
catch (ClassCastException e) { throw new SolrException (BAD_REQUEST, "Field values are not mutually comparable: " + e.getMessage(), e); }
// in core/src/java/org/apache/solr/update/DocumentBuilder.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "ERROR: "+getID(doc, schema)+"Error adding field '" + field.getName() + "'='" +field.getValue()+"' msg=" + ex.getMessage(), ex ); }
// in core/src/java/org/apache/solr/core/SolrResourceLoader.java
catch( RuntimeException ex ) { log.warn("Odd RuntimeException while testing for JNDI: " + ex.getMessage()); }
// in core/src/java/org/apache/solr/util/SystemIdResolver.java
catch (RuntimeException re) { // unfortunately XInclude fallback only works with IOException, but openResource() never throws that one throw (IOException) (new IOException(re.getMessage()).initCause(re)); }
// in core/src/java/org/apache/solr/util/plugin/AbstractPluginLoader.java
catch (Exception ex) { SolrException e = new SolrException (ErrorCode.SERVER_ERROR, "Plugin init failure for " + type + (null != name ? (" \"" + name + "\"") : "") + ": " + ex.getMessage(), ex); throw e; }
49
wrapAndThrow 42
                  
// in contrib/dataimporthandler-extras/src/java/org/apache/solr/handler/dataimport/TikaEntityProcessor.java
catch (Exception e) { wrapAndThrow (SEVERE, e,"Unable to load Tika Config"); }
// in contrib/dataimporthandler-extras/src/java/org/apache/solr/handler/dataimport/TikaEntityProcessor.java
catch (TransformerConfigurationException e) { wrapAndThrow(SEVERE, e, "Unable to create content handler"); }
// in contrib/dataimporthandler-extras/src/java/org/apache/solr/handler/dataimport/TikaEntityProcessor.java
catch (Exception e) { wrapAndThrow(SEVERE, e, "Unable to read content"); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DataImporter.java
catch (Exception e) { wrapAndThrow(SEVERE, e, "Invalid type for data source: " + type); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DataImporter.java
catch (Exception e) { wrapAndThrow(SEVERE, e, "Failed to initialize DataSource: " + key.getDataSourceName()); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/XPathEntityProcessor.java
catch (Exception e) { if (ABORT.equals(onError)) { wrapAndThrow(SEVERE, e); } else if (SKIP.equals(onError)) { if (LOG.isDebugEnabled()) LOG.debug("Skipping url : " + s, e); wrapAndThrow(DataImportHandlerException.SKIP, e); } else { LOG.warn("Failed for url : " + s, e); rowIterator = Collections.EMPTY_LIST.iterator(); return; } }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/XPathEntityProcessor.java
catch (TransformerException e) { if (ABORT.equals(onError)) { wrapAndThrow(SEVERE, e, "Exception in applying XSL Transformeation"); } else if (SKIP.equals(onError)) { wrapAndThrow(DataImportHandlerException.SKIP, e); } else { LOG.warn("Failed for url : " + s, e); rowIterator = Collections.EMPTY_LIST.iterator(); return; } }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/XPathEntityProcessor.java
catch (Exception e) { String msg = "Parsing failed for xml, url:" + s + " rows processed:" + rows.size(); if (rows.size() > 0) msg += " last row: " + rows.get(rows.size() - 1); if (ABORT.equals(onError)) { wrapAndThrow(SEVERE, e, msg); } else if (SKIP.equals(onError)) { LOG.warn(msg, e); Map<String, Object> map = new HashMap<String, Object>(); map.put(SKIP_DOC, Boolean.TRUE); rows.add(map); } else if (CONTINUE.equals(onError)) { LOG.warn(msg, e); } }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/ClobTransformer.java
catch (IOException e) { DataImportHandlerException.wrapAndThrow(DataImportHandlerException.SEVERE, e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/BinFileDataSource.java
catch (FileNotFoundException e) { wrapAndThrow(SEVERE,e,"Unable to open file "+f.getAbsolutePath()); return null; }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/ScriptTransformer.java
catch (Exception e) { wrapAndThrow(SEVERE,e, "Error invoking script for entity " + context.getEntityAttribute("name")); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/ScriptTransformer.java
catch (ScriptException e) { wrapAndThrow(SEVERE, e, "'eval' failed with language: " + scriptLang + " and script: \n" + scriptText); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/FileDataSource.java
catch (Exception e) { wrapAndThrow(SEVERE,e,"Unable to open File : "+f.getAbsolutePath()); return null; }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/EvaluatorBag.java
catch (Exception e) { wrapAndThrow(SEVERE, e, "Unable to encode expression: " + expression + " with value: " + s); return null; }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/EvaluatorBag.java
catch (ParseException exp) { wrapAndThrow(SEVERE, exp, "Invalid expression for date"); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/EvaluatorBag.java
catch (ParseException e) { wrapAndThrow(SEVERE, e, "Invalid expression for date"); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/EvaluatorBag.java
catch (Exception e) { wrapAndThrow(SEVERE, e, "Unable to instantiate evaluator: " + map.get(CLASS)); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/EvaluatorBag.java
catch (NumberFormatException e) { if (vr.resolve(ss[i]) == null) { wrapAndThrow( SEVERE, e, "Invalid number :" + ss[i] + "in parameters " + expression); } }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/JdbcDataSource.java
catch (ClassNotFoundException e) { wrapAndThrow(SEVERE, e, "Could not load driver: " + driver); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/JdbcDataSource.java
catch (Exception e) { wrapAndThrow(SEVERE, e, "Unable to execute query: " + query); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/JdbcDataSource.java
catch (SQLException e) { logError("Error reading data ", e); wrapAndThrow(SEVERE, e, "Error reading data from database"); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/JdbcDataSource.java
catch (SQLException e) { close(); wrapAndThrow(SEVERE,e); return false; }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DIHCacheSupport.java
catch (Exception e) { SolrException.log(log, "getNextFromCache() failed for query '" + query + "'", e); wrapAndThrow(DataImportHandlerException.WARN, e); return null; }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/BinContentStreamDataSource.java
catch (IOException e) { DataImportHandlerException.wrapAndThrow(SEVERE, e); return null; }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/EntityProcessorWrapper.java
catch (NoSuchMethodException nsme){ String msg = "Transformer :" + trans + "does not implement Transformer interface or does not have a transformRow(Map<String.Object> m)method"; log.error(msg); wrapAndThrow(SEVERE, nsme,msg); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/EntityProcessorWrapper.java
catch (Exception e) { log.error("Unable to load Transformer: " + aTransArr, e); wrapAndThrow(SEVERE, e,"Unable to load Transformer: " + trans); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/EntityProcessorWrapper.java
catch (Exception e) { log.warn("transformer threw error", e); if (ABORT.equals(onError)) { wrapAndThrow(SEVERE, e); } else if (SKIP.equals(onError)) { wrapAndThrow(DataImportHandlerException.SKIP, e); } // onError = continue }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/EntityProcessorWrapper.java
catch (Exception e) { if(ABORT.equals(onError)){ wrapAndThrow(SEVERE, e); } else { //SKIP is not really possible. If this calls the nextRow() again the Entityprocessor would be in an inconisttent state SolrException.log(log, "Exception in entity : "+ entityName, e); return null; } }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/BinURLDataSource.java
catch (Exception e) { LOG.error("Exception thrown while getting data", e); wrapAndThrow (SEVERE, e, "Exception in invoking url " + url); return null;//unreachable }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/SolrEntityProcessor.java
catch (SolrServerException e) { if (ABORT.equals(onError)) { wrapAndThrow(SEVERE, e); } else if (SKIP.equals(onError)) { wrapAndThrow(DataImportHandlerException.SKIP_ROW, e); } }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/ContentStreamDataSource.java
catch (IOException e) { DataImportHandlerException.wrapAndThrow(SEVERE, e); return null; }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/FieldReaderDataSource.java
catch (Exception e) { wrapAndThrow(SEVERE, e,"Unable to get reader from clob"); return null;//unreachable }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/EntityProcessorBase.java
catch (Exception e) { SolrException.log(log, "getNext() failed for query '" + query + "'", e); query = null; rowIterator = null; wrapAndThrow(DataImportHandlerException.WARN, e); return null; }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DocBuilder.java
catch (Exception e) { wrapAndThrow(SEVERE, e); // unreachable statement return null; }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DocBuilder.java
catch (Exception e) { wrapAndThrow(SEVERE, e, "Unable to load class : " + className); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DocBuilder.java
catch (Exception e) { wrapAndThrow (SEVERE,e, "Unable to load EntityProcessor implementation for entity:" + entity.getName()); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/PlainTextEntityProcessor.java
catch (Exception e) { wrapAndThrow(SEVERE, e, "Exception reading url : " + url); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/PlainTextEntityProcessor.java
catch (IOException e) { IOUtils.closeQuietly(r); wrapAndThrow(SEVERE, e, "Exception reading url : " + url); }
44
code 32
                  
// in solrj/src/java/org/apache/solr/common/cloud/ZkStateReader.java
catch (KeeperException e) { if (e.code() == KeeperException.Code.SESSIONEXPIRED || e.code() == KeeperException.Code.CONNECTIONLOSS) { log.warn("ZooKeeper watch triggered, but Solr cannot talk to ZK"); return; } log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in solrj/src/java/org/apache/solr/common/cloud/ZkStateReader.java
catch (KeeperException e) { if (e.code() == KeeperException.Code.SESSIONEXPIRED || e.code() == KeeperException.Code.CONNECTIONLOSS) { log.warn("ZooKeeper watch triggered, but Solr cannot talk to ZK"); return; } log.error("", e); throw new ZooKeeperException( SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in solrj/src/java/org/apache/solr/common/cloud/ZkStateReader.java
catch (KeeperException e) { if (e.code() == KeeperException.Code.SESSIONEXPIRED || e.code() == KeeperException.Code.CONNECTIONLOSS) { log.warn("ZooKeeper watch triggered, but Solr cannot talk to ZK"); return; } log.error("", e); throw new ZooKeeperException( SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
catch (SolrException e) { // we retry on 404 or 403 or 503 - you can see this on solr shutdown if (e.code() == 404 || e.code() == 403 || e.code() == 503 || e.code() == 500) { ex = addZombie(server, e); } else { // Server is alive but the request was likely malformed or invalid throw e; } // TODO: consider using below above - currently does cause a problem with distrib updates: // seems to match up against a failed forward to leader exception as well... // || e.getMessage().contains("java.net.SocketException") // || e.getMessage().contains("java.net.ConnectException") }
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
catch (SolrException e) { // we retry on 404 or 403 or 503 - you can see this on solr shutdown if (e.code() == 404 || e.code() == 403 || e.code() == 503 || e.code() == 500) { ex = e; // already a zombie, no need to re-add } else { // Server is alive but the request was malformed or invalid zombieServers.remove(wrapper.getKey()); throw e; } }
// in core/src/java/org/apache/solr/handler/RequestHandlerBase.java
catch (Exception e) { if (e instanceof SolrException) { SolrException se = (SolrException)e; if (se.code() == SolrException.ErrorCode.CONFLICT.code) { // TODO: should we allow this to be counted as an error (numErrors++)? } else { SolrException.log(SolrCore.log,e); } } else { SolrException.log(SolrCore.log,e); if (e instanceof ParseException) { e = new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); } } rsp.setException(e); numErrors++; }
// in core/src/java/org/apache/solr/handler/component/HttpShardHandler.java
catch (Throwable th) { srsp.setException(th); if (th instanceof SolrException) { srsp.setResponseCode(((SolrException)th).code()); } else { srsp.setResponseCode(-1); } }
// in core/src/java/org/apache/solr/cloud/ShardLeaderWatcher.java
catch (KeeperException ke) { //check if we lost connection or the node was gone if (ke.code() != Code.CONNECTIONLOSS && ke.code() != Code.SESSIONEXPIRED && ke.code() != Code.NONODE) { throw ke; } }
// in core/src/java/org/apache/solr/cloud/Overseer.java
catch (KeeperException e) { if (e.code() == KeeperException.Code.SESSIONEXPIRED || e.code() == KeeperException.Code.CONNECTIONLOSS) { log.warn("ZooKeeper watch triggered, but Solr cannot talk to ZK"); return; } SolrException.log(log, "", e); throw new ZooKeeperException( SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/cloud/Overseer.java
catch (KeeperException e) { if (e.code() == Code.CONNECTIONLOSS || e.code() == Code.SESSIONEXPIRED) { log.warn("ZooKeeper watch triggered, but Solr cannot talk to ZK"); return; } }
// in core/src/java/org/apache/solr/cloud/Overseer.java
catch (KeeperException e) { if (e.code() == KeeperException.Code.SESSIONEXPIRED || e.code() == KeeperException.Code.CONNECTIONLOSS) { log.warn("ZooKeeper watch triggered, but Solr cannot talk to ZK"); return; } SolrException.log(log, "", e); throw new ZooKeeperException( SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/cloud/Overseer.java
catch (KeeperException e) { if (e.code() == KeeperException.Code.SESSIONEXPIRED || e.code() == KeeperException.Code.CONNECTIONLOSS) { log.warn("ZooKeeper watch triggered, but Solr cannot talk to ZK"); return; } SolrException.log(log, "", e); throw new ZooKeeperException( SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (KeeperException e) { // its okay if the node already exists if (e.code() != KeeperException.Code.NODEEXISTS) { throw e; } }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (KeeperException e) { // its okay if the node already exists if (e.code() != KeeperException.Code.NODEEXISTS) { throw e; } }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (KeeperException e) { // its okay if another beats us creating the node if (e.code() != KeeperException.Code.NODEEXISTS) { throw e; } }
// in core/src/java/org/apache/solr/update/SolrCmdDistributor.java
catch (Exception e) { clonedRequest.exception = e; if (e instanceof SolrException) { clonedRequest.rspCode = ((SolrException) e).code(); } else { clonedRequest.rspCode = -1; } }
39
append 24
                  
// in contrib/extraction/src/java/org/apache/solr/handler/extraction/ExtractingDocumentLoader.java
catch (TikaException e) { if(ignoreTikaException) log.warn(new StringBuilder("skip extracting text due to ").append(e.getLocalizedMessage()) .append(". metadata=").append(metadata.toString()).toString()); else throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in contrib/uima/src/java/org/apache/solr/uima/processor/UIMAUpdateRequestProcessor.java
catch (Exception e) { String logField = solrUIMAConfiguration.getLogField(); if(logField == null){ SchemaField uniqueKeyField = solrCore.getSchema().getUniqueKeyField(); if(uniqueKeyField != null){ logField = uniqueKeyField.getName(); } } String optionalFieldInfo = logField == null ? "." : new StringBuilder(". ").append(logField).append("=") .append((String)cmd.getSolrInputDocument().getField(logField).getValue()) .append(", ").toString(); int len = Math.min(text.length(), 100); if (solrUIMAConfiguration.isIgnoreErrors()) { log.warn(new StringBuilder("skip the text processing due to ") .append(e.getLocalizedMessage()).append(optionalFieldInfo) .append(" text=\"").append(text.substring(0, len)).append("...\"").toString()); } else { throw new SolrException(ErrorCode.SERVER_ERROR, new StringBuilder("processing error: ") .append(e.getLocalizedMessage()).append(optionalFieldInfo) .append(" text=\"").append(text.substring(0, len)).append("...\"").toString(), e); } }
// in core/src/java/org/apache/solr/search/QueryParsing.java
catch (Exception e) { out.append("EXCEPTION(val="); out.append(val); out.append(")"); }
// in core/src/java/org/apache/solr/search/QueryParsing.java
catch (Exception e) { out.append("EXCEPTION(val="); out.append(val.utf8ToString()); out.append(")"); }
// in core/src/java/org/apache/solr/core/SolrDeletionPolicy.java
catch (Exception e) { sb.append(e); }
725
getName 22
                  
// in solrj/src/java/org/apache/solr/client/solrj/beans/DocumentObjectBinder.java
catch (Exception ex) { // no getter -- don't worry about it... if (type == Boolean.class) { gname = "is" + setter.getName().substring(3); try { getter = setter.getDeclaringClass().getMethod(gname, (Class[]) null); } catch(Exception ex2) { // no getter -- don't worry about it... } } }
// in contrib/uima/src/java/org/apache/solr/uima/processor/UIMAUpdateRequestProcessor.java
catch (Exception e) { String logField = solrUIMAConfiguration.getLogField(); if(logField == null){ SchemaField uniqueKeyField = solrCore.getSchema().getUniqueKeyField(); if(uniqueKeyField != null){ logField = uniqueKeyField.getName(); } } String optionalFieldInfo = logField == null ? "." : new StringBuilder(". ").append(logField).append("=") .append((String)cmd.getSolrInputDocument().getField(logField).getValue()) .append(", ").toString(); int len = Math.min(text.length(), 100); if (solrUIMAConfiguration.isIgnoreErrors()) { log.warn(new StringBuilder("skip the text processing due to ") .append(e.getLocalizedMessage()).append(optionalFieldInfo) .append(" text=\"").append(text.substring(0, len)).append("...\"").toString()); } else { throw new SolrException(ErrorCode.SERVER_ERROR, new StringBuilder("processing error: ") .append(e.getLocalizedMessage()).append(optionalFieldInfo) .append(" text=\"").append(text.substring(0, len)).append("...\"").toString(), e); } }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DocBuilder.java
catch (DataImportHandlerException e) { if (verboseDebug) { getDebugLogger().log(DIHLogLevels.ENTITY_EXCEPTION, epw.getEntity().getName(), e); } if(e.getErrCode() == DataImportHandlerException.SKIP_ROW){ continue; } if (isRoot) { if (e.getErrCode() == DataImportHandlerException.SKIP) { importStatistics.skipDocCount.getAndIncrement(); doc = null; } else { SolrException.log(LOG, "Exception while processing: " + epw.getEntity().getName() + " document : " + doc, e); } if (e.getErrCode() == DataImportHandlerException.SEVERE) throw e; } else throw e; }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DocBuilder.java
catch (Throwable t) { if (verboseDebug) { getDebugLogger().log(DIHLogLevels.ENTITY_EXCEPTION, epw.getEntity().getName(), t); } throw new DataImportHandlerException(DataImportHandlerException.SEVERE, t); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DocBuilder.java
catch (Exception e) { wrapAndThrow (SEVERE,e, "Unable to load EntityProcessor implementation for entity:" + entity.getName()); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DocBuilder.java
catch (Exception e) { try { String n = DocBuilder.class.getPackage().getName() + "." + name; return core != null ? core.getResourceLoader().findClass(n, Object.class) : Class.forName(n); } catch (Exception e1) { throw new ClassNotFoundException("Unable to load " + name + " or " + DocBuilder.class.getPackage().getName() + "." + name, e); } }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DocBuilder.java
catch (Exception e1) { throw new ClassNotFoundException("Unable to load " + name + " or " + DocBuilder.class.getPackage().getName() + "." + name, e); }
// in core/src/java/org/apache/solr/request/SimpleFacets.java
catch (Exception e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Can't parse value "+rawval+" for field: " + field.getName(), e); }
// in core/src/java/org/apache/solr/request/SimpleFacets.java
catch (Exception e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Can't parse gap "+gap+" for field: " + field.getName(), e); }
// in core/src/java/org/apache/solr/request/SimpleFacets.java
catch (Exception e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Can't add gap "+gap+" to value " + value + " for field: " + field.getName(), e); }
// in core/src/java/org/apache/solr/update/DocumentBuilder.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "ERROR: "+getID(doc, schema)+"Error adding field '" + field.getName() + "'='" +field.getValue()+"' msg=" + ex.getMessage(), ex ); }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (Exception e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR,"Error Instantiating "+msg+", "+className+ " failed to instantiate " +cast.getName(), e); }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (Exception e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR,"Error Instantiating "+msg+", "+className+ " failed to instantiate " + UpdateHandler.class.getName(), e); }
// in core/src/java/org/apache/solr/core/SolrResourceLoader.java
catch (Exception e) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "Error instantiating class: '" + clazz.getName()+"'", e); }
// in core/src/java/org/apache/solr/core/SolrResourceLoader.java
catch (Exception e) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "Error instantiating class: '" + clazz.getName()+"'", e); }
// in core/src/java/org/apache/solr/core/SolrResourceLoader.java
catch (Exception e) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "Error instantiating class: '" + clazz.getName()+"'", e); }
// in core/src/java/org/apache/solr/core/JmxMonitoredMap.java
catch (Exception e) { LOG.warn("Could not getStatistics on info bean {}", infoBean.getName(), e); }
// in core/src/java/org/apache/solr/util/SolrPluginUtils.java
catch (InvocationTargetException e1) { throw new RuntimeException("Error invoking setter " + setterName + " on class : " + clazz.getName(), e1); }
// in core/src/java/org/apache/solr/util/SolrPluginUtils.java
catch (IllegalAccessException e1) { throw new RuntimeException("Error invoking setter " + setterName + " on class : " + clazz.getName(), e1); }
// in core/src/java/org/apache/solr/util/xslt/TransformerProvider.java
catch(TransformerConfigurationException tce) { log.error(getClass().getName(), "getTransformer", tce); final IOException ioe = new IOException("newTransformer fails ( " + lastFilename + ")"); ioe.initCause(tce); throw ioe; }
// in core/src/java/org/apache/solr/util/xslt/TransformerProvider.java
catch (Exception e) { log.error(getClass().getName(), "newTemplates", e); final IOException ioe = new IOException("Unable to initialize Templates '" + filename + "'"); ioe.initCause(e); throw ioe; }
504
toString 16
                  
// in contrib/extraction/src/java/org/apache/solr/handler/extraction/ExtractingDocumentLoader.java
catch (TikaException e) { if(ignoreTikaException) log.warn(new StringBuilder("skip extracting text due to ").append(e.getLocalizedMessage()) .append(". metadata=").append(metadata.toString()).toString()); else throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in contrib/uima/src/java/org/apache/solr/uima/processor/UIMAUpdateRequestProcessor.java
catch (Exception e) { String logField = solrUIMAConfiguration.getLogField(); if(logField == null){ SchemaField uniqueKeyField = solrCore.getSchema().getUniqueKeyField(); if(uniqueKeyField != null){ logField = uniqueKeyField.getName(); } } String optionalFieldInfo = logField == null ? "." : new StringBuilder(". ").append(logField).append("=") .append((String)cmd.getSolrInputDocument().getField(logField).getValue()) .append(", ").toString(); int len = Math.min(text.length(), 100); if (solrUIMAConfiguration.isIgnoreErrors()) { log.warn(new StringBuilder("skip the text processing due to ") .append(e.getLocalizedMessage()).append(optionalFieldInfo) .append(" text=\"").append(text.substring(0, len)).append("...\"").toString()); } else { throw new SolrException(ErrorCode.SERVER_ERROR, new StringBuilder("processing error: ") .append(e.getLocalizedMessage()).append(optionalFieldInfo) .append(" text=\"").append(text.substring(0, len)).append("...\"").toString(), e); } }
// in contrib/analysis-extras/src/java/org/apache/solr/analysis/MorfologikFilterFactory.java
catch (IllegalArgumentException e) { throw new IllegalArgumentException("The " + DICTIONARY_SCHEMA_ATTRIBUTE + " attribute accepts the " + "following constants: " + Arrays.toString(DICTIONARY.values()) + ", this value is invalid: " + dictionaryName); }
// in core/src/java/org/apache/solr/servlet/SolrDispatchFilter.java
catch( Throwable t ) { // This error really does not matter SimpleOrderedMap info = new SimpleOrderedMap(); int code=getErrorInfo(ex, info); response.sendError( code, info.toString() ); }
// in core/src/java/org/apache/solr/servlet/ZookeeperInfoServlet.java
catch (KeeperException e) { writeKeyValue(json, "warning", e.toString(), false); log.warn("Keeper Exception", e); }
// in core/src/java/org/apache/solr/servlet/ZookeeperInfoServlet.java
catch (InterruptedException e) { writeKeyValue(json, "warning", e.toString(), false); log.warn("InterruptedException", e); }
// in core/src/java/org/apache/solr/servlet/ZookeeperInfoServlet.java
catch (KeeperException e) { writeError(500, e.toString()); return false; }
// in core/src/java/org/apache/solr/servlet/ZookeeperInfoServlet.java
catch (InterruptedException e) { writeError(500, e.toString()); return false; }
// in core/src/java/org/apache/solr/servlet/ZookeeperInfoServlet.java
catch (KeeperException e) { writeError(500, e.toString()); return false; }
// in core/src/java/org/apache/solr/servlet/ZookeeperInfoServlet.java
catch (InterruptedException e) { writeError(500, e.toString()); return false; }
// in core/src/java/org/apache/solr/internal/csv/CSVParser.java
catch (NumberFormatException e) { throw new IOException( "(line " + getLineNumber() + ") Wrong unicode escape sequence found '" + code.toString() + "'" + e.toString()); }
// in core/src/java/org/apache/solr/core/Config.java
catch (IllegalArgumentException iae) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Invalid luceneMatchVersion '" + matchVersion + "', valid values are: " + Arrays.toString(Version.values()) + " or a string in format 'V.V'", iae); }
584
equals 13
                  
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/XPathEntityProcessor.java
catch (Exception e) { if (ABORT.equals(onError)) { wrapAndThrow(SEVERE, e); } else if (SKIP.equals(onError)) { if (LOG.isDebugEnabled()) LOG.debug("Skipping url : " + s, e); wrapAndThrow(DataImportHandlerException.SKIP, e); } else { LOG.warn("Failed for url : " + s, e); rowIterator = Collections.EMPTY_LIST.iterator(); return; } }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/XPathEntityProcessor.java
catch (TransformerException e) { if (ABORT.equals(onError)) { wrapAndThrow(SEVERE, e, "Exception in applying XSL Transformeation"); } else if (SKIP.equals(onError)) { wrapAndThrow(DataImportHandlerException.SKIP, e); } else { LOG.warn("Failed for url : " + s, e); rowIterator = Collections.EMPTY_LIST.iterator(); return; } }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/XPathEntityProcessor.java
catch (Exception e) { String msg = "Parsing failed for xml, url:" + s + " rows processed:" + rows.size(); if (rows.size() > 0) msg += " last row: " + rows.get(rows.size() - 1); if (ABORT.equals(onError)) { wrapAndThrow(SEVERE, e, msg); } else if (SKIP.equals(onError)) { LOG.warn(msg, e); Map<String, Object> map = new HashMap<String, Object>(); map.put(SKIP_DOC, Boolean.TRUE); rows.add(map); } else if (CONTINUE.equals(onError)) { LOG.warn(msg, e); } }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/EntityProcessorWrapper.java
catch (Exception e) { log.warn("transformer threw error", e); if (ABORT.equals(onError)) { wrapAndThrow(SEVERE, e); } else if (SKIP.equals(onError)) { wrapAndThrow(DataImportHandlerException.SKIP, e); } // onError = continue }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/EntityProcessorWrapper.java
catch (Exception e) { if(ABORT.equals(onError)){ wrapAndThrow(SEVERE, e); } else { //SKIP is not really possible. If this calls the nextRow() again the Entityprocessor would be in an inconisttent state SolrException.log(log, "Exception in entity : "+ entityName, e); return null; } }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/SolrEntityProcessor.java
catch (SolrServerException e) { if (ABORT.equals(onError)) { wrapAndThrow(SEVERE, e); } else if (SKIP.equals(onError)) { wrapAndThrow(DataImportHandlerException.SKIP_ROW, e); } }
// in core/src/java/org/apache/solr/cloud/LeaderElector.java
catch (ConnectionLossException e) { // we don't know if we made our node or not... List<String> entries = zkClient.getChildren(shardsElectZkPath, null, true); boolean foundId = false; for (String entry : entries) { String nodeId = getNodeId(entry); if (id.equals(nodeId)) { // we did create our node... foundId = true; break; } } if (!foundId) { throw e; } }
645
printStackTrace 13
                  
// in solrj/src/java/org/apache/solr/client/solrj/impl/ConcurrentUpdateSolrServer.java
catch (InterruptedException e) { e.printStackTrace(); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/XMLResponseParser.java
catch( Exception ex ) { ex.printStackTrace(); }
// in contrib/velocity/src/java/org/apache/solr/response/VelocityResponseWriter.java
catch (ClassCastException e) { // known edge case where QueryResponse's extraction assumes "response" is a SolrDocumentList // (AnalysisRequestHandler emits a "response") e.printStackTrace(); rsp = new SolrResponseBase(); rsp.setResponse(parsedResponse); }
// in core/src/java/org/apache/solr/handler/admin/SystemInfoHandler.java
catch( Throwable ex ) { ex.printStackTrace(); }
// in core/src/java/org/apache/solr/client/solrj/embedded/JettySolrRunner.java
catch (Exception ex) { ex.printStackTrace(); }
// in core/src/java/org/apache/solr/SolrLogFormatter.java
catch (Throwable e) { e.printStackTrace(); }
// in core/src/java/org/apache/solr/response/transform/ExplainAugmenterFactory.java
catch (IOException e) { e.printStackTrace(); }
// in core/src/java/org/apache/solr/schema/CurrencyField.java
catch (IOException e) { e.printStackTrace(); }
// in core/src/java/org/apache/solr/schema/PreAnalyzedField.java
catch (Exception e) { e.printStackTrace(); return null; }
// in core/src/java/org/apache/solr/internal/csv/writer/CSVConfigGuesser.java
catch(Exception e) { e.printStackTrace(); }
// in core/src/java/org/apache/solr/internal/csv/writer/CSVWriter.java
catch(Exception e) { e.printStackTrace(); }
// in core/src/java/org/apache/solr/update/processor/URLClassifyProcessor.java
catch (MalformedURLException e) { e.printStackTrace(); }
// in core/src/java/org/apache/solr/util/SimplePostTool.java
catch(RuntimeException e) { e.printStackTrace(); fatal("RuntimeException " + e); }
17
add 11
                  
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/XPathEntityProcessor.java
catch (Exception e) { String msg = "Parsing failed for xml, url:" + s + " rows processed:" + rows.size(); if (rows.size() > 0) msg += " last row: " + rows.get(rows.size() - 1); if (ABORT.equals(onError)) { wrapAndThrow(SEVERE, e, msg); } else if (SKIP.equals(onError)) { LOG.warn(msg, e); Map<String, Object> map = new HashMap<String, Object>(); map.put(SKIP_DOC, Boolean.TRUE); rows.add(map); } else if (CONTINUE.equals(onError)) { LOG.warn(msg, e); } }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DataImportHandler.java
catch (RuntimeException e) { rsp.add("exception", DebugLogger.getStacktraceString(e)); importer = null; return; }
// in core/src/java/org/apache/solr/handler/ReplicationHandler.java
catch (Exception e) { LOG.warn("Exception during creating a snapshot", e); rsp.add("exception", e); }
// in core/src/java/org/apache/solr/handler/ReplicationHandler.java
catch (IOException e) { rsp.add("status", "unable to get file names for given index generation"); rsp.add("exception", e); LOG.warn("Unable to get file names for indexCommit generation: " + gen, e); }
// in core/src/java/org/apache/solr/handler/ReplicationHandler.java
catch (Exception e) { LOG.warn("Exception while invoking 'details' method for replication on master ", e); slave.add(ERR_STATUS, "invalid_master"); }
// in core/src/java/org/apache/solr/handler/MoreLikeThisHandler.java
catch (Exception e) { SolrException.log(SolrCore.log, "Exception during debug", e); rsp.add("exception_during_debug", SolrException.toStr(e)); }
// in core/src/java/org/apache/solr/handler/SnapShooter.java
catch (Exception e) { SnapPuller.delTree(snapShotDir); LOG.error("Exception while creating snapshot", e); details.add("snapShootException", e.getMessage()); }
// in core/src/java/org/apache/solr/search/ReturnFields.java
catch (ParseException e) { // try again, simple rules for a field name with no whitespace sp.pos = start; field = sp.getSimpleString(); if (req.getSchema().getFieldOrNull(field) != null) { // OK, it was an oddly named field fields.add(field); if( key != null ) { rename.add(field, key); } } else { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Error parsing fieldname: " + e.getMessage(), e); } }
// in core/src/java/org/apache/solr/util/VersionedFile.java
catch (SecurityException e) { if (!df.exists()) { deleted.add(df); } }
1630
fatal 11
                  
// in core/src/java/org/apache/solr/util/SimplePostTool.java
catch (MalformedURLException e) { fatal("System Property 'url' is not a valid URL: " + u); }
// in core/src/java/org/apache/solr/util/SimplePostTool.java
catch(RuntimeException e) { e.printStackTrace(); fatal("RuntimeException " + e); }
// in core/src/java/org/apache/solr/util/SimplePostTool.java
catch (IOException e) { fatal("Can't open/read file: " + file); }
// in core/src/java/org/apache/solr/util/SimplePostTool.java
catch (IOException e) { fatal("IOException while closing file: "+ e); }
// in core/src/java/org/apache/solr/util/SimplePostTool.java
catch (MalformedURLException e) { fatal("The specified URL "+url+" is not a valid URL. Please check"); }
// in core/src/java/org/apache/solr/util/SimplePostTool.java
catch (IOException e) { fatal("An error occured posting data to "+url+". Please check that Solr is running."); }
// in core/src/java/org/apache/solr/util/SimplePostTool.java
catch (ProtocolException e) { fatal("Shouldn't happen: HttpURLConnection doesn't support POST??"+e); }
// in core/src/java/org/apache/solr/util/SimplePostTool.java
catch (IOException e) { fatal("Connection error (is Solr running at " + solrUrl + " ?): " + e); }
// in core/src/java/org/apache/solr/util/SimplePostTool.java
catch (IOException e) { fatal("IOException while posting data: " + e); }
// in core/src/java/org/apache/solr/util/SimplePostTool.java
catch (IOException e) { fatal("IOException while reading response: " + e); }
// in core/src/java/org/apache/solr/util/SimplePostTool.java
catch (UnsupportedEncodingException e) { fatal("Shouldn't happen: UTF-8 not supported?!?!?!"); }
14
info 10
                  
// in solrj/src/java/org/apache/solr/common/cloud/DefaultConnectionStrategy.java
catch (Exception e) { SolrException.log(log, "Reconnect to ZooKeeper failed", e); log.info("Reconnect to ZooKeeper failed"); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/FieldStreamDataSource.java
catch (Exception e) { LOG.info("Unable to get data from BLOB"); return null; }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/FieldReaderDataSource.java
catch (Exception e) { LOG.info("Unable to get data from CLOB"); return null; }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/FieldReaderDataSource.java
catch (Exception e) { LOG.info("Unable to get data from BLOB"); return null; }
// in core/src/java/org/apache/solr/cloud/SyncStrategy.java
catch (Exception e) { log.info("Could not tell a replica to recover", e); }
// in core/src/java/org/apache/solr/update/processor/DistributedUpdateProcessor.java
catch (Exception e) { log.info("Could not tell a replica to recover", e); }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (InterruptedException e) { log.info(SolrException.toStr(e)); }
// in core/src/java/org/apache/solr/core/SolrResourceLoader.java
catch (NoInitialContextException e) { log.info("JNDI not configured for "+project+" (NoInitialContextEx)"); }
// in core/src/java/org/apache/solr/core/SolrResourceLoader.java
catch (NamingException e) { log.info("No /"+project+"/home in JNDI"); }
372
debug 7
                  
// in solrj/src/java/org/apache/solr/client/solrj/impl/XMLResponseParser.java
catch( IllegalArgumentException ex ) { // Other implementations will likely throw this exception since "reuse-instance" // isimplementation specific. log.debug( "Unable to set the 'reuse-instance' property for the input factory: "+factory ); }
// in contrib/clustering/src/java/org/apache/solr/handler/clustering/carrot2/CarrotClusteringEngine.java
catch (RuntimeException e) { log.debug("Resource not found in Solr's config: " + resourceName + ". Using the default " + resource + " from Carrot JAR."); return new IResource[] {}; }
// in contrib/langid/src/java/org/apache/solr/update/processor/LangDetectLanguageIdentifierUpdateProcessor.java
catch (LangDetectException e) { log.debug("Could not determine language, returning empty list: ", e); return Collections.emptyList(); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/XPathEntityProcessor.java
catch (Exception e) { if (ABORT.equals(onError)) { wrapAndThrow(SEVERE, e); } else if (SKIP.equals(onError)) { if (LOG.isDebugEnabled()) LOG.debug("Skipping url : " + s, e); wrapAndThrow(DataImportHandlerException.SKIP, e); } else { LOG.warn("Failed for url : " + s, e); rowIterator = Collections.EMPTY_LIST.iterator(); return; } }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/XPathEntityProcessor.java
catch (InterruptedException e) { LOG.debug("Caught InterruptedException while waiting for row. Aborting."); isEnd.set(true); return null; }
// in core/src/java/org/apache/solr/handler/DocumentAnalysisRequestHandler.java
catch (IllegalArgumentException ex) { // Other implementations will likely throw this exception since "reuse-instance" // isimplementation specific. log.debug("Unable to set the 'reuse-instance' property for the input factory: " + inputFactory); }
// in core/src/java/org/apache/solr/handler/loader/XMLLoader.java
catch (IllegalArgumentException ex) { // Other implementations will likely throw this exception since "reuse-instance" // isimplementation specific. log.debug("Unable to set the 'reuse-instance' property for the input chain: " + inputFactory); }
103
getClass 7
                  
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/ZKPropertiesWriter.java
catch (InterruptedException e) { Thread.currentThread().interrupt(); log.warn( "Could not persist properties to " + path + " :" + e.getClass(), e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/ZKPropertiesWriter.java
catch (Exception e) { log.warn( "Could not persist properties to " + path + " :" + e.getClass(), e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/ZKPropertiesWriter.java
catch (Throwable e) { log.warn( "Could not read DIH properties from " + path + " :" + e.getClass(), e); }
// in core/src/java/org/apache/solr/cloud/Overseer.java
catch (KeeperException e) { log.error("Failed to create watcher for shard leader col:" + collection + " shard:" + shardId + ", exception: " + e.getClass()); }
// in core/src/java/org/apache/solr/cloud/Overseer.java
catch (InterruptedException e) { Thread.currentThread().interrupt(); log.error("Failed to create watcher for shard leader col:" + collection + " shard:" + shardId + ", exception: " + e.getClass()); }
// in core/src/java/org/apache/solr/util/xslt/TransformerProvider.java
catch(TransformerConfigurationException tce) { log.error(getClass().getName(), "getTransformer", tce); final IOException ioe = new IOException("newTransformer fails ( " + lastFilename + ")"); ioe.initCause(tce); throw ioe; }
// in core/src/java/org/apache/solr/util/xslt/TransformerProvider.java
catch (Exception e) { log.error(getClass().getName(), "newTemplates", e); final IOException ioe = new IOException("Unable to initialize Templates '" + filename + "'"); ioe.initCause(e); throw ioe; }
109
writeError 6
                  
// in core/src/java/org/apache/solr/servlet/ZookeeperInfoServlet.java
catch (TimeoutException e) { writeError(503, "Could not connect to zookeeper at '" + addr + "'\""); zkClient = null; return; }
// in core/src/java/org/apache/solr/servlet/ZookeeperInfoServlet.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); writeError(503, "Could not connect to zookeeper at '" + addr + "'\""); zkClient = null; return; }
// in core/src/java/org/apache/solr/servlet/ZookeeperInfoServlet.java
catch (KeeperException e) { writeError(500, e.toString()); return false; }
// in core/src/java/org/apache/solr/servlet/ZookeeperInfoServlet.java
catch (InterruptedException e) { writeError(500, e.toString()); return false; }
// in core/src/java/org/apache/solr/servlet/ZookeeperInfoServlet.java
catch (KeeperException e) { writeError(500, e.toString()); return false; }
// in core/src/java/org/apache/solr/servlet/ZookeeperInfoServlet.java
catch (InterruptedException e) { writeError(500, e.toString()); return false; }
7
writeStr 6
                  
// in core/src/java/org/apache/solr/schema/DoubleField.java
catch (NumberFormatException e){ // can't parse - write out the contents as a string so nothing is lost and // clients don't get a parse error. writer.writeStr(name, s, true); }
// in core/src/java/org/apache/solr/schema/IntField.java
catch (NumberFormatException e){ // can't parse - write out the contents as a string so nothing is lost and // clients don't get a parse error. writer.writeStr(name, s, true); }
// in core/src/java/org/apache/solr/schema/FloatField.java
catch (NumberFormatException e){ // can't parse - write out the contents as a string so nothing is lost and // clients don't get a parse error. writer.writeStr(name, s, true); }
// in core/src/java/org/apache/solr/schema/ByteField.java
catch (NumberFormatException e){ // can't parse - write out the contents as a string so nothing is lost and // clients don't get a parse error. writer.writeStr(name, s, true); }
// in core/src/java/org/apache/solr/schema/LongField.java
catch (NumberFormatException e){ // can't parse - write out the contents as a string so nothing is lost and // clients don't get a parse error. writer.writeStr(name, s, true); }
// in core/src/java/org/apache/solr/schema/ShortField.java
catch (NumberFormatException e){ // can't parse - write out the contents as a string so nothing is lost and // clients don't get a parse error. writer.writeStr(name, s, true); }
37
get 5
                  
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/XPathEntityProcessor.java
catch (Exception e) { String msg = "Parsing failed for xml, url:" + s + " rows processed:" + rows.size(); if (rows.size() > 0) msg += " last row: " + rows.get(rows.size() - 1); if (ABORT.equals(onError)) { wrapAndThrow(SEVERE, e, msg); } else if (SKIP.equals(onError)) { LOG.warn(msg, e); Map<String, Object> map = new HashMap<String, Object>(); map.put(SKIP_DOC, Boolean.TRUE); rows.add(map); } else if (CONTINUE.equals(onError)) { LOG.warn(msg, e); } }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/XPathEntityProcessor.java
catch (Exception e) { if(throwExp.get()) exp.set(e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/EvaluatorBag.java
catch (Exception e) { wrapAndThrow(SEVERE, e, "Unable to instantiate evaluator: " + map.get(CLASS)); }
// in core/src/java/org/apache/solr/analysis/HunspellStemFilterFactory.java
catch (Exception e) { throw new InitializationException("Unable to load hunspell data! [dictionary=" + args.get("dictionary") + ",affix=" + affixFile + "]", e); }
// in core/src/java/org/apache/solr/schema/TrieField.java
catch (IllegalArgumentException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Invalid type specified in schema.xml for field: " + args.get("name"), e); }
1703
getKey 5
                  
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
catch (SolrException e) { // we retry on 404 or 403 or 503 - you can see this on solr shutdown if (e.code() == 404 || e.code() == 403 || e.code() == 503 || e.code() == 500) { ex = e; // already a zombie, no need to re-add } else { // Server is alive but the request was malformed or invalid zombieServers.remove(wrapper.getKey()); throw e; } }
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
catch (SolrServerException e) { if (e.getRootCause() instanceof IOException) { ex = e; moveAliveToDead(wrapper); if (justFailed == null) justFailed = new HashMap<String,ServerWrapper>(); justFailed.put(wrapper.getKey(), wrapper); } else { throw e; } }
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
catch (Exception e) { //Expected. The server is still down. zombieServer.failedPings++; // If the server doesn't belong in the standard set belonging to this load balancer // then simply drop it after a certain number of failed pings. if (!zombieServer.standard && zombieServer.failedPings >= NONSTANDARD_PING_LIMIT) { zombieServers.remove(zombieServer.getKey()); } }
// in core/src/java/org/apache/solr/search/LFUCache.java
catch (Throwable e) { SolrException.log(log, "Error during auto-warming of key:" + itemsArr[i].getKey(), e); }
// in core/src/java/org/apache/solr/search/FastLRUCache.java
catch (Throwable e) { SolrException.log(log, "Error during auto-warming of key:" + itemsArr[i].getKey(), e); }
190
setPartialResults 5
                  
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
catch( TimeLimitingCollector.TimeExceededException x ) { log.warn( "Query: " + query + "; " + x.getMessage() ); qr.setPartialResults(true); }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
catch( TimeLimitingCollector.TimeExceededException x ) { log.warn( "Query: " + query + "; " + x.getMessage() ); qr.setPartialResults(true); }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
catch( TimeLimitingCollector.TimeExceededException x ) { log.warn( "Query: " + query + "; " + x.getMessage() ); qr.setPartialResults(true); }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
catch( TimeLimitingCollector.TimeExceededException x ) { log.warn( "Query: " + query + "; " + x.getMessage() ); qr.setPartialResults(true); }
// in core/src/java/org/apache/solr/search/Grouping.java
catch (TimeLimitingCollector.TimeExceededException x) { logger.warn( "Query: " + query + "; " + x.getMessage() ); qr.setPartialResults(true); }
6
size 5
                  
// in contrib/dataimporthandler-extras/src/java/org/apache/solr/handler/dataimport/MailEntityProcessor.java
catch (MessagingException e) { // skip bad ones unless its the last one and still no good folder if (folders.size() == 0 && i == topLevelFolders.size() - 1) throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Folder retreival failed"); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/XPathEntityProcessor.java
catch (Exception e) { String msg = "Parsing failed for xml, url:" + s + " rows processed:" + rows.size(); if (rows.size() > 0) msg += " last row: " + rows.get(rows.size() - 1); if (ABORT.equals(onError)) { wrapAndThrow(SEVERE, e, msg); } else if (SKIP.equals(onError)) { LOG.warn(msg, e); Map<String, Object> map = new HashMap<String, Object>(); map.put(SKIP_DOC, Boolean.TRUE); rows.add(map); } else if (CONTINUE.equals(onError)) { LOG.warn(msg, e); } }
650
sleep 5
                  
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (NoNodeException e) { Thread.sleep(500); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (Exception e) { SolrException.log(log, "There was a problem finding the leader in zk", e); try { Thread.sleep(2000); } catch (InterruptedException e1) { Thread.currentThread().interrupt(); } if (i == retries - 1) { throw new SolrException(ErrorCode.SERVER_ERROR, "There was a problem finding the leader in zk"); } }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (Exception e) { SolrException.log(log, "There was a problem making a request to the leader", e); try { Thread.sleep(2000); } catch (InterruptedException e1) { Thread.currentThread().interrupt(); } if (i == retries - 1) { throw new SolrException(ErrorCode.SERVER_ERROR, "There was a problem making a request to the leader"); } }
// in core/src/java/org/apache/solr/cloud/LeaderElector.java
catch (KeeperException.NoNodeException e) { // we must have failed in creating the election node - someone else must // be working on it, lets try again if (tries++ > 9) { throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } cont = true; Thread.sleep(50); }
// in core/src/java/org/apache/solr/util/FileUtils.java
catch (IOException ioe) { if (exc == null) exc = ioe; try { // Pause 5 msec Thread.sleep(5); } catch (InterruptedException ie) { Thread.currentThread().interrupt(); } }
23
StringBuilder 4
                  
// in contrib/extraction/src/java/org/apache/solr/handler/extraction/ExtractingDocumentLoader.java
catch (TikaException e) { if(ignoreTikaException) log.warn(new StringBuilder("skip extracting text due to ").append(e.getLocalizedMessage()) .append(". metadata=").append(metadata.toString()).toString()); else throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in contrib/uima/src/java/org/apache/solr/uima/processor/UIMAUpdateRequestProcessor.java
catch (Exception e) { String logField = solrUIMAConfiguration.getLogField(); if(logField == null){ SchemaField uniqueKeyField = solrCore.getSchema().getUniqueKeyField(); if(uniqueKeyField != null){ logField = uniqueKeyField.getName(); } } String optionalFieldInfo = logField == null ? "." : new StringBuilder(". ").append(logField).append("=") .append((String)cmd.getSolrInputDocument().getField(logField).getValue()) .append(", ").toString(); int len = Math.min(text.length(), 100); if (solrUIMAConfiguration.isIgnoreErrors()) { log.warn(new StringBuilder("skip the text processing due to ") .append(e.getLocalizedMessage()).append(optionalFieldInfo) .append(" text=\"").append(text.substring(0, len)).append("...\"").toString()); } else { throw new SolrException(ErrorCode.SERVER_ERROR, new StringBuilder("processing error: ") .append(e.getLocalizedMessage()).append(optionalFieldInfo) .append(" text=\"").append(text.substring(0, len)).append("...\"").toString(), e); } }
126
addZombie
4
                  
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
catch (SolrException e) { // we retry on 404 or 403 or 503 - you can see this on solr shutdown if (e.code() == 404 || e.code() == 403 || e.code() == 503 || e.code() == 500) { ex = addZombie(server, e); } else { // Server is alive but the request was likely malformed or invalid throw e; } // TODO: consider using below above - currently does cause a problem with distrib updates: // seems to match up against a failed forward to leader exception as well... // || e.getMessage().contains("java.net.SocketException") // || e.getMessage().contains("java.net.ConnectException") }
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
catch (SocketException e) { ex = addZombie(server, e); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
catch (SocketTimeoutException e) { ex = addZombie(server, e); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
catch (SolrServerException e) { Throwable rootCause = e.getRootCause(); if (rootCause instanceof IOException) { ex = addZombie(server, e); } else { throw e; } }
4
getLocalizedMessage
4
                  
// in solrj/src/java/org/apache/solr/client/solrj/impl/ConcurrentUpdateSolrServer.java
catch (InterruptedException e) { log.error("interrupted", e); throw new IOException(e.getLocalizedMessage()); }
// in contrib/extraction/src/java/org/apache/solr/handler/extraction/ExtractingDocumentLoader.java
catch (TikaException e) { if(ignoreTikaException) log.warn(new StringBuilder("skip extracting text due to ").append(e.getLocalizedMessage()) .append(". metadata=").append(metadata.toString()).toString()); else throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in contrib/uima/src/java/org/apache/solr/uima/processor/UIMAUpdateRequestProcessor.java
catch (Exception e) { String logField = solrUIMAConfiguration.getLogField(); if(logField == null){ SchemaField uniqueKeyField = solrCore.getSchema().getUniqueKeyField(); if(uniqueKeyField != null){ logField = uniqueKeyField.getName(); } } String optionalFieldInfo = logField == null ? "." : new StringBuilder(". ").append(logField).append("=") .append((String)cmd.getSolrInputDocument().getField(logField).getValue()) .append(", ").toString(); int len = Math.min(text.length(), 100); if (solrUIMAConfiguration.isIgnoreErrors()) { log.warn(new StringBuilder("skip the text processing due to ") .append(e.getLocalizedMessage()).append(optionalFieldInfo) .append(" text=\"").append(text.substring(0, len)).append("...\"").toString()); } else { throw new SolrException(ErrorCode.SERVER_ERROR, new StringBuilder("processing error: ") .append(e.getLocalizedMessage()).append(optionalFieldInfo) .append(" text=\"").append(text.substring(0, len)).append("...\"").toString(), e); } }
4
getRootCause 4
                  
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
catch (SolrServerException e) { Throwable rootCause = e.getRootCause(); if (rootCause instanceof IOException) { ex = addZombie(server, e); } else { throw e; } }
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
catch (SolrServerException e) { Throwable rootCause = e.getRootCause(); if (rootCause instanceof IOException) { ex = e; // already a zombie, no need to re-add } else { throw e; } }
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
catch (SolrServerException e) { if (e.getRootCause() instanceof IOException) { ex = e; moveAliveToDead(wrapper); if (justFailed == null) justFailed = new HashMap<String,ServerWrapper>(); justFailed.put(wrapper.getKey(), wrapper); } else { throw e; } }
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
catch (SolrServerException e) { if (e.getRootCause() instanceof IOException) { ex = e; // still dead } else { throw e; } }
5
initCause
4
                  
// in core/src/java/org/apache/solr/response/XSLTResponseWriter.java
catch(TransformerException te) { final IOException ioe = new IOException("XSLT transformation error"); ioe.initCause(te); throw ioe; }
// in core/src/java/org/apache/solr/util/SystemIdResolver.java
catch (RuntimeException re) { // unfortunately XInclude fallback only works with IOException, but openResource() never throws that one throw (IOException) (new IOException(re.getMessage()).initCause(re)); }
// in core/src/java/org/apache/solr/util/xslt/TransformerProvider.java
catch(TransformerConfigurationException tce) { log.error(getClass().getName(), "getTransformer", tce); final IOException ioe = new IOException("newTransformer fails ( " + lastFilename + ")"); ioe.initCause(tce); throw ioe; }
// in core/src/java/org/apache/solr/util/xslt/TransformerProvider.java
catch (Exception e) { log.error(getClass().getName(), "newTemplates", e); final IOException ioe = new IOException("Unable to initialize Templates '" + filename + "'"); ioe.initCause(e); throw ioe; }
4
substring 4
                  
// in solrj/src/java/org/apache/solr/client/solrj/beans/DocumentObjectBinder.java
catch (Exception ex) { // no getter -- don't worry about it... if (type == Boolean.class) { gname = "is" + setter.getName().substring(3); try { getter = setter.getDeclaringClass().getMethod(gname, (Class[]) null); } catch(Exception ex2) { // no getter -- don't worry about it... } } }
// in contrib/uima/src/java/org/apache/solr/uima/processor/UIMAUpdateRequestProcessor.java
catch (Exception e) { String logField = solrUIMAConfiguration.getLogField(); if(logField == null){ SchemaField uniqueKeyField = solrCore.getSchema().getUniqueKeyField(); if(uniqueKeyField != null){ logField = uniqueKeyField.getName(); } } String optionalFieldInfo = logField == null ? "." : new StringBuilder(". ").append(logField).append("=") .append((String)cmd.getSolrInputDocument().getField(logField).getValue()) .append(", ").toString(); int len = Math.min(text.length(), 100); if (solrUIMAConfiguration.isIgnoreErrors()) { log.warn(new StringBuilder("skip the text processing due to ") .append(e.getLocalizedMessage()).append(optionalFieldInfo) .append(" text=\"").append(text.substring(0, len)).append("...\"").toString()); } else { throw new SolrException(ErrorCode.SERVER_ERROR, new StringBuilder("processing error: ") .append(e.getLocalizedMessage()).append(optionalFieldInfo) .append(" text=\"").append(text.substring(0, len)).append("...\"").toString(), e); } }
// in core/src/java/org/apache/solr/core/SolrResourceLoader.java
catch (ClassNotFoundException e) { String newName=cname; if (newName.startsWith(project)) { newName = cname.substring(project.length()+1); } for (String subpackage : subpackages) { try { String name = base + '.' + subpackage + newName; log.trace("Trying class name " + name); return clazz = Class.forName(name,true,classLoader).asSubclass(expectedType); } catch (ClassNotFoundException e1) { // ignore... assume first exception is best. } } throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "Error loading class '" + cname + "'", e); }
138
close 3
                  
// in solrj/src/java/org/apache/solr/client/solrj/impl/HttpSolrServer.java
catch( NoHttpResponseException r ) { method = null; if(is != null) { is.close(); } // If out of tries then just rethrow (as normal error). if (tries < 1) { throw r; } }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/JdbcDataSource.java
catch (SQLException e) { close(); wrapAndThrow(SEVERE,e); return false; }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (Throwable e) { latch.countDown();//release the latch, otherwise we block trying to do the close. This should be fine, since counting down on a latch of 0 is still fine //close down the searcher and any other resources, if it exists, as this is not recoverable close(); throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, null, e); }
161
delete 3
                  
// in core/src/java/org/apache/solr/cloud/ElectionContext.java
catch (NodeExistsException e) { // if a previous leader ephemeral still exists for some reason, try and // remove it zkClient.delete(leaderPath, -1, true); zkClient.makePath(leaderPath, leaderProps == null ? null : ZkStateReader.toJSON(leaderProps), CreateMode.EPHEMERAL, true); }
// in core/src/java/org/apache/solr/cloud/ElectionContext.java
catch (NodeExistsException e) { // if a previous leader ephemeral still exists for some reason, try and // remove it zkClient.delete(leaderPath, -1, true); zkClient.makePath(leaderPath, ZkStateReader.toJSON(myProps), CreateMode.EPHEMERAL, true); }
// in core/src/java/org/apache/solr/update/UpdateLog.java
catch (Exception e) { SolrException.log(log, "Failure to open existing log file (non fatal) " + f, e); f.delete(); }
26
getBaseURL 3
                  
// in solrj/src/java/org/apache/solr/client/solrj/impl/HttpSolrServer.java
catch (ConnectException e) { throw new SolrServerException("Server refused connection at: " + getBaseURL(), e); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/HttpSolrServer.java
catch (SocketTimeoutException e) { throw new SolrServerException( "Timeout occured while waiting response from server at: " + getBaseURL(), e); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/HttpSolrServer.java
catch (IOException e) { throw new SolrServerException( "IOException occured when talking to server at: " + getBaseURL(), e); }
9
getEntity 3
                  
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DocBuilder.java
catch (DataImportHandlerException e) { if (verboseDebug) { getDebugLogger().log(DIHLogLevels.ENTITY_EXCEPTION, epw.getEntity().getName(), e); } if(e.getErrCode() == DataImportHandlerException.SKIP_ROW){ continue; } if (isRoot) { if (e.getErrCode() == DataImportHandlerException.SKIP) { importStatistics.skipDocCount.getAndIncrement(); doc = null; } else { SolrException.log(LOG, "Exception while processing: " + epw.getEntity().getName() + " document : " + doc, e); } if (e.getErrCode() == DataImportHandlerException.SEVERE) throw e; } else throw e; }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DocBuilder.java
catch (Throwable t) { if (verboseDebug) { getDebugLogger().log(DIHLogLevels.ENTITY_EXCEPTION, epw.getEntity().getName(), t); } throw new DataImportHandlerException(DataImportHandlerException.SEVERE, t); }
56
getErrCode
3
                  
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DocBuilder.java
catch (DataImportHandlerException e) { if (verboseDebug) { getDebugLogger().log(DIHLogLevels.ENTITY_EXCEPTION, epw.getEntity().getName(), e); } if(e.getErrCode() == DataImportHandlerException.SKIP_ROW){ continue; } if (isRoot) { if (e.getErrCode() == DataImportHandlerException.SKIP) { importStatistics.skipDocCount.getAndIncrement(); doc = null; } else { SolrException.log(LOG, "Exception while processing: " + epw.getEntity().getName() + " document : " + doc, e); } if (e.getErrCode() == DataImportHandlerException.SEVERE) throw e; } else throw e; }
3
getValue 3
                  
// in contrib/uima/src/java/org/apache/solr/uima/processor/UIMAUpdateRequestProcessor.java
catch (Exception e) { String logField = solrUIMAConfiguration.getLogField(); if(logField == null){ SchemaField uniqueKeyField = solrCore.getSchema().getUniqueKeyField(); if(uniqueKeyField != null){ logField = uniqueKeyField.getName(); } } String optionalFieldInfo = logField == null ? "." : new StringBuilder(". ").append(logField).append("=") .append((String)cmd.getSolrInputDocument().getField(logField).getValue()) .append(", ").toString(); int len = Math.min(text.length(), 100); if (solrUIMAConfiguration.isIgnoreErrors()) { log.warn(new StringBuilder("skip the text processing due to ") .append(e.getLocalizedMessage()).append(optionalFieldInfo) .append(" text=\"").append(text.substring(0, len)).append("...\"").toString()); } else { throw new SolrException(ErrorCode.SERVER_ERROR, new StringBuilder("processing error: ") .append(e.getLocalizedMessage()).append(optionalFieldInfo) .append(" text=\"").append(text.substring(0, len)).append("...\"").toString(), e); } }
// in core/src/java/org/apache/solr/schema/JsonPreAnalyzedParser.java
catch (NumberFormatException nfe) { LOG.warn("Invalid " + FLAGS_KEY + " attribute, skipped: '" + e.getValue() + "'"); }
// in core/src/java/org/apache/solr/update/DocumentBuilder.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "ERROR: "+getID(doc, schema)+"Error adding field '" + field.getName() + "'='" +field.getValue()+"' msg=" + ex.getMessage(), ex ); }
227
msg 3
                  
// in core/src/java/org/apache/solr/update/PeerSync.java
catch (IOException e) { // TODO: should this be handled separately as a problem with us? // I guess it probably already will by causing replication to be kicked off. sreq.updateException = e; log.error(msg() + "Error applying updates from " + sreq.shards + " ,update=" + o, e); return false; }
// in core/src/java/org/apache/solr/update/PeerSync.java
catch (Exception e) { sreq.updateException = e; log.error(msg() + "Error applying updates from " + sreq.shards + " ,update=" + o, e); return false; }
// in core/src/java/org/apache/solr/update/PeerSync.java
catch (Exception e) { sreq.updateException = e; log.error(msg() + "Error applying updates from " + sreq.shards + " ,finish()", e); return false; }
49
put 3
                  
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
catch (SolrServerException e) { if (e.getRootCause() instanceof IOException) { ex = e; moveAliveToDead(wrapper); if (justFailed == null) justFailed = new HashMap<String,ServerWrapper>(); justFailed.put(wrapper.getKey(), wrapper); } else { throw e; } }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/XPathEntityProcessor.java
catch (Exception e) { String msg = "Parsing failed for xml, url:" + s + " rows processed:" + rows.size(); if (rows.size() > 0) msg += " last row: " + rows.get(rows.size() - 1); if (ABORT.equals(onError)) { wrapAndThrow(SEVERE, e, msg); } else if (SKIP.equals(onError)) { LOG.warn(msg, e); Map<String, Object> map = new HashMap<String, Object>(); map.put(SKIP_DOC, Boolean.TRUE); rows.add(map); } else if (CONTINUE.equals(onError)) { LOG.warn(msg, e); } }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DocBuilder.java
catch (Exception e) { LOG.error("Could not write property file", e); statusMessages.put("error", "Could not write property file. Delta imports will not work. " + "Make sure your conf directory is writable"); }
670
set 3
                  
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/XPathEntityProcessor.java
catch (Exception e) { isEnd.set(true); return; }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/XPathEntityProcessor.java
catch (Exception e) { if(throwExp.get()) exp.set(e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/XPathEntityProcessor.java
catch (InterruptedException e) { LOG.debug("Caught InterruptedException while waiting for row. Aborting."); isEnd.set(true); return null; }
284
setException 3
                  
// in core/src/java/org/apache/solr/handler/RequestHandlerBase.java
catch (Exception e) { if (e instanceof SolrException) { SolrException se = (SolrException)e; if (se.code() == SolrException.ErrorCode.CONFLICT.code) { // TODO: should we allow this to be counted as an error (numErrors++)? } else { SolrException.log(SolrCore.log,e); } } else { SolrException.log(SolrCore.log,e); if (e instanceof ParseException) { e = new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); } } rsp.setException(e); numErrors++; }
// in core/src/java/org/apache/solr/handler/component/HttpShardHandler.java
catch( ConnectException cex ) { srsp.setException(cex); //???? }
// in core/src/java/org/apache/solr/handler/component/HttpShardHandler.java
catch (Throwable th) { srsp.setException(th); if (th instanceof SolrException) { srsp.setResponseCode(((SolrException)th).code()); } else { srsp.setResponseCode(-1); } }
7
shutdownNow 3
                  
// in solrj/src/java/org/apache/solr/client/solrj/impl/ConcurrentUpdateSolrServer.java
catch (InterruptedException ie) { scheduler.shutdownNow(); Thread.currentThread().interrupt(); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/ConcurrentUpdateSolrServer.java
catch (InterruptedException ie) { scheduler.shutdownNow(); Thread.currentThread().interrupt(); }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (InterruptedException e) { searcherExecutor.shutdownNow(); try { if (!searcherExecutor.awaitTermination(30, TimeUnit.SECONDS)) { log.error("Timeout waiting for searchExecutor to terminate"); } } catch (InterruptedException e2) { SolrException.log(log, e2); } }
11
toStr 3
                  
// in core/src/java/org/apache/solr/handler/MoreLikeThisHandler.java
catch (Exception e) { SolrException.log(SolrCore.log, "Exception during debug", e); rsp.add("exception_during_debug", SolrException.toStr(e)); }
// in core/src/java/org/apache/solr/SolrLogFormatter.java
catch (Throwable th) { // logging swallows exceptions, so if we hit an exception we need to convert it to a string to see it return "ERROR IN SolrLogFormatter! original message:" + record.getMessage() + "\n\tException: " + SolrException.toStr(th); }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (InterruptedException e) { log.info(SolrException.toStr(e)); }
7
writeKeyValue 3
                  
// in core/src/java/org/apache/solr/servlet/ZookeeperInfoServlet.java
catch (IllegalArgumentException e) { // path doesn't exist (must have been removed) writeKeyValue(json, "warning", "(path gone)", false); }
// in core/src/java/org/apache/solr/servlet/ZookeeperInfoServlet.java
catch (KeeperException e) { writeKeyValue(json, "warning", e.toString(), false); log.warn("Keeper Exception", e); }
// in core/src/java/org/apache/solr/servlet/ZookeeperInfoServlet.java
catch (InterruptedException e) { writeKeyValue(json, "warning", e.toString(), false); log.warn("InterruptedException", e); }
20
exists 2
                  
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
catch (NodeExistsException e) { if (!failOnExists) { // TODO: version ? for now, don't worry about race setData(currentPath, data, -1, retryOnConnLoss); // set new watch exists(currentPath, watcher, retryOnConnLoss); return; } // ignore unless it's the last node in the path if (i == paths.length - 1) { throw e; } }
// in core/src/java/org/apache/solr/util/VersionedFile.java
catch (SecurityException e) { if (!df.exists()) { deleted.add(df); } }
75
forName 2
                  
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DocBuilder.java
catch (Exception e) { try { String n = DocBuilder.class.getPackage().getName() + "." + name; return core != null ? core.getResourceLoader().findClass(n, Object.class) : Class.forName(n); } catch (Exception e1) { throw new ClassNotFoundException("Unable to load " + name + " or " + DocBuilder.class.getPackage().getName() + "." + name, e); } }
// in core/src/java/org/apache/solr/core/SolrResourceLoader.java
catch (ClassNotFoundException e) { String newName=cname; if (newName.startsWith(project)) { newName = cname.substring(project.length()+1); } for (String subpackage : subpackages) { try { String name = base + '.' + subpackage + newName; log.trace("Trying class name " + name); return clazz = Class.forName(name,true,classLoader).asSubclass(expectedType); } catch (ClassNotFoundException e1) { // ignore... assume first exception is best. } } throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "Error loading class '" + cname + "'", e); }
19
getAbsolutePath 2
                  
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/BinFileDataSource.java
catch (FileNotFoundException e) { wrapAndThrow(SEVERE,e,"Unable to open file "+f.getAbsolutePath()); return null; }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/FileDataSource.java
catch (Exception e) { wrapAndThrow(SEVERE,e,"Unable to open File : "+f.getAbsolutePath()); return null; }
39
getCause 2
                  
// in core/src/java/org/apache/solr/analysis/PhoneticFilterFactory.java
catch (Exception e) { final Throwable t = (e instanceof InvocationTargetException) ? e.getCause() : e; throw new InitializationException("Error initializing encoder: " + name + " / " + clazz, t); }
// in core/src/java/org/apache/solr/request/PerSegmentSingleValuedFaceting.java
catch (ExecutionException e) { Throwable cause = e.getCause(); if (cause instanceof RuntimeException) { throw (RuntimeException)cause; } else { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Error in per-segment faceting on field: " + fieldName, cause); } }
6
getDebugLogger 2
                  
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DocBuilder.java
catch (DataImportHandlerException e) { if (verboseDebug) { getDebugLogger().log(DIHLogLevels.ENTITY_EXCEPTION, epw.getEntity().getName(), e); } if(e.getErrCode() == DataImportHandlerException.SKIP_ROW){ continue; } if (isRoot) { if (e.getErrCode() == DataImportHandlerException.SKIP) { importStatistics.skipDocCount.getAndIncrement(); doc = null; } else { SolrException.log(LOG, "Exception while processing: " + epw.getEntity().getName() + " document : " + doc, e); } if (e.getErrCode() == DataImportHandlerException.SEVERE) throw e; } else throw e; }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DocBuilder.java
catch (Throwable t) { if (verboseDebug) { getDebugLogger().log(DIHLogLevels.ENTITY_EXCEPTION, epw.getEntity().getName(), t); } throw new DataImportHandlerException(DataImportHandlerException.SEVERE, t); }
13
getDictionaryName 2
                  
// in core/src/java/org/apache/solr/handler/component/SpellCheckComponent.java
catch (IOException e) { log.error( "Exception in reloading spell check index for spellchecker: " + checker.getDictionaryName(), e); }
// in core/src/java/org/apache/solr/handler/component/SpellCheckComponent.java
catch (Exception e) { log.error( "Exception in building spell check index for spellchecker: " + checker.getDictionaryName(), e); }
9
getPackage 2
                  
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DocBuilder.java
catch (Exception e) { try { String n = DocBuilder.class.getPackage().getName() + "." + name; return core != null ? core.getResourceLoader().findClass(n, Object.class) : Class.forName(n); } catch (Exception e1) { throw new ClassNotFoundException("Unable to load " + name + " or " + DocBuilder.class.getPackage().getName() + "." + name, e); } }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DocBuilder.java
catch (Exception e1) { throw new ClassNotFoundException("Unable to load " + name + " or " + DocBuilder.class.getPackage().getName() + "." + name, e); }
11
getSchema 2
                  
// in contrib/uima/src/java/org/apache/solr/uima/processor/UIMAUpdateRequestProcessor.java
catch (Exception e) { String logField = solrUIMAConfiguration.getLogField(); if(logField == null){ SchemaField uniqueKeyField = solrCore.getSchema().getUniqueKeyField(); if(uniqueKeyField != null){ logField = uniqueKeyField.getName(); } } String optionalFieldInfo = logField == null ? "." : new StringBuilder(". ").append(logField).append("=") .append((String)cmd.getSolrInputDocument().getField(logField).getValue()) .append(", ").toString(); int len = Math.min(text.length(), 100); if (solrUIMAConfiguration.isIgnoreErrors()) { log.warn(new StringBuilder("skip the text processing due to ") .append(e.getLocalizedMessage()).append(optionalFieldInfo) .append(" text=\"").append(text.substring(0, len)).append("...\"").toString()); } else { throw new SolrException(ErrorCode.SERVER_ERROR, new StringBuilder("processing error: ") .append(e.getLocalizedMessage()).append(optionalFieldInfo) .append(" text=\"").append(text.substring(0, len)).append("...\"").toString(), e); } }
// in core/src/java/org/apache/solr/search/ReturnFields.java
catch (ParseException e) { // try again, simple rules for a field name with no whitespace sp.pos = start; field = sp.getSimpleString(); if (req.getSchema().getFieldOrNull(field) != null) { // OK, it was an oddly named field fields.add(field); if( key != null ) { rename.add(field, key); } } else { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Error parsing fieldname: " + e.getMessage(), e); } }
147
iterator 2
                  
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/XPathEntityProcessor.java
catch (Exception e) { if (ABORT.equals(onError)) { wrapAndThrow(SEVERE, e); } else if (SKIP.equals(onError)) { if (LOG.isDebugEnabled()) LOG.debug("Skipping url : " + s, e); wrapAndThrow(DataImportHandlerException.SKIP, e); } else { LOG.warn("Failed for url : " + s, e); rowIterator = Collections.EMPTY_LIST.iterator(); return; } }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/XPathEntityProcessor.java
catch (TransformerException e) { if (ABORT.equals(onError)) { wrapAndThrow(SEVERE, e, "Exception in applying XSL Transformeation"); } else if (SKIP.equals(onError)) { wrapAndThrow(DataImportHandlerException.SKIP, e); } else { LOG.warn("Failed for url : " + s, e); rowIterator = Collections.EMPTY_LIST.iterator(); return; } }
135
keySet 2
                  
// in core/src/java/org/apache/solr/analysis/PhoneticFilterFactory.java
catch (ClassNotFoundException cnfe) { throw new InitializationException("Unknown encoder: " + name + " must be full class name or one of " + registry.keySet(), cnfe); }
// in core/src/java/org/apache/solr/analysis/PhoneticFilterFactory.java
catch (ClassCastException e) { throw new InitializationException("Not an encoder: " + name + " must be full class name or one of " + registry.keySet(), e); }
68
length 2
                  
// in contrib/uima/src/java/org/apache/solr/uima/processor/UIMAUpdateRequestProcessor.java
catch (Exception e) { String logField = solrUIMAConfiguration.getLogField(); if(logField == null){ SchemaField uniqueKeyField = solrCore.getSchema().getUniqueKeyField(); if(uniqueKeyField != null){ logField = uniqueKeyField.getName(); } } String optionalFieldInfo = logField == null ? "." : new StringBuilder(". ").append(logField).append("=") .append((String)cmd.getSolrInputDocument().getField(logField).getValue()) .append(", ").toString(); int len = Math.min(text.length(), 100); if (solrUIMAConfiguration.isIgnoreErrors()) { log.warn(new StringBuilder("skip the text processing due to ") .append(e.getLocalizedMessage()).append(optionalFieldInfo) .append(" text=\"").append(text.substring(0, len)).append("...\"").toString()); } else { throw new SolrException(ErrorCode.SERVER_ERROR, new StringBuilder("processing error: ") .append(e.getLocalizedMessage()).append(optionalFieldInfo) .append(" text=\"").append(text.substring(0, len)).append("...\"").toString(), e); } }
// in core/src/java/org/apache/solr/core/SolrResourceLoader.java
catch (ClassNotFoundException e) { String newName=cname; if (newName.startsWith(project)) { newName = cname.substring(project.length()+1); } for (String subpackage : subpackages) { try { String name = base + '.' + subpackage + newName; log.trace("Trying class name " + name); return clazz = Class.forName(name,true,classLoader).asSubclass(expectedType); } catch (ClassNotFoundException e1) { // ignore... assume first exception is best. } } throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "Error loading class '" + cname + "'", e); }
360
logError
2
                  
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/JdbcDataSource.java
catch (SQLException e) { logError("Error reading data ", e); wrapAndThrow(SEVERE, e, "Error reading data from database"); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/JdbcDataSource.java
catch (Exception e) { logError("Exception while closing result set", e); }
2
makePath 2
                  
// in core/src/java/org/apache/solr/cloud/ElectionContext.java
catch (NodeExistsException e) { // if a previous leader ephemeral still exists for some reason, try and // remove it zkClient.delete(leaderPath, -1, true); zkClient.makePath(leaderPath, leaderProps == null ? null : ZkStateReader.toJSON(leaderProps), CreateMode.EPHEMERAL, true); }
// in core/src/java/org/apache/solr/cloud/ElectionContext.java
catch (NodeExistsException e) { // if a previous leader ephemeral still exists for some reason, try and // remove it zkClient.delete(leaderPath, -1, true); zkClient.makePath(leaderPath, ZkStateReader.toJSON(myProps), CreateMode.EPHEMERAL, true); }
18
newInstance 2
                  
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/JdbcDataSource.java
catch (SQLException e) { // DriverManager does not allow you to use a driver which is not loaded through // the class loader of the class which is trying to make the connection. // This is a workaround for cases where the user puts the driver jar in the // solr.home/lib or solr.home/core/lib directories. Driver d = (Driver) DocBuilder.loadClass(driver, context.getSolrCore()).newInstance(); c = d.connect(url, initProps); }
// in core/src/java/org/apache/solr/schema/FieldTypePluginLoader.java
catch (NoSuchMethodException nsme) { // otherwise use default ctor return clazz.newInstance(); }
78
remove 2
                  
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
catch (SolrException e) { // we retry on 404 or 403 or 503 - you can see this on solr shutdown if (e.code() == 404 || e.code() == 403 || e.code() == 503 || e.code() == 500) { ex = e; // already a zombie, no need to re-add } else { // Server is alive but the request was malformed or invalid zombieServers.remove(wrapper.getKey()); throw e; } }
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
catch (Exception e) { //Expected. The server is still down. zombieServer.failedPings++; // If the server doesn't belong in the standard set belonging to this load balancer // then simply drop it after a certain number of failed pings. if (!zombieServer.standard && zombieServer.failedPings >= NONSTANDARD_PING_LIMIT) { zombieServers.remove(zombieServer.getKey()); } }
215
rollback 2
                  
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DataImporter.java
catch (Throwable t) { SolrException.log(LOG, "Full Import failed", t); docBuilder.rollback(); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DataImporter.java
catch (Throwable t) { LOG.error("Delta Import Failed", t); docBuilder.rollback(); }
10
sendError 2
                  
// in core/src/java/org/apache/solr/servlet/SolrDispatchFilter.java
catch (Throwable ex) { sendError( core, solrReq, request, (HttpServletResponse)response, ex ); return; }
// in core/src/java/org/apache/solr/servlet/SolrDispatchFilter.java
catch( Throwable t ) { // This error really does not matter SimpleOrderedMap info = new SimpleOrderedMap(); int code=getErrorInfo(ex, info); response.sendError( code, info.toString() ); }
5
setResponseCode
2
                  
// in core/src/java/org/apache/solr/handler/component/HttpShardHandler.java
catch (Throwable th) { srsp.setException(th); if (th instanceof SolrException) { srsp.setResponseCode(((SolrException)th).code()); } else { srsp.setResponseCode(-1); } }
2
toJSON 2
                  
// in core/src/java/org/apache/solr/cloud/ElectionContext.java
catch (NodeExistsException e) { // if a previous leader ephemeral still exists for some reason, try and // remove it zkClient.delete(leaderPath, -1, true); zkClient.makePath(leaderPath, leaderProps == null ? null : ZkStateReader.toJSON(leaderProps), CreateMode.EPHEMERAL, true); }
// in core/src/java/org/apache/solr/cloud/ElectionContext.java
catch (NodeExistsException e) { // if a previous leader ephemeral still exists for some reason, try and // remove it zkClient.delete(leaderPath, -1, true); zkClient.makePath(leaderPath, ZkStateReader.toJSON(myProps), CreateMode.EPHEMERAL, true); }
8
values 2
                  
// in contrib/analysis-extras/src/java/org/apache/solr/analysis/MorfologikFilterFactory.java
catch (IllegalArgumentException e) { throw new IllegalArgumentException("The " + DICTIONARY_SCHEMA_ATTRIBUTE + " attribute accepts the " + "following constants: " + Arrays.toString(DICTIONARY.values()) + ", this value is invalid: " + dictionaryName); }
// in core/src/java/org/apache/solr/core/Config.java
catch (IllegalArgumentException iae) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Invalid luceneMatchVersion '" + matchVersion + "', valid values are: " + Arrays.toString(Version.values()) + " or a string in format 'V.V'", iae); }
107
<String, Object> 1
                  
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/XPathEntityProcessor.java
catch (Exception e) { String msg = "Parsing failed for xml, url:" + s + " rows processed:" + rows.size(); if (rows.size() > 0) msg += " last row: " + rows.get(rows.size() - 1); if (ABORT.equals(onError)) { wrapAndThrow(SEVERE, e, msg); } else if (SKIP.equals(onError)) { LOG.warn(msg, e); Map<String, Object> map = new HashMap<String, Object>(); map.put(SKIP_DOC, Boolean.TRUE); rows.add(map); } else if (CONTINUE.equals(onError)) { LOG.warn(msg, e); } }
42
<String,ServerWrapper>
1
                  
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
catch (SolrServerException e) { if (e.getRootCause() instanceof IOException) { ex = e; moveAliveToDead(wrapper); if (justFailed == null) justFailed = new HashMap<String,ServerWrapper>(); justFailed.put(wrapper.getKey(), wrapper); } else { throw e; } }
1
ExtendedWhitespaceTokenizer 1
                  
// in contrib/clustering/src/java/org/apache/solr/handler/clustering/carrot2/LuceneCarrot2TokenizerFactory.java
catch (Throwable e) { return new ExtendedWhitespaceTokenizer(); }
2
JsonPreAnalyzedParser 1
                  
// in core/src/java/org/apache/solr/schema/PreAnalyzedField.java
catch (Exception e) { LOG.warn("Can't use the configured PreAnalyzedParser class '" + implName + "' (" + e.getMessage() + "), using default " + DEFAULT_IMPL); parser = new JsonPreAnalyzedParser(); }
2
SimpleOrderedMap 1
                  
// in core/src/java/org/apache/solr/servlet/SolrDispatchFilter.java
catch( Throwable t ) { // This error really does not matter SimpleOrderedMap info = new SimpleOrderedMap(); int code=getErrorInfo(ex, info); response.sendError( code, info.toString() ); }
148
SolrResponseBase
1
                  
// in contrib/velocity/src/java/org/apache/solr/response/VelocityResponseWriter.java
catch (ClassCastException e) { // known edge case where QueryResponse's extraction assumes "response" is a SolrDocumentList // (AnalysisRequestHandler emits a "response") e.printStackTrace(); rsp = new SolrResponseBase(); rsp.setResponse(parsedResponse); }
1
asSubclass 1
                  
// in core/src/java/org/apache/solr/core/SolrResourceLoader.java
catch (ClassNotFoundException e) { String newName=cname; if (newName.startsWith(project)) { newName = cname.substring(project.length()+1); } for (String subpackage : subpackages) { try { String name = base + '.' + subpackage + newName; log.trace("Trying class name " + name); return clazz = Class.forName(name,true,classLoader).asSubclass(expectedType); } catch (ClassNotFoundException e1) { // ignore... assume first exception is best. } } throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "Error loading class '" + cname + "'", e); }
4
awaitTermination 1
                  
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (InterruptedException e) { searcherExecutor.shutdownNow(); try { if (!searcherExecutor.awaitTermination(30, TimeUnit.SECONDS)) { log.error("Timeout waiting for searchExecutor to terminate"); } } catch (InterruptedException e2) { SolrException.log(log, e2); } }
6
checkIfIamLeader 1
                  
// in core/src/java/org/apache/solr/cloud/LeaderElector.java
catch (KeeperException e) { // we couldn't set our watch - the node before us may already be down? // we need to check if we are the leader again checkIfIamLeader(seq, context, true); }
3
closeQuietly 1
                  
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/PlainTextEntityProcessor.java
catch (IOException e) { IOUtils.closeQuietly(r); wrapAndThrow(SEVERE, e, "Exception reading url : " + url); }
36
connect 1
                  
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/JdbcDataSource.java
catch (SQLException e) { // DriverManager does not allow you to use a driver which is not loaded through // the class loader of the class which is trying to make the connection. // This is a workaround for cases where the user puts the driver jar in the // solr.home/lib or solr.home/core/lib directories. Driver d = (Driver) DocBuilder.loadClass(driver, context.getSolrCore()).newInstance(); c = d.connect(url, initProps); }
4
countDown 1
                  
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (Throwable e) { latch.countDown();//release the latch, otherwise we block trying to do the close. This should be fine, since counting down on a latch of 0 is still fine //close down the searcher and any other resources, if it exists, as this is not recoverable close(); throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, null, e); }
4
decref 1
                  
// in core/src/java/org/apache/solr/update/UpdateLog.java
catch (IOException e) { log.error("Error inspecting tlog " + ll); ll.decref(); continue; }
38
delTree 1
                  
// in core/src/java/org/apache/solr/handler/SnapShooter.java
catch (Exception e) { SnapPuller.delTree(snapShotDir); LOG.error("Exception while creating snapshot", e); details.add("snapShootException", e.getMessage()); }
7
emptyList 1
                  
// in contrib/langid/src/java/org/apache/solr/update/processor/LangDetectLanguageIdentifierUpdateProcessor.java
catch (LangDetectException e) { log.debug("Could not determine language, returning empty list: ", e); return Collections.emptyList(); }
23
findClass 1
                  
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DocBuilder.java
catch (Exception e) { try { String n = DocBuilder.class.getPackage().getName() + "." + name; return core != null ? core.getResourceLoader().findClass(n, Object.class) : Class.forName(n); } catch (Exception e1) { throw new ClassNotFoundException("Unable to load " + name + " or " + DocBuilder.class.getPackage().getName() + "." + name, e); } }
12
format 1
                  
// in core/src/java/org/apache/solr/handler/component/QueryComponent.java
catch (IllegalArgumentException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, String.format("Illegal %s parameter", GroupParams.GROUP_FORMAT)); }
49
getAndIncrement 1
                  
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DocBuilder.java
catch (DataImportHandlerException e) { if (verboseDebug) { getDebugLogger().log(DIHLogLevels.ENTITY_EXCEPTION, epw.getEntity().getName(), e); } if(e.getErrCode() == DataImportHandlerException.SKIP_ROW){ continue; } if (isRoot) { if (e.getErrCode() == DataImportHandlerException.SKIP) { importStatistics.skipDocCount.getAndIncrement(); doc = null; } else { SolrException.log(LOG, "Exception while processing: " + epw.getEntity().getName() + " document : " + doc, e); } if (e.getErrCode() == DataImportHandlerException.SEVERE) throw e; } else throw e; }
3
getChildren 1
                  
// in core/src/java/org/apache/solr/cloud/LeaderElector.java
catch (ConnectionLossException e) { // we don't know if we made our node or not... List<String> entries = zkClient.getChildren(shardsElectZkPath, null, true); boolean foundId = false; for (String entry : entries) { String nodeId = getNodeId(entry); if (id.equals(nodeId)) { // we did create our node... foundId = true; break; } } if (!foundId) { throw e; } }
37
getCoreDescriptor 1
                  
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (Exception e) { // if register fails, this is really bad - close the zkController to // minimize any damage we can cause zkController.publish(core.getCoreDescriptor(), ZkStateReader.DOWN); SolrException.log(log, "", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
52
getDataSourceName 1
                  
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DataImporter.java
catch (Exception e) { wrapAndThrow(SEVERE, e, "Failed to initialize DataSource: " + key.getDataSourceName()); }
2
getDeclaringClass 1
                  
// in solrj/src/java/org/apache/solr/client/solrj/beans/DocumentObjectBinder.java
catch (Exception ex) { // no getter -- don't worry about it... if (type == Boolean.class) { gname = "is" + setter.getName().substring(3); try { getter = setter.getDeclaringClass().getMethod(gname, (Class[]) null); } catch(Exception ex2) { // no getter -- don't worry about it... } } }
2
getEntityAttribute 1
                  
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/ScriptTransformer.java
catch (Exception e) { wrapAndThrow(SEVERE,e, "Error invoking script for entity " + context.getEntityAttribute("name")); }
47
getErrorInfo 1
                  
// in core/src/java/org/apache/solr/servlet/SolrDispatchFilter.java
catch( Throwable t ) { // This error really does not matter SimpleOrderedMap info = new SimpleOrderedMap(); int code=getErrorInfo(ex, info); response.sendError( code, info.toString() ); }
2
getField 1
                  
// in contrib/uima/src/java/org/apache/solr/uima/processor/UIMAUpdateRequestProcessor.java
catch (Exception e) { String logField = solrUIMAConfiguration.getLogField(); if(logField == null){ SchemaField uniqueKeyField = solrCore.getSchema().getUniqueKeyField(); if(uniqueKeyField != null){ logField = uniqueKeyField.getName(); } } String optionalFieldInfo = logField == null ? "." : new StringBuilder(". ").append(logField).append("=") .append((String)cmd.getSolrInputDocument().getField(logField).getValue()) .append(", ").toString(); int len = Math.min(text.length(), 100); if (solrUIMAConfiguration.isIgnoreErrors()) { log.warn(new StringBuilder("skip the text processing due to ") .append(e.getLocalizedMessage()).append(optionalFieldInfo) .append(" text=\"").append(text.substring(0, len)).append("...\"").toString()); } else { throw new SolrException(ErrorCode.SERVER_ERROR, new StringBuilder("processing error: ") .append(e.getLocalizedMessage()).append(optionalFieldInfo) .append(" text=\"").append(text.substring(0, len)).append("...\"").toString(), e); } }
79
getFieldOrNull 1
                  
// in core/src/java/org/apache/solr/search/ReturnFields.java
catch (ParseException e) { // try again, simple rules for a field name with no whitespace sp.pos = start; field = sp.getSimpleString(); if (req.getSchema().getFieldOrNull(field) != null) { // OK, it was an oddly named field fields.add(field); if( key != null ) { rename.add(field, key); } } else { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Error parsing fieldname: " + e.getMessage(), e); } }
37
getID 1
                  
// in core/src/java/org/apache/solr/update/DocumentBuilder.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "ERROR: "+getID(doc, schema)+"Error adding field '" + field.getName() + "'='" +field.getValue()+"' msg=" + ex.getMessage(), ex ); }
6
getLineNumber 1
                  
// in core/src/java/org/apache/solr/internal/csv/CSVParser.java
catch (NumberFormatException e) { throw new IOException( "(line " + getLineNumber() + ") Wrong unicode escape sequence found '" + code.toString() + "'" + e.toString()); }
9
getLogField
1
                  
// in contrib/uima/src/java/org/apache/solr/uima/processor/UIMAUpdateRequestProcessor.java
catch (Exception e) { String logField = solrUIMAConfiguration.getLogField(); if(logField == null){ SchemaField uniqueKeyField = solrCore.getSchema().getUniqueKeyField(); if(uniqueKeyField != null){ logField = uniqueKeyField.getName(); } } String optionalFieldInfo = logField == null ? "." : new StringBuilder(". ").append(logField).append("=") .append((String)cmd.getSolrInputDocument().getField(logField).getValue()) .append(", ").toString(); int len = Math.min(text.length(), 100); if (solrUIMAConfiguration.isIgnoreErrors()) { log.warn(new StringBuilder("skip the text processing due to ") .append(e.getLocalizedMessage()).append(optionalFieldInfo) .append(" text=\"").append(text.substring(0, len)).append("...\"").toString()); } else { throw new SolrException(ErrorCode.SERVER_ERROR, new StringBuilder("processing error: ") .append(e.getLocalizedMessage()).append(optionalFieldInfo) .append(" text=\"").append(text.substring(0, len)).append("...\"").toString(), e); } }
1
getMethod 1
                  
// in solrj/src/java/org/apache/solr/client/solrj/beans/DocumentObjectBinder.java
catch (Exception ex) { // no getter -- don't worry about it... if (type == Boolean.class) { gname = "is" + setter.getName().substring(3); try { getter = setter.getDeclaringClass().getMethod(gname, (Class[]) null); } catch(Exception ex2) { // no getter -- don't worry about it... } } }
15
getNodeId
1
                  
// in core/src/java/org/apache/solr/cloud/LeaderElector.java
catch (ConnectionLossException e) { // we don't know if we made our node or not... List<String> entries = zkClient.getChildren(shardsElectZkPath, null, true); boolean foundId = false; for (String entry : entries) { String nodeId = getNodeId(entry); if (id.equals(nodeId)) { // we did create our node... foundId = true; break; } } if (!foundId) { throw e; } }
1
getResourceLoader 1
                  
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DocBuilder.java
catch (Exception e) { try { String n = DocBuilder.class.getPackage().getName() + "." + name; return core != null ? core.getResourceLoader().findClass(n, Object.class) : Class.forName(n); } catch (Exception e1) { throw new ClassNotFoundException("Unable to load " + name + " or " + DocBuilder.class.getPackage().getName() + "." + name, e); } }
68
getSimpleString 1
                  
// in core/src/java/org/apache/solr/search/ReturnFields.java
catch (ParseException e) { // try again, simple rules for a field name with no whitespace sp.pos = start; field = sp.getSimpleString(); if (req.getSchema().getFieldOrNull(field) != null) { // OK, it was an oddly named field fields.add(field); if( key != null ) { rename.add(field, key); } } else { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Error parsing fieldname: " + e.getMessage(), e); } }
2
getSolrCore 1
                  
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/JdbcDataSource.java
catch (SQLException e) { // DriverManager does not allow you to use a driver which is not loaded through // the class loader of the class which is trying to make the connection. // This is a workaround for cases where the user puts the driver jar in the // solr.home/lib or solr.home/core/lib directories. Driver d = (Driver) DocBuilder.loadClass(driver, context.getSolrCore()).newInstance(); c = d.connect(url, initProps); }
8
getSolrInputDocument 1
                  
// in contrib/uima/src/java/org/apache/solr/uima/processor/UIMAUpdateRequestProcessor.java
catch (Exception e) { String logField = solrUIMAConfiguration.getLogField(); if(logField == null){ SchemaField uniqueKeyField = solrCore.getSchema().getUniqueKeyField(); if(uniqueKeyField != null){ logField = uniqueKeyField.getName(); } } String optionalFieldInfo = logField == null ? "." : new StringBuilder(". ").append(logField).append("=") .append((String)cmd.getSolrInputDocument().getField(logField).getValue()) .append(", ").toString(); int len = Math.min(text.length(), 100); if (solrUIMAConfiguration.isIgnoreErrors()) { log.warn(new StringBuilder("skip the text processing due to ") .append(e.getLocalizedMessage()).append(optionalFieldInfo) .append(" text=\"").append(text.substring(0, len)).append("...\"").toString()); } else { throw new SolrException(ErrorCode.SERVER_ERROR, new StringBuilder("processing error: ") .append(e.getLocalizedMessage()).append(optionalFieldInfo) .append(" text=\"").append(text.substring(0, len)).append("...\"").toString(), e); } }
13
getStacktraceString 1
                  
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DataImportHandler.java
catch (RuntimeException e) { rsp.add("exception", DebugLogger.getStacktraceString(e)); importer = null; return; }
3
getUniqueKeyField 1
                  
// in contrib/uima/src/java/org/apache/solr/uima/processor/UIMAUpdateRequestProcessor.java
catch (Exception e) { String logField = solrUIMAConfiguration.getLogField(); if(logField == null){ SchemaField uniqueKeyField = solrCore.getSchema().getUniqueKeyField(); if(uniqueKeyField != null){ logField = uniqueKeyField.getName(); } } String optionalFieldInfo = logField == null ? "." : new StringBuilder(". ").append(logField).append("=") .append((String)cmd.getSolrInputDocument().getField(logField).getValue()) .append(", ").toString(); int len = Math.min(text.length(), 100); if (solrUIMAConfiguration.isIgnoreErrors()) { log.warn(new StringBuilder("skip the text processing due to ") .append(e.getLocalizedMessage()).append(optionalFieldInfo) .append(" text=\"").append(text.substring(0, len)).append("...\"").toString()); } else { throw new SolrException(ErrorCode.SERVER_ERROR, new StringBuilder("processing error: ") .append(e.getLocalizedMessage()).append(optionalFieldInfo) .append(" text=\"").append(text.substring(0, len)).append("...\"").toString(), e); } }
43
handleError 1
                  
// in solrj/src/java/org/apache/solr/client/solrj/impl/ConcurrentUpdateSolrServer.java
catch (Throwable e) { handleError(e); }
2
input_err 1
                  
// in core/src/java/org/apache/solr/handler/loader/CSVLoaderBase.java
catch (IOException e) { //Catch the exception and rethrow it with more line information input_err("can't read line: " + line, null, line, e); }
2
interrupted
1
                  
// in core/src/java/org/apache/solr/cloud/ShardLeaderWatcher.java
catch (InterruptedException e) { Thread.interrupted(); logger.warn("Shard leader watch triggered but Solr cannot talk to zk."); }
1
isClosed 1
                  
// in solrj/src/java/org/apache/solr/common/cloud/ZkCmdExecutor.java
catch (KeeperException.ConnectionLossException e) { if (exception == null) { exception = e; } if (Thread.currentThread().isInterrupted()) { Thread.currentThread().interrupt(); throw new InterruptedException(); } if (Thread.currentThread() instanceof SafeStopThread) { if (((SafeStopThread) Thread.currentThread()).isClosed()) { throw new RuntimeException("Interrupted"); } } retryDelay(i); }
4
isDebugEnabled 1
                  
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/XPathEntityProcessor.java
catch (Exception e) { if (ABORT.equals(onError)) { wrapAndThrow(SEVERE, e); } else if (SKIP.equals(onError)) { if (LOG.isDebugEnabled()) LOG.debug("Skipping url : " + s, e); wrapAndThrow(DataImportHandlerException.SKIP, e); } else { LOG.warn("Failed for url : " + s, e); rowIterator = Collections.EMPTY_LIST.iterator(); return; } }
14
isIgnoreErrors
1
                  
// in contrib/uima/src/java/org/apache/solr/uima/processor/UIMAUpdateRequestProcessor.java
catch (Exception e) { String logField = solrUIMAConfiguration.getLogField(); if(logField == null){ SchemaField uniqueKeyField = solrCore.getSchema().getUniqueKeyField(); if(uniqueKeyField != null){ logField = uniqueKeyField.getName(); } } String optionalFieldInfo = logField == null ? "." : new StringBuilder(". ").append(logField).append("=") .append((String)cmd.getSolrInputDocument().getField(logField).getValue()) .append(", ").toString(); int len = Math.min(text.length(), 100); if (solrUIMAConfiguration.isIgnoreErrors()) { log.warn(new StringBuilder("skip the text processing due to ") .append(e.getLocalizedMessage()).append(optionalFieldInfo) .append(" text=\"").append(text.substring(0, len)).append("...\"").toString()); } else { throw new SolrException(ErrorCode.SERVER_ERROR, new StringBuilder("processing error: ") .append(e.getLocalizedMessage()).append(optionalFieldInfo) .append(" text=\"").append(text.substring(0, len)).append("...\"").toString(), e); } }
1
isInterrupted 1
                  
// in solrj/src/java/org/apache/solr/common/cloud/ZkCmdExecutor.java
catch (KeeperException.ConnectionLossException e) { if (exception == null) { exception = e; } if (Thread.currentThread().isInterrupted()) { Thread.currentThread().interrupt(); throw new InterruptedException(); } if (Thread.currentThread() instanceof SafeStopThread) { if (((SafeStopThread) Thread.currentThread()).isClosed()) { throw new RuntimeException("Interrupted"); } } retryDelay(i); }
2
loadClass 1
                  
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/JdbcDataSource.java
catch (SQLException e) { // DriverManager does not allow you to use a driver which is not loaded through // the class loader of the class which is trying to make the connection. // This is a workaround for cases where the user puts the driver jar in the // solr.home/lib or solr.home/core/lib directories. Driver d = (Driver) DocBuilder.loadClass(driver, context.getSolrCore()).newInstance(); c = d.connect(url, initProps); }
18
min 1
                  
// in contrib/uima/src/java/org/apache/solr/uima/processor/UIMAUpdateRequestProcessor.java
catch (Exception e) { String logField = solrUIMAConfiguration.getLogField(); if(logField == null){ SchemaField uniqueKeyField = solrCore.getSchema().getUniqueKeyField(); if(uniqueKeyField != null){ logField = uniqueKeyField.getName(); } } String optionalFieldInfo = logField == null ? "." : new StringBuilder(". ").append(logField).append("=") .append((String)cmd.getSolrInputDocument().getField(logField).getValue()) .append(", ").toString(); int len = Math.min(text.length(), 100); if (solrUIMAConfiguration.isIgnoreErrors()) { log.warn(new StringBuilder("skip the text processing due to ") .append(e.getLocalizedMessage()).append(optionalFieldInfo) .append(" text=\"").append(text.substring(0, len)).append("...\"").toString()); } else { throw new SolrException(ErrorCode.SERVER_ERROR, new StringBuilder("processing error: ") .append(e.getLocalizedMessage()).append(optionalFieldInfo) .append(" text=\"").append(text.substring(0, len)).append("...\"").toString(), e); } }
50
moveAliveToDead
1
                  
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
catch (SolrServerException e) { if (e.getRootCause() instanceof IOException) { ex = e; moveAliveToDead(wrapper); if (justFailed == null) justFailed = new HashMap<String,ServerWrapper>(); justFailed.put(wrapper.getKey(), wrapper); } else { throw e; } }
1
name 1
                  
// in contrib/clustering/src/java/org/apache/solr/handler/clustering/carrot2/LuceneCarrot2StemmerFactory.java
catch (Exception e) { logger.warn("Could not instantiate snowball stemmer" + " for language: " + language.name() + ". Quality of clustering may be degraded.", e); return IdentityStemmer.INSTANCE; }
113
openResource 1
                  
// in core/src/java/org/apache/solr/schema/OpenExchangeRatesOrgProvider.java
catch (Exception e) { ratesJsonStream = resourceLoader.openResource(ratesFileLocation); }
26
parseDate 1
                  
// in core/src/java/org/apache/solr/schema/DateField.java
catch (Exception e) { return DateUtil.parseDate(s); }
10
publish 1
                  
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (Exception e) { // if register fails, this is really bad - close the zkController to // minimize any damage we can cause zkController.publish(core.getCoreDescriptor(), ZkStateReader.DOWN); SolrException.log(log, "", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
5
resolve 1
                  
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/EvaluatorBag.java
catch (NumberFormatException e) { if (vr.resolve(ss[i]) == null) { wrapAndThrow( SEVERE, e, "Invalid number :" + ss[i] + "in parameters " + expression); } }
22
retryDelay
1
                  
// in solrj/src/java/org/apache/solr/common/cloud/ZkCmdExecutor.java
catch (KeeperException.ConnectionLossException e) { if (exception == null) { exception = e; } if (Thread.currentThread().isInterrupted()) { Thread.currentThread().interrupt(); throw new InterruptedException(); } if (Thread.currentThread() instanceof SafeStopThread) { if (((SafeStopThread) Thread.currentThread()).isClosed()) { throw new RuntimeException("Interrupted"); } } retryDelay(i); }
1
setData 1
                  
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
catch (NodeExistsException e) { if (!failOnExists) { // TODO: version ? for now, don't worry about race setData(currentPath, data, -1, retryOnConnLoss); // set new watch exists(currentPath, watcher, retryOnConnLoss); return; } // ignore unless it's the last node in the path if (i == paths.length - 1) { throw e; } }
10
setResponse 1
                  
// in contrib/velocity/src/java/org/apache/solr/response/VelocityResponseWriter.java
catch (ClassCastException e) { // known edge case where QueryResponse's extraction assumes "response" is a SolrDocumentList // (AnalysisRequestHandler emits a "response") e.printStackTrace(); rsp = new SolrResponseBase(); rsp.setResponse(parsedResponse); }
16
startsWith 1
                  
// in core/src/java/org/apache/solr/core/SolrResourceLoader.java
catch (ClassNotFoundException e) { String newName=cname; if (newName.startsWith(project)) { newName = cname.substring(project.length()+1); } for (String subpackage : subpackages) { try { String name = base + '.' + subpackage + newName; log.trace("Trying class name " + name); return clazz = Class.forName(name,true,classLoader).asSubclass(expectedType); } catch (ClassNotFoundException e1) { // ignore... assume first exception is best. } } throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "Error loading class '" + cname + "'", e); }
84
trace 1
                  
// in core/src/java/org/apache/solr/core/SolrResourceLoader.java
catch (ClassNotFoundException e) { String newName=cname; if (newName.startsWith(project)) { newName = cname.substring(project.length()+1); } for (String subpackage : subpackages) { try { String name = base + '.' + subpackage + newName; log.trace("Trying class name " + name); return clazz = Class.forName(name,true,classLoader).asSubclass(expectedType); } catch (ClassNotFoundException e1) { // ignore... assume first exception is best. } } throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "Error loading class '" + cname + "'", e); }
25
utf8ToString 1
                  
// in core/src/java/org/apache/solr/search/QueryParsing.java
catch (Exception e) { out.append("EXCEPTION(val="); out.append(val.utf8ToString()); out.append(")"); }
25
wrapAsRuntimeException
1
                  
// in contrib/clustering/src/java/org/apache/solr/handler/clustering/carrot2/LuceneCarrot2TokenizerFactory.java
catch (Exception e) { throw ExceptionUtils.wrapAsRuntimeException(e); }
1
writeString 1
                  
// in core/src/java/org/apache/solr/servlet/ZookeeperInfoServlet.java
catch (IllegalArgumentException e) { // path doesn't exist (must have been removed) json.writeString("(children gone)"); }
18
Method Nbr Nbr total
close 77
                  
// in solrj/src/java/org/apache/solr/client/solrj/impl/ConcurrentUpdateSolrServer.java
finally { try { if (response != null) { response.getEntity().getContent().close(); } } catch (Exception ex) { } }
// in solrj/src/java/org/apache/solr/client/solrj/impl/HttpSolrServer.java
finally { if (respBody != null) { try { respBody.close(); } catch (Throwable t) {} // ignore } }
// in solrj/src/java/org/apache/solr/client/solrj/impl/XMLResponseParser.java
finally { try { parser.close(); } catch( Exception ex ){} }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/SimplePropertiesWriter.java
finally { try { if (propOutput != null) propOutput.close(); } catch (IOException e) { propOutput = null; } }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/SimplePropertiesWriter.java
finally { try { if (propInput != null) propInput.close(); } catch (IOException e) { propInput = null; } }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/SolrWriter.java
finally { try { in.close(); } catch (Exception e) { } }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DocBuilder.java
finally { if (writer != null) { writer.close(); } if (epwList != null) { closeEntityProcessorWrappers(epwList); } if(reqParams.isDebug()) { reqParams.getDebugInfo().debugVerboseOutput = getDebugLogger().output; } }
// in contrib/velocity/src/java/org/apache/solr/response/VelocityResponseWriter.java
finally { if (is != null) is.close(); }
// in core/src/java/org/apache/solr/handler/DocumentAnalysisRequestHandler.java
finally { if (parser != null) parser.close(); IOUtils.closeQuietly(is); }
// in core/src/java/org/apache/solr/handler/DumpRequestHandler.java
finally { reader.close(); }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
finally { try { if (input != null) { input.close(); } } catch (Exception e) { } }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
finally { req.close(); }
// in core/src/java/org/apache/solr/handler/loader/XMLLoader.java
finally { if (parser != null) parser.close(); }
// in core/src/java/org/apache/solr/handler/loader/XMLLoader.java
finally { if (parser != null) parser.close(); IOUtils.closeQuietly(is); }
// in core/src/java/org/apache/solr/handler/loader/JavabinLoader.java
finally { if(is != null) { is.close(); } }
// in core/src/java/org/apache/solr/handler/MoreLikeThisHandler.java
finally { if (reader != null) { reader.close(); } }
// in core/src/java/org/apache/solr/handler/component/RealTimeGetComponent.java
finally { recentUpdates.close(); // cache this somehow? }
// in core/src/java/org/apache/solr/handler/component/RealTimeGetComponent.java
finally { recentUpdates.close(); // cache this somehow? }
// in core/src/java/org/apache/solr/handler/admin/CoreAdminHandler.java
finally { if (searchers != null) { for (RefCounted<SolrIndexSearcher> searcher : searchers) { if (searcher != null) searcher.decref(); } } if (sourceCores != null) { for (SolrCore solrCore : sourceCores) { if (solrCore != null) solrCore.close(); } } if (readersToBeClosed != null) IOUtils.closeWhileHandlingException(readersToBeClosed); if (dirsToBeReleased != null) { for (Directory dir : dirsToBeReleased) { DirectoryFactory dirFactory = core.getDirectoryFactory(); dirFactory.release(dir); } } if (wrappedReq != null) wrappedReq.close(); core.close(); }
// in core/src/java/org/apache/solr/handler/admin/CoreAdminHandler.java
finally { // no recoveryStrat close for now if (core != null) { core.close(); } }
// in core/src/java/org/apache/solr/handler/admin/CoreAdminHandler.java
finally { if (core != null) { core.close(); } }
// in core/src/java/org/apache/solr/handler/admin/CoreAdminHandler.java
finally { core.close(); }
// in core/src/java/org/apache/solr/client/solrj/embedded/EmbeddedSolrServer.java
finally { if (req != null) req.close(); core.close(); SolrRequestInfo.clearRequestInfo(); }
// in core/src/java/org/apache/solr/response/JSONResponseWriter.java
finally { w.close(); }
// in core/src/java/org/apache/solr/response/PHPResponseWriter.java
finally { w.close(); }
// in core/src/java/org/apache/solr/response/CSVResponseWriter.java
finally { w.close(); }
// in core/src/java/org/apache/solr/response/PythonResponseWriter.java
finally { w.close(); }
// in core/src/java/org/apache/solr/response/RubyResponseWriter.java
finally { w.close(); }
// in core/src/java/org/apache/solr/response/XMLWriter.java
finally { xmlWriter.close(); }
// in core/src/java/org/apache/solr/response/XMLResponseWriter.java
finally { w.close(); }
// in core/src/java/org/apache/solr/response/RawResponseWriter.java
finally { reader.close(); }
// in core/src/java/org/apache/solr/response/RawResponseWriter.java
finally { in.close(); }
// in core/src/java/org/apache/solr/response/PHPSerializedResponseWriter.java
finally { w.close(); }
// in core/src/java/org/apache/solr/servlet/SolrDispatchFilter.java
finally { if( solrReq != null ) { solrReq.close(); } if (core != null) { core.close(); } SolrRequestInfo.clearRequestInfo(); }
// in core/src/java/org/apache/solr/servlet/ZookeeperInfoServlet.java
finally { printer.close(); }
// in core/src/java/org/apache/solr/servlet/DirectSolrConnection.java
finally { if (req != null) { req.close(); } SolrRequestInfo.clearRequestInfo(); }
// in core/src/java/org/apache/solr/schema/OpenExchangeRatesOrgProvider.java
finally { if (ratesJsonStream != null) try { ratesJsonStream.close(); } catch (IOException e) { throw new SolrException(ErrorCode.SERVER_ERROR, "Error closing stream", e); } }
// in core/src/java/org/apache/solr/schema/CurrencyField.java
finally { try { if (is != null) { is.close(); } } catch (IOException e) { e.printStackTrace(); } }
// in core/src/java/org/apache/solr/internal/csv/writer/CSVConfigGuesser.java
finally { if (in != null) { try { in.close(); } catch(Exception e) { // ignore exception. } } }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
finally { try { req.close(); } finally { SolrRequestInfo.clearRequestInfo(); } }
// in core/src/java/org/apache/solr/search/JoinQParserPlugin.java
finally { otherReq.close(); fromCore.close(); if (fromHolder != null) fromHolder.decref(); }
// in core/src/java/org/apache/solr/search/function/FileFloatSource.java
finally { // swallow exceptions on close so we don't override any // exceptions that happened in the loop try{r.close();}catch(Exception e){} }
// in core/src/java/org/apache/solr/spelling/SpellCheckCollator.java
finally { checkResponse.req.close(); }
// in core/src/java/org/apache/solr/cloud/ElectionContext.java
finally { if (core != null ) { core.close(); } }
// in core/src/java/org/apache/solr/cloud/SolrZkServer.java
finally { in.close(); }
// in core/src/java/org/apache/solr/cloud/SolrZkServer.java
finally { br.close(); }
// in core/src/java/org/apache/solr/cloud/RecoveryStrategy.java
finally { close(); }
// in core/src/java/org/apache/solr/cloud/RecoveryStrategy.java
finally { if (core != null) core.close(); SolrRequestInfo.clearRequestInfo(); }
// in core/src/java/org/apache/solr/cloud/RecoveryStrategy.java
finally { recentUpdates.close(); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
finally { if (core != null) { core.close(); } }
// in core/src/java/org/apache/solr/update/UpdateLog.java
finally { startingUpdates.close(); }
// in core/src/java/org/apache/solr/update/UpdateLog.java
finally { if (reader != null) reader.close(); }
// in core/src/java/org/apache/solr/update/UpdateLog.java
finally { if (tlogReader != null) tlogReader.close(); translog.decref(); }
// in core/src/java/org/apache/solr/update/CommitTracker.java
finally { // log.info("###done committing"); req.close(); }
// in core/src/java/org/apache/solr/update/PeerSync.java
finally { recentUpdates.close(); }
// in core/src/java/org/apache/solr/core/SolrResourceLoader.java
finally { if (input != null) input.close(); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
finally { if(zkController != null) { zkController.close(); } if (zkServer != null) { zkServer.stop(); } if (shardHandlerFactory != null) { shardHandlerFactory.close(); } isShutDown = true; }
// in core/src/java/org/apache/solr/core/CoreContainer.java
finally { if (core != null) { core.close(); } }
// in core/src/java/org/apache/solr/core/SolrXMLSerializer.java
finally { writer.close(); out.close(); }
// in core/src/java/org/apache/solr/core/SolrXMLSerializer.java
finally { if (fis != null) try { fis.close(); fis = null; } catch (IOException xio) {} if (fos != null) try { fos.close(); fos = null; } catch (IOException xio) {} if (fcin != null && fcin.isOpen()) try { fcin.close(); fcin = null; } catch (IOException xio) {} if (fcout != null && fcout.isOpen()) try { fcout.close(); fcout = null; } catch (IOException xio) {} }
// in core/src/java/org/apache/solr/core/QuerySenderListener.java
finally { if (req != null) req.close(); SolrRequestInfo.clearRequestInfo(); }
// in core/src/java/org/apache/solr/util/SimplePostTool.java
finally { try { if(is!=null) is.close(); } catch (IOException e) { fatal("IOException while closing file: "+ e); } }
// in core/src/java/org/apache/solr/util/SimplePostTool.java
finally { try { if(out!=null) out.close(); } catch (IOException x) { /*NOOP*/ } }
// in core/src/java/org/apache/solr/util/SimplePostTool.java
finally { try { if(in!=null) in.close(); } catch (IOException x) { /*NOOP*/ } }
// in core/src/java/org/apache/solr/util/FileUtils.java
finally { try { if (in != null) in.close(); } catch (IOException e) {} try { if (out != null) out.close(); } catch (IOException e) {} }
// in core/src/java/org/apache/solr/util/FileUtils.java
finally { if (file != null) file.close(); }
161
closeQuietly 31
                  
// in contrib/extraction/src/java/org/apache/solr/handler/extraction/ExtractingDocumentLoader.java
finally { IOUtils.closeQuietly(inputStream); }
// in contrib/clustering/src/java/org/apache/solr/handler/clustering/carrot2/CarrotClusteringEngine.java
finally { if (resourceStream != null) Closeables.closeQuietly(resourceStream); }
// in contrib/analysis-extras/src/java/org/apache/solr/schema/ICUCollationField.java
finally { IOUtils.closeQuietly(input); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DataImporter.java
finally { // some XML parsers are broken and don't close the byte stream (but they should according to spec) IOUtils.closeQuietly(configFile.getByteStream()); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/XPathEntityProcessor.java
finally { // some XML parsers are broken and don't close the byte stream (but they should according to spec) IOUtils.closeQuietly(xsltSource.getInputStream()); }
// in core/src/java/org/apache/solr/handler/DocumentAnalysisRequestHandler.java
finally { if (parser != null) parser.close(); IOUtils.closeQuietly(is); }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
finally { IOUtils.closeQuietly(outFile); }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
finally { IOUtils.closeQuietly(is); }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
finally { IOUtils.closeQuietly(os); }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
finally { IOUtils.closeQuietly(is); }
// in core/src/java/org/apache/solr/handler/loader/XMLLoader.java
finally { IOUtils.closeQuietly(is); }
// in core/src/java/org/apache/solr/handler/loader/XMLLoader.java
finally { if (parser != null) parser.close(); IOUtils.closeQuietly(is); }
// in core/src/java/org/apache/solr/handler/loader/JsonLoader.java
finally { IOUtils.closeQuietly(reader); }
// in core/src/java/org/apache/solr/handler/loader/CSVLoaderBase.java
finally{ if (reader != null) { IOUtils.closeQuietly(reader); } }
// in core/src/java/org/apache/solr/handler/FieldAnalysisRequestHandler.java
finally { IOUtils.closeQuietly(reader); }
// in core/src/java/org/apache/solr/handler/ReplicationHandler.java
finally { IOUtils.closeQuietly(fis); }
// in core/src/java/org/apache/solr/handler/ReplicationHandler.java
finally { IOUtils.closeQuietly(inFile); }
// in core/src/java/org/apache/solr/handler/ReplicationHandler.java
finally { IOUtils.closeQuietly(inputStream); }
// in core/src/java/org/apache/solr/handler/admin/SystemInfoHandler.java
finally { if (process != null) { IOUtils.closeQuietly( process.getOutputStream() ); IOUtils.closeQuietly( process.getInputStream() ); IOUtils.closeQuietly( process.getErrorStream() ); } }
// in core/src/java/org/apache/solr/handler/SnapShooter.java
finally { try { IOUtils.closeQuietly(input); } finally { IOUtils.closeQuietly(output); } }
// in core/src/java/org/apache/solr/handler/SnapShooter.java
finally { IOUtils.closeQuietly(output); }
// in core/src/java/org/apache/solr/analysis/HyphenationCompoundWordTokenFilterFactory.java
finally { IOUtils.closeQuietly(stream); }
// in core/src/java/org/apache/solr/servlet/LoadAdminUiServlet.java
finally { IOUtils.closeQuietly(in); }
// in core/src/java/org/apache/solr/schema/CollationField.java
finally { IOUtils.closeQuietly(input); }
// in core/src/java/org/apache/solr/update/processor/HTMLStripFieldUpdateProcessorFactory.java
finally { IOUtils.closeQuietly(in); }
// in core/src/java/org/apache/solr/core/Config.java
finally { // some XML parsers are broken and don't close the byte stream (but they should according to spec) IOUtils.closeQuietly(is.getByteStream()); }
// in core/src/java/org/apache/solr/core/SolrCore.java
finally { IOUtils.closeQuietly(is); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
finally{ IOUtils.closeQuietly(is); }
// in core/src/java/org/apache/solr/util/xslt/TransformerProvider.java
finally { // some XML parsers are broken and don't close the byte stream (but they should according to spec) IOUtils.closeQuietly(src.getInputStream()); }
36
decref 18
                  
// in core/src/java/org/apache/solr/handler/SnapPuller.java
finally { if (searcherRefCounted != null) searcherRefCounted.decref(); }
// in core/src/java/org/apache/solr/handler/ReplicationHandler.java
finally { searcher.decref(); }
// in core/src/java/org/apache/solr/handler/ReplicationHandler.java
finally { if (s!=null) s.decref(); }
// in core/src/java/org/apache/solr/handler/component/QueryElevationComponent.java
finally { if (searchHolder != null) searchHolder.decref(); }
// in core/src/java/org/apache/solr/handler/component/RealTimeGetComponent.java
finally { if (searcherHolder != null) { searcherHolder.decref(); } }
// in core/src/java/org/apache/solr/handler/component/RealTimeGetComponent.java
finally { if (searcherHolder != null) { searcherHolder.decref(); } }
// in core/src/java/org/apache/solr/handler/admin/CoreAdminHandler.java
finally { if (searchers != null) { for (RefCounted<SolrIndexSearcher> searcher : searchers) { if (searcher != null) searcher.decref(); } } if (sourceCores != null) { for (SolrCore solrCore : sourceCores) { if (solrCore != null) solrCore.close(); } } if (readersToBeClosed != null) IOUtils.closeWhileHandlingException(readersToBeClosed); if (dirsToBeReleased != null) { for (Directory dir : dirsToBeReleased) { DirectoryFactory dirFactory = core.getDirectoryFactory(); dirFactory.release(dir); } } if (wrappedReq != null) wrappedReq.close(); core.close(); }
// in core/src/java/org/apache/solr/handler/admin/CoreAdminHandler.java
finally { searcher.decref(); }
// in core/src/java/org/apache/solr/search/JoinQParserPlugin.java
finally { otherReq.close(); fromCore.close(); if (fromHolder != null) fromHolder.decref(); }
// in core/src/java/org/apache/solr/update/UpdateLog.java
finally { lookupLog.decref(); }
// in core/src/java/org/apache/solr/update/UpdateLog.java
finally { currLog.decref(); }
// in core/src/java/org/apache/solr/update/UpdateLog.java
finally { if (tlogReader != null) tlogReader.close(); translog.decref(); }
// in core/src/java/org/apache/solr/update/VersionInfo.java
finally { if (newestSearcher != null) { newestSearcher.decref(); } }
// in core/src/java/org/apache/solr/core/SolrCore.java
finally { openSearcherLock.unlock(); if (newestSearcher != null) { newestSearcher.decref(); } }
// in core/src/java/org/apache/solr/core/SolrCore.java
finally { // we are all done with the old searcher we used // for warming... if (currSearcherHolderF!=null) currSearcherHolderF.decref(); }
// in core/src/java/org/apache/solr/core/SolrCore.java
finally { if (!success) { synchronized (searcherLock) { onDeckSearchers--; if (onDeckSearchers < 0) { // sanity check... should never happen log.error(logid+"ERROR!!! onDeckSearchers after decrement=" + onDeckSearchers); onDeckSearchers=0; // try and recover } // if we failed, we need to wake up at least one waiter to continue the process searcherLock.notify(); } if (currSearcherHolder != null) { currSearcherHolder.decref(); } if (searchHolder != null) { searchHolder.decref(); // decrement 1 for _searcher (searchHolder will never become _searcher now) if (returnSearcher) { searchHolder.decref(); // decrement 1 because we won't be returning the searcher to the user } } } // we want to do this after we decrement onDeckSearchers so another thread // doesn't increment first and throw a false warning. openSearcherLock.unlock(); }
38
unlock 14
                  
// in solrj/src/java/org/apache/solr/client/solrj/impl/ConcurrentUpdateSolrServer.java
finally { // remove it from the list of running things unless we are the last // runner and the queue is full... // in which case, the next queue.put() would block and there would be no // runners to handle it. // This case has been further handled by using offer instead of put, and // using a retry loop // to avoid blocking forever (see request()). synchronized (runners) { if (runners.size() == 1 && queue.remainingCapacity() == 0) { // keep this runner alive scheduler.execute(this); } else { runners.remove(this); } } log.info("finished: {}", this); runnerLock.unlock(); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DataImporter.java
finally { importLock.unlock(); }
// in core/src/java/org/apache/solr/handler/ReplicationHandler.java
finally { if (snapPuller != null) { tempSnapPuller = snapPuller; } snapPullLock.unlock(); }
// in core/src/java/org/apache/solr/update/DirectUpdateHandler2.java
finally { if (!cmd.softCommit) { commitLock.unlock(); } addCommands.set(0); deleteByIdCommands.set(0); deleteByQueryCommands.set(0); if (error) numErrors.incrementAndGet(); }
// in core/src/java/org/apache/solr/update/DirectUpdateHandler2.java
finally { commitLock.unlock(); if (clearRequestInfo) SolrRequestInfo.clearRequestInfo(); }
// in core/src/java/org/apache/solr/core/SolrCore.java
finally { openSearcherLock.unlock(); }
// in core/src/java/org/apache/solr/core/SolrCore.java
finally { openSearcherLock.unlock(); if (newestSearcher != null) { newestSearcher.decref(); } }
// in core/src/java/org/apache/solr/core/SolrCore.java
finally { if (!success) { synchronized (searcherLock) { onDeckSearchers--; if (onDeckSearchers < 0) { // sanity check... should never happen log.error(logid+"ERROR!!! onDeckSearchers after decrement=" + onDeckSearchers); onDeckSearchers=0; // try and recover } // if we failed, we need to wake up at least one waiter to continue the process searcherLock.notify(); } if (currSearcherHolder != null) { currSearcherHolder.decref(); } if (searchHolder != null) { searchHolder.decref(); // decrement 1 for _searcher (searchHolder will never become _searcher now) if (returnSearcher) { searchHolder.decref(); // decrement 1 because we won't be returning the searcher to the user } } } // we want to do this after we decrement onDeckSearchers so another thread // doesn't increment first and throw a false warning. openSearcherLock.unlock(); }
// in core/src/java/org/apache/solr/util/ConcurrentLRUCache.java
finally { isCleaning = false; // set before markAndSweep.unlock() for visibility markAndSweepLock.unlock(); }
// in core/src/java/org/apache/solr/util/ConcurrentLRUCache.java
finally { markAndSweepLock.unlock(); }
// in core/src/java/org/apache/solr/util/ConcurrentLRUCache.java
finally { markAndSweepLock.unlock(); }
// in core/src/java/org/apache/solr/util/ConcurrentLFUCache.java
finally { isCleaning = false; // set before markAndSweep.unlock() for visibility markAndSweepLock.unlock(); }
// in core/src/java/org/apache/solr/util/ConcurrentLFUCache.java
finally { markAndSweepLock.unlock(); }
// in core/src/java/org/apache/solr/util/ConcurrentLFUCache.java
finally { markAndSweepLock.unlock(); }
18
incrementAndGet 9
                  
// in core/src/java/org/apache/solr/update/DirectUpdateHandler2.java
finally { if (rc!=1) { numErrors.incrementAndGet(); numErrorsCumulative.incrementAndGet(); } else { numDocsPending.incrementAndGet(); } }
// in core/src/java/org/apache/solr/update/DirectUpdateHandler2.java
finally { if (!madeIt) { numErrors.incrementAndGet(); numErrorsCumulative.incrementAndGet(); } }
// in core/src/java/org/apache/solr/update/DirectUpdateHandler2.java
finally { if (error) numErrors.incrementAndGet(); }
// in core/src/java/org/apache/solr/update/DirectUpdateHandler2.java
finally { if (!cmd.softCommit) { commitLock.unlock(); } addCommands.set(0); deleteByIdCommands.set(0); deleteByQueryCommands.set(0); if (error) numErrors.incrementAndGet(); }
// in core/src/java/org/apache/solr/update/DirectUpdateHandler2.java
finally { addCommandsCumulative.set( addCommandsCumulative.get() - addCommands.getAndSet( 0 ) ); deleteByIdCommandsCumulative.set( deleteByIdCommandsCumulative.get() - deleteByIdCommands.getAndSet( 0 ) ); deleteByQueryCommandsCumulative.set( deleteByQueryCommandsCumulative.get() - deleteByQueryCommands.getAndSet( 0 ) ); if (error) numErrors.incrementAndGet(); }
// in core/src/java/org/apache/solr/update/SolrIndexWriter.java
finally { isClosed = true; directoryFactory.release(directory); numCloses.incrementAndGet(); }
67
log 8
                  
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DebugLogger.java
finally { log(DIHLogLevels.ENTITY_META, "time-taken", DocBuilder .getTimeElapsedSince(start)); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DocBuilder.java
finally { if (verboseDebug) { getDebugLogger().log(DIHLogLevels.ROW_END, epw.getEntity().getName(), null); if (epw.getEntity().isDocRoot()) getDebugLogger().log(DIHLogLevels.END_DOC, null, null); } }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DocBuilder.java
finally { if (verboseDebug) { getDebugLogger().log(DIHLogLevels.END_ENTITY, null, null); } }
// in core/src/java/org/apache/solr/cloud/RecoveryStrategy.java
finally { if (!replayed) { try { ulog.dropBufferedUpdates(); } catch (Throwable t) { SolrException.log(log, "", t); } } }
682
set 8
                  
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DataImporter.java
finally { setStatus(Status.IDLE); DocBuilder.INSTANCE.set(null); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DataImporter.java
finally { setStatus(Status.IDLE); DocBuilder.INSTANCE.set(null); }
// in core/src/java/org/apache/solr/update/DirectUpdateHandler2.java
finally { if (!cmd.softCommit) { commitLock.unlock(); } addCommands.set(0); deleteByIdCommands.set(0); deleteByQueryCommands.set(0); if (error) numErrors.incrementAndGet(); }
// in core/src/java/org/apache/solr/update/DirectUpdateHandler2.java
finally { addCommandsCumulative.set( addCommandsCumulative.get() - addCommands.getAndSet( 0 ) ); deleteByIdCommandsCumulative.set( deleteByIdCommandsCumulative.get() - deleteByIdCommands.getAndSet( 0 ) ); deleteByQueryCommandsCumulative.set( deleteByQueryCommandsCumulative.get() - deleteByQueryCommands.getAndSet( 0 ) ); if (error) numErrors.incrementAndGet(); }
284
clearRequestInfo 7
                  
// in core/src/java/org/apache/solr/client/solrj/embedded/EmbeddedSolrServer.java
finally { if (req != null) req.close(); core.close(); SolrRequestInfo.clearRequestInfo(); }
// in core/src/java/org/apache/solr/servlet/SolrDispatchFilter.java
finally { if( solrReq != null ) { solrReq.close(); } if (core != null) { core.close(); } SolrRequestInfo.clearRequestInfo(); }
// in core/src/java/org/apache/solr/servlet/DirectSolrConnection.java
finally { if (req != null) { req.close(); } SolrRequestInfo.clearRequestInfo(); }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
finally { try { req.close(); } finally { SolrRequestInfo.clearRequestInfo(); } }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
finally { SolrRequestInfo.clearRequestInfo(); }
// in core/src/java/org/apache/solr/cloud/RecoveryStrategy.java
finally { if (core != null) core.close(); SolrRequestInfo.clearRequestInfo(); }
// in core/src/java/org/apache/solr/update/DirectUpdateHandler2.java
finally { commitLock.unlock(); if (clearRequestInfo) SolrRequestInfo.clearRequestInfo(); }
// in core/src/java/org/apache/solr/core/QuerySenderListener.java
finally { if (req != null) req.close(); SolrRequestInfo.clearRequestInfo(); }
8
finalize
7
                  
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
finally { super.finalize(); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/JdbcDataSource.java
finally { super.finalize(); }
// in core/src/java/org/apache/solr/update/SolrIndexWriter.java
finally { super.finalize(); }
// in core/src/java/org/apache/solr/core/SolrCore.java
finally { super.finalize(); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
finally { super.finalize(); }
// in core/src/java/org/apache/solr/util/ConcurrentLRUCache.java
finally { super.finalize(); }
// in core/src/java/org/apache/solr/util/ConcurrentLFUCache.java
finally { super.finalize(); }
7
remove 6
                  
// in solrj/src/java/org/apache/solr/client/solrj/impl/ConcurrentUpdateSolrServer.java
finally { // remove it from the list of running things unless we are the last // runner and the queue is full... // in which case, the next queue.put() would block and there would be no // runners to handle it. // This case has been further handled by using offer instead of put, and // using a retry loop // to avoid blocking forever (see request()). synchronized (runners) { if (runners.size() == 1 && queue.remainingCapacity() == 0) { // keep this runner alive scheduler.execute(this); } else { runners.remove(this); } } log.info("finished: {}", this); runnerLock.unlock(); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/VariableResolverImpl.java
finally { CURRENT_VARIABLE_RESOLVER.remove(); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/XPathRecordReader.java
finally { if ((isRecord || !recordStarted) && !stack.empty()) { Set<String> cleanThis = stack.pop(); if (cleanThis != null) { for (String fld : cleanThis) values.remove(fld); } } }
// in core/src/java/org/apache/solr/request/SolrRequestInfo.java
finally { threadLocal.remove(); }
// in core/src/java/org/apache/solr/schema/OpenExchangeRatesOrgProvider.java
finally { // Removing config params custom to us params.remove(PARAM_RATES_FILE_LOCATION); params.remove(PARAM_REFRESH_INTERVAL); }
215
unblockUpdates
6
                  
// in core/src/java/org/apache/solr/update/UpdateLog.java
finally { versionInfo.unblockUpdates(); }
// in core/src/java/org/apache/solr/update/UpdateLog.java
finally { versionInfo.unblockUpdates(); }
// in core/src/java/org/apache/solr/update/UpdateLog.java
finally { versionInfo.unblockUpdates(); }
// in core/src/java/org/apache/solr/update/UpdateLog.java
finally { versionInfo.unblockUpdates(); }
// in core/src/java/org/apache/solr/update/UpdateLog.java
finally { // change the state while updates are still blocked to prevent races state = State.ACTIVE; if (finishing) { versionInfo.unblockUpdates(); } }
// in core/src/java/org/apache/solr/update/processor/DistributedUpdateProcessor.java
finally { vinfo.unblockUpdates(); }
6
get 4
                  
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/XPathEntityProcessor.java
finally { closeIt(data); if (!isEnd.get()) { offer(END_MARKER); } }
// in core/src/java/org/apache/solr/update/DirectUpdateHandler2.java
finally { addCommandsCumulative.set( addCommandsCumulative.get() - addCommands.getAndSet( 0 ) ); deleteByIdCommandsCumulative.set( deleteByIdCommandsCumulative.get() - deleteByIdCommands.getAndSet( 0 ) ); deleteByQueryCommandsCumulative.set( deleteByQueryCommandsCumulative.get() - deleteByQueryCommands.getAndSet( 0 ) ); if (error) numErrors.incrementAndGet(); }
1703
getDebugLogger 4
                  
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DocBuilder.java
finally { if (writer != null) { writer.close(); } if (epwList != null) { closeEntityProcessorWrappers(epwList); } if(reqParams.isDebug()) { reqParams.getDebugInfo().debugVerboseOutput = getDebugLogger().output; } }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DocBuilder.java
finally { if (verboseDebug) { getDebugLogger().log(DIHLogLevels.ROW_END, epw.getEntity().getName(), null); if (epw.getEntity().isDocRoot()) getDebugLogger().log(DIHLogLevels.END_DOC, null, null); } }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DocBuilder.java
finally { if (verboseDebug) { getDebugLogger().log(DIHLogLevels.END_ENTITY, null, null); } }
13
delTree 3
                  
// in core/src/java/org/apache/solr/handler/SnapPuller.java
finally { if (deleteTmpIdxDir) delTree(tmpIndexDir); else delTree(indexDir); }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
finally { delTree(tmpconfDir); }
7
error 3
                  
// in core/src/java/org/apache/solr/handler/SnapShooter.java
finally { replicationHandler.core.getDeletionPolicy().releaseCommitPoint(indexCommit.getGeneration()); replicationHandler.snapShootDetails = details; if (lock != null) { try { lock.release(); } catch (IOException e) { LOG.error("Unable to release snapshoot lock: " + directoryName + ".lock"); } } }
// in core/src/java/org/apache/solr/update/PeerSync.java
finally { try { proc.finish(); } catch (Exception e) { sreq.updateException = e; log.error(msg() + "Error applying updates from " + sreq.shards + " ,finish()", e); return false; } }
// in core/src/java/org/apache/solr/core/SolrCore.java
finally { if (!success) { synchronized (searcherLock) { onDeckSearchers--; if (onDeckSearchers < 0) { // sanity check... should never happen log.error(logid+"ERROR!!! onDeckSearchers after decrement=" + onDeckSearchers); onDeckSearchers=0; // try and recover } // if we failed, we need to wake up at least one waiter to continue the process searcherLock.notify(); } if (currSearcherHolder != null) { currSearcherHolder.decref(); } if (searchHolder != null) { searchHolder.decref(); // decrement 1 for _searcher (searchHolder will never become _searcher now) if (returnSearcher) { searchHolder.decref(); // decrement 1 because we won't be returning the searcher to the user } } } // we want to do this after we decrement onDeckSearchers so another thread // doesn't increment first and throw a false warning. openSearcherLock.unlock(); }
145
finish 3
                  
// in core/src/java/org/apache/solr/handler/ContentStreamHandlerBase.java
finally { // finish the request processor.finish(); }
// in core/src/java/org/apache/solr/search/function/FileFloatSource.java
finally{ processor.finish(); }
// in core/src/java/org/apache/solr/update/PeerSync.java
finally { try { proc.finish(); } catch (Exception e) { sreq.updateException = e; log.error(msg() + "Error applying updates from " + sreq.shards + " ,finish()", e); return false; } }
18
getAndSet 3
                  
// in core/src/java/org/apache/solr/update/DirectUpdateHandler2.java
finally { addCommandsCumulative.set( addCommandsCumulative.get() - addCommands.getAndSet( 0 ) ); deleteByIdCommandsCumulative.set( deleteByIdCommandsCumulative.get() - deleteByIdCommands.getAndSet( 0 ) ); deleteByQueryCommandsCumulative.set( deleteByQueryCommandsCumulative.get() - deleteByQueryCommands.getAndSet( 0 ) ); if (error) numErrors.incrementAndGet(); }
4
getEntity 3
                  
// in solrj/src/java/org/apache/solr/client/solrj/impl/ConcurrentUpdateSolrServer.java
finally { try { if (response != null) { response.getEntity().getContent().close(); } } catch (Exception ex) { } }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DocBuilder.java
finally { if (verboseDebug) { getDebugLogger().log(DIHLogLevels.ROW_END, epw.getEntity().getName(), null); if (epw.getEntity().isDocRoot()) getDebugLogger().log(DIHLogLevels.END_DOC, null, null); } }
56
getInputStream 3
                  
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/XPathEntityProcessor.java
finally { // some XML parsers are broken and don't close the byte stream (but they should according to spec) IOUtils.closeQuietly(xsltSource.getInputStream()); }
// in core/src/java/org/apache/solr/handler/admin/SystemInfoHandler.java
finally { if (process != null) { IOUtils.closeQuietly( process.getOutputStream() ); IOUtils.closeQuietly( process.getInputStream() ); IOUtils.closeQuietly( process.getErrorStream() ); } }
// in core/src/java/org/apache/solr/util/xslt/TransformerProvider.java
finally { // some XML parsers are broken and don't close the byte stream (but they should according to spec) IOUtils.closeQuietly(src.getInputStream()); }
12
getName 3
                  
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DocBuilder.java
finally { if (verboseDebug) { getDebugLogger().log(DIHLogLevels.ROW_END, epw.getEntity().getName(), null); if (epw.getEntity().isDocRoot()) getDebugLogger().log(DIHLogLevels.END_DOC, null, null); } }
// in core/src/java/org/apache/solr/core/SolrResourceLoader.java
finally{ //cache the shortname vs FQN if it is loaded by the webapp classloader and it is loaded // using a shortname if ( clazz != null && clazz.getClassLoader() == SolrResourceLoader.class.getClassLoader() && !cname.equals(clazz.getName()) && (subpackages.length == 0 || subpackages == packages)) { //store in the cache classNameCache.put(cname, clazz.getName()); } }
504
release 3
                  
// in core/src/java/org/apache/solr/handler/admin/CoreAdminHandler.java
finally { if (searchers != null) { for (RefCounted<SolrIndexSearcher> searcher : searchers) { if (searcher != null) searcher.decref(); } } if (sourceCores != null) { for (SolrCore solrCore : sourceCores) { if (solrCore != null) solrCore.close(); } } if (readersToBeClosed != null) IOUtils.closeWhileHandlingException(readersToBeClosed); if (dirsToBeReleased != null) { for (Directory dir : dirsToBeReleased) { DirectoryFactory dirFactory = core.getDirectoryFactory(); dirFactory.release(dir); } } if (wrappedReq != null) wrappedReq.close(); core.close(); }
// in core/src/java/org/apache/solr/handler/SnapShooter.java
finally { replicationHandler.core.getDeletionPolicy().releaseCommitPoint(indexCommit.getGeneration()); replicationHandler.snapShootDetails = details; if (lock != null) { try { lock.release(); } catch (IOException e) { LOG.error("Unable to release snapshoot lock: " + directoryName + ".lock"); } } }
// in core/src/java/org/apache/solr/update/SolrIndexWriter.java
finally { isClosed = true; directoryFactory.release(directory); numCloses.incrementAndGet(); }
5
unlockForUpdate
3
                  
// in core/src/java/org/apache/solr/update/processor/DistributedUpdateProcessor.java
finally { vinfo.unlockForUpdate(); }
// in core/src/java/org/apache/solr/update/processor/DistributedUpdateProcessor.java
finally { vinfo.unlockForUpdate(); }
// in core/src/java/org/apache/solr/update/processor/DistributedUpdateProcessor.java
finally { if (vinfo != null) { vinfo.unlockForUpdate(); } }
3
SolrException 2
                  
// in core/src/java/org/apache/solr/schema/OpenExchangeRatesOrgProvider.java
finally { if (ratesJsonStream != null) try { ratesJsonStream.close(); } catch (IOException e) { throw new SolrException(ErrorCode.SERVER_ERROR, "Error closing stream", e); } }
600
closeIt
2
                  
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/XPathEntityProcessor.java
finally { if (!streamRows) { closeIt(data); } }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/XPathEntityProcessor.java
finally { closeIt(data); if (!isEnd.get()) { offer(END_MARKER); } }
2
countDown 2
                  
// in solrj/src/java/org/apache/solr/client/solrj/impl/ConcurrentUpdateSolrServer.java
finally { lock.countDown(); lock = null; }
// in core/src/java/org/apache/solr/core/SolrCore.java
finally { // allow firstSearcher events to fire and make sure it is released latch.countDown(); }
4
getByteStream 2
                  
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DataImporter.java
finally { // some XML parsers are broken and don't close the byte stream (but they should according to spec) IOUtils.closeQuietly(configFile.getByteStream()); }
// in core/src/java/org/apache/solr/core/Config.java
finally { // some XML parsers are broken and don't close the byte stream (but they should according to spec) IOUtils.closeQuietly(is.getByteStream()); }
3
getClassLoader 2
                  
// in core/src/java/org/apache/solr/core/SolrResourceLoader.java
finally{ //cache the shortname vs FQN if it is loaded by the webapp classloader and it is loaded // using a shortname if ( clazz != null && clazz.getClassLoader() == SolrResourceLoader.class.getClassLoader() && !cname.equals(clazz.getName()) && (subpackages.length == 0 || subpackages == packages)) { //store in the cache classNameCache.put(cname, clazz.getName()); } }
11
isOpen
2
                  
// in core/src/java/org/apache/solr/core/SolrXMLSerializer.java
finally { if (fis != null) try { fis.close(); fis = null; } catch (IOException xio) {} if (fos != null) try { fos.close(); fos = null; } catch (IOException xio) {} if (fcin != null && fcin.isOpen()) try { fcin.close(); fcin = null; } catch (IOException xio) {} if (fcout != null && fcout.isOpen()) try { fcout.close(); fcout = null; } catch (IOException xio) {} }
2
notifyAll 2
                  
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/XPathEntityProcessor.java
finally { synchronized (this) { notifyAll(); } }
// in core/src/java/org/apache/solr/core/SolrCore.java
finally { // wake up anyone waiting for a searcher // even in the face of errors. onDeckSearchers--; searcherLock.notifyAll(); }
4
setContextClassLoader 2
                  
// in contrib/clustering/src/java/org/apache/solr/handler/clustering/carrot2/CarrotClusteringEngine.java
finally { ct.setContextClassLoader(prev); }
// in contrib/clustering/src/java/org/apache/solr/handler/clustering/carrot2/CarrotClusteringEngine.java
finally { ct.setContextClassLoader(prev); }
4
setSessionAttribute 2
                  
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/XPathEntityProcessor.java
finally { context.setSessionAttribute(HAS_MORE,null,Context.SCOPE_ENTITY); context.setSessionAttribute(NEXT_URL,null,Context.SCOPE_ENTITY); }
11
setStatus 2
                  
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DataImporter.java
finally { setStatus(Status.IDLE); DocBuilder.INSTANCE.set(null); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DataImporter.java
finally { setStatus(Status.IDLE); DocBuilder.INSTANCE.set(null); }
9
cleanup
1
                  
// in core/src/java/org/apache/solr/handler/SnapPuller.java
finally { cleanup(); //if cleanup suceeds . The file is downloaded fully. do an fsync fsyncService.submit(new Runnable(){ public void run() { try { FileUtils.sync(file); } catch (IOException e) { fsyncException = e; } } }); }
1
closeEntityProcessorWrappers 1
                  
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DocBuilder.java
finally { if (writer != null) { writer.close(); } if (epwList != null) { closeEntityProcessorWrappers(epwList); } if(reqParams.isDebug()) { reqParams.getDebugInfo().debugVerboseOutput = getDebugLogger().output; } }
2
closeWhileHandlingException
1
                  
// in core/src/java/org/apache/solr/handler/admin/CoreAdminHandler.java
finally { if (searchers != null) { for (RefCounted<SolrIndexSearcher> searcher : searchers) { if (searcher != null) searcher.decref(); } } if (sourceCores != null) { for (SolrCore solrCore : sourceCores) { if (solrCore != null) solrCore.close(); } } if (readersToBeClosed != null) IOUtils.closeWhileHandlingException(readersToBeClosed); if (dirsToBeReleased != null) { for (Directory dir : dirsToBeReleased) { DirectoryFactory dirFactory = core.getDirectoryFactory(); dirFactory.release(dir); } } if (wrappedReq != null) wrappedReq.close(); core.close(); }
1
delete 1
                  
// in core/src/java/org/apache/solr/core/SolrXMLSerializer.java
finally { if (tmpFile != null) { if (!tmpFile.delete()) tmpFile.deleteOnExit(); } }
26
deleteOnExit
1
                  
// in core/src/java/org/apache/solr/core/SolrXMLSerializer.java
finally { if (tmpFile != null) { if (!tmpFile.delete()) tmpFile.deleteOnExit(); } }
1
destroy 1
                  
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DocBuilder.java
finally { for (EntityProcessorWrapper entityWrapper : entitiesToDestroy) { entityWrapper.destroy(); } resetEntity(epw); }
9
disconnect
1
                  
// in core/src/java/org/apache/solr/util/SimplePostTool.java
finally { if(urlc!=null) urlc.disconnect(); }
1
dropBufferedUpdates
1
                  
// in core/src/java/org/apache/solr/cloud/RecoveryStrategy.java
finally { if (!replayed) { try { ulog.dropBufferedUpdates(); } catch (Throwable t) { SolrException.log(log, "", t); } } }
1
empty
1
                  
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/XPathRecordReader.java
finally { if ((isRecord || !recordStarted) && !stack.empty()) { Set<String> cleanThis = stack.pop(); if (cleanThis != null) { for (String fld : cleanThis) values.remove(fld); } } }
1
equals 1
                  
// in core/src/java/org/apache/solr/core/SolrResourceLoader.java
finally{ //cache the shortname vs FQN if it is loaded by the webapp classloader and it is loaded // using a shortname if ( clazz != null && clazz.getClassLoader() == SolrResourceLoader.class.getClassLoader() && !cname.equals(clazz.getName()) && (subpackages.length == 0 || subpackages == packages)) { //store in the cache classNameCache.put(cname, clazz.getName()); } }
645
execute 1
                  
// in solrj/src/java/org/apache/solr/client/solrj/impl/ConcurrentUpdateSolrServer.java
finally { // remove it from the list of running things unless we are the last // runner and the queue is full... // in which case, the next queue.put() would block and there would be no // runners to handle it. // This case has been further handled by using offer instead of put, and // using a retry loop // to avoid blocking forever (see request()). synchronized (runners) { if (runners.size() == 1 && queue.remainingCapacity() == 0) { // keep this runner alive scheduler.execute(this); } else { runners.remove(this); } } log.info("finished: {}", this); runnerLock.unlock(); }
23
fatal 1
                  
// in core/src/java/org/apache/solr/util/SimplePostTool.java
finally { try { if(is!=null) is.close(); } catch (IOException e) { fatal("IOException while closing file: "+ e); } }
14
flushBuffer 1
                  
// in solrj/src/java/org/apache/solr/common/util/JavaBinCodec.java
finally { daos.flushBuffer(); }
9
getContent 1
                  
// in solrj/src/java/org/apache/solr/client/solrj/impl/ConcurrentUpdateSolrServer.java
finally { try { if (response != null) { response.getEntity().getContent().close(); } } catch (Exception ex) { } }
8
getDebugInfo 1
                  
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DocBuilder.java
finally { if (writer != null) { writer.close(); } if (epwList != null) { closeEntityProcessorWrappers(epwList); } if(reqParams.isDebug()) { reqParams.getDebugInfo().debugVerboseOutput = getDebugLogger().output; } }
11
getDeletionPolicy 1
                  
// in core/src/java/org/apache/solr/handler/SnapShooter.java
finally { replicationHandler.core.getDeletionPolicy().releaseCommitPoint(indexCommit.getGeneration()); replicationHandler.snapShootDetails = details; if (lock != null) { try { lock.release(); } catch (IOException e) { LOG.error("Unable to release snapshoot lock: " + directoryName + ".lock"); } } }
13
getDirectoryFactory 1
                  
// in core/src/java/org/apache/solr/handler/admin/CoreAdminHandler.java
finally { if (searchers != null) { for (RefCounted<SolrIndexSearcher> searcher : searchers) { if (searcher != null) searcher.decref(); } } if (sourceCores != null) { for (SolrCore solrCore : sourceCores) { if (solrCore != null) solrCore.close(); } } if (readersToBeClosed != null) IOUtils.closeWhileHandlingException(readersToBeClosed); if (dirsToBeReleased != null) { for (Directory dir : dirsToBeReleased) { DirectoryFactory dirFactory = core.getDirectoryFactory(); dirFactory.release(dir); } } if (wrappedReq != null) wrappedReq.close(); core.close(); }
8
getErrorStream
1
                  
// in core/src/java/org/apache/solr/handler/admin/SystemInfoHandler.java
finally { if (process != null) { IOUtils.closeQuietly( process.getOutputStream() ); IOUtils.closeQuietly( process.getInputStream() ); IOUtils.closeQuietly( process.getErrorStream() ); } }
1
getGeneration 1
                  
// in core/src/java/org/apache/solr/handler/SnapShooter.java
finally { replicationHandler.core.getDeletionPolicy().releaseCommitPoint(indexCommit.getGeneration()); replicationHandler.snapShootDetails = details; if (lock != null) { try { lock.release(); } catch (IOException e) { LOG.error("Unable to release snapshoot lock: " + directoryName + ".lock"); } } }
18
getOutputStream 1
                  
// in core/src/java/org/apache/solr/handler/admin/SystemInfoHandler.java
finally { if (process != null) { IOUtils.closeQuietly( process.getOutputStream() ); IOUtils.closeQuietly( process.getInputStream() ); IOUtils.closeQuietly( process.getErrorStream() ); } }
5
getTimeElapsedSince 1
                  
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DebugLogger.java
finally { log(DIHLogLevels.ENTITY_META, "time-taken", DocBuilder .getTimeElapsedSince(start)); }
4
info 1
                  
// in solrj/src/java/org/apache/solr/client/solrj/impl/ConcurrentUpdateSolrServer.java
finally { // remove it from the list of running things unless we are the last // runner and the queue is full... // in which case, the next queue.put() would block and there would be no // runners to handle it. // This case has been further handled by using offer instead of put, and // using a retry loop // to avoid blocking forever (see request()). synchronized (runners) { if (runners.size() == 1 && queue.remainingCapacity() == 0) { // keep this runner alive scheduler.execute(this); } else { runners.remove(this); } } log.info("finished: {}", this); runnerLock.unlock(); }
372
isDebug 1
                  
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DocBuilder.java
finally { if (writer != null) { writer.close(); } if (epwList != null) { closeEntityProcessorWrappers(epwList); } if(reqParams.isDebug()) { reqParams.getDebugInfo().debugVerboseOutput = getDebugLogger().output; } }
16
isDocRoot 1
                  
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DocBuilder.java
finally { if (verboseDebug) { getDebugLogger().log(DIHLogLevels.ROW_END, epw.getEntity().getName(), null); if (epw.getEntity().isDocRoot()) getDebugLogger().log(DIHLogLevels.END_DOC, null, null); } }
9
isShutdown 1
                  
// in core/src/java/org/apache/solr/handler/SnapPuller.java
finally { if (!successfulInstall) { logReplicationTimeAndConfFiles(null, successfulInstall); } filesToDownload = filesDownloaded = confFilesDownloaded = confFilesToDownload = null; replicationStartTime = 0; fileFetcher = null; if (fsyncService != null && !fsyncService.isShutdown()) fsyncService.shutdownNow(); fsyncService = null; stop = false; fsyncException = null; }
2
logReplicationTimeAndConfFiles 1
                  
// in core/src/java/org/apache/solr/handler/SnapPuller.java
finally { if (!successfulInstall) { logReplicationTimeAndConfFiles(null, successfulInstall); } filesToDownload = filesDownloaded = confFilesDownloaded = confFilesToDownload = null; replicationStartTime = 0; fileFetcher = null; if (fsyncService != null && !fsyncService.isShutdown()) fsyncService.shutdownNow(); fsyncService = null; stop = false; fsyncException = null; }
3
msg 1
                  
// in core/src/java/org/apache/solr/update/PeerSync.java
finally { try { proc.finish(); } catch (Exception e) { sreq.updateException = e; log.error(msg() + "Error applying updates from " + sreq.shards + " ,finish()", e); return false; } }
49
notify 1
                  
// in core/src/java/org/apache/solr/core/SolrCore.java
finally { if (!success) { synchronized (searcherLock) { onDeckSearchers--; if (onDeckSearchers < 0) { // sanity check... should never happen log.error(logid+"ERROR!!! onDeckSearchers after decrement=" + onDeckSearchers); onDeckSearchers=0; // try and recover } // if we failed, we need to wake up at least one waiter to continue the process searcherLock.notify(); } if (currSearcherHolder != null) { currSearcherHolder.decref(); } if (searchHolder != null) { searchHolder.decref(); // decrement 1 for _searcher (searchHolder will never become _searcher now) if (returnSearcher) { searchHolder.decref(); // decrement 1 because we won't be returning the searcher to the user } } } // we want to do this after we decrement onDeckSearchers so another thread // doesn't increment first and throw a false warning. openSearcherLock.unlock(); }
6
offer 1
                  
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/XPathEntityProcessor.java
finally { closeIt(data); if (!isEnd.get()) { offer(END_MARKER); } }
6
pop 1
                  
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/XPathRecordReader.java
finally { if ((isRecord || !recordStarted) && !stack.empty()) { Set<String> cleanThis = stack.pop(); if (cleanThis != null) { for (String fld : cleanThis) values.remove(fld); } } }
16
printStackTrace 1
                  
// in core/src/java/org/apache/solr/schema/CurrencyField.java
finally { try { if (is != null) { is.close(); } } catch (IOException e) { e.printStackTrace(); } }
17
put 1
                  
// in core/src/java/org/apache/solr/core/SolrResourceLoader.java
finally{ //cache the shortname vs FQN if it is loaded by the webapp classloader and it is loaded // using a shortname if ( clazz != null && clazz.getClassLoader() == SolrResourceLoader.class.getClassLoader() && !cname.equals(clazz.getName()) && (subpackages.length == 0 || subpackages == packages)) { //store in the cache classNameCache.put(cname, clazz.getName()); } }
670
releaseCommitPoint
1
                  
// in core/src/java/org/apache/solr/handler/SnapShooter.java
finally { replicationHandler.core.getDeletionPolicy().releaseCommitPoint(indexCommit.getGeneration()); replicationHandler.snapShootDetails = details; if (lock != null) { try { lock.release(); } catch (IOException e) { LOG.error("Unable to release snapshoot lock: " + directoryName + ".lock"); } } }
1
remainingCapacity 1
                  
// in solrj/src/java/org/apache/solr/client/solrj/impl/ConcurrentUpdateSolrServer.java
finally { // remove it from the list of running things unless we are the last // runner and the queue is full... // in which case, the next queue.put() would block and there would be no // runners to handle it. // This case has been further handled by using offer instead of put, and // using a retry loop // to avoid blocking forever (see request()). synchronized (runners) { if (runners.size() == 1 && queue.remainingCapacity() == 0) { // keep this runner alive scheduler.execute(this); } else { runners.remove(this); } } log.info("finished: {}", this); runnerLock.unlock(); }
2
removeNamespace 1
                  
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DocBuilder.java
finally { resolver.removeNamespace(entity); }
2
resetEntity 1
                  
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DocBuilder.java
finally { for (EntityProcessorWrapper entityWrapper : entitiesToDestroy) { entityWrapper.destroy(); } resetEntity(epw); }
3
shutdownNow 1
                  
// in core/src/java/org/apache/solr/handler/SnapPuller.java
finally { if (!successfulInstall) { logReplicationTimeAndConfFiles(null, successfulInstall); } filesToDownload = filesDownloaded = confFilesDownloaded = confFilesToDownload = null; replicationStartTime = 0; fileFetcher = null; if (fsyncService != null && !fsyncService.isShutdown()) fsyncService.shutdownNow(); fsyncService = null; stop = false; fsyncException = null; }
11
size 1
                  
// in solrj/src/java/org/apache/solr/client/solrj/impl/ConcurrentUpdateSolrServer.java
finally { // remove it from the list of running things unless we are the last // runner and the queue is full... // in which case, the next queue.put() would block and there would be no // runners to handle it. // This case has been further handled by using offer instead of put, and // using a retry loop // to avoid blocking forever (see request()). synchronized (runners) { if (runners.size() == 1 && queue.remainingCapacity() == 0) { // keep this runner alive scheduler.execute(this); } else { runners.remove(this); } } log.info("finished: {}", this); runnerLock.unlock(); }
650
stop 1
                  
// in core/src/java/org/apache/solr/core/CoreContainer.java
finally { if(zkController != null) { zkController.close(); } if (zkServer != null) { zkServer.stop(); } if (shardHandlerFactory != null) { shardHandlerFactory.close(); } isShutDown = true; }
25
submit 1
                  
// in core/src/java/org/apache/solr/handler/SnapPuller.java
finally { cleanup(); //if cleanup suceeds . The file is downloaded fully. do an fsync fsyncService.submit(new Runnable(){ public void run() { try { FileUtils.sync(file); } catch (IOException e) { fsyncException = e; } } }); }
23
sync 1
                  
// in core/src/java/org/apache/solr/handler/SnapPuller.java
finally { cleanup(); //if cleanup suceeds . The file is downloaded fully. do an fsync fsyncService.submit(new Runnable(){ public void run() { try { FileUtils.sync(file); } catch (IOException e) { fsyncException = e; } } }); }
7

Reference Table

This table concatenates the results of the previous tables.

Checked/Runtime Type Exception Thrown Thrown from Catch Declared Caught directly Caught
with Thrown
Caught
with Thrown Runtime
unknown (Lib) . 0 0 0 0 0 0
unknown (Lib) AssertionError 2 0 0 0 0 0
unknown (Lib) AttributeNotFoundException 1
            
// in core/src/java/org/apache/solr/core/JmxMonitoredMap.java
public Object getAttribute(String attribute) throws AttributeNotFoundException, MBeanException, ReflectionException { Object val; if ("coreHashCode".equals(attribute)) { val = coreHashCode; } else if (staticStats.contains(attribute) && attribute != null && attribute.length() > 0) { try { String getter = "get" + attribute.substring(0, 1).toUpperCase(Locale.ENGLISH) + attribute.substring(1); Method meth = infoBean.getClass().getMethod(getter); val = meth.invoke(infoBean); } catch (Exception e) { throw new AttributeNotFoundException(attribute); } } else { NamedList list = infoBean.getStatistics(); val = list.get(attribute); } if (val != null) { // Its String or one of the simple types, just return it as JMX suggests direct support for such types for (String simpleTypeName : SimpleType.ALLOWED_CLASSNAMES_LIST) { if (val.getClass().getName().equals(simpleTypeName)) { return val; } } // Its an arbitrary object which could be something complex and odd, return its toString, assuming that is // a workable representation of the object return val.toString(); } return null; }
1
            
// in core/src/java/org/apache/solr/core/JmxMonitoredMap.java
catch (Exception e) { throw new AttributeNotFoundException(attribute); }
2
            
// in core/src/java/org/apache/solr/core/JmxMonitoredMap.java
public Object getAttribute(String attribute) throws AttributeNotFoundException, MBeanException, ReflectionException { Object val; if ("coreHashCode".equals(attribute)) { val = coreHashCode; } else if (staticStats.contains(attribute) && attribute != null && attribute.length() > 0) { try { String getter = "get" + attribute.substring(0, 1).toUpperCase(Locale.ENGLISH) + attribute.substring(1); Method meth = infoBean.getClass().getMethod(getter); val = meth.invoke(infoBean); } catch (Exception e) { throw new AttributeNotFoundException(attribute); } } else { NamedList list = infoBean.getStatistics(); val = list.get(attribute); } if (val != null) { // Its String or one of the simple types, just return it as JMX suggests direct support for such types for (String simpleTypeName : SimpleType.ALLOWED_CLASSNAMES_LIST) { if (val.getClass().getName().equals(simpleTypeName)) { return val; } } // Its an arbitrary object which could be something complex and odd, return its toString, assuming that is // a workable representation of the object return val.toString(); } return null; }
// in core/src/java/org/apache/solr/core/JmxMonitoredMap.java
public void setAttribute(Attribute attribute) throws AttributeNotFoundException, InvalidAttributeValueException, MBeanException, ReflectionException { throw new UnsupportedOperationException("Operation not Supported"); }
0 0 0
runtime (Domain) BindingException
public class BindingException extends RuntimeException {

  public BindingException(String message) {
    super(message);
  }

  public BindingException(String message, Throwable cause) {
    super(message, cause);
  }
}
8
            
// in solrj/src/java/org/apache/solr/client/solrj/beans/DocumentObjectBinder.java
private <T> T getBean(Class<T> clazz, List<DocField> fields, SolrDocument solrDoc) { if (fields == null) { fields = getDocFields(clazz); } try { T obj = clazz.newInstance(); for (DocField docField : fields) { docField.inject(obj, solrDoc); } return obj; } catch (Exception e) { throw new BindingException("Could not instantiate object of " + clazz, e); } }
// in solrj/src/java/org/apache/solr/client/solrj/beans/DocumentObjectBinder.java
public SolrInputDocument toSolrInputDocument(Object obj) { List<DocField> fields = getDocFields(obj.getClass()); if (fields.isEmpty()) { throw new BindingException("class: " + obj.getClass() + " does not define any fields."); } SolrInputDocument doc = new SolrInputDocument(); for (DocField field : fields) { if (field.dynamicFieldNamePatternMatcher != null && field.get(obj) != null && field.isContainedInMap) { Map<String, Object> mapValue = (Map<String, Object>) field.get(obj); for (Map.Entry<String, Object> e : mapValue.entrySet()) { doc.setField(e.getKey(), e.getValue(), 1.0f); } } else { doc.setField(field.name, field.get(obj), 1.0f); } } return doc; }
// in solrj/src/java/org/apache/solr/client/solrj/beans/DocumentObjectBinder.java
private void storeType() { if (field != null) { type = field.getType(); } else { Class[] params = setter.getParameterTypes(); if (params.length != 1) { throw new BindingException("Invalid setter method. Must have one and only one parameter"); } type = params[0]; } if(type == Collection.class || type == List.class || type == ArrayList.class) { type = Object.class; isList = true; } else if (type == byte[].class) { //no op } else if (type.isArray()) { isArray = true; type = type.getComponentType(); } else if (type == Map.class || type == HashMap.class) { //corresponding to the support for dynamicFields isContainedInMap = true; //assigned a default type type = Object.class; if (field != null) { if (field.getGenericType() instanceof ParameterizedType) { //check what are the generic values ParameterizedType parameterizedType = (ParameterizedType) field.getGenericType(); Type[] types = parameterizedType.getActualTypeArguments(); if (types != null && types.length == 2 && types[0] == String.class) { //the key should always be String //Raw and primitive types if (types[1] instanceof Class) { //the value could be multivalued then it is a List, Collection, ArrayList if (types[1]== Collection.class || types[1] == List.class || types[1] == ArrayList.class) { type = Object.class; isList = true; } else { //else assume it is a primitive and put in the source type itself type = (Class) types[1]; } } else if (types[1] instanceof ParameterizedType) { //Of all the Parameterized types, only List is supported Type rawType = ((ParameterizedType)types[1]).getRawType(); if(rawType== Collection.class || rawType == List.class || rawType == ArrayList.class){ type = Object.class; isList = true; } } else if (types[1] instanceof GenericArrayType) { //Array types type = (Class) ((GenericArrayType) types[1]).getGenericComponentType(); isArray = true; } else { //Throw an Exception if types are not known throw new BindingException("Allowed type for values of mapping a dynamicField are : " + "Object, Object[] and List"); } } } } } }
// in solrj/src/java/org/apache/solr/client/solrj/beans/DocumentObjectBinder.java
private void set(Object obj, Object v) { if (v != null && type == ByteBuffer.class && v.getClass() == byte[].class) { v = ByteBuffer.wrap((byte[]) v); } try { if (field != null) { field.set(obj, v); } else if (setter != null) { setter.invoke(obj, v); } } catch (Exception e) { throw new BindingException("Exception while setting value : " + v + " on " + (field != null ? field : setter), e); } }
// in solrj/src/java/org/apache/solr/client/solrj/beans/DocumentObjectBinder.java
public Object get(final Object obj) { if (field != null) { try { return field.get(obj); } catch (Exception e) { throw new BindingException("Exception while getting value: " + field, e); } } else if (getter == null) { throw new BindingException("Missing getter for field: " + name + " -- You can only call the 'get' for fields that have a field of 'get' method"); } try { return getter.invoke(obj, (Object[]) null); } catch (Exception e) { throw new BindingException("Exception while getting value: " + getter, e); } }
4
            
// in solrj/src/java/org/apache/solr/client/solrj/beans/DocumentObjectBinder.java
catch (Exception e) { throw new BindingException("Could not instantiate object of " + clazz, e); }
// in solrj/src/java/org/apache/solr/client/solrj/beans/DocumentObjectBinder.java
catch (Exception e) { throw new BindingException("Exception while setting value : " + v + " on " + (field != null ? field : setter), e); }
// in solrj/src/java/org/apache/solr/client/solrj/beans/DocumentObjectBinder.java
catch (Exception e) { throw new BindingException("Exception while getting value: " + field, e); }
// in solrj/src/java/org/apache/solr/client/solrj/beans/DocumentObjectBinder.java
catch (Exception e) { throw new BindingException("Exception while getting value: " + getter, e); }
0 0 0 0
unknown (Lib) CharacterCodingException 0 0 0 1
            
// in core/src/java/org/apache/solr/core/SolrResourceLoader.java
catch (CharacterCodingException ex) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Error loading resource (wrong encoding?): " + resource, ex); }
1
            
// in core/src/java/org/apache/solr/core/SolrResourceLoader.java
catch (CharacterCodingException ex) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Error loading resource (wrong encoding?): " + resource, ex); }
1
unknown (Lib) ClassCastException 0 0 0 7
            
// in contrib/velocity/src/java/org/apache/solr/response/VelocityResponseWriter.java
catch (ClassCastException e) { // known edge case where QueryResponse's extraction assumes "response" is a SolrDocumentList // (AnalysisRequestHandler emits a "response") e.printStackTrace(); rsp = new SolrResponseBase(); rsp.setResponse(parsedResponse); }
// in core/src/java/org/apache/solr/handler/component/RealTimeGetComponent.java
catch (ClassCastException e) { log.warn("Exception reading log for updates", e); }
// in core/src/java/org/apache/solr/analysis/PhoneticFilterFactory.java
catch (ClassCastException e) { throw new InitializationException("Not an encoder: " + name + " must be full class name or one of " + registry.keySet(), e); }
// in core/src/java/org/apache/solr/update/UpdateLog.java
catch (ClassCastException cl) { log.warn("Unexpected log entry or corrupt log. Entry=" + o, cl); // would be caused by a corrupt transaction log }
// in core/src/java/org/apache/solr/update/UpdateLog.java
catch (ClassCastException cl) { recoveryInfo.errors++; loglog.warn("REPLAY_ERR: Unexpected log entry or corrupt log. Entry=" + o, cl); // would be caused by a corrupt transaction log }
// in core/src/java/org/apache/solr/update/processor/MinFieldValueUpdateProcessorFactory.java
catch (ClassCastException e) { throw new SolrException (BAD_REQUEST, "Field values are not mutually comparable: " + e.getMessage(), e); }
// in core/src/java/org/apache/solr/update/processor/MaxFieldValueUpdateProcessorFactory.java
catch (ClassCastException e) { throw new SolrException (BAD_REQUEST, "Field values are not mutually comparable: " + e.getMessage(), e); }
3
            
// in core/src/java/org/apache/solr/analysis/PhoneticFilterFactory.java
catch (ClassCastException e) { throw new InitializationException("Not an encoder: " + name + " must be full class name or one of " + registry.keySet(), e); }
// in core/src/java/org/apache/solr/update/processor/MinFieldValueUpdateProcessorFactory.java
catch (ClassCastException e) { throw new SolrException (BAD_REQUEST, "Field values are not mutually comparable: " + e.getMessage(), e); }
// in core/src/java/org/apache/solr/update/processor/MaxFieldValueUpdateProcessorFactory.java
catch (ClassCastException e) { throw new SolrException (BAD_REQUEST, "Field values are not mutually comparable: " + e.getMessage(), e); }
2
unknown (Lib) ClassNotFoundException 2
            
// in core/src/java/org/apache/solr/util/SuggestMissingFactories.java
public Collection<Class> findExtends(Class<?> clazz) throws ClassNotFoundException { HashSet<Class> results = new HashSet<Class>(); for (JarFile jarFile : jarFiles) { for (Enumeration<JarEntry> e = jarFile.entries(); e.hasMoreElements() ;) { String n = e.nextElement().getName(); if (n.endsWith(".class")) { String cn = n.replace("/",".").substring(0,n.length()-6); Class<?> target; try { target = cl.loadClass(cn); } catch (NoClassDefFoundError e1) { throw new ClassNotFoundException ("Can't load: " + cn, e1); } if (clazz.isAssignableFrom(target) && !target.isAnonymousClass()) { int mods = target.getModifiers(); if (!(Modifier.isAbstract(mods) || Modifier.isInterface(mods))) { results.add(target); } } } } } return results; }
2
            
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DocBuilder.java
catch (Exception e) { try { String n = DocBuilder.class.getPackage().getName() + "." + name; return core != null ? core.getResourceLoader().findClass(n, Object.class) : Class.forName(n); } catch (Exception e1) { throw new ClassNotFoundException("Unable to load " + name + " or " + DocBuilder.class.getPackage().getName() + "." + name, e); } }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DocBuilder.java
catch (Exception e1) { throw new ClassNotFoundException("Unable to load " + name + " or " + DocBuilder.class.getPackage().getName() + "." + name, e); }
// in core/src/java/org/apache/solr/util/SuggestMissingFactories.java
catch (NoClassDefFoundError e1) { throw new ClassNotFoundException ("Can't load: " + cn, e1); }
3
            
// in core/src/java/org/apache/solr/util/SuggestMissingFactories.java
public static void main(String[] args) throws ClassNotFoundException, IOException, NoSuchMethodException { final File[] files = new File[args.length]; for (int i = 0; i < args.length; i++) { files[i] = new File(args[i]); } final FindClasses finder = new FindClasses(files); final ClassLoader cl = finder.getClassLoader(); final Class TOKENSTREAM = cl.loadClass("org.apache.lucene.analysis.TokenStream"); final Class TOKENIZER = cl.loadClass("org.apache.lucene.analysis.Tokenizer"); final Class TOKENFILTER = cl.loadClass("org.apache.lucene.analysis.TokenFilter"); final Class TOKENIZERFACTORY = cl.loadClass("org.apache.solr.analysis.TokenizerFactory"); final Class TOKENFILTERFACTORY = cl.loadClass("org.apache.solr.analysis.TokenFilterFactory"); final HashSet<Class> result = new HashSet<Class>(finder.findExtends(TOKENIZER)); result.addAll(finder.findExtends(TOKENFILTER)); result.removeAll(finder.findMethodReturns (finder.findExtends(TOKENIZERFACTORY), "create", Reader.class).values()); result.removeAll(finder.findMethodReturns (finder.findExtends(TOKENFILTERFACTORY), "create", TOKENSTREAM).values()); for (final Class c : result) { System.out.println(c.getName()); } }
// in core/src/java/org/apache/solr/util/SuggestMissingFactories.java
public static void main(String[] args) throws ClassNotFoundException, IOException, NoSuchMethodException { FindClasses finder = new FindClasses(new File(args[1])); ClassLoader cl = finder.getClassLoader(); Class clazz = cl.loadClass(args[0]); if (args.length == 2) { System.out.println("Finding all extenders of " + clazz.getName()); for (Class c : finder.findExtends(clazz)) { System.out.println(c.getName()); } } else { String methName = args[2]; System.out.println("Finding all extenders of " + clazz.getName() + " with method: " + methName); Class[] methArgs = new Class[args.length-3]; for (int i = 3; i < args.length; i++) { methArgs[i-3] = cl.loadClass(args[i]); } Map<Class,Class> map = finder.findMethodReturns (finder.findExtends(clazz),methName, methArgs); for (Class key : map.keySet()) { System.out.println(key.getName() + " => " + map.get(key).getName()); } } }
// in core/src/java/org/apache/solr/util/SuggestMissingFactories.java
public Collection<Class> findExtends(Class<?> clazz) throws ClassNotFoundException { HashSet<Class> results = new HashSet<Class>(); for (JarFile jarFile : jarFiles) { for (Enumeration<JarEntry> e = jarFile.entries(); e.hasMoreElements() ;) { String n = e.nextElement().getName(); if (n.endsWith(".class")) { String cn = n.replace("/",".").substring(0,n.length()-6); Class<?> target; try { target = cl.loadClass(cn); } catch (NoClassDefFoundError e1) { throw new ClassNotFoundException ("Can't load: " + cn, e1); } if (clazz.isAssignableFrom(target) && !target.isAnonymousClass()) { int mods = target.getModifiers(); if (!(Modifier.isAbstract(mods) || Modifier.isInterface(mods))) { results.add(target); } } } } } return results; }
7
            
// in contrib/clustering/src/java/org/apache/solr/handler/clustering/carrot2/LuceneCarrot2StemmerFactory.java
catch (ClassNotFoundException e) { logger .warn( "Could not instantiate Lucene stemmer for Arabic, clustering quality " + "of Arabic content may be degraded. For best quality clusters, " + "make sure Lucene's Arabic analyzer JAR is in the classpath", e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/JdbcDataSource.java
catch (ClassNotFoundException e) { wrapAndThrow(SEVERE, e, "Could not load driver: " + driver); }
// in core/src/java/org/apache/solr/analysis/PhoneticFilterFactory.java
catch (ClassNotFoundException cnfe) { throw new InitializationException("Unknown encoder: " + name + " must be full class name or one of " + registry.keySet(), cnfe); }
// in core/src/java/org/apache/solr/analysis/SnowballPorterFilterFactory.java
catch (ClassNotFoundException e) { throw new InitializationException("Can't find class for stemmer language " + language, e); }
// in core/src/java/org/apache/solr/core/SolrResourceLoader.java
catch (ClassNotFoundException e) { //this is unlikely log.error("Unable to load cached class-name : "+ c +" for shortname : "+cname + e); }
// in core/src/java/org/apache/solr/core/SolrResourceLoader.java
catch (ClassNotFoundException e) { String newName=cname; if (newName.startsWith(project)) { newName = cname.substring(project.length()+1); } for (String subpackage : subpackages) { try { String name = base + '.' + subpackage + newName; log.trace("Trying class name " + name); return clazz = Class.forName(name,true,classLoader).asSubclass(expectedType); } catch (ClassNotFoundException e1) { // ignore... assume first exception is best. } } throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "Error loading class '" + cname + "'", e); }
// in core/src/java/org/apache/solr/core/SolrResourceLoader.java
catch (ClassNotFoundException e1) { // ignore... assume first exception is best. }
3
            
// in core/src/java/org/apache/solr/analysis/PhoneticFilterFactory.java
catch (ClassNotFoundException cnfe) { throw new InitializationException("Unknown encoder: " + name + " must be full class name or one of " + registry.keySet(), cnfe); }
// in core/src/java/org/apache/solr/analysis/SnowballPorterFilterFactory.java
catch (ClassNotFoundException e) { throw new InitializationException("Can't find class for stemmer language " + language, e); }
// in core/src/java/org/apache/solr/core/SolrResourceLoader.java
catch (ClassNotFoundException e) { String newName=cname; if (newName.startsWith(project)) { newName = cname.substring(project.length()+1); } for (String subpackage : subpackages) { try { String name = base + '.' + subpackage + newName; log.trace("Trying class name " + name); return clazz = Class.forName(name,true,classLoader).asSubclass(expectedType); } catch (ClassNotFoundException e1) { // ignore... assume first exception is best. } } throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "Error loading class '" + cname + "'", e); }
1
unknown (Lib) CloneNotSupportedException 0 0 0 3
            
// in core/src/java/org/apache/solr/internal/csv/CSVStrategy.java
catch (CloneNotSupportedException e) { throw new RuntimeException(e); // impossible }
// in core/src/java/org/apache/solr/search/DocSlice.java
catch (CloneNotSupportedException e) {}
// in core/src/java/org/apache/solr/update/UpdateCommand.java
catch (CloneNotSupportedException e) { return null; }
1
            
// in core/src/java/org/apache/solr/internal/csv/CSVStrategy.java
catch (CloneNotSupportedException e) { throw new RuntimeException(e); // impossible }
1
unknown (Lib) ConfigException 4
            
// in core/src/java/org/apache/solr/cloud/SolrZkServer.java
public static Properties getProperties(String path) throws ConfigException { File configFile = new File(path); LOG.info("Reading configuration from: " + configFile); try { if (!configFile.exists()) { throw new IllegalArgumentException(configFile.toString() + " file is missing"); } Properties cfg = new Properties(); FileInputStream in = new FileInputStream(configFile); try { cfg.load(in); } finally { in.close(); } return cfg; } catch (IOException e) { throw new ConfigException("Error processing " + path, e); } catch (IllegalArgumentException e) { throw new ConfigException("Error processing " + path, e); } }
// in core/src/java/org/apache/solr/cloud/SolrZkServer.java
Override public void parseProperties(Properties zkProp) throws IOException, ConfigException { for (Entry<Object, Object> entry : zkProp.entrySet()) { String key = entry.getKey().toString().trim(); String value = entry.getValue().toString().trim(); if (key.equals("dataDir")) { dataDir = value; } else if (key.equals("dataLogDir")) { dataLogDir = value; } else if (key.equals("clientPort")) { setClientPort(Integer.parseInt(value)); } else if (key.equals("tickTime")) { tickTime = Integer.parseInt(value); } else if (key.equals("initLimit")) { initLimit = Integer.parseInt(value); } else if (key.equals("syncLimit")) { syncLimit = Integer.parseInt(value); } else if (key.equals("electionAlg")) { electionAlg = Integer.parseInt(value); } else if (key.equals("maxClientCnxns")) { maxClientCnxns = Integer.parseInt(value); } else if (key.startsWith("server.")) { int dot = key.indexOf('.'); long sid = Long.parseLong(key.substring(dot + 1)); String parts[] = value.split(":"); if ((parts.length != 2) && (parts.length != 3)) { LOG.error(value + " does not have the form host:port or host:port:port"); } InetSocketAddress addr = new InetSocketAddress(parts[0], Integer.parseInt(parts[1])); if (parts.length == 2) { servers.put(Long.valueOf(sid), new QuorumPeer.QuorumServer(sid, addr)); } else if (parts.length == 3) { InetSocketAddress electionAddr = new InetSocketAddress( parts[0], Integer.parseInt(parts[2])); servers.put(Long.valueOf(sid), new QuorumPeer.QuorumServer(sid, addr, electionAddr)); } } else if (key.startsWith("group")) { int dot = key.indexOf('.'); long gid = Long.parseLong(key.substring(dot + 1)); numGroups++; String parts[] = value.split(":"); for(String s : parts){ long sid = Long.parseLong(s); if(serverGroup.containsKey(sid)) throw new ConfigException("Server " + sid + "is in multiple groups"); else serverGroup.put(sid, gid); } } else if(key.startsWith("weight")) { int dot = key.indexOf('.'); long sid = Long.parseLong(key.substring(dot + 1)); serverWeight.put(sid, Long.parseLong(value)); } else { System.setProperty("zookeeper." + key, value); } } if (dataDir == null) { throw new IllegalArgumentException("dataDir is not set"); } if (dataLogDir == null) { dataLogDir = dataDir; } else { if (!new File(dataLogDir).isDirectory()) { throw new IllegalArgumentException("dataLogDir " + dataLogDir + " is missing."); } } if (tickTime == 0) { throw new IllegalArgumentException("tickTime is not set"); } if (servers.size() > 1) { if (initLimit == 0) { throw new IllegalArgumentException("initLimit is not set"); } if (syncLimit == 0) { throw new IllegalArgumentException("syncLimit is not set"); } /* * If using FLE, then every server requires a separate election * port. */ if (electionAlg != 0) { for (QuorumPeer.QuorumServer s : servers.values()) { if (s.electionAddr == null) throw new IllegalArgumentException( "Missing election port for server: " + s.id); } } /* * Default of quorum config is majority */ if(serverGroup.size() > 0){ if(servers.size() != serverGroup.size()) throw new ConfigException("Every server must be in exactly one group"); /* * The deafult weight of a server is 1 */ for(QuorumPeer.QuorumServer s : servers.values()){ if(!serverWeight.containsKey(s.id)) serverWeight.put(s.id, (long) 1); } /* * Set the quorumVerifier to be QuorumHierarchical */ quorumVerifier = new QuorumHierarchical(numGroups, serverWeight, serverGroup); } else { /* * The default QuorumVerifier is QuorumMaj */ LOG.info("Defaulting to majority quorums"); quorumVerifier = new QuorumMaj(servers.size()); } File myIdFile = new File(dataDir, "myid"); if (!myIdFile.exists()) { ///////////////// ADDED FOR SOLR ////// Long myid = getMySeverId(); if (myid != null) { serverId = myid; return; } if (zkRun == null) return; //////////////// END ADDED FOR SOLR ////// throw new IllegalArgumentException(myIdFile.toString() + " file is missing"); } BufferedReader br = new BufferedReader(new FileReader(myIdFile)); String myIdString; try { myIdString = br.readLine(); } finally { br.close(); } try { serverId = Long.parseLong(myIdString); } catch (NumberFormatException e) { throw new IllegalArgumentException("serverid " + myIdString + " is not a number"); } } }
2
            
// in core/src/java/org/apache/solr/cloud/SolrZkServer.java
catch (IOException e) { throw new ConfigException("Error processing " + path, e); }
// in core/src/java/org/apache/solr/cloud/SolrZkServer.java
catch (IllegalArgumentException e) { throw new ConfigException("Error processing " + path, e); }
2
            
// in core/src/java/org/apache/solr/cloud/SolrZkServer.java
public static Properties getProperties(String path) throws ConfigException { File configFile = new File(path); LOG.info("Reading configuration from: " + configFile); try { if (!configFile.exists()) { throw new IllegalArgumentException(configFile.toString() + " file is missing"); } Properties cfg = new Properties(); FileInputStream in = new FileInputStream(configFile); try { cfg.load(in); } finally { in.close(); } return cfg; } catch (IOException e) { throw new ConfigException("Error processing " + path, e); } catch (IllegalArgumentException e) { throw new ConfigException("Error processing " + path, e); } }
// in core/src/java/org/apache/solr/cloud/SolrZkServer.java
Override public void parseProperties(Properties zkProp) throws IOException, ConfigException { for (Entry<Object, Object> entry : zkProp.entrySet()) { String key = entry.getKey().toString().trim(); String value = entry.getValue().toString().trim(); if (key.equals("dataDir")) { dataDir = value; } else if (key.equals("dataLogDir")) { dataLogDir = value; } else if (key.equals("clientPort")) { setClientPort(Integer.parseInt(value)); } else if (key.equals("tickTime")) { tickTime = Integer.parseInt(value); } else if (key.equals("initLimit")) { initLimit = Integer.parseInt(value); } else if (key.equals("syncLimit")) { syncLimit = Integer.parseInt(value); } else if (key.equals("electionAlg")) { electionAlg = Integer.parseInt(value); } else if (key.equals("maxClientCnxns")) { maxClientCnxns = Integer.parseInt(value); } else if (key.startsWith("server.")) { int dot = key.indexOf('.'); long sid = Long.parseLong(key.substring(dot + 1)); String parts[] = value.split(":"); if ((parts.length != 2) && (parts.length != 3)) { LOG.error(value + " does not have the form host:port or host:port:port"); } InetSocketAddress addr = new InetSocketAddress(parts[0], Integer.parseInt(parts[1])); if (parts.length == 2) { servers.put(Long.valueOf(sid), new QuorumPeer.QuorumServer(sid, addr)); } else if (parts.length == 3) { InetSocketAddress electionAddr = new InetSocketAddress( parts[0], Integer.parseInt(parts[2])); servers.put(Long.valueOf(sid), new QuorumPeer.QuorumServer(sid, addr, electionAddr)); } } else if (key.startsWith("group")) { int dot = key.indexOf('.'); long gid = Long.parseLong(key.substring(dot + 1)); numGroups++; String parts[] = value.split(":"); for(String s : parts){ long sid = Long.parseLong(s); if(serverGroup.containsKey(sid)) throw new ConfigException("Server " + sid + "is in multiple groups"); else serverGroup.put(sid, gid); } } else if(key.startsWith("weight")) { int dot = key.indexOf('.'); long sid = Long.parseLong(key.substring(dot + 1)); serverWeight.put(sid, Long.parseLong(value)); } else { System.setProperty("zookeeper." + key, value); } } if (dataDir == null) { throw new IllegalArgumentException("dataDir is not set"); } if (dataLogDir == null) { dataLogDir = dataDir; } else { if (!new File(dataLogDir).isDirectory()) { throw new IllegalArgumentException("dataLogDir " + dataLogDir + " is missing."); } } if (tickTime == 0) { throw new IllegalArgumentException("tickTime is not set"); } if (servers.size() > 1) { if (initLimit == 0) { throw new IllegalArgumentException("initLimit is not set"); } if (syncLimit == 0) { throw new IllegalArgumentException("syncLimit is not set"); } /* * If using FLE, then every server requires a separate election * port. */ if (electionAlg != 0) { for (QuorumPeer.QuorumServer s : servers.values()) { if (s.electionAddr == null) throw new IllegalArgumentException( "Missing election port for server: " + s.id); } } /* * Default of quorum config is majority */ if(serverGroup.size() > 0){ if(servers.size() != serverGroup.size()) throw new ConfigException("Every server must be in exactly one group"); /* * The deafult weight of a server is 1 */ for(QuorumPeer.QuorumServer s : servers.values()){ if(!serverWeight.containsKey(s.id)) serverWeight.put(s.id, (long) 1); } /* * Set the quorumVerifier to be QuorumHierarchical */ quorumVerifier = new QuorumHierarchical(numGroups, serverWeight, serverGroup); } else { /* * The default QuorumVerifier is QuorumMaj */ LOG.info("Defaulting to majority quorums"); quorumVerifier = new QuorumMaj(servers.size()); } File myIdFile = new File(dataDir, "myid"); if (!myIdFile.exists()) { ///////////////// ADDED FOR SOLR ////// Long myid = getMySeverId(); if (myid != null) { serverId = myid; return; } if (zkRun == null) return; //////////////// END ADDED FOR SOLR ////// throw new IllegalArgumentException(myIdFile.toString() + " file is missing"); } BufferedReader br = new BufferedReader(new FileReader(myIdFile)); String myIdString; try { myIdString = br.readLine(); } finally { br.close(); } try { serverId = Long.parseLong(myIdString); } catch (NumberFormatException e) { throw new IllegalArgumentException("serverid " + myIdString + " is not a number"); } } }
1
            
// in core/src/java/org/apache/solr/cloud/SolrZkServer.java
catch (QuorumPeerConfig.ConfigException e) { if (zkRun != null) throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
1
            
// in core/src/java/org/apache/solr/cloud/SolrZkServer.java
catch (QuorumPeerConfig.ConfigException e) { if (zkRun != null) throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
1
unknown (Lib) ConnectException 0 0 0 2
            
// in solrj/src/java/org/apache/solr/client/solrj/impl/HttpSolrServer.java
catch (ConnectException e) { throw new SolrServerException("Server refused connection at: " + getBaseURL(), e); }
// in core/src/java/org/apache/solr/handler/component/HttpShardHandler.java
catch( ConnectException cex ) { srsp.setException(cex); //???? }
1
            
// in solrj/src/java/org/apache/solr/client/solrj/impl/HttpSolrServer.java
catch (ConnectException e) { throw new SolrServerException("Server refused connection at: " + getBaseURL(), e); }
0
unknown (Lib) ConnectionLossException 0 0 0 2
            
// in solrj/src/java/org/apache/solr/common/cloud/ZkCmdExecutor.java
catch (KeeperException.ConnectionLossException e) { if (exception == null) { exception = e; } if (Thread.currentThread().isInterrupted()) { Thread.currentThread().interrupt(); throw new InterruptedException(); } if (Thread.currentThread() instanceof SafeStopThread) { if (((SafeStopThread) Thread.currentThread()).isClosed()) { throw new RuntimeException("Interrupted"); } } retryDelay(i); }
// in core/src/java/org/apache/solr/cloud/LeaderElector.java
catch (ConnectionLossException e) { // we don't know if we made our node or not... List<String> entries = zkClient.getChildren(shardsElectZkPath, null, true); boolean foundId = false; for (String entry : entries) { String nodeId = getNodeId(entry); if (id.equals(nodeId)) { // we did create our node... foundId = true; break; } } if (!foundId) { throw e; } }
3
            
// in solrj/src/java/org/apache/solr/common/cloud/ZkCmdExecutor.java
catch (KeeperException.ConnectionLossException e) { if (exception == null) { exception = e; } if (Thread.currentThread().isInterrupted()) { Thread.currentThread().interrupt(); throw new InterruptedException(); } if (Thread.currentThread() instanceof SafeStopThread) { if (((SafeStopThread) Thread.currentThread()).isClosed()) { throw new RuntimeException("Interrupted"); } } retryDelay(i); }
// in core/src/java/org/apache/solr/cloud/LeaderElector.java
catch (ConnectionLossException e) { // we don't know if we made our node or not... List<String> entries = zkClient.getChildren(shardsElectZkPath, null, true); boolean foundId = false; for (String entry : entries) { String nodeId = getNodeId(entry); if (id.equals(nodeId)) { // we did create our node... foundId = true; break; } } if (!foundId) { throw e; } }
1
runtime (Domain) DataImportHandlerException
public class DataImportHandlerException extends RuntimeException {
  private int errCode;

  public boolean debugged = false;

  public static final int SEVERE = 500, WARN = 400, SKIP = 300, SKIP_ROW =301;

  public DataImportHandlerException(int err) {
    super();
    errCode = err;
  }

  public DataImportHandlerException(int err, String message) {
    super(message + (SolrWriter.getDocCount() == null ? "" : MSG + SolrWriter.getDocCount()));
    errCode = err;
  }

  public DataImportHandlerException(int err, String message, Throwable cause) {
    super(message + (SolrWriter.getDocCount() == null ? "" : MSG + SolrWriter.getDocCount()), cause);
    errCode = err;
  }

  public DataImportHandlerException(int err, Throwable cause) {
    super(cause);
    errCode = err;
  }

  public int getErrCode() {
    return errCode;
  }

  public static void wrapAndThrow(int err, Exception e) {
    if (e instanceof DataImportHandlerException) {
      throw (DataImportHandlerException) e;
    } else {
      throw new DataImportHandlerException(err, e);
    }
  }

  public static void wrapAndThrow(int err, Exception e, String msg) {
    if (e instanceof DataImportHandlerException) {
      throw (DataImportHandlerException) e;
    } else {
      throw new DataImportHandlerException(err, msg, e);
    }
  }


  public static final String MSG = " Processing Document # ";
}
68
            
// in contrib/dataimporthandler-extras/src/java/org/apache/solr/handler/dataimport/TikaEntityProcessor.java
Override protected void firstInit(Context context) { try { String tikaConfigFile = context.getResolvedEntityAttribute("tikaConfig"); if (tikaConfigFile == null) { ClassLoader classLoader = context.getSolrCore().getResourceLoader().getClassLoader(); tikaConfig = new TikaConfig(classLoader); } else { File configFile = new File(tikaConfigFile); if (!configFile.isAbsolute()) { configFile = new File(context.getSolrCore().getResourceLoader().getConfigDir(), tikaConfigFile); } tikaConfig = new TikaConfig(configFile); } } catch (Exception e) { wrapAndThrow (SEVERE, e,"Unable to load Tika Config"); } format = context.getResolvedEntityAttribute("format"); if(format == null) format = "text"; if (!"html".equals(format) && !"xml".equals(format) && !"text".equals(format)&& !"none".equals(format) ) throw new DataImportHandlerException(SEVERE, "'format' can be one of text|html|xml|none"); parser = context.getResolvedEntityAttribute("parser"); if(parser == null) { parser = AUTO_PARSER; } done = false; }
// in contrib/dataimporthandler-extras/src/java/org/apache/solr/handler/dataimport/MailEntityProcessor.java
Override public void init(Context context) { super.init(context); // set attributes using XXX getXXXFromContext(attribute, defualtValue); // applies variable resolver and return default if value is not found or null // REQUIRED : connection and folder info user = getStringFromContext("user", null); password = getStringFromContext("password", null); host = getStringFromContext("host", null); protocol = getStringFromContext("protocol", null); folderNames = getStringFromContext("folders", null); // validate if (host == null || protocol == null || user == null || password == null || folderNames == null) throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "'user|password|protocol|host|folders' are required attributes"); //OPTIONAL : have defaults and are optional recurse = getBoolFromContext("recurse", true); String excludes = getStringFromContext("exclude", ""); if (excludes != null && !excludes.trim().equals("")) { exclude = Arrays.asList(excludes.split(",")); } String includes = getStringFromContext("include", ""); if (includes != null && !includes.trim().equals("")) { include = Arrays.asList(includes.split(",")); } batchSize = getIntFromContext("batchSize", 20); customFilter = getStringFromContext("customFilter", ""); String s = getStringFromContext("fetchMailsSince", ""); if (s != null) try { fetchMailsSince = new SimpleDateFormat("yyyy-MM-dd HH:mm:ss").parse(s); } catch (ParseException e) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Invalid value for fetchMailSince: " + s, e); } fetchSize = getIntFromContext("fetchSize", 32 * 1024); cTimeout = getIntFromContext("connectTimeout", 30 * 1000); rTimeout = getIntFromContext("readTimeout", 60 * 1000); processAttachment = getBoolFromContext( getStringFromContext("processAttachment",null) == null ? "processAttachement":"processAttachment" , true); tika = new Tika(); logConfig(); }
// in contrib/dataimporthandler-extras/src/java/org/apache/solr/handler/dataimport/MailEntityProcessor.java
private boolean connectToMailBox() { try { Properties props = new Properties(); props.setProperty("mail.store.protocol", protocol); props.setProperty("mail.imap.fetchsize", "" + fetchSize); props.setProperty("mail.imap.timeout", "" + rTimeout); props.setProperty("mail.imap.connectiontimeout", "" + cTimeout); Session session = Session.getDefaultInstance(props, null); mailbox = session.getStore(protocol); mailbox.connect(host, user, password); LOG.info("Connected to mailbox"); return true; } catch (MessagingException e) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Connection failed", e); } }
// in contrib/dataimporthandler-extras/src/java/org/apache/solr/handler/dataimport/MailEntityProcessor.java
private void createFilters() { if (fetchMailsSince != null) { filters.add(new MailsSinceLastCheckFilter(fetchMailsSince)); } if (customFilter != null && !customFilter.equals("")) { try { Class cf = Class.forName(customFilter); Object obj = cf.newInstance(); if (obj instanceof CustomFilter) { filters.add((CustomFilter) obj); } } catch (Exception e) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Custom filter could not be created", e); } } }
// in contrib/dataimporthandler-extras/src/java/org/apache/solr/handler/dataimport/MailEntityProcessor.java
private void getTopLevelFolders(Store mailBox) { if (folderNames != null) topLevelFolders = Arrays.asList(folderNames.split(",")); for (int i = 0; topLevelFolders != null && i < topLevelFolders.size(); i++) { try { folders.add(mailbox.getFolder(topLevelFolders.get(i))); } catch (MessagingException e) { // skip bad ones unless its the last one and still no good folder if (folders.size() == 0 && i == topLevelFolders.size() - 1) throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Folder retreival failed"); } } if (topLevelFolders == null || topLevelFolders.size() == 0) { try { folders.add(mailBox.getDefaultFolder()); } catch (MessagingException e) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Folder retreival failed"); } } }
// in contrib/dataimporthandler-extras/src/java/org/apache/solr/handler/dataimport/MailEntityProcessor.java
public boolean hasNext() { boolean hasMore = current < messagesInCurBatch.length; if (!hasMore && doBatching && currentBatch * batchSize < totalInFolder) { // try next batch try { getNextBatch(batchSize, folder); hasMore = current < messagesInCurBatch.length; } catch (MessagingException e) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Message retreival failed", e); } } return hasMore; }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DataImporter.java
private void loadDataConfig(InputSource configFile) { try { DocumentBuilderFactory dbf = DocumentBuilderFactory.newInstance(); // only enable xinclude, if a a SolrCore and SystemId is present (makes no sense otherwise) if (core != null && configFile.getSystemId() != null) { try { dbf.setXIncludeAware(true); dbf.setNamespaceAware(true); } catch( UnsupportedOperationException e ) { LOG.warn( "XML parser doesn't support XInclude option" ); } } DocumentBuilder builder = dbf.newDocumentBuilder(); if (core != null) builder.setEntityResolver(new SystemIdResolver(core.getResourceLoader())); builder.setErrorHandler(XMLLOG); Document document; try { document = builder.parse(configFile); } finally { // some XML parsers are broken and don't close the byte stream (but they should according to spec) IOUtils.closeQuietly(configFile.getByteStream()); } config = readFromXml(document); LOG.info("Data Configuration loaded successfully"); } catch (Exception e) { throw new DataImportHandlerException(SEVERE, "Exception occurred while initializing context", e); } }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DataImporter.java
public DIHConfiguration readFromXml(Document xmlDocument) { DIHConfiguration config; List<Map<String, String >> functions = new ArrayList<Map<String ,String>>(); Script script = null; Map<String, Properties> dataSources = new HashMap<String, Properties>(); NodeList dataConfigTags = xmlDocument.getElementsByTagName("dataConfig"); if(dataConfigTags == null || dataConfigTags.getLength() == 0) { throw new DataImportHandlerException(SEVERE, "the root node '<dataConfig>' is missing"); } Element e = (Element) dataConfigTags.item(0); List<Element> documentTags = ConfigParseUtil.getChildNodes(e, "document"); if (documentTags.isEmpty()) { throw new DataImportHandlerException(SEVERE, "DataImportHandler " + "configuration file must have one <document> node."); } List<Element> scriptTags = ConfigParseUtil.getChildNodes(e, ConfigNameConstants.SCRIPT); if (!scriptTags.isEmpty()) { script = new Script(scriptTags.get(0)); } // Add the provided evaluators List<Element> functionTags = ConfigParseUtil.getChildNodes(e, ConfigNameConstants.FUNCTION); if (!functionTags.isEmpty()) { for (Element element : functionTags) { String func = ConfigParseUtil.getStringAttribute(element, NAME, null); String clz = ConfigParseUtil.getStringAttribute(element, ConfigNameConstants.CLASS, null); if (func == null || clz == null){ throw new DataImportHandlerException( SEVERE, "<function> must have a 'name' and 'class' attributes"); } else { functions.add(ConfigParseUtil.getAllAttributes(element)); } } } List<Element> dataSourceTags = ConfigParseUtil.getChildNodes(e, DATA_SRC); if (!dataSourceTags.isEmpty()) { for (Element element : dataSourceTags) { Properties p = new Properties(); HashMap<String, String> attrs = ConfigParseUtil.getAllAttributes(element); for (Map.Entry<String, String> entry : attrs.entrySet()) { p.setProperty(entry.getKey(), entry.getValue()); } dataSources.put(p.getProperty("name"), p); } } if(dataSources.get(null) == null){ for (Properties properties : dataSources.values()) { dataSources.put(null,properties); break; } } return new DIHConfiguration(documentTags.get(0), this, functions, script, dataSources); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DataImporter.java
DataSource getDataSourceInstance(Entity key, String name, Context ctx) { Properties p = dataSourceProps.get(name); if (p == null) p = config.getDataSources().get(name); if (p == null) p = dataSourceProps.get(null);// for default data source if (p == null) p = config.getDataSources().get(null); if (p == null) throw new DataImportHandlerException(SEVERE, "No dataSource :" + name + " available for entity :" + key.getName()); String type = p.getProperty(TYPE); DataSource dataSrc = null; if (type == null) { dataSrc = new JdbcDataSource(); } else { try { dataSrc = (DataSource) DocBuilder.loadClass(type, getCore()).newInstance(); } catch (Exception e) { wrapAndThrow(SEVERE, e, "Invalid type for data source: " + type); } } try { Properties copyProps = new Properties(); copyProps.putAll(p); Map<String, Object> map = ctx.getRequestParameters(); if (map.containsKey("rows")) { int rows = Integer.parseInt((String) map.get("rows")); if (map.containsKey("start")) { rows += Integer.parseInt((String) map.get("start")); } copyProps.setProperty("maxRows", String.valueOf(rows)); } dataSrc.init(ctx, copyProps); } catch (Exception e) { wrapAndThrow(SEVERE, e, "Failed to initialize DataSource: " + key.getDataSourceName()); } return dataSrc; }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DataImporter.java
private void checkWritablePersistFile(SolrWriter writer) { // File persistFile = propWriter.getPersistFile(); // boolean isWritable = persistFile.exists() ? persistFile.canWrite() : persistFile.getParentFile().canWrite(); if (isDeltaImportSupported && !propWriter.isWritable()) { throw new DataImportHandlerException(SEVERE, "Properties is not writable. Delta imports are supported by data config but will not work."); } }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/XPathEntityProcessor.java
private void initXpathReader() { useSolrAddXml = Boolean.parseBoolean(context .getEntityAttribute(USE_SOLR_ADD_SCHEMA)); streamRows = Boolean.parseBoolean(context .getEntityAttribute(STREAM)); if (context.getResolvedEntityAttribute("batchSize") != null) { blockingQueueSize = Integer.parseInt(context.getEntityAttribute("batchSize")); } if (context.getResolvedEntityAttribute("readTimeOut") != null) { blockingQueueTimeOut = Integer.parseInt(context.getEntityAttribute("readTimeOut")); } String xslt = context.getEntityAttribute(XSL); if (xslt != null) { xslt = context.replaceTokens(xslt); try { // create an instance of TransformerFactory TransformerFactory transFact = TransformerFactory.newInstance(); final SolrCore core = context.getSolrCore(); final StreamSource xsltSource; if (core != null) { final ResourceLoader loader = core.getResourceLoader(); transFact.setURIResolver(new SystemIdResolver(loader).asURIResolver()); xsltSource = new StreamSource(loader.openResource(xslt), SystemIdResolver.createSystemIdFromResourceName(xslt)); } else { // fallback for tests xsltSource = new StreamSource(xslt); } transFact.setErrorListener(xmllog); try { xslTransformer = transFact.newTransformer(xsltSource); } finally { // some XML parsers are broken and don't close the byte stream (but they should according to spec) IOUtils.closeQuietly(xsltSource.getInputStream()); } LOG.info("Using xslTransformer: " + xslTransformer.getClass().getName()); } catch (Exception e) { throw new DataImportHandlerException(SEVERE, "Error initializing XSL ", e); } } if (useSolrAddXml) { // Support solr add documents xpathReader = new XPathRecordReader("/add/doc"); xpathReader.addField("name", "/add/doc/field/@name", true); xpathReader.addField("value", "/add/doc/field", true); } else { String forEachXpath = context.getEntityAttribute(FOR_EACH); if (forEachXpath == null) throw new DataImportHandlerException(SEVERE, "Entity : " + context.getEntityAttribute("name") + " must have a 'forEach' attribute"); try { xpathReader = new XPathRecordReader(forEachXpath); for (Map<String, String> field : context.getAllEntityFields()) { if (field.get(XPATH) == null) continue; int flags = 0; if ("true".equals(field.get("flatten"))) { flags = XPathRecordReader.FLATTEN; } String xpath = field.get(XPATH); xpath = context.replaceTokens(xpath); xpathReader.addField(field.get(DataImporter.COLUMN), xpath, Boolean.parseBoolean(field.get(DataImporter.MULTI_VALUED)), flags); } } catch (RuntimeException e) { throw new DataImportHandlerException(SEVERE, "Exception while reading xpaths for fields", e); } } String url = context.getEntityAttribute(URL); List<String> l = url == null ? Collections.EMPTY_LIST : TemplateString.getVariables(url); for (String s : l) { if (s.startsWith(entityName + ".")) { if (placeHolderVariables == null) placeHolderVariables = new ArrayList<String>(); placeHolderVariables.add(s.substring(entityName.length() + 1)); } } for (Map<String, String> fld : context.getAllEntityFields()) { if (fld.get(COMMON_FIELD) != null && "true".equals(fld.get(COMMON_FIELD))) { if (commonFields == null) commonFields = new ArrayList<String>(); commonFields.add(fld.get(DataImporter.COLUMN)); } } }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DebugLogger.java
Override public DebugInfo pop() { if (size() == 1) throw new DataImportHandlerException( DataImportHandlerException.SEVERE, "Stack is becoming empty"); return super.pop(); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/LineEntityProcessor.java
Override public void init(Context context) { super.init(context); String s; // init a regex to locate files from the input we want to index s = context.getResolvedEntityAttribute(ACCEPT_LINE_REGEX); if (s != null) { acceptLineRegex = Pattern.compile(s); } // init a regex to locate files from the input to be skipped s = context.getResolvedEntityAttribute(SKIP_LINE_REGEX); if (s != null) { skipLineRegex = Pattern.compile(s); } // the FileName is required. url = context.getResolvedEntityAttribute(URL); if (url == null) throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "'"+ URL +"' is a required attribute"); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/LineEntityProcessor.java
Override public Map<String, Object> nextRow() { if (reader == null) { reader = new BufferedReader((Reader) context.getDataSource().getData(url)); } String line; while ( true ) { // read a line from the input file try { line = reader.readLine(); } catch (IOException exp) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Problem reading from input", exp); } if (line == null) return null; // end of input // First scan whole line to see if we want it if (acceptLineRegex != null && ! acceptLineRegex.matcher(line).find()) continue; if (skipLineRegex != null && skipLineRegex.matcher(line).find()) continue; // Contruct the 'row' of fields Map<String, Object> row = new HashMap<String, Object>(); row.put("rawLine", line); return row; } }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/ScriptTransformer.java
private void initEngine(Context context) { String scriptText = context.getScript(); String scriptLang = context.getScriptLanguage(); if (scriptText == null) { throw new DataImportHandlerException(SEVERE, "<script> tag is not present under <dataConfig>"); } ScriptEngineManager scriptEngineMgr = new ScriptEngineManager(); ScriptEngine scriptEngine = scriptEngineMgr.getEngineByName(scriptLang); if (scriptEngine == null) { throw new DataImportHandlerException(SEVERE, "Cannot load Script Engine for language: " + scriptLang); } if (scriptEngine instanceof Invocable) { engine = (Invocable) scriptEngine; } else { throw new DataImportHandlerException(SEVERE, "The installed ScriptEngine for: " + scriptLang + " does not implement Invocable. Class is " + scriptEngine.getClass().getName()); } try { scriptEngine.eval(scriptText); } catch (ScriptException e) { wrapAndThrow(SEVERE, e, "'eval' failed with language: " + scriptLang + " and script: \n" + scriptText); } }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/FieldStreamDataSource.java
Override public InputStream getData(String query) { Object o = wrapper.getVariableResolver().resolve(dataField); if (o == null) { throw new DataImportHandlerException(SEVERE, "No field available for name : " + dataField); } if (o instanceof Blob) { Blob blob = (Blob) o; try { //Most of the JDBC drivers have getBinaryStream defined as public // so let us just check it Method m = blob.getClass().getDeclaredMethod("getBinaryStream"); if (Modifier.isPublic(m.getModifiers())) { return (InputStream) m.invoke(blob); } else { // force invoke m.setAccessible(true); return (InputStream) m.invoke(blob); } } catch (Exception e) { LOG.info("Unable to get data from BLOB"); return null; } } else if (o instanceof byte[]) { byte[] bytes = (byte[]) o; return new ByteArrayInputStream(bytes); } else { throw new RuntimeException("unsupported type : " + o.getClass()); } }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/SimplePropertiesWriter.java
Override public void persist(Properties p) { OutputStream propOutput = null; Properties props = readIndexerProperties(); try { props.putAll(p); String filePath = configDir; if (configDir != null && !configDir.endsWith(File.separator)) filePath += File.separator; filePath += persistFilename; propOutput = new FileOutputStream(filePath); props.store(propOutput, null); log.info("Wrote last indexed time to " + persistFilename); } catch (Exception e) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Unable to persist Index Start Time", e); } finally { try { if (propOutput != null) propOutput.close(); } catch (IOException e) { propOutput = null; } } }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/EvaluatorBag.java
Override public String evaluate(String expression, Context context) { List l = parseParams(expression, context.getVariableResolver()); if (l.size() != 1) { throw new DataImportHandlerException(SEVERE, "'escapeSql' must have at least one parameter "); } String s = l.get(0).toString(); // escape single quote with two single quotes, double quote // with two doule quotes, and backslash with double backslash. // See: http://dev.mysql.com/doc/refman/4.1/en/mysql-real-escape-string.html return s.replaceAll("'", "''").replaceAll("\"", "\"\"").replaceAll("\\\\", "\\\\\\\\"); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/EvaluatorBag.java
Override public String evaluate(String expression, Context context) { List l = parseParams(expression, context.getVariableResolver()); if (l.size() != 1) { throw new DataImportHandlerException(SEVERE, "'escapeQueryChars' must have at least one parameter "); } String s = l.get(0).toString(); return ClientUtils.escapeQueryChars(s); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/EvaluatorBag.java
Override public String evaluate(String expression, Context context) { List l = parseParams(expression, context.getVariableResolver()); if (l.size() != 1) { throw new DataImportHandlerException(SEVERE, "'encodeUrl' must have at least one parameter "); } String s = l.get(0).toString(); try { return URLEncoder.encode(s.toString(), "UTF-8"); } catch (Exception e) { wrapAndThrow(SEVERE, e, "Unable to encode expression: " + expression + " with value: " + s); return null; } }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/EvaluatorBag.java
Override public String evaluate(String expression, Context context) { List l = parseParams(expression, context.getVariableResolver()); if (l.size() != 2) { throw new DataImportHandlerException(SEVERE, "'formatDate()' must have two parameters "); } Object o = l.get(0); Object format = l.get(1); if (format instanceof VariableWrapper) { VariableWrapper wrapper = (VariableWrapper) format; o = wrapper.resolve(); if (o == null) { format = wrapper.varName; LOG.warn("Deprecated syntax used. The syntax of formatDate has been changed to formatDate(<var>, '<date_format_string>'). " + "The old syntax will stop working in Solr 1.5"); } else { format = o.toString(); } } String dateFmt = format.toString(); SimpleDateFormat fmt = new SimpleDateFormat(dateFmt); Date date = null; if (o instanceof VariableWrapper) { VariableWrapper variableWrapper = (VariableWrapper) o; Object variableval = variableWrapper.resolve(); if (variableval instanceof Date) { date = (Date) variableval; } else { String s = variableval.toString(); try { date = DataImporter.DATE_TIME_FORMAT.get().parse(s); } catch (ParseException exp) { wrapAndThrow(SEVERE, exp, "Invalid expression for date"); } } } else { String datemathfmt = o.toString(); datemathfmt = datemathfmt.replaceAll("NOW", ""); try { date = dateMathParser.parseMath(datemathfmt); } catch (ParseException e) { wrapAndThrow(SEVERE, e, "Invalid expression for date"); } } return fmt.format(date); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/EvaluatorBag.java
public static List parseParams(String expression, VariableResolver vr) { List result = new ArrayList(); expression = expression.trim(); String[] ss = expression.split(","); for (int i = 0; i < ss.length; i++) { ss[i] = ss[i].trim(); if (ss[i].startsWith("'")) {//a string param has started StringBuilder sb = new StringBuilder(); while (true) { sb.append(ss[i]); if (ss[i].endsWith("'")) break; i++; if (i >= ss.length) throw new DataImportHandlerException(SEVERE, "invalid string at " + ss[i - 1] + " in function params: " + expression); sb.append(","); } String s = sb.substring(1, sb.length() - 1); s = s.replaceAll("\\\\'", "'"); result.add(s); } else { if (Character.isDigit(ss[i].charAt(0))) { try { Double doub = Double.parseDouble(ss[i]); result.add(doub); } catch (NumberFormatException e) { if (vr.resolve(ss[i]) == null) { wrapAndThrow( SEVERE, e, "Invalid number :" + ss[i] + "in parameters " + expression); } } } else { result.add(new VariableWrapper(ss[i], vr)); } } } return result; }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/JdbcDataSource.java
protected Callable<Connection> createConnectionFactory(final Context context, final Properties initProps) { // final VariableResolver resolver = context.getVariableResolver(); resolveVariables(context, initProps); final String jndiName = initProps.getProperty(JNDI_NAME); final String url = initProps.getProperty(URL); final String driver = initProps.getProperty(DRIVER); if (url == null && jndiName == null) throw new DataImportHandlerException(SEVERE, "JDBC URL or JNDI name has to be specified"); if (driver != null) { try { DocBuilder.loadClass(driver, context.getSolrCore()); } catch (ClassNotFoundException e) { wrapAndThrow(SEVERE, e, "Could not load driver: " + driver); } } else { if(jndiName == null){ throw new DataImportHandlerException(SEVERE, "One of driver or jndiName must be specified in the data source"); } } String s = initProps.getProperty("maxRows"); if (s != null) { maxRows = Integer.parseInt(s); } return factory = new Callable<Connection>() { public Connection call() throws Exception { LOG.info("Creating a connection for entity " + context.getEntityAttribute(DataImporter.NAME) + " with URL: " + url); long start = System.currentTimeMillis(); Connection c = null; try { if(url != null){ c = DriverManager.getConnection(url, initProps); } else if(jndiName != null){ InitialContext ctx = new InitialContext(); Object jndival = ctx.lookup(jndiName); if (jndival instanceof javax.sql.DataSource) { javax.sql.DataSource dataSource = (javax.sql.DataSource) jndival; String user = (String) initProps.get("user"); String pass = (String) initProps.get("password"); if(user == null || user.trim().equals("")){ c = dataSource.getConnection(); } else { c = dataSource.getConnection(user, pass); } } else { throw new DataImportHandlerException(SEVERE, "the jndi name : '"+jndiName +"' is not a valid javax.sql.DataSource"); } } } catch (SQLException e) { // DriverManager does not allow you to use a driver which is not loaded through // the class loader of the class which is trying to make the connection. // This is a workaround for cases where the user puts the driver jar in the // solr.home/lib or solr.home/core/lib directories. Driver d = (Driver) DocBuilder.loadClass(driver, context.getSolrCore()).newInstance(); c = d.connect(url, initProps); } if (c != null) { if (Boolean.parseBoolean(initProps.getProperty("readOnly"))) { c.setReadOnly(true); // Add other sane defaults c.setAutoCommit(true); c.setTransactionIsolation(Connection.TRANSACTION_READ_UNCOMMITTED); c.setHoldability(ResultSet.CLOSE_CURSORS_AT_COMMIT); } if (!Boolean.parseBoolean(initProps.getProperty("autoCommit"))) { c.setAutoCommit(false); } String transactionIsolation = initProps.getProperty("transactionIsolation"); if ("TRANSACTION_READ_UNCOMMITTED".equals(transactionIsolation)) { c.setTransactionIsolation(Connection.TRANSACTION_READ_UNCOMMITTED); } else if ("TRANSACTION_READ_COMMITTED".equals(transactionIsolation)) { c.setTransactionIsolation(Connection.TRANSACTION_READ_COMMITTED); } else if ("TRANSACTION_REPEATABLE_READ".equals(transactionIsolation)) { c.setTransactionIsolation(Connection.TRANSACTION_REPEATABLE_READ); } else if ("TRANSACTION_SERIALIZABLE".equals(transactionIsolation)) { c.setTransactionIsolation(Connection.TRANSACTION_SERIALIZABLE); } else if ("TRANSACTION_NONE".equals(transactionIsolation)) { c.setTransactionIsolation(Connection.TRANSACTION_NONE); } String holdability = initProps.getProperty("holdability"); if ("CLOSE_CURSORS_AT_COMMIT".equals(holdability)) { c.setHoldability(ResultSet.CLOSE_CURSORS_AT_COMMIT); } else if ("HOLD_CURSORS_OVER_COMMIT".equals(holdability)) { c.setHoldability(ResultSet.HOLD_CURSORS_OVER_COMMIT); } } LOG.info("Time taken for getConnection(): " + (System.currentTimeMillis() - start)); return c; } }; }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/JdbcDataSource.java
public Connection call() throws Exception { LOG.info("Creating a connection for entity " + context.getEntityAttribute(DataImporter.NAME) + " with URL: " + url); long start = System.currentTimeMillis(); Connection c = null; try { if(url != null){ c = DriverManager.getConnection(url, initProps); } else if(jndiName != null){ InitialContext ctx = new InitialContext(); Object jndival = ctx.lookup(jndiName); if (jndival instanceof javax.sql.DataSource) { javax.sql.DataSource dataSource = (javax.sql.DataSource) jndival; String user = (String) initProps.get("user"); String pass = (String) initProps.get("password"); if(user == null || user.trim().equals("")){ c = dataSource.getConnection(); } else { c = dataSource.getConnection(user, pass); } } else { throw new DataImportHandlerException(SEVERE, "the jndi name : '"+jndiName +"' is not a valid javax.sql.DataSource"); } } } catch (SQLException e) { // DriverManager does not allow you to use a driver which is not loaded through // the class loader of the class which is trying to make the connection. // This is a workaround for cases where the user puts the driver jar in the // solr.home/lib or solr.home/core/lib directories. Driver d = (Driver) DocBuilder.loadClass(driver, context.getSolrCore()).newInstance(); c = d.connect(url, initProps); } if (c != null) { if (Boolean.parseBoolean(initProps.getProperty("readOnly"))) { c.setReadOnly(true); // Add other sane defaults c.setAutoCommit(true); c.setTransactionIsolation(Connection.TRANSACTION_READ_UNCOMMITTED); c.setHoldability(ResultSet.CLOSE_CURSORS_AT_COMMIT); } if (!Boolean.parseBoolean(initProps.getProperty("autoCommit"))) { c.setAutoCommit(false); } String transactionIsolation = initProps.getProperty("transactionIsolation"); if ("TRANSACTION_READ_UNCOMMITTED".equals(transactionIsolation)) { c.setTransactionIsolation(Connection.TRANSACTION_READ_UNCOMMITTED); } else if ("TRANSACTION_READ_COMMITTED".equals(transactionIsolation)) { c.setTransactionIsolation(Connection.TRANSACTION_READ_COMMITTED); } else if ("TRANSACTION_REPEATABLE_READ".equals(transactionIsolation)) { c.setTransactionIsolation(Connection.TRANSACTION_REPEATABLE_READ); } else if ("TRANSACTION_SERIALIZABLE".equals(transactionIsolation)) { c.setTransactionIsolation(Connection.TRANSACTION_SERIALIZABLE); } else if ("TRANSACTION_NONE".equals(transactionIsolation)) { c.setTransactionIsolation(Connection.TRANSACTION_NONE); } String holdability = initProps.getProperty("holdability"); if ("CLOSE_CURSORS_AT_COMMIT".equals(holdability)) { c.setHoldability(ResultSet.CLOSE_CURSORS_AT_COMMIT); } else if ("HOLD_CURSORS_OVER_COMMIT".equals(holdability)) { c.setHoldability(ResultSet.HOLD_CURSORS_OVER_COMMIT); } } LOG.info("Time taken for getConnection(): " + (System.currentTimeMillis() - start)); return c; }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DIHCacheSupport.java
private DIHCache instantiateCache(Context context) { DIHCache cache = null; try { @SuppressWarnings("unchecked") Class<DIHCache> cacheClass = DocBuilder.loadClass(cacheImplName, context .getSolrCore()); Constructor<DIHCache> constr = cacheClass.getConstructor(); cache = constr.newInstance(); cache.open(context); } catch (Exception e) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Unable to load Cache implementation:" + cacheImplName, e); } return cache; }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DIHCacheSupport.java
protected Map<String,Object> getIdCacheData(Context context, String query, Iterator<Map<String,Object>> rowIterator) { Object key = context.resolve(cacheForeignKey); if (key == null) { throw new DataImportHandlerException(DataImportHandlerException.WARN, "The cache lookup value : " + cacheForeignKey + " is resolved to be null in the entity :" + context.getEntityAttribute("name")); } if (dataSourceRowCache == null) { DIHCache cache = queryVsCache.get(query); if (cache == null) { cache = instantiateCache(context); queryVsCache.put(query, cache); populateCache(query, rowIterator); } dataSourceRowCache = cache.iterator(key); } return getFromRowCacheTransformed(); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DataImportHandlerException.java
public static void wrapAndThrow(int err, Exception e) { if (e instanceof DataImportHandlerException) { throw (DataImportHandlerException) e; } else { throw new DataImportHandlerException(err, e); } }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DataImportHandlerException.java
public static void wrapAndThrow(int err, Exception e, String msg) { if (e instanceof DataImportHandlerException) { throw (DataImportHandlerException) e; } else { throw new DataImportHandlerException(err, msg, e); } }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/BinContentStreamDataSource.java
Override public InputStream getData(String query) { contentStream = context.getDocBuilder().getReqParams().getContentStream(); if (contentStream == null) throw new DataImportHandlerException(SEVERE, "No stream available. The request has no body"); try { return in = contentStream.getStream(); } catch (IOException e) { DataImportHandlerException.wrapAndThrow(SEVERE, e); return null; } }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/SolrWriter.java
Override public void close() { try { processor.finish(); } catch (IOException e) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Unable to call finish() on UpdateRequestProcessor", e); } }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/SolrWriter.java
Override public void doDeleteAll() { try { DeleteUpdateCommand deleteCommand = new DeleteUpdateCommand(req); deleteCommand.query = "*:*"; processor.processDelete(deleteCommand); } catch (IOException e) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Exception in full dump while deleting all documents.", e); } }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/HTMLStripTransformer.java
private Object stripHTML(String value, String column) { StringBuilder out = new StringBuilder(); StringReader strReader = new StringReader(value); try { HTMLStripCharFilter html = new HTMLStripCharFilter(CharReader.get(strReader.markSupported() ? strReader : new BufferedReader(strReader))); char[] cbuf = new char[1024 * 10]; while (true) { int count = html.read(cbuf); if (count == -1) break; // end of stream mark is -1 if (count > 0) out.append(cbuf, 0, count); } html.close(); } catch (IOException e) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Failed stripping HTML for column: " + column, e); } return out.toString(); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/URLDataSource.java
Override public Reader getData(String query) { URL url = null; try { if (URIMETHOD.matcher(query).find()) url = new URL(query); else url = new URL(baseUrl + query); LOG.debug("Accessing URL: " + url.toString()); URLConnection conn = url.openConnection(); conn.setConnectTimeout(connectionTimeout); conn.setReadTimeout(readTimeout); InputStream in = conn.getInputStream(); String enc = encoding; if (enc == null) { String cType = conn.getContentType(); if (cType != null) { Matcher m = CHARSET_PATTERN.matcher(cType); if (m.find()) { enc = m.group(1); } } } if (enc == null) enc = UTF_8; DataImporter.QUERY_COUNT.get().incrementAndGet(); return new InputStreamReader(in, enc); } catch (Exception e) { LOG.error("Exception thrown while getting data", e); throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Exception in invoking url " + url, e); } }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/FileListEntityProcessor.java
Override public void init(Context context) { super.init(context); fileName = context.getEntityAttribute(FILE_NAME); if (fileName != null) { fileName = context.replaceTokens(fileName); fileNamePattern = Pattern.compile(fileName); } baseDir = context.getEntityAttribute(BASE_DIR); if (baseDir == null) throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "'baseDir' is a required attribute"); baseDir = context.replaceTokens(baseDir); File dir = new File(baseDir); if (!dir.isDirectory()) throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "'baseDir' value: " + baseDir + " is not a directory"); String r = context.getEntityAttribute(RECURSIVE); if (r != null) recursive = Boolean.parseBoolean(r); excludes = context.getEntityAttribute(EXCLUDES); if (excludes != null) { excludes = context.replaceTokens(excludes); excludesPattern = Pattern.compile(excludes); } }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/FileListEntityProcessor.java
private Date getDate(String dateStr) { if (dateStr == null) return null; Matcher m = PLACE_HOLDER_PATTERN.matcher(dateStr); if (m.find()) { Object o = context.resolve(m.group(1)); if (o instanceof Date) return (Date)o; dateStr = (String) o; } else { dateStr = context.replaceTokens(dateStr); } m = EvaluatorBag.IN_SINGLE_QUOTES.matcher(dateStr); if (m.find()) { String expr = null; expr = m.group(1).replaceAll("NOW", ""); try { return EvaluatorBag.dateMathParser.parseMath(expr); } catch (ParseException exp) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Invalid expression for date", exp); } } try { return DataImporter.DATE_TIME_FORMAT.get().parse(dateStr); } catch (ParseException exp) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Invalid expression for date", exp); } }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/SolrEntityProcessor.java
Override protected void firstInit(Context context) { super.firstInit(context); try { String serverPath = context.getResolvedEntityAttribute(SOLR_SERVER); if (serverPath == null) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "SolrEntityProcessor: parameter 'url' is required"); } HttpClient client = getHttpClient(); URL url = new URL(serverPath); // (wt="javabin|xml") default is javabin if ("xml".equals(context.getResolvedEntityAttribute(CommonParams.WT))) { solrServer = new HttpSolrServer(url.toExternalForm(), client, new XMLResponseParser()); LOG.info("using XMLResponseParser"); } else { solrServer = new HttpSolrServer(url.toExternalForm(), client); LOG.info("using BinaryResponseParser"); } } catch (MalformedURLException e) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, e); } }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/SolrEntityProcessor.java
protected SolrDocumentList doQuery(int start) { this.queryString = context.getResolvedEntityAttribute(QUERY); if (this.queryString == null) { throw new DataImportHandlerException( DataImportHandlerException.SEVERE, "SolrEntityProcessor: parameter 'query' is required" ); } String rowsP = context.getResolvedEntityAttribute(CommonParams.ROWS); if (rowsP != null) { rows = Integer.parseInt(rowsP); } String fqAsString = context.getResolvedEntityAttribute(CommonParams.FQ); if (fqAsString != null) { this.filterQueries = fqAsString.split(","); } String fieldsAsString = context.getResolvedEntityAttribute(CommonParams.FL); if (fieldsAsString != null) { this.fields = fieldsAsString.split(","); } this.queryType = context.getResolvedEntityAttribute(CommonParams.QT); String timeoutAsString = context.getResolvedEntityAttribute(TIMEOUT); if (timeoutAsString != null) { this.timeout = Integer.parseInt(timeoutAsString); } SolrQuery solrQuery = new SolrQuery(queryString); solrQuery.setRows(rows); solrQuery.setStart(start); if (fields != null) { for (String field : fields) { solrQuery.addField(field); } } solrQuery.setQueryType(queryType); solrQuery.setFilterQueries(filterQueries); solrQuery.setTimeAllowed(timeout * 1000); QueryResponse response = null; try { response = solrServer.query(solrQuery); } catch (SolrServerException e) { if (ABORT.equals(onError)) { wrapAndThrow(SEVERE, e); } else if (SKIP.equals(onError)) { wrapAndThrow(DataImportHandlerException.SKIP_ROW, e); } } return response == null ? null : response.getResults(); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/ContentStreamDataSource.java
Override public Reader getData(String query) { contentStream = context.getDocBuilder().getReqParams().getContentStream(); if (contentStream == null) throw new DataImportHandlerException(SEVERE, "No stream available. The request has no body"); try { return reader = contentStream.getReader(); } catch (IOException e) { DataImportHandlerException.wrapAndThrow(SEVERE, e); return null; } }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/FieldReaderDataSource.java
Override public Reader getData(String query) { Object o = entityProcessor.getVariableResolver().resolve(dataField); if (o == null) { throw new DataImportHandlerException (SEVERE, "No field available for name : " +dataField); } if (o instanceof String) { return new StringReader((String) o); } else if (o instanceof Clob) { Clob clob = (Clob) o; try { //Most of the JDBC drivers have getCharacterStream defined as public // so let us just check it return readCharStream(clob); } catch (Exception e) { LOG.info("Unable to get data from CLOB"); return null; } } else if (o instanceof Blob) { Blob blob = (Blob) o; try { return getReader(blob); } catch (Exception e) { LOG.info("Unable to get data from BLOB"); return null; } } else { return new StringReader(o.toString()); } }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DocBuilder.java
private void handleSpecialCommands(Map<String, Object> arow, DocWrapper doc) { Object value = arow.get("$deleteDocById"); if (value != null) { if (value instanceof Collection) { Collection collection = (Collection) value; for (Object o : collection) { writer.deleteDoc(o.toString()); importStatistics.deletedDocCount.incrementAndGet(); } } else { writer.deleteDoc(value); importStatistics.deletedDocCount.incrementAndGet(); } } value = arow.get("$deleteDocByQuery"); if (value != null) { if (value instanceof Collection) { Collection collection = (Collection) value; for (Object o : collection) { writer.deleteByQuery(o.toString()); importStatistics.deletedDocCount.incrementAndGet(); } } else { writer.deleteByQuery(value.toString()); importStatistics.deletedDocCount.incrementAndGet(); } } value = arow.get("$docBoost"); if (value != null) { float value1 = 1.0f; if (value instanceof Number) { value1 = ((Number) value).floatValue(); } else { value1 = Float.parseFloat(value.toString()); } doc.setDocumentBoost(value1); } value = arow.get("$skipDoc"); if (value != null) { if (Boolean.parseBoolean(value.toString())) { throw new DataImportHandlerException(DataImportHandlerException.SKIP, "Document skipped :" + arow); } } value = arow.get("$skipRow"); if (value != null) { if (Boolean.parseBoolean(value.toString())) { throw new DataImportHandlerException(DataImportHandlerException.SKIP_ROW); } } }
26
            
// in contrib/dataimporthandler-extras/src/java/org/apache/solr/handler/dataimport/MailEntityProcessor.java
catch (ParseException e) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Invalid value for fetchMailSince: " + s, e); }
// in contrib/dataimporthandler-extras/src/java/org/apache/solr/handler/dataimport/MailEntityProcessor.java
catch (MessagingException e) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Connection failed", e); }
// in contrib/dataimporthandler-extras/src/java/org/apache/solr/handler/dataimport/MailEntityProcessor.java
catch (Exception e) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Custom filter could not be created", e); }
// in contrib/dataimporthandler-extras/src/java/org/apache/solr/handler/dataimport/MailEntityProcessor.java
catch (MessagingException e) { // skip bad ones unless its the last one and still no good folder if (folders.size() == 0 && i == topLevelFolders.size() - 1) throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Folder retreival failed"); }
// in contrib/dataimporthandler-extras/src/java/org/apache/solr/handler/dataimport/MailEntityProcessor.java
catch (MessagingException e) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Folder retreival failed"); }
// in contrib/dataimporthandler-extras/src/java/org/apache/solr/handler/dataimport/MailEntityProcessor.java
catch (MessagingException e) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Message retreival failed", e); }
// in contrib/dataimporthandler-extras/src/java/org/apache/solr/handler/dataimport/MailEntityProcessor.java
catch (MessagingException e) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Message retreival failed", e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DataImporter.java
catch (Exception e) { throw new DataImportHandlerException(SEVERE, "Exception occurred while initializing context", e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/XPathEntityProcessor.java
catch (Exception e) { throw new DataImportHandlerException(SEVERE, "Error initializing XSL ", e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/XPathEntityProcessor.java
catch (RuntimeException e) { throw new DataImportHandlerException(SEVERE, "Exception while reading xpaths for fields", e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/LineEntityProcessor.java
catch (IOException exp) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Problem reading from input", exp); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/SqlEntityProcessor.java
catch (Exception e) { LOG.error( "The query failed '" + q + "'", e); throw new DataImportHandlerException(DataImportHandlerException.SEVERE, e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/SimplePropertiesWriter.java
catch (Exception e) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Unable to persist Index Start Time", e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DIHCacheSupport.java
catch (Exception e) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Unable to load Cache implementation:" + cacheImplName, e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/NumberFormatTransformer.java
catch (ParseException e) { throw new DataImportHandlerException( DataImportHandlerException.SEVERE, "Failed to apply NumberFormat on column: " + column, e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/NumberFormatTransformer.java
catch (ParseException e) { throw new DataImportHandlerException( DataImportHandlerException.SEVERE, "Failed to apply NumberFormat on column: " + column, e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/SolrWriter.java
catch (IOException e) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Unable to call finish() on UpdateRequestProcessor", e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/SolrWriter.java
catch (IOException e) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Exception in full dump while deleting all documents.", e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/EntityProcessorWrapper.java
catch (Exception e) { log.warn("method invocation failed on transformer : " + trans, e); throw new DataImportHandlerException(WARN, e);
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/HTMLStripTransformer.java
catch (IOException e) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Failed stripping HTML for column: " + column, e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/URLDataSource.java
catch (Exception e) { LOG.error("Exception thrown while getting data", e); throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Exception in invoking url " + url, e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/FileListEntityProcessor.java
catch (ParseException exp) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Invalid expression for date", exp); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/FileListEntityProcessor.java
catch (ParseException exp) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Invalid expression for date", exp); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/SolrEntityProcessor.java
catch (MalformedURLException e) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DocBuilder.java
catch (Exception e) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Unable to load Writer implementation:" + writerClassStr, e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DocBuilder.java
catch (Throwable t) { if (verboseDebug) { getDebugLogger().log(DIHLogLevels.ENTITY_EXCEPTION, epw.getEntity().getName(), t); } throw new DataImportHandlerException(DataImportHandlerException.SEVERE, t); }
0 5
            
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DebugLogger.java
catch (DataImportHandlerException de) { log(DIHLogLevels.ENTITY_EXCEPTION, null, de); throw de; }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DebugLogger.java
catch (DataImportHandlerException de) { log(DIHLogLevels.TRANSFORMER_EXCEPTION, tName, de); de.debugged = true; throw de; }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/ScriptTransformer.java
catch (DataImportHandlerException e) { throw e; }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/SqlEntityProcessor.java
catch (DataImportHandlerException e) { throw e; }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DocBuilder.java
catch (DataImportHandlerException e) { if (verboseDebug) { getDebugLogger().log(DIHLogLevels.ENTITY_EXCEPTION, epw.getEntity().getName(), e); } if(e.getErrCode() == DataImportHandlerException.SKIP_ROW){ continue; } if (isRoot) { if (e.getErrCode() == DataImportHandlerException.SKIP) { importStatistics.skipDocCount.getAndIncrement(); doc = null; } else { SolrException.log(LOG, "Exception while processing: " + epw.getEntity().getName() + " document : " + doc, e); } if (e.getErrCode() == DataImportHandlerException.SEVERE) throw e; } else throw e; }
6
            
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DebugLogger.java
catch (DataImportHandlerException de) { log(DIHLogLevels.ENTITY_EXCEPTION, null, de); throw de; }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DebugLogger.java
catch (DataImportHandlerException de) { log(DIHLogLevels.TRANSFORMER_EXCEPTION, tName, de); de.debugged = true; throw de; }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/ScriptTransformer.java
catch (DataImportHandlerException e) { throw e; }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/SqlEntityProcessor.java
catch (DataImportHandlerException e) { throw e; }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DocBuilder.java
catch (DataImportHandlerException e) { if (verboseDebug) { getDebugLogger().log(DIHLogLevels.ENTITY_EXCEPTION, epw.getEntity().getName(), e); } if(e.getErrCode() == DataImportHandlerException.SKIP_ROW){ continue; } if (isRoot) { if (e.getErrCode() == DataImportHandlerException.SKIP) { importStatistics.skipDocCount.getAndIncrement(); doc = null; } else { SolrException.log(LOG, "Exception while processing: " + epw.getEntity().getName() + " document : " + doc, e); } if (e.getErrCode() == DataImportHandlerException.SEVERE) throw e; } else throw e; }
0
unknown (Lib) EOFException 3
            
// in solrj/src/java/org/apache/solr/common/util/FastInputStream.java
public int readUnsignedByte() throws IOException { if (pos >= end) { refill(); if (pos >= end) { throw new EOFException(); } } return buf[pos++] & 0xff; }
// in solrj/src/java/org/apache/solr/common/util/FastInputStream.java
public void readFully(byte b[], int off, int len) throws IOException { while (len>0) { int ret = read(b, off, len); if (ret==-1) { throw new EOFException(); } off += ret; len -= ret; } }
// in solrj/src/java/org/apache/solr/common/util/FastInputStream.java
public byte readByte() throws IOException { if (pos >= end) { refill(); if (pos >= end) throw new EOFException(); } return buf[pos++]; }
0 0 1
            
// in core/src/java/org/apache/solr/handler/loader/JavabinLoader.java
catch (EOFException e) { break; // this is expected }
0 0
checked (Lib) Exception 5
            
// in solrj/src/java/org/apache/solr/client/solrj/impl/XMLResponseParser.java
private NamedList<Object> processResponse(XMLStreamReader parser) { try { NamedList<Object> response = null; for (int event = parser.next(); event != XMLStreamConstants.END_DOCUMENT; event = parser.next()) { switch (event) { case XMLStreamConstants.START_ELEMENT: if( response != null ) { throw new Exception( "already read the response!" ); } // only top-level element is "response String name = parser.getLocalName(); if( name.equals( "response" ) || name.equals( "result" ) ) { response = readNamedList( parser ); } else if( name.equals( "solr" ) ) { return new SimpleOrderedMap<Object>(); } else { throw new Exception( "really needs to be response or result. " + "not:"+parser.getLocalName() ); } break; } } return response; } catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "parsing error", ex ); } finally { try { parser.close(); } catch( Exception ex ){} } }
// in core/src/java/org/apache/solr/schema/PreAnalyzedField.java
Override protected void init(IndexSchema schema, Map<String, String> args) { super.init(schema, args); String implName = args.get(PARSER_IMPL); if (implName == null) { parser = new JsonPreAnalyzedParser(); } else { try { Class<?> implClazz = Class.forName(implName); if (!PreAnalyzedParser.class.isAssignableFrom(implClazz)) { throw new Exception("must implement " + PreAnalyzedParser.class.getName()); } Constructor<?> c = implClazz.getConstructor(new Class<?>[0]); parser = (PreAnalyzedParser) c.newInstance(new Object[0]); } catch (Exception e) { LOG.warn("Can't use the configured PreAnalyzedParser class '" + implName + "' (" + e.getMessage() + "), using default " + DEFAULT_IMPL); parser = new JsonPreAnalyzedParser(); } } }
// in core/src/java/org/apache/solr/core/RequestHandlers.java
void initHandlersFromConfig(SolrConfig config ){ // use link map so we iterate in the same order Map<PluginInfo,SolrRequestHandler> handlers = new LinkedHashMap<PluginInfo,SolrRequestHandler>(); for (PluginInfo info : config.getPluginInfos(SolrRequestHandler.class.getName())) { try { SolrRequestHandler requestHandler; String startup = info.attributes.get("startup") ; if( startup != null ) { if( "lazy".equals(startup) ) { log.info("adding lazy requestHandler: " + info.className); requestHandler = new LazyRequestHandlerWrapper( core, info.className, info.initArgs ); } else { throw new Exception( "Unknown startup value: '"+startup+"' for: "+info.className ); } } else { requestHandler = core.createRequestHandler(info.className); } handlers.put(info,requestHandler); SolrRequestHandler old = register(info.name, requestHandler); if(old != null) { log.warn("Multiple requestHandler registered to the same name: " + info.name + " ignoring: " + old.getClass().getName()); } if(info.isDefault()){ old = register("",requestHandler); if(old != null) log.warn("Multiple default requestHandler registered" + " ignoring: " + old.getClass().getName()); } log.info("created "+info.name+": " + info.className); } catch (Exception ex) { throw new SolrException (ErrorCode.SERVER_ERROR, "RequestHandler init failure", ex); } } // we've now registered all handlers, time to init them in the same order for (Map.Entry<PluginInfo,SolrRequestHandler> entry : handlers.entrySet()) { PluginInfo info = entry.getKey(); SolrRequestHandler requestHandler = entry.getValue(); if (requestHandler instanceof PluginInfoInitialized) { ((PluginInfoInitialized) requestHandler).init(info); } else{ requestHandler.init(info.initArgs); } } if(get("") == null) register("", get("/select"));//defacto default handler if(get("") == null) register("", get("standard"));//old default handler name; TODO remove? if(get("") == null) log.warn("no default request handler is registered (either '/select' or 'standard')"); }
// in core/src/java/org/apache/solr/core/SolrCore.java
private void initWriters() { // use link map so we iterate in the same order Map<PluginInfo,QueryResponseWriter> writers = new LinkedHashMap<PluginInfo,QueryResponseWriter>(); for (PluginInfo info : solrConfig.getPluginInfos(QueryResponseWriter.class.getName())) { try { QueryResponseWriter writer; String startup = info.attributes.get("startup") ; if( startup != null ) { if( "lazy".equals(startup) ) { log.info("adding lazy queryResponseWriter: " + info.className); writer = new LazyQueryResponseWriterWrapper(this, info.className, info.initArgs ); } else { throw new Exception( "Unknown startup value: '"+startup+"' for: "+info.className ); } } else { writer = createQueryResponseWriter(info.className); } writers.put(info,writer); QueryResponseWriter old = registerResponseWriter(info.name, writer); if(old != null) { log.warn("Multiple queryResponseWriter registered to the same name: " + info.name + " ignoring: " + old.getClass().getName()); } if(info.isDefault()){ if(defaultResponseWriter != null) log.warn("Multiple default queryResponseWriter registered, using: " + info.name); defaultResponseWriter = writer; } log.info("created "+info.name+": " + info.className); } catch (Exception ex) { SolrException e = new SolrException (SolrException.ErrorCode.SERVER_ERROR, "QueryResponseWriter init failure", ex); SolrException.log(log,null,e); throw e; } } // we've now registered all handlers, time to init them in the same order for (Map.Entry<PluginInfo,QueryResponseWriter> entry : writers.entrySet()) { PluginInfo info = entry.getKey(); QueryResponseWriter writer = entry.getValue(); responseWriters.put(info.name, writer); if (writer instanceof PluginInfoInitialized) { ((PluginInfoInitialized) writer).init(info); } else{ writer.init(info.initArgs); } } NamedList emptyList = new NamedList(); for (Map.Entry<String, QueryResponseWriter> entry : DEFAULT_RESPONSE_WRITERS.entrySet()) { if(responseWriters.get(entry.getKey()) == null) { responseWriters.put(entry.getKey(), entry.getValue()); // call init so any logic in the default writers gets invoked entry.getValue().init(emptyList); } } // configure the default response writer; this one should never be null if (defaultResponseWriter == null) { defaultResponseWriter = responseWriters.get("standard"); } }
0 77
            
// in contrib/extraction/src/java/org/apache/solr/handler/extraction/ExtractingDocumentLoader.java
Override public void load(SolrQueryRequest req, SolrQueryResponse rsp, ContentStream stream, UpdateRequestProcessor processor) throws Exception { Parser parser = null; String streamType = req.getParams().get(ExtractingParams.STREAM_TYPE, null); if (streamType != null) { //Cache? Parsers are lightweight to construct and thread-safe, so I'm told MediaType mt = MediaType.parse(streamType.trim().toLowerCase(Locale.ENGLISH)); parser = new DefaultParser(config.getMediaTypeRegistry()).getParsers().get(mt); } else { parser = autoDetectParser; } if (parser != null) { Metadata metadata = new Metadata(); // If you specify the resource name (the filename, roughly) with this parameter, // then Tika can make use of it in guessing the appropriate MIME type: String resourceName = req.getParams().get(ExtractingParams.RESOURCE_NAME, null); if (resourceName != null) { metadata.add(TikaMetadataKeys.RESOURCE_NAME_KEY, resourceName); } // Provide stream's content type as hint for auto detection if(stream.getContentType() != null) { metadata.add(HttpHeaders.CONTENT_TYPE, stream.getContentType()); } InputStream inputStream = null; try { inputStream = stream.getStream(); metadata.add(ExtractingMetadataConstants.STREAM_NAME, stream.getName()); metadata.add(ExtractingMetadataConstants.STREAM_SOURCE_INFO, stream.getSourceInfo()); metadata.add(ExtractingMetadataConstants.STREAM_SIZE, String.valueOf(stream.getSize())); metadata.add(ExtractingMetadataConstants.STREAM_CONTENT_TYPE, stream.getContentType()); // HtmlParser and TXTParser regard Metadata.CONTENT_ENCODING in metadata String charset = ContentStreamBase.getCharsetFromContentType(stream.getContentType()); if(charset != null){ metadata.add(HttpHeaders.CONTENT_ENCODING, charset); } String xpathExpr = params.get(ExtractingParams.XPATH_EXPRESSION); boolean extractOnly = params.getBool(ExtractingParams.EXTRACT_ONLY, false); SolrContentHandler handler = factory.createSolrContentHandler(metadata, params, schema); ContentHandler parsingHandler = handler; StringWriter writer = null; BaseMarkupSerializer serializer = null; if (extractOnly == true) { String extractFormat = params.get(ExtractingParams.EXTRACT_FORMAT, "xml"); writer = new StringWriter(); if (extractFormat.equals(TEXT_FORMAT)) { serializer = new TextSerializer(); serializer.setOutputCharStream(writer); serializer.setOutputFormat(new OutputFormat("Text", "UTF-8", true)); } else { serializer = new XMLSerializer(writer, new OutputFormat("XML", "UTF-8", true)); } if (xpathExpr != null) { Matcher matcher = PARSER.parse(xpathExpr); serializer.startDocument();//The MatchingContentHandler does not invoke startDocument. See http://tika.markmail.org/message/kknu3hw7argwiqin parsingHandler = new MatchingContentHandler(serializer, matcher); } else { parsingHandler = serializer; } } else if (xpathExpr != null) { Matcher matcher = PARSER.parse(xpathExpr); parsingHandler = new MatchingContentHandler(handler, matcher); } //else leave it as is try{ //potentially use a wrapper handler for parsing, but we still need the SolrContentHandler for getting the document. ParseContext context = new ParseContext();//TODO: should we design a way to pass in parse context? parser.parse(inputStream, parsingHandler, metadata, context); } catch (TikaException e) { if(ignoreTikaException) log.warn(new StringBuilder("skip extracting text due to ").append(e.getLocalizedMessage()) .append(". metadata=").append(metadata.toString()).toString()); else throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); } if (extractOnly == false) { addDoc(handler); } else { //serializer is not null, so we need to call endDoc on it if using xpath if (xpathExpr != null){ serializer.endDocument(); } rsp.add(stream.getName(), writer.toString()); writer.close(); String[] names = metadata.names(); NamedList metadataNL = new NamedList(); for (int i = 0; i < names.length; i++) { String[] vals = metadata.getValues(names[i]); metadataNL.add(names[i], vals); } rsp.add(stream.getName() + "_metadata", metadataNL); } } catch (SAXException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); } finally { IOUtils.closeQuietly(inputStream); } } else { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Stream type of " + streamType + " didn't match any known parsers. Please supply the " + ExtractingParams.STREAM_TYPE + " parameter."); } }
// in contrib/dataimporthandler-extras/src/java/org/apache/solr/handler/dataimport/MailEntityProcessor.java
public void addPartToDocument(Part part, Map<String, Object> row, boolean outerMost) throws Exception { if (part instanceof Message) { addEnvelopToDocument(part, row); } String ct = part.getContentType(); ContentType ctype = new ContentType(ct); if (part.isMimeType("multipart/*")) { Multipart mp = (Multipart) part.getContent(); int count = mp.getCount(); if (part.isMimeType("multipart/alternative")) count = 1; for (int i = 0; i < count; i++) addPartToDocument(mp.getBodyPart(i), row, false); } else if (part.isMimeType("message/rfc822")) { addPartToDocument((Part) part.getContent(), row, false); } else { String disp = part.getDisposition(); if (!processAttachment || (disp != null && disp.equalsIgnoreCase(Part.ATTACHMENT))) return; InputStream is = part.getInputStream(); String fileName = part.getFileName(); Metadata md = new Metadata(); md.set(HttpHeaders.CONTENT_TYPE, ctype.getBaseType().toLowerCase(Locale.ENGLISH)); md.set(TikaMetadataKeys.RESOURCE_NAME_KEY, fileName); String content = tika.parseToString(is, md); if (disp != null && disp.equalsIgnoreCase(Part.ATTACHMENT)) { if (row.get(ATTACHMENT) == null) row.put(ATTACHMENT, new ArrayList<String>()); List<String> contents = (List<String>) row.get(ATTACHMENT); contents.add(content); row.put(ATTACHMENT, contents); if (row.get(ATTACHMENT_NAMES) == null) row.put(ATTACHMENT_NAMES, new ArrayList<String>()); List<String> names = (List<String>) row.get(ATTACHMENT_NAMES); names.add(fileName); row.put(ATTACHMENT_NAMES, names); } else { if (row.get(CONTENT) == null) row.put(CONTENT, new ArrayList<String>()); List<String> contents = (List<String>) row.get(CONTENT); contents.add(content); row.put(CONTENT, contents); } } }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/JdbcDataSource.java
protected Callable<Connection> createConnectionFactory(final Context context, final Properties initProps) { // final VariableResolver resolver = context.getVariableResolver(); resolveVariables(context, initProps); final String jndiName = initProps.getProperty(JNDI_NAME); final String url = initProps.getProperty(URL); final String driver = initProps.getProperty(DRIVER); if (url == null && jndiName == null) throw new DataImportHandlerException(SEVERE, "JDBC URL or JNDI name has to be specified"); if (driver != null) { try { DocBuilder.loadClass(driver, context.getSolrCore()); } catch (ClassNotFoundException e) { wrapAndThrow(SEVERE, e, "Could not load driver: " + driver); } } else { if(jndiName == null){ throw new DataImportHandlerException(SEVERE, "One of driver or jndiName must be specified in the data source"); } } String s = initProps.getProperty("maxRows"); if (s != null) { maxRows = Integer.parseInt(s); } return factory = new Callable<Connection>() { public Connection call() throws Exception { LOG.info("Creating a connection for entity " + context.getEntityAttribute(DataImporter.NAME) + " with URL: " + url); long start = System.currentTimeMillis(); Connection c = null; try { if(url != null){ c = DriverManager.getConnection(url, initProps); } else if(jndiName != null){ InitialContext ctx = new InitialContext(); Object jndival = ctx.lookup(jndiName); if (jndival instanceof javax.sql.DataSource) { javax.sql.DataSource dataSource = (javax.sql.DataSource) jndival; String user = (String) initProps.get("user"); String pass = (String) initProps.get("password"); if(user == null || user.trim().equals("")){ c = dataSource.getConnection(); } else { c = dataSource.getConnection(user, pass); } } else { throw new DataImportHandlerException(SEVERE, "the jndi name : '"+jndiName +"' is not a valid javax.sql.DataSource"); } } } catch (SQLException e) { // DriverManager does not allow you to use a driver which is not loaded through // the class loader of the class which is trying to make the connection. // This is a workaround for cases where the user puts the driver jar in the // solr.home/lib or solr.home/core/lib directories. Driver d = (Driver) DocBuilder.loadClass(driver, context.getSolrCore()).newInstance(); c = d.connect(url, initProps); } if (c != null) { if (Boolean.parseBoolean(initProps.getProperty("readOnly"))) { c.setReadOnly(true); // Add other sane defaults c.setAutoCommit(true); c.setTransactionIsolation(Connection.TRANSACTION_READ_UNCOMMITTED); c.setHoldability(ResultSet.CLOSE_CURSORS_AT_COMMIT); } if (!Boolean.parseBoolean(initProps.getProperty("autoCommit"))) { c.setAutoCommit(false); } String transactionIsolation = initProps.getProperty("transactionIsolation"); if ("TRANSACTION_READ_UNCOMMITTED".equals(transactionIsolation)) { c.setTransactionIsolation(Connection.TRANSACTION_READ_UNCOMMITTED); } else if ("TRANSACTION_READ_COMMITTED".equals(transactionIsolation)) { c.setTransactionIsolation(Connection.TRANSACTION_READ_COMMITTED); } else if ("TRANSACTION_REPEATABLE_READ".equals(transactionIsolation)) { c.setTransactionIsolation(Connection.TRANSACTION_REPEATABLE_READ); } else if ("TRANSACTION_SERIALIZABLE".equals(transactionIsolation)) { c.setTransactionIsolation(Connection.TRANSACTION_SERIALIZABLE); } else if ("TRANSACTION_NONE".equals(transactionIsolation)) { c.setTransactionIsolation(Connection.TRANSACTION_NONE); } String holdability = initProps.getProperty("holdability"); if ("CLOSE_CURSORS_AT_COMMIT".equals(holdability)) { c.setHoldability(ResultSet.CLOSE_CURSORS_AT_COMMIT); } else if ("HOLD_CURSORS_OVER_COMMIT".equals(holdability)) { c.setHoldability(ResultSet.HOLD_CURSORS_OVER_COMMIT); } } LOG.info("Time taken for getConnection(): " + (System.currentTimeMillis() - start)); return c; } }; }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/JdbcDataSource.java
public Connection call() throws Exception { LOG.info("Creating a connection for entity " + context.getEntityAttribute(DataImporter.NAME) + " with URL: " + url); long start = System.currentTimeMillis(); Connection c = null; try { if(url != null){ c = DriverManager.getConnection(url, initProps); } else if(jndiName != null){ InitialContext ctx = new InitialContext(); Object jndival = ctx.lookup(jndiName); if (jndival instanceof javax.sql.DataSource) { javax.sql.DataSource dataSource = (javax.sql.DataSource) jndival; String user = (String) initProps.get("user"); String pass = (String) initProps.get("password"); if(user == null || user.trim().equals("")){ c = dataSource.getConnection(); } else { c = dataSource.getConnection(user, pass); } } else { throw new DataImportHandlerException(SEVERE, "the jndi name : '"+jndiName +"' is not a valid javax.sql.DataSource"); } } } catch (SQLException e) { // DriverManager does not allow you to use a driver which is not loaded through // the class loader of the class which is trying to make the connection. // This is a workaround for cases where the user puts the driver jar in the // solr.home/lib or solr.home/core/lib directories. Driver d = (Driver) DocBuilder.loadClass(driver, context.getSolrCore()).newInstance(); c = d.connect(url, initProps); } if (c != null) { if (Boolean.parseBoolean(initProps.getProperty("readOnly"))) { c.setReadOnly(true); // Add other sane defaults c.setAutoCommit(true); c.setTransactionIsolation(Connection.TRANSACTION_READ_UNCOMMITTED); c.setHoldability(ResultSet.CLOSE_CURSORS_AT_COMMIT); } if (!Boolean.parseBoolean(initProps.getProperty("autoCommit"))) { c.setAutoCommit(false); } String transactionIsolation = initProps.getProperty("transactionIsolation"); if ("TRANSACTION_READ_UNCOMMITTED".equals(transactionIsolation)) { c.setTransactionIsolation(Connection.TRANSACTION_READ_UNCOMMITTED); } else if ("TRANSACTION_READ_COMMITTED".equals(transactionIsolation)) { c.setTransactionIsolation(Connection.TRANSACTION_READ_COMMITTED); } else if ("TRANSACTION_REPEATABLE_READ".equals(transactionIsolation)) { c.setTransactionIsolation(Connection.TRANSACTION_REPEATABLE_READ); } else if ("TRANSACTION_SERIALIZABLE".equals(transactionIsolation)) { c.setTransactionIsolation(Connection.TRANSACTION_SERIALIZABLE); } else if ("TRANSACTION_NONE".equals(transactionIsolation)) { c.setTransactionIsolation(Connection.TRANSACTION_NONE); } String holdability = initProps.getProperty("holdability"); if ("CLOSE_CURSORS_AT_COMMIT".equals(holdability)) { c.setHoldability(ResultSet.CLOSE_CURSORS_AT_COMMIT); } else if ("HOLD_CURSORS_OVER_COMMIT".equals(holdability)) { c.setHoldability(ResultSet.HOLD_CURSORS_OVER_COMMIT); } } LOG.info("Time taken for getConnection(): " + (System.currentTimeMillis() - start)); return c; }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/JdbcDataSource.java
private Connection getConnection() throws Exception { long currTime = System.currentTimeMillis(); if (currTime - connLastUsed > CONN_TIME_OUT) { synchronized (this) { Connection tmpConn = factory.call(); closeConnection(); connLastUsed = System.currentTimeMillis(); return conn = tmpConn; } } else { connLastUsed = currTime; return conn; } }
// in core/src/java/org/apache/solr/handler/DocumentAnalysisRequestHandler.java
Override protected NamedList doAnalysis(SolrQueryRequest req) throws Exception { DocumentAnalysisRequest analysisRequest = resolveAnalysisRequest(req); return handleAnalysisRequest(analysisRequest, req.getSchema()); }
// in core/src/java/org/apache/solr/handler/UpdateRequestHandler.java
Override public void load(SolrQueryRequest req, SolrQueryResponse rsp, ContentStream stream, UpdateRequestProcessor processor) throws Exception { String type = req.getParams().get(UpdateParams.ASSUME_CONTENT_TYPE); if(type == null) { type = stream.getContentType(); } if( type == null ) { // Normal requests will not get here. throw new SolrException(ErrorCode.BAD_REQUEST, "Missing ContentType"); } int idx = type.indexOf(';'); if(idx>0) { type = type.substring(0,idx); } ContentStreamLoader loader = loaders.get(type); if(loader==null) { throw new SolrException(ErrorCode.BAD_REQUEST, "Unsupported ContentType: " +type+ " Not in: "+loaders.keySet()); } if(loader.getDefaultWT()!=null) { setDefaultWT(req,loader); } loader.load(req, rsp, stream, processor); }
// in core/src/java/org/apache/solr/handler/AnalysisRequestHandlerBase.java
Override public void handleRequestBody(SolrQueryRequest req, SolrQueryResponse rsp) throws Exception { rsp.add("analysis", doAnalysis(req)); }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
private void terminateAndWaitFsyncService() throws Exception { if (fsyncService.isTerminated()) return; fsyncService.shutdown(); // give a long wait say 1 hr fsyncService.awaitTermination(3600, TimeUnit.SECONDS); // if any fsync failed, throw that exception back Exception fsyncExceptionCopy = fsyncException; if (fsyncExceptionCopy != null) throw fsyncExceptionCopy; }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
private void downloadConfFiles(List<Map<String, Object>> confFilesToDownload, long latestGeneration) throws Exception { LOG.info("Starting download of configuration files from master: " + confFilesToDownload); confFilesDownloaded = Collections.synchronizedList(new ArrayList<Map<String, Object>>()); File tmpconfDir = new File(solrCore.getResourceLoader().getConfigDir(), "conf." + getDateAsStr(new Date())); try { boolean status = tmpconfDir.mkdirs(); if (!status) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Failed to create temporary config folder: " + tmpconfDir.getName()); } for (Map<String, Object> file : confFilesToDownload) { String saveAs = (String) (file.get(ALIAS) == null ? file.get(NAME) : file.get(ALIAS)); fileFetcher = new FileFetcher(tmpconfDir, file, saveAs, true, latestGeneration); currentFile = file; fileFetcher.fetchFile(); confFilesDownloaded.add(new HashMap<String, Object>(file)); } // this is called before copying the files to the original conf dir // so that if there is an exception avoid corrupting the original files. terminateAndWaitFsyncService(); copyTmpConfFiles2Conf(tmpconfDir); } finally { delTree(tmpconfDir); } }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
private void downloadIndexFiles(boolean downloadCompleteIndex, File tmpIdxDir, long latestGeneration) throws Exception { for (Map<String, Object> file : filesToDownload) { File localIndexFile = new File(solrCore.getIndexDir(), (String) file.get(NAME)); if (!localIndexFile.exists() || downloadCompleteIndex) { fileFetcher = new FileFetcher(tmpIdxDir, file, (String) file.get(NAME), false, latestGeneration); currentFile = file; fileFetcher.fetchFile(); filesDownloaded.add(new HashMap<String, Object>(file)); } else { LOG.info("Skipping download for " + localIndexFile); } } }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
void fetchFile() throws Exception { try { while (true) { final FastInputStream is = getStream(); int result; try { //fetch packets one by one in a single request result = fetchPackets(is); if (result == 0 || result == NO_CONTENT) { // if the file is downloaded properly set the // timestamp same as that in the server if (file.exists() && lastmodified > 0) file.setLastModified(lastmodified); return; } //if there is an error continue. But continue from the point where it got broken } finally { IOUtils.closeQuietly(is); } } } finally { cleanup(); //if cleanup suceeds . The file is downloaded fully. do an fsync fsyncService.submit(new Runnable(){ public void run() { try { FileUtils.sync(file); } catch (IOException e) { fsyncException = e; } } }); } }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
private int fetchPackets(FastInputStream fis) throws Exception { byte[] intbytes = new byte[4]; byte[] longbytes = new byte[8]; try { while (true) { if (stop) { stop = false; aborted = true; throw new ReplicationHandlerException("User aborted replication"); } long checkSumServer = -1; fis.readFully(intbytes); //read the size of the packet int packetSize = readInt(intbytes); if (packetSize <= 0) { LOG.warn("No content recieved for file: " + currentFile); return NO_CONTENT; } if (buf.length < packetSize) buf = new byte[packetSize]; if (checksum != null) { //read the checksum fis.readFully(longbytes); checkSumServer = readLong(longbytes); } //then read the packet of bytes fis.readFully(buf, 0, packetSize); //compare the checksum as sent from the master if (includeChecksum) { checksum.reset(); checksum.update(buf, 0, packetSize); long checkSumClient = checksum.getValue(); if (checkSumClient != checkSumServer) { LOG.error("Checksum not matched between client and server for: " + currentFile); //if checksum is wrong it is a problem return for retry return 1; } } //if everything is fine, write down the packet to the file fileChannel.write(ByteBuffer.wrap(buf, 0, packetSize)); bytesDownloaded += packetSize; if (bytesDownloaded >= size) return 0; //errorcount is always set to zero after a successful packet errorCount = 0; } } catch (ReplicationHandlerException e) { throw e; } catch (Exception e) { LOG.warn("Error in fetching packets ", e); //for any failure , increment the error count errorCount++; //if it fails for the same pacaket for MAX_RETRIES fail and come out if (errorCount > MAX_RETRIES) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Fetch failed for file:" + fileName, e); } return ERR; } }
// in core/src/java/org/apache/solr/handler/loader/XMLLoader.java
Override public void load(SolrQueryRequest req, SolrQueryResponse rsp, ContentStream stream, UpdateRequestProcessor processor) throws Exception { final String charset = ContentStreamBase.getCharsetFromContentType(stream.getContentType()); InputStream is = null; XMLStreamReader parser = null; String tr = req.getParams().get(CommonParams.TR,null); if(tr!=null) { Transformer t = getTransformer(tr,req); final DOMResult result = new DOMResult(); // first step: read XML and build DOM using Transformer (this is no overhead, as XSL always produces // an internal result DOM tree, we just access it directly as input for StAX): try { is = stream.getStream(); final InputSource isrc = new InputSource(is); isrc.setEncoding(charset); final SAXSource source = new SAXSource(isrc); t.transform(source, result); } catch(TransformerException te) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, te.getMessage(), te); } finally { IOUtils.closeQuietly(is); } // second step feed the intermediate DOM tree into StAX parser: try { parser = inputFactory.createXMLStreamReader(new DOMSource(result.getNode())); this.processUpdate(req, processor, parser); } catch (XMLStreamException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e.getMessage(), e); } finally { if (parser != null) parser.close(); } } // Normal XML Loader else { try { is = stream.getStream(); if (UpdateRequestHandler.log.isTraceEnabled()) { final byte[] body = IOUtils.toByteArray(is); // TODO: The charset may be wrong, as the real charset is later // determined by the XML parser, the content-type is only used as a hint! UpdateRequestHandler.log.trace("body", new String(body, (charset == null) ? ContentStreamBase.DEFAULT_CHARSET : charset)); IOUtils.closeQuietly(is); is = new ByteArrayInputStream(body); } parser = (charset == null) ? inputFactory.createXMLStreamReader(is) : inputFactory.createXMLStreamReader(is, charset); this.processUpdate(req, processor, parser); } catch (XMLStreamException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e.getMessage(), e); } finally { if (parser != null) parser.close(); IOUtils.closeQuietly(is); } } }
// in core/src/java/org/apache/solr/handler/loader/JavabinLoader.java
Override public void load(SolrQueryRequest req, SolrQueryResponse rsp, ContentStream stream, UpdateRequestProcessor processor) throws Exception { InputStream is = null; try { is = stream.getStream(); parseAndLoadDocs(req, rsp, is, processor); } finally { if(is != null) { is.close(); } } }
// in core/src/java/org/apache/solr/handler/loader/JsonLoader.java
Override public void load(SolrQueryRequest req, SolrQueryResponse rsp, ContentStream stream, UpdateRequestProcessor processor) throws Exception { new SingleThreadedJsonLoader(req,processor).load(req, rsp, stream, processor); }
// in core/src/java/org/apache/solr/handler/loader/JsonLoader.java
Override public void load(SolrQueryRequest req, SolrQueryResponse rsp, ContentStream stream, UpdateRequestProcessor processor) throws Exception { Reader reader = null; try { reader = stream.getReader(); if (log.isTraceEnabled()) { String body = IOUtils.toString(reader); log.trace("body", body); reader = new StringReader(body); } parser = new JSONParser(reader); this.processUpdate(); } finally { IOUtils.closeQuietly(reader); } }
// in core/src/java/org/apache/solr/handler/loader/CSVLoader.java
Override public void load(SolrQueryRequest req, SolrQueryResponse rsp, ContentStream stream, UpdateRequestProcessor processor) throws Exception { new SingleThreadedCSVLoader(req,processor).load(req, rsp, stream, processor); }
// in core/src/java/org/apache/solr/handler/FieldAnalysisRequestHandler.java
Override protected NamedList doAnalysis(SolrQueryRequest req) throws Exception { FieldAnalysisRequest analysisRequest = resolveAnalysisRequest(req); IndexSchema indexSchema = req.getCore().getSchema(); return handleAnalysisRequest(analysisRequest, indexSchema); }
// in core/src/java/org/apache/solr/handler/ReplicationHandler.java
Override public void handleRequestBody(SolrQueryRequest req, SolrQueryResponse rsp) throws Exception { rsp.setHttpCaching(false); final SolrParams solrParams = req.getParams(); String command = solrParams.get(COMMAND); if (command == null) { rsp.add(STATUS, OK_STATUS); rsp.add("message", "No command"); return; } // This command does not give the current index version of the master // It gives the current 'replicateable' index version if (command.equals(CMD_INDEX_VERSION)) { IndexCommit commitPoint = indexCommitPoint; // make a copy so it won't change if (commitPoint == null) { // if this handler is 'lazy', we may not have tracked the last commit // because our commit listener is registered on inform commitPoint = core.getDeletionPolicy().getLatestCommit(); } if (commitPoint != null && replicationEnabled.get()) { // // There is a race condition here. The commit point may be changed / deleted by the time // we get around to reserving it. This is a very small window though, and should not result // in a catastrophic failure, but will result in the client getting an empty file list for // the CMD_GET_FILE_LIST command. // core.getDeletionPolicy().setReserveDuration(commitPoint.getGeneration(), reserveCommitDuration); rsp.add(CMD_INDEX_VERSION, IndexDeletionPolicyWrapper.getCommitTimestamp(commitPoint)); rsp.add(GENERATION, commitPoint.getGeneration()); } else { // This happens when replication is not configured to happen after startup and no commit/optimize // has happened yet. rsp.add(CMD_INDEX_VERSION, 0L); rsp.add(GENERATION, 0L); } } else if (command.equals(CMD_GET_FILE)) { getFileStream(solrParams, rsp); } else if (command.equals(CMD_GET_FILE_LIST)) { getFileList(solrParams, rsp); } else if (command.equalsIgnoreCase(CMD_BACKUP)) { doSnapShoot(new ModifiableSolrParams(solrParams), rsp,req); rsp.add(STATUS, OK_STATUS); } else if (command.equalsIgnoreCase(CMD_FETCH_INDEX)) { String masterUrl = solrParams.get(MASTER_URL); if (!isSlave && masterUrl == null) { rsp.add(STATUS,ERR_STATUS); rsp.add("message","No slave configured or no 'masterUrl' Specified"); return; } final SolrParams paramsCopy = new ModifiableSolrParams(solrParams); new Thread() { @Override public void run() { doFetch(paramsCopy, false); } }.start(); rsp.add(STATUS, OK_STATUS); } else if (command.equalsIgnoreCase(CMD_DISABLE_POLL)) { if (snapPuller != null){ snapPuller.disablePoll(); rsp.add(STATUS, OK_STATUS); } else { rsp.add(STATUS, ERR_STATUS); rsp.add("message","No slave configured"); } } else if (command.equalsIgnoreCase(CMD_ENABLE_POLL)) { if (snapPuller != null){ snapPuller.enablePoll(); rsp.add(STATUS, OK_STATUS); }else { rsp.add(STATUS,ERR_STATUS); rsp.add("message","No slave configured"); } } else if (command.equalsIgnoreCase(CMD_ABORT_FETCH)) { SnapPuller temp = tempSnapPuller; if (temp != null){ temp.abortPull(); rsp.add(STATUS, OK_STATUS); } else { rsp.add(STATUS,ERR_STATUS); rsp.add("message","No slave configured"); } } else if (command.equals(CMD_FILE_CHECKSUM)) { // this command is not used by anyone getFileChecksum(solrParams, rsp); } else if (command.equals(CMD_SHOW_COMMITS)) { rsp.add(CMD_SHOW_COMMITS, getCommits()); } else if (command.equals(CMD_DETAILS)) { rsp.add(CMD_DETAILS, getReplicationDetails(solrParams.getBool("slave",true))); RequestHandlerUtils.addExperimentalFormatWarning(rsp); } else if (CMD_ENABLE_REPL.equalsIgnoreCase(command)) { replicationEnabled.set(true); rsp.add(STATUS, OK_STATUS); } else if (CMD_DISABLE_REPL.equalsIgnoreCase(command)) { replicationEnabled.set(false); rsp.add(STATUS, OK_STATUS); } }
// in core/src/java/org/apache/solr/handler/MoreLikeThisHandler.java
Override public void handleRequestBody(SolrQueryRequest req, SolrQueryResponse rsp) throws Exception { SolrParams params = req.getParams(); // Set field flags ReturnFields returnFields = new ReturnFields( req ); rsp.setReturnFields( returnFields ); int flags = 0; if (returnFields.wantsScore()) { flags |= SolrIndexSearcher.GET_SCORES; } String defType = params.get(QueryParsing.DEFTYPE, QParserPlugin.DEFAULT_QTYPE); String q = params.get( CommonParams.Q ); Query query = null; SortSpec sortSpec = null; List<Query> filters = null; try { if (q != null) { QParser parser = QParser.getParser(q, defType, req); query = parser.getQuery(); sortSpec = parser.getSort(true); } String[] fqs = req.getParams().getParams(CommonParams.FQ); if (fqs!=null && fqs.length!=0) { filters = new ArrayList<Query>(); for (String fq : fqs) { if (fq != null && fq.trim().length()!=0) { QParser fqp = QParser.getParser(fq, null, req); filters.add(fqp.getQuery()); } } } } catch (ParseException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); } SolrIndexSearcher searcher = req.getSearcher(); MoreLikeThisHelper mlt = new MoreLikeThisHelper( params, searcher ); // Hold on to the interesting terms if relevant TermStyle termStyle = TermStyle.get( params.get( MoreLikeThisParams.INTERESTING_TERMS ) ); List<InterestingTerm> interesting = (termStyle == TermStyle.NONE ) ? null : new ArrayList<InterestingTerm>( mlt.mlt.getMaxQueryTerms() ); DocListAndSet mltDocs = null; // Parse Required Params // This will either have a single Reader or valid query Reader reader = null; try { if (q == null || q.trim().length() < 1) { Iterable<ContentStream> streams = req.getContentStreams(); if (streams != null) { Iterator<ContentStream> iter = streams.iterator(); if (iter.hasNext()) { reader = iter.next().getReader(); } if (iter.hasNext()) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "MoreLikeThis does not support multiple ContentStreams"); } } } int start = params.getInt(CommonParams.START, 0); int rows = params.getInt(CommonParams.ROWS, 10); // Find documents MoreLikeThis - either with a reader or a query // -------------------------------------------------------------------------------- if (reader != null) { mltDocs = mlt.getMoreLikeThis(reader, start, rows, filters, interesting, flags); } else if (q != null) { // Matching options boolean includeMatch = params.getBool(MoreLikeThisParams.MATCH_INCLUDE, true); int matchOffset = params.getInt(MoreLikeThisParams.MATCH_OFFSET, 0); // Find the base match DocList match = searcher.getDocList(query, null, null, matchOffset, 1, flags); // only get the first one... if (includeMatch) { rsp.add("match", match); } // This is an iterator, but we only handle the first match DocIterator iterator = match.iterator(); if (iterator.hasNext()) { // do a MoreLikeThis query for each document in results int id = iterator.nextDoc(); mltDocs = mlt.getMoreLikeThis(id, start, rows, filters, interesting, flags); } } else { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "MoreLikeThis requires either a query (?q=) or text to find similar documents."); } } finally { if (reader != null) { reader.close(); } } if( mltDocs == null ) { mltDocs = new DocListAndSet(); // avoid NPE } rsp.add( "response", mltDocs.docList ); if( interesting != null ) { if( termStyle == TermStyle.DETAILS ) { NamedList<Float> it = new NamedList<Float>(); for( InterestingTerm t : interesting ) { it.add( t.term.toString(), t.boost ); } rsp.add( "interestingTerms", it ); } else { List<String> it = new ArrayList<String>( interesting.size() ); for( InterestingTerm t : interesting ) { it.add( t.term.text()); } rsp.add( "interestingTerms", it ); } } // maybe facet the results if (params.getBool(FacetParams.FACET,false)) { if( mltDocs.docSet == null ) { rsp.add( "facet_counts", null ); } else { SimpleFacets f = new SimpleFacets(req, mltDocs.docSet, params ); rsp.add( "facet_counts", f.getFacetCounts() ); } } boolean dbg = req.getParams().getBool(CommonParams.DEBUG_QUERY, false); boolean dbgQuery = false, dbgResults = false; if (dbg == false){//if it's true, we are doing everything anyway. String[] dbgParams = req.getParams().getParams(CommonParams.DEBUG); if (dbgParams != null) { for (int i = 0; i < dbgParams.length; i++) { if (dbgParams[i].equals(CommonParams.QUERY)){ dbgQuery = true; } else if (dbgParams[i].equals(CommonParams.RESULTS)){ dbgResults = true; } } } } else { dbgQuery = true; dbgResults = true; } // Copied from StandardRequestHandler... perhaps it should be added to doStandardDebug? if (dbg == true) { try { NamedList<Object> dbgInfo = SolrPluginUtils.doStandardDebug(req, q, mlt.getRawMLTQuery(), mltDocs.docList, dbgQuery, dbgResults); if (null != dbgInfo) { if (null != filters) { dbgInfo.add("filter_queries",req.getParams().getParams(CommonParams.FQ)); List<String> fqs = new ArrayList<String>(filters.size()); for (Query fq : filters) { fqs.add(QueryParsing.toString(fq, req.getSchema())); } dbgInfo.add("parsed_filter_queries",fqs); } rsp.add("debug", dbgInfo); } } catch (Exception e) { SolrException.log(SolrCore.log, "Exception during debug", e); rsp.add("exception_during_debug", SolrException.toStr(e)); } } }
// in core/src/java/org/apache/solr/handler/component/HttpShardHandler.java
public void submit(final ShardRequest sreq, final String shard, final ModifiableSolrParams params) { // do this outside of the callable for thread safety reasons final List<String> urls = getURLs(shard); Callable<ShardResponse> task = new Callable<ShardResponse>() { public ShardResponse call() throws Exception { ShardResponse srsp = new ShardResponse(); srsp.setShardRequest(sreq); srsp.setShard(shard); SimpleSolrResponse ssr = new SimpleSolrResponse(); srsp.setSolrResponse(ssr); long startTime = System.currentTimeMillis(); try { params.remove(CommonParams.WT); // use default (currently javabin) params.remove(CommonParams.VERSION); // SolrRequest req = new QueryRequest(SolrRequest.METHOD.POST, "/select"); // use generic request to avoid extra processing of queries QueryRequest req = new QueryRequest(params); req.setMethod(SolrRequest.METHOD.POST); // no need to set the response parser as binary is the default // req.setResponseParser(new BinaryResponseParser()); // if there are no shards available for a slice, urls.size()==0 if (urls.size()==0) { // TODO: what's the right error code here? We should use the same thing when // all of the servers for a shard are down. throw new SolrException(SolrException.ErrorCode.SERVICE_UNAVAILABLE, "no servers hosting shard: " + shard); } if (urls.size() <= 1) { String url = urls.get(0); srsp.setShardAddress(url); SolrServer server = new HttpSolrServer(url, httpClient); ssr.nl = server.request(req); } else { LBHttpSolrServer.Rsp rsp = httpShardHandlerFactory.loadbalancer.request(new LBHttpSolrServer.Req(req, urls)); ssr.nl = rsp.getResponse(); srsp.setShardAddress(rsp.getServer()); } } catch( ConnectException cex ) { srsp.setException(cex); //???? } catch (Throwable th) { srsp.setException(th); if (th instanceof SolrException) { srsp.setResponseCode(((SolrException)th).code()); } else { srsp.setResponseCode(-1); } } ssr.elapsedTime = System.currentTimeMillis() - startTime; return srsp; } }; pending.add( completionService.submit(task) ); }
// in core/src/java/org/apache/solr/handler/component/HttpShardHandler.java
public ShardResponse call() throws Exception { ShardResponse srsp = new ShardResponse(); srsp.setShardRequest(sreq); srsp.setShard(shard); SimpleSolrResponse ssr = new SimpleSolrResponse(); srsp.setSolrResponse(ssr); long startTime = System.currentTimeMillis(); try { params.remove(CommonParams.WT); // use default (currently javabin) params.remove(CommonParams.VERSION); // SolrRequest req = new QueryRequest(SolrRequest.METHOD.POST, "/select"); // use generic request to avoid extra processing of queries QueryRequest req = new QueryRequest(params); req.setMethod(SolrRequest.METHOD.POST); // no need to set the response parser as binary is the default // req.setResponseParser(new BinaryResponseParser()); // if there are no shards available for a slice, urls.size()==0 if (urls.size()==0) { // TODO: what's the right error code here? We should use the same thing when // all of the servers for a shard are down. throw new SolrException(SolrException.ErrorCode.SERVICE_UNAVAILABLE, "no servers hosting shard: " + shard); } if (urls.size() <= 1) { String url = urls.get(0); srsp.setShardAddress(url); SolrServer server = new HttpSolrServer(url, httpClient); ssr.nl = server.request(req); } else { LBHttpSolrServer.Rsp rsp = httpShardHandlerFactory.loadbalancer.request(new LBHttpSolrServer.Req(req, urls)); ssr.nl = rsp.getResponse(); srsp.setShardAddress(rsp.getServer()); } } catch( ConnectException cex ) { srsp.setException(cex); //???? } catch (Throwable th) { srsp.setException(th); if (th instanceof SolrException) { srsp.setResponseCode(((SolrException)th).code()); } else { srsp.setResponseCode(-1); } } ssr.elapsedTime = System.currentTimeMillis() - startTime; return srsp; }
// in core/src/java/org/apache/solr/handler/component/QueryElevationComponent.java
Map<String, ElevationObj> getElevationMap(IndexReader reader, SolrCore core) throws Exception { synchronized (elevationCache) { Map<String, ElevationObj> map = elevationCache.get(null); if (map != null) return map; map = elevationCache.get(reader); if (map == null) { String f = initArgs.get(CONFIG_FILE); if (f == null) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "QueryElevationComponent must specify argument: " + CONFIG_FILE); } log.info("Loading QueryElevation from data dir: " + f); Config cfg; ZkController zkController = core.getCoreDescriptor().getCoreContainer().getZkController(); if (zkController != null) { cfg = new Config(core.getResourceLoader(), f, null, null); } else { InputStream is = VersionedFile.getLatestFile(core.getDataDir(), f); cfg = new Config(core.getResourceLoader(), f, new InputSource(is), null); } map = loadElevationMap(cfg); elevationCache.put(reader, map); } return map; } }
// in core/src/java/org/apache/solr/handler/component/SearchHandler.java
Override public void handleRequestBody(SolrQueryRequest req, SolrQueryResponse rsp) throws Exception, ParseException, InstantiationException, IllegalAccessException { // int sleep = req.getParams().getInt("sleep",0); // if (sleep > 0) {log.error("SLEEPING for " + sleep); Thread.sleep(sleep);} ResponseBuilder rb = new ResponseBuilder(req, rsp, components); if (rb.requestInfo != null) { rb.requestInfo.setResponseBuilder(rb); } boolean dbg = req.getParams().getBool(CommonParams.DEBUG_QUERY, false); rb.setDebug(dbg); if (dbg == false){//if it's true, we are doing everything anyway. SolrPluginUtils.getDebugInterests(req.getParams().getParams(CommonParams.DEBUG), rb); } final RTimer timer = rb.isDebug() ? new RTimer() : null; ShardHandler shardHandler1 = shardHandlerFactory.getShardHandler(); shardHandler1.checkDistributed(rb); if (timer == null) { // non-debugging prepare phase for( SearchComponent c : components ) { c.prepare(rb); } } else { // debugging prepare phase RTimer subt = timer.sub( "prepare" ); for( SearchComponent c : components ) { rb.setTimer( subt.sub( c.getName() ) ); c.prepare(rb); rb.getTimer().stop(); } subt.stop(); } if (!rb.isDistrib) { // a normal non-distributed request // The semantics of debugging vs not debugging are different enough that // it makes sense to have two control loops if(!rb.isDebug()) { // Process for( SearchComponent c : components ) { c.process(rb); } } else { // Process RTimer subt = timer.sub( "process" ); for( SearchComponent c : components ) { rb.setTimer( subt.sub( c.getName() ) ); c.process(rb); rb.getTimer().stop(); } subt.stop(); timer.stop(); // add the timing info if (rb.isDebugTimings()) { rb.addDebugInfo("timing", timer.asNamedList() ); } } } else { // a distributed request if (rb.outgoing == null) { rb.outgoing = new LinkedList<ShardRequest>(); } rb.finished = new ArrayList<ShardRequest>(); int nextStage = 0; do { rb.stage = nextStage; nextStage = ResponseBuilder.STAGE_DONE; // call all components for( SearchComponent c : components ) { // the next stage is the minimum of what all components report nextStage = Math.min(nextStage, c.distributedProcess(rb)); } // check the outgoing queue and send requests while (rb.outgoing.size() > 0) { // submit all current request tasks at once while (rb.outgoing.size() > 0) { ShardRequest sreq = rb.outgoing.remove(0); sreq.actualShards = sreq.shards; if (sreq.actualShards==ShardRequest.ALL_SHARDS) { sreq.actualShards = rb.shards; } sreq.responses = new ArrayList<ShardResponse>(); // TODO: map from shard to address[] for (String shard : sreq.actualShards) { ModifiableSolrParams params = new ModifiableSolrParams(sreq.params); params.remove(ShardParams.SHARDS); // not a top-level request params.set("distrib", "false"); // not a top-level request params.remove("indent"); params.remove(CommonParams.HEADER_ECHO_PARAMS); params.set(ShardParams.IS_SHARD, true); // a sub (shard) request params.set(ShardParams.SHARD_URL, shard); // so the shard knows what was asked if (rb.requestInfo != null) { // we could try and detect when this is needed, but it could be tricky params.set("NOW", Long.toString(rb.requestInfo.getNOW().getTime())); } String shardQt = params.get(ShardParams.SHARDS_QT); if (shardQt == null) { params.remove(CommonParams.QT); } else { params.set(CommonParams.QT, shardQt); } shardHandler1.submit(sreq, shard, params); } } // now wait for replies, but if anyone puts more requests on // the outgoing queue, send them out immediately (by exiting // this loop) boolean tolerant = rb.req.getParams().getBool(ShardParams.SHARDS_TOLERANT, false); while (rb.outgoing.size() == 0) { ShardResponse srsp = tolerant ? shardHandler1.takeCompletedIncludingErrors(): shardHandler1.takeCompletedOrError(); if (srsp == null) break; // no more requests to wait for // Was there an exception? if (srsp.getException() != null) { // If things are not tolerant, abort everything and rethrow if(!tolerant) { shardHandler1.cancelAll(); if (srsp.getException() instanceof SolrException) { throw (SolrException)srsp.getException(); } else { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, srsp.getException()); } } } rb.finished.add(srsp.getShardRequest()); // let the components see the responses to the request for(SearchComponent c : components) { c.handleResponses(rb, srsp.getShardRequest()); } } } for(SearchComponent c : components) { c.finishStage(rb); } // we are done when the next stage is MAX_VALUE } while (nextStage != Integer.MAX_VALUE); } }
// in core/src/java/org/apache/solr/handler/ContentStreamHandlerBase.java
Override public void handleRequestBody(SolrQueryRequest req, SolrQueryResponse rsp) throws Exception { SolrParams params = req.getParams(); UpdateRequestProcessorChain processorChain = req.getCore().getUpdateProcessingChain(params.get(UpdateParams.UPDATE_CHAIN)); UpdateRequestProcessor processor = processorChain.createProcessor(req, rsp); try { ContentStreamLoader documentLoader = newLoader(req, processor); Iterable<ContentStream> streams = req.getContentStreams(); if (streams == null) { if (!RequestHandlerUtils.handleCommit(req, processor, params, false) && !RequestHandlerUtils.handleRollback(req, processor, params, false)) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "missing content stream"); } } else { for (ContentStream stream : streams) { documentLoader.load(req, rsp, stream, processor); } // Perhaps commit from the parameters RequestHandlerUtils.handleCommit(req, processor, params, false); RequestHandlerUtils.handleRollback(req, processor, params, false); } } finally { // finish the request processor.finish(); } }
// in core/src/java/org/apache/solr/handler/admin/PluginInfoHandler.java
Override public void handleRequestBody(SolrQueryRequest req, SolrQueryResponse rsp) throws Exception { SolrParams params = req.getParams(); boolean stats = params.getBool( "stats", false ); rsp.add( "plugins", getSolrInfoBeans( req.getCore(), stats ) ); rsp.setHttpCaching(false); }
// in core/src/java/org/apache/solr/handler/admin/LukeRequestHandler.java
Override public void handleRequestBody(SolrQueryRequest req, SolrQueryResponse rsp) throws Exception { IndexSchema schema = req.getSchema(); SolrIndexSearcher searcher = req.getSearcher(); DirectoryReader reader = searcher.getIndexReader(); SolrParams params = req.getParams(); ShowStyle style = ShowStyle.get(params.get("show")); // If no doc is given, show all fields and top terms rsp.add("index", getIndexInfo(reader)); if(ShowStyle.INDEX==style) { return; // that's all we need } Integer docId = params.getInt( DOC_ID ); if( docId == null && params.get( ID ) != null ) { // Look for something with a given solr ID SchemaField uniqueKey = schema.getUniqueKeyField(); String v = uniqueKey.getType().toInternal( params.get(ID) ); Term t = new Term( uniqueKey.getName(), v ); docId = searcher.getFirstMatch( t ); if( docId < 0 ) { throw new SolrException( SolrException.ErrorCode.NOT_FOUND, "Can't find document: "+params.get( ID ) ); } } // Read the document from the index if( docId != null ) { if( style != null && style != ShowStyle.DOC ) { throw new SolrException(ErrorCode.BAD_REQUEST, "missing doc param for doc style"); } Document doc = null; try { doc = reader.document( docId ); } catch( Exception ex ) {} if( doc == null ) { throw new SolrException( SolrException.ErrorCode.NOT_FOUND, "Can't find document: "+docId ); } SimpleOrderedMap<Object> info = getDocumentFieldsInfo( doc, docId, reader, schema ); SimpleOrderedMap<Object> docinfo = new SimpleOrderedMap<Object>(); docinfo.add( "docId", docId ); docinfo.add( "lucene", info ); docinfo.add( "solr", doc ); rsp.add( "doc", docinfo ); } else if ( ShowStyle.SCHEMA == style ) { rsp.add( "schema", getSchemaInfo( req.getSchema() ) ); } else { rsp.add( "fields", getIndexedFieldsInfo(req) ) ; } // Add some generally helpful information NamedList<Object> info = new SimpleOrderedMap<Object>(); info.add( "key", getFieldFlagsKey() ); info.add( "NOTE", "Document Frequency (df) is not updated when a document is marked for deletion. df values include deleted documents." ); rsp.add( "info", info ); rsp.setHttpCaching(false); }
// in core/src/java/org/apache/solr/handler/admin/LukeRequestHandler.java
private static SimpleOrderedMap<Object> getIndexedFieldsInfo(SolrQueryRequest req) throws Exception { SolrIndexSearcher searcher = req.getSearcher(); SolrParams params = req.getParams(); Set<String> fields = null; String fl = params.get(CommonParams.FL); if (fl != null) { fields = new TreeSet<String>(Arrays.asList(fl.split( "[,\\s]+" ))); } AtomicReader reader = searcher.getAtomicReader(); IndexSchema schema = searcher.getSchema(); // Don't be tempted to put this in the loop below, the whole point here is to alphabetize the fields! Set<String> fieldNames = new TreeSet<String>(); for(FieldInfo fieldInfo : reader.getFieldInfos()) { fieldNames.add(fieldInfo.name); } // Walk the term enum and keep a priority queue for each map in our set SimpleOrderedMap<Object> finfo = new SimpleOrderedMap<Object>(); for (String fieldName : fieldNames) { if (fields != null && ! fields.contains(fieldName) && ! fields.contains("*")) { continue; //we're not interested in this field Still an issue here } SimpleOrderedMap<Object> fieldMap = new SimpleOrderedMap<Object>(); SchemaField sfield = schema.getFieldOrNull( fieldName ); FieldType ftype = (sfield==null)?null:sfield.getType(); fieldMap.add( "type", (ftype==null)?null:ftype.getTypeName() ); fieldMap.add("schema", getFieldFlags(sfield)); if (sfield != null && schema.isDynamicField(sfield.getName()) && schema.getDynamicPattern(sfield.getName()) != null) { fieldMap.add("dynamicBase", schema.getDynamicPattern(sfield.getName())); } Terms terms = reader.fields().terms(fieldName); if (terms == null) { // Not indexed, so we need to report what we can (it made it through the fl param if specified) finfo.add( fieldName, fieldMap ); continue; } if(sfield != null && sfield.indexed() ) { // In the pre-4.0 days, this did a veeeery expensive range query. But we can be much faster now, // so just do this all the time. Document doc = getFirstLiveDoc(reader, fieldName, terms); if( doc != null ) { // Found a document with this field try { IndexableField fld = doc.getField( fieldName ); if( fld != null ) { fieldMap.add("index", getFieldFlags(fld)); } else { // it is a non-stored field... fieldMap.add("index", "(unstored field)"); } } catch( Exception ex ) { log.warn( "error reading field: "+fieldName ); } } fieldMap.add("docs", terms.getDocCount()); } if (fields != null && (fields.contains(fieldName) || fields.contains("*"))) { getDetailedFieldInfo(req, fieldName, fieldMap); } // Add the field finfo.add( fieldName, fieldMap ); } return finfo; }
// in core/src/java/org/apache/solr/handler/admin/SolrInfoMBeanHandler.java
Override public void handleRequestBody(SolrQueryRequest req, SolrQueryResponse rsp) throws Exception { NamedList<NamedList<NamedList<Object>>> cats = getMBeanInfo(req); if(req.getParams().getBool("diff", false)) { ContentStream body = null; try { body = req.getContentStreams().iterator().next(); } catch(Exception ex) { throw new SolrException(ErrorCode.BAD_REQUEST, "missing content-stream for diff"); } String content = IOUtils.toString(body.getReader()); NamedList<NamedList<NamedList<Object>>> ref = fromXML(content); // Normalize the output SolrQueryResponse wrap = new SolrQueryResponse(); wrap.add("solr-mbeans", cats); cats = (NamedList<NamedList<NamedList<Object>>>) BinaryResponseWriter.getParsedResponse(req, wrap).get("solr-mbeans"); // Get rid of irrelevant things ref = normalize(ref); cats = normalize(cats); // Only the changes boolean showAll = req.getParams().getBool("all", false); rsp.add("solr-mbeans", getDiff(ref,cats, showAll)); } else { rsp.add("solr-mbeans", cats); } rsp.setHttpCaching(false); // never cache, no matter what init config looks like }
// in core/src/java/org/apache/solr/handler/admin/CoreAdminHandler.java
Override public void handleRequestBody(SolrQueryRequest req, SolrQueryResponse rsp) throws Exception { // Make sure the cores is enabled CoreContainer cores = getCoreContainer(); if (cores == null) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Core container instance missing"); } boolean doPersist = false; // Pick the action SolrParams params = req.getParams(); CoreAdminAction action = CoreAdminAction.STATUS; String a = params.get(CoreAdminParams.ACTION); if (a != null) { action = CoreAdminAction.get(a); if (action == null) { doPersist = this.handleCustomAction(req, rsp); } } if (action != null) { switch (action) { case CREATE: { doPersist = this.handleCreateAction(req, rsp); break; } case RENAME: { doPersist = this.handleRenameAction(req, rsp); break; } case UNLOAD: { doPersist = this.handleUnloadAction(req, rsp); break; } case STATUS: { doPersist = this.handleStatusAction(req, rsp); break; } case PERSIST: { doPersist = this.handlePersistAction(req, rsp); break; } case RELOAD: { doPersist = this.handleReloadAction(req, rsp); break; } case SWAP: { doPersist = this.handleSwapAction(req, rsp); break; } case MERGEINDEXES: { doPersist = this.handleMergeAction(req, rsp); break; } case PREPRECOVERY: { this.handleWaitForStateAction(req, rsp); break; } case REQUESTRECOVERY: { this.handleRequestRecoveryAction(req, rsp); break; } case DISTRIBURL: { this.handleDistribUrlAction(req, rsp); break; } default: { doPersist = this.handleCustomAction(req, rsp); break; } case LOAD: break; } } // Should we persist the changes? if (doPersist) { cores.persist(); rsp.add("saved", cores.getConfigFile().getAbsolutePath()); } rsp.setHttpCaching(false); }
// in core/src/java/org/apache/solr/handler/admin/SystemInfoHandler.java
Override public void handleRequestBody(SolrQueryRequest req, SolrQueryResponse rsp) throws Exception { rsp.add( "core", getCoreInfo( req.getCore() ) ); rsp.add( "lucene", getLuceneInfo() ); rsp.add( "jvm", getJvmInfo() ); rsp.add( "system", getSystemInfo() ); rsp.setHttpCaching(false); }
// in core/src/java/org/apache/solr/handler/admin/SystemInfoHandler.java
private SimpleOrderedMap<Object> getCoreInfo( SolrCore core ) throws Exception { SimpleOrderedMap<Object> info = new SimpleOrderedMap<Object>(); IndexSchema schema = core.getSchema(); info.add( "schema", schema != null ? schema.getSchemaName():"no schema!" ); // Host info.add( "host", hostname ); // Now info.add( "now", new Date() ); // Start Time info.add( "start", new Date(core.getStartTime()) ); // Solr Home SimpleOrderedMap<Object> dirs = new SimpleOrderedMap<Object>(); dirs.add( "cwd" , new File( System.getProperty("user.dir")).getAbsolutePath() ); dirs.add( "instance", new File( core.getResourceLoader().getInstanceDir() ).getAbsolutePath() ); dirs.add( "data", new File( core.getDataDir() ).getAbsolutePath() ); dirs.add( "index", new File( core.getIndexDir() ).getAbsolutePath() ); info.add( "directory", dirs ); return info; }
// in core/src/java/org/apache/solr/handler/admin/SystemInfoHandler.java
public static SimpleOrderedMap<Object> getSystemInfo() throws Exception { SimpleOrderedMap<Object> info = new SimpleOrderedMap<Object>(); OperatingSystemMXBean os = ManagementFactory.getOperatingSystemMXBean(); info.add( "name", os.getName() ); info.add( "version", os.getVersion() ); info.add( "arch", os.getArch() ); info.add( "systemLoadAverage", os.getSystemLoadAverage()); // com.sun.management.OperatingSystemMXBean addGetterIfAvaliable( os, "committedVirtualMemorySize", info); addGetterIfAvaliable( os, "freePhysicalMemorySize", info); addGetterIfAvaliable( os, "freeSwapSpaceSize", info); addGetterIfAvaliable( os, "processCpuTime", info); addGetterIfAvaliable( os, "totalPhysicalMemorySize", info); addGetterIfAvaliable( os, "totalSwapSpaceSize", info); // com.sun.management.UnixOperatingSystemMXBean addGetterIfAvaliable( os, "openFileDescriptorCount", info ); addGetterIfAvaliable( os, "maxFileDescriptorCount", info ); try { if( !os.getName().toLowerCase(Locale.ENGLISH).startsWith( "windows" ) ) { // Try some command line things info.add( "uname", execute( "uname -a" ) ); info.add( "uptime", execute( "uptime" ) ); } } catch( Throwable ex ) { ex.printStackTrace(); } return info; }
// in core/src/java/org/apache/solr/handler/admin/SystemInfoHandler.java
private static SimpleOrderedMap<Object> getLuceneInfo() throws Exception { SimpleOrderedMap<Object> info = new SimpleOrderedMap<Object>(); Package p = SolrCore.class.getPackage(); info.add( "solr-spec-version", p.getSpecificationVersion() ); info.add( "solr-impl-version", p.getImplementationVersion() ); p = LucenePackage.class.getPackage(); info.add( "lucene-spec-version", p.getSpecificationVersion() ); info.add( "lucene-impl-version", p.getImplementationVersion() ); return info; }
// in core/src/java/org/apache/solr/handler/admin/LoggingHandler.java
Override public void handleRequestBody(SolrQueryRequest req, SolrQueryResponse rsp) throws Exception { // Don't do anything if the framework is unknown if(watcher==null) { rsp.add("error", "Logging Not Initalized"); return; } rsp.add("watcher", watcher.getName()); SolrParams params = req.getParams(); if(params.get("threshold")!=null) { watcher.setThreshold(params.get("threshold")); } // Write something at each level if(params.get("test")!=null) { log.trace("trace message"); log.debug( "debug message"); log.info("info (with exception)", new RuntimeException("test") ); log.warn("warn (with exception)", new RuntimeException("test") ); log.error("error (with exception)", new RuntimeException("test") ); } String[] set = params.getParams("set"); if (set != null) { for (String pair : set) { String[] split = pair.split(":"); if (split.length != 2) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "Invalid format, expected level:value, got " + pair); } String category = split[0]; String level = split[1]; watcher.setLogLevel(category, level); } } String since = req.getParams().get("since"); if(since != null) { long time = -1; try { time = Long.parseLong(since); } catch(Exception ex) { throw new SolrException(ErrorCode.BAD_REQUEST, "invalid timestamp: "+since); } AtomicBoolean found = new AtomicBoolean(false); SolrDocumentList docs = watcher.getHistory(time, found); if(docs==null) { rsp.add("error", "History not enabled"); return; } else { SimpleOrderedMap<Object> info = new SimpleOrderedMap<Object>(); if(time>0) { info.add("since", time); info.add("found", found); } else { info.add("levels", watcher.getAllLevels()); // show for the first request } info.add("last", watcher.getLastEvent()); info.add("buffer", watcher.getHistorySize()); info.add("threshold", watcher.getThreshold()); rsp.add("info", info); rsp.add("history", docs); } } else { rsp.add("levels", watcher.getAllLevels()); List<LoggerInfo> loggers = new ArrayList<LoggerInfo>(watcher.getAllLoggers()); Collections.sort(loggers); List<SimpleOrderedMap<?>> info = new ArrayList<SimpleOrderedMap<?>>(); for(LoggerInfo wrap:loggers) { info.add(wrap.getInfo()); } rsp.add("loggers", info); } rsp.setHttpCaching(false); }
// in core/src/java/org/apache/solr/handler/PingRequestHandler.java
Override public void handleRequestBody(SolrQueryRequest req, SolrQueryResponse rsp) throws Exception { SolrParams params = req.getParams(); // in this case, we want to default distrib to false so // we only ping the single node Boolean distrib = params.getBool("distrib"); if (distrib == null) { ModifiableSolrParams mparams = new ModifiableSolrParams(params); mparams.set("distrib", false); req.setParams(mparams); } String actionParam = params.get("action"); ACTIONS action = null; if (actionParam == null){ action = ACTIONS.PING; } else { try { action = ACTIONS.valueOf(actionParam.toUpperCase(Locale.ENGLISH)); } catch (IllegalArgumentException iae){ throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Unknown action: " + actionParam); } } switch(action){ case PING: if( isPingDisabled() ) { throw new SolrException(SolrException.ErrorCode.SERVICE_UNAVAILABLE, "Service disabled"); } handlePing(req, rsp); break; case ENABLE: handleEnable(true); break; case DISABLE: handleEnable(false); break; case STATUS: if( healthcheck == null ){ SolrException e = new SolrException (SolrException.ErrorCode.SERVICE_UNAVAILABLE, "healthcheck not configured"); rsp.setException(e); } else { rsp.add( "status", isPingDisabled() ? "disabled" : "enabled" ); } } }
// in core/src/java/org/apache/solr/handler/PingRequestHandler.java
protected void handlePing(SolrQueryRequest req, SolrQueryResponse rsp) throws Exception { SolrParams params = req.getParams(); SolrCore core = req.getCore(); // Get the RequestHandler String qt = params.get( CommonParams.QT );//optional; you get the default otherwise SolrRequestHandler handler = core.getRequestHandler( qt ); if( handler == null ) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Unknown RequestHandler (qt): "+qt ); } if( handler instanceof PingRequestHandler ) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Cannot execute the PingRequestHandler recursively" ); } // Execute the ping query and catch any possible exception Throwable ex = null; try { SolrQueryResponse pingrsp = new SolrQueryResponse(); core.execute(handler, req, pingrsp ); ex = pingrsp.getException(); } catch( Throwable th ) { ex = th; } // Send an error or an 'OK' message (response code will be 200) if( ex != null ) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Ping query caused exception: "+ex.getMessage(), ex ); } rsp.add( "status", "OK" ); }
// in core/src/java/org/apache/solr/client/solrj/embedded/JettySolrRunner.java
public void start() throws Exception { start(true); }
// in core/src/java/org/apache/solr/client/solrj/embedded/JettySolrRunner.java
public void start(boolean waitForSolr) throws Exception { // if started before, make a new server if (startedBefore) { waitOnSolr = false; init(solrHome, context, lastPort, stopAtShutdown); } else { startedBefore = true; } if( dataDir != null) { System.setProperty("solr.data.dir", dataDir); } if(shards != null) { System.setProperty("shard", shards); } if (!server.isRunning()) { server.start(); } synchronized (JettySolrRunner.this) { int cnt = 0; while (!waitOnSolr) { this.wait(100); if (cnt++ == 5) { throw new RuntimeException("Jetty/Solr unresponsive"); } } } System.clearProperty("shard"); System.clearProperty("solr.data.dir"); }
// in core/src/java/org/apache/solr/client/solrj/embedded/JettySolrRunner.java
public void stop() throws Exception { if (!server.isStopped() && !server.isStopping()) { server.stop(); } server.join(); }
// in core/src/java/org/apache/solr/SolrLogFormatter.java
public static void main(String[] args) throws Exception { Handler[] handlers = Logger.getLogger("").getHandlers(); boolean foundConsoleHandler = false; for (int index = 0; index < handlers.length; index++) { // set console handler to SEVERE if (handlers[index] instanceof ConsoleHandler) { handlers[index].setLevel(Level.ALL); handlers[index].setFormatter(new SolrLogFormatter()); foundConsoleHandler = true; } } if (!foundConsoleHandler) { // no console handler found System.err.println("No consoleHandler found, adding one."); ConsoleHandler consoleHandler = new ConsoleHandler(); consoleHandler.setLevel(Level.ALL); consoleHandler.setFormatter(new SolrLogFormatter()); Logger.getLogger("").addHandler(consoleHandler); } final org.slf4j.Logger log = LoggerFactory.getLogger(SolrLogFormatter.class); log.error("HELLO"); ThreadGroup tg = new MyThreadGroup("YCS"); Thread th = new Thread(tg, "NEW_THREAD") { @Override public void run() { try { go(); } catch (Throwable e) { e.printStackTrace(); } } }; th.start(); th.join(); }
// in core/src/java/org/apache/solr/SolrLogFormatter.java
public static void go() throws Exception { final org.slf4j.Logger log = LoggerFactory.getLogger(SolrLogFormatter.class); Thread thread1 = new Thread() { @Override public void run() { threadLocal.set("from thread1"); log.error("[] webapp=/solr path=/select params={hello} wow"); } }; Thread thread2 = new Thread() { @Override public void run() { threadLocal.set("from thread2"); log.error("InThread2"); } }; thread1.start(); thread2.start(); thread1.join(); thread2.join(); }
// in core/src/java/org/apache/solr/response/BinaryResponseWriter.java
public Object getValue(SchemaField sf, IndexableField f) throws Exception { FieldType ft = null; if(sf != null) ft =sf.getType(); if (ft == null) { // handle fields not in the schema BytesRef bytesRef = f.binaryValue(); if (bytesRef != null) { if (bytesRef.offset == 0 && bytesRef.length == bytesRef.bytes.length) { return bytesRef.bytes; } else { final byte[] bytes = new byte[bytesRef.length]; System.arraycopy(bytesRef.bytes, bytesRef.offset, bytes, 0, bytesRef.length); return bytes; } } else return f.stringValue(); } else { if (useFieldObjects && KNOWN_TYPES.contains(ft.getClass())) { return ft.toObject(f); } else { return ft.toExternal(f); } } }
// in core/src/java/org/apache/solr/request/PerSegmentSingleValuedFaceting.java
NamedList<Integer> getFacetCounts(Executor executor) throws IOException { CompletionService<SegFacet> completionService = new ExecutorCompletionService<SegFacet>(executor); // reuse the translation logic to go from top level set to per-segment set baseSet = docs.getTopFilter(); final AtomicReaderContext[] leaves = searcher.getTopReaderContext().leaves(); // The list of pending tasks that aren't immediately submitted // TODO: Is there a completion service, or a delegating executor that can // limit the number of concurrent tasks submitted to a bigger executor? LinkedList<Callable<SegFacet>> pending = new LinkedList<Callable<SegFacet>>(); int threads = nThreads <= 0 ? Integer.MAX_VALUE : nThreads; for (int i=0; i<leaves.length; i++) { final SegFacet segFacet = new SegFacet(leaves[i]); Callable<SegFacet> task = new Callable<SegFacet>() { public SegFacet call() throws Exception { segFacet.countTerms(); return segFacet; } }; // TODO: if limiting threads, submit by largest segment first? if (--threads >= 0) { completionService.submit(task); } else { pending.add(task); } } // now merge the per-segment results PriorityQueue<SegFacet> queue = new PriorityQueue<SegFacet>(leaves.length) { @Override protected boolean lessThan(SegFacet a, SegFacet b) { return a.tempBR.compareTo(b.tempBR) < 0; } }; boolean hasMissingCount=false; int missingCount=0; for (int i=0; i<leaves.length; i++) { SegFacet seg = null; try { Future<SegFacet> future = completionService.take(); seg = future.get(); if (!pending.isEmpty()) { completionService.submit(pending.removeFirst()); } } catch (InterruptedException e) { Thread.currentThread().interrupt(); throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); } catch (ExecutionException e) { Throwable cause = e.getCause(); if (cause instanceof RuntimeException) { throw (RuntimeException)cause; } else { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Error in per-segment faceting on field: " + fieldName, cause); } } if (seg.startTermIndex < seg.endTermIndex) { if (seg.startTermIndex==0) { hasMissingCount=true; missingCount += seg.counts[0]; seg.pos = 1; } else { seg.pos = seg.startTermIndex; } if (seg.pos < seg.endTermIndex) { seg.tenum = seg.si.getTermsEnum(); seg.tenum.seekExact(seg.pos); seg.tempBR = seg.tenum.term(); queue.add(seg); } } } FacetCollector collector; if (sort.equals(FacetParams.FACET_SORT_COUNT) || sort.equals(FacetParams.FACET_SORT_COUNT_LEGACY)) { collector = new CountSortedFacetCollector(offset, limit, mincount); } else { collector = new IndexSortedFacetCollector(offset, limit, mincount); } BytesRef val = new BytesRef(); while (queue.size() > 0) { SegFacet seg = queue.top(); // make a shallow copy val.bytes = seg.tempBR.bytes; val.offset = seg.tempBR.offset; val.length = seg.tempBR.length; int count = 0; do { count += seg.counts[seg.pos - seg.startTermIndex]; // TODO: OPTIMIZATION... // if mincount>0 then seg.pos++ can skip ahead to the next non-zero entry. seg.pos++; if (seg.pos >= seg.endTermIndex) { queue.pop(); seg = queue.top(); } else { seg.tempBR = seg.tenum.next(); seg = queue.updateTop(); } } while (seg != null && val.compareTo(seg.tempBR) == 0); boolean stop = collector.collect(val, count); if (stop) break; } NamedList<Integer> res = collector.getFacetCounts(); // convert labels to readable form FieldType ft = searcher.getSchema().getFieldType(fieldName); int sz = res.size(); for (int i=0; i<sz; i++) { res.setName(i, ft.indexedToReadable(res.getName(i))); } if (missing) { if (!hasMissingCount) { missingCount = SimpleFacets.getFieldMissingCount(searcher,docs,fieldName); } res.add(null, missingCount); } return res; }
// in core/src/java/org/apache/solr/request/PerSegmentSingleValuedFaceting.java
public SegFacet call() throws Exception { segFacet.countTerms(); return segFacet; }
// in core/src/java/org/apache/solr/servlet/SolrRequestParsers.java
public SolrQueryRequest parse( SolrCore core, String path, HttpServletRequest req ) throws Exception { SolrRequestParser parser = standard; // TODO -- in the future, we could pick a different parser based on the request // Pick the parser from the request... ArrayList<ContentStream> streams = new ArrayList<ContentStream>(1); SolrParams params = parser.parseParamsAndFillStreams( req, streams ); SolrQueryRequest sreq = buildRequestFrom( core, params, streams ); // Handlers and login will want to know the path. If it contains a ':' // the handler could use it for RESTful URLs sreq.getContext().put( "path", path ); return sreq; }
// in core/src/java/org/apache/solr/servlet/SolrRequestParsers.java
public SolrQueryRequest buildRequestFrom( SolrCore core, SolrParams params, Collection<ContentStream> streams ) throws Exception { // The content type will be applied to all streaming content String contentType = params.get( CommonParams.STREAM_CONTENTTYPE ); // Handle anything with a remoteURL String[] strs = params.getParams( CommonParams.STREAM_URL ); if( strs != null ) { if( !enableRemoteStreams ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "Remote Streaming is disabled." ); } for( final String url : strs ) { ContentStreamBase stream = new ContentStreamBase.URLStream( new URL(url) ); if( contentType != null ) { stream.setContentType( contentType ); } streams.add( stream ); } } // Handle streaming files strs = params.getParams( CommonParams.STREAM_FILE ); if( strs != null ) { if( !enableRemoteStreams ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "Remote Streaming is disabled." ); } for( final String file : strs ) { ContentStreamBase stream = new ContentStreamBase.FileStream( new File(file) ); if( contentType != null ) { stream.setContentType( contentType ); } streams.add( stream ); } } // Check for streams in the request parameters strs = params.getParams( CommonParams.STREAM_BODY ); if( strs != null ) { for( final String body : strs ) { ContentStreamBase stream = new ContentStreamBase.StringStream( body ); if( contentType != null ) { stream.setContentType( contentType ); } streams.add( stream ); } } SolrQueryRequestBase q = new SolrQueryRequestBase( core, params ) { }; if( streams != null && streams.size() > 0 ) { q.setContentStreams( streams ); } return q; }
// in core/src/java/org/apache/solr/servlet/SolrRequestParsers.java
public SolrParams parseParamsAndFillStreams( final HttpServletRequest req, ArrayList<ContentStream> streams ) throws Exception { return new ServletSolrParams(req); }
// in core/src/java/org/apache/solr/servlet/SolrRequestParsers.java
public SolrParams parseParamsAndFillStreams( final HttpServletRequest req, ArrayList<ContentStream> streams ) throws Exception { // The javadocs for HttpServletRequest are clear that req.getReader() should take // care of any character encoding issues. BUT, there are problems while running on // some servlet containers: including Tomcat 5 and resin. // // Rather than return req.getReader(), this uses the default ContentStreamBase method // that checks for charset definitions in the ContentType. streams.add( new HttpRequestContentStream( req ) ); return SolrRequestParsers.parseQueryString( req.getQueryString() ); }
// in core/src/java/org/apache/solr/servlet/SolrRequestParsers.java
public SolrParams parseParamsAndFillStreams( final HttpServletRequest req, ArrayList<ContentStream> streams ) throws Exception { if( !ServletFileUpload.isMultipartContent(req) ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "Not multipart content! "+req.getContentType() ); } MultiMapSolrParams params = SolrRequestParsers.parseQueryString( req.getQueryString() ); // Create a factory for disk-based file items DiskFileItemFactory factory = new DiskFileItemFactory(); // Set factory constraints // TODO - configure factory.setSizeThreshold(yourMaxMemorySize); // TODO - configure factory.setRepository(yourTempDirectory); // Create a new file upload handler ServletFileUpload upload = new ServletFileUpload(factory); upload.setSizeMax( uploadLimitKB*1024 ); // Parse the request List items = upload.parseRequest(req); Iterator iter = items.iterator(); while (iter.hasNext()) { FileItem item = (FileItem) iter.next(); // If its a form field, put it in our parameter map if (item.isFormField()) { MultiMapSolrParams.addParam( item.getFieldName(), item.getString(), params.getMap() ); } // Add the stream else { streams.add( new FileItemContentStream( item ) ); } } return params; }
// in core/src/java/org/apache/solr/servlet/SolrRequestParsers.java
public SolrParams parseParamsAndFillStreams( final HttpServletRequest req, ArrayList<ContentStream> streams ) throws Exception { String method = req.getMethod().toUpperCase(Locale.ENGLISH); if( "GET".equals( method ) || "HEAD".equals( method )) { return new ServletSolrParams(req); } if( "POST".equals( method ) ) { String contentType = req.getContentType(); if( contentType != null ) { int idx = contentType.indexOf( ';' ); if( idx > 0 ) { // remove the charset definition "; charset=utf-8" contentType = contentType.substring( 0, idx ); } if( "application/x-www-form-urlencoded".equals( contentType.toLowerCase(Locale.ENGLISH) ) ) { return new ServletSolrParams(req); // just get the params from parameterMap } if( ServletFileUpload.isMultipartContent(req) ) { return multipart.parseParamsAndFillStreams(req, streams); } } return raw.parseParamsAndFillStreams(req, streams); } throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "Unsupported method: "+method ); }
// in core/src/java/org/apache/solr/servlet/DirectSolrConnection.java
public String request( String pathAndParams, String body ) throws Exception { String path = null; SolrParams params = null; int idx = pathAndParams.indexOf( '?' ); if( idx > 0 ) { path = pathAndParams.substring( 0, idx ); params = SolrRequestParsers.parseQueryString( pathAndParams.substring(idx+1) ); } else { path= pathAndParams; params = new MapSolrParams( new HashMap<String, String>() ); } return request(path, params, body); }
// in core/src/java/org/apache/solr/servlet/DirectSolrConnection.java
public String request(String path, SolrParams params, String body) throws Exception { // Extract the handler from the path or params SolrRequestHandler handler = core.getRequestHandler( path ); if( handler == null ) { if( "/select".equals( path ) || "/select/".equalsIgnoreCase( path) ) { if (params == null) params = new MapSolrParams( new HashMap<String, String>() ); String qt = params.get( CommonParams.QT ); handler = core.getRequestHandler( qt ); if( handler == null ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "unknown handler: "+qt); } } } if( handler == null ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "unknown handler: "+path ); } return request(handler, params, body); }
// in core/src/java/org/apache/solr/servlet/DirectSolrConnection.java
public String request(SolrRequestHandler handler, SolrParams params, String body) throws Exception { if (params == null) params = new MapSolrParams( new HashMap<String, String>() ); // Make a stream for the 'body' content List<ContentStream> streams = new ArrayList<ContentStream>( 1 ); if( body != null && body.length() > 0 ) { streams.add( new ContentStreamBase.StringStream( body ) ); } SolrQueryRequest req = null; try { req = parser.buildRequestFrom( core, params, streams ); SolrQueryResponse rsp = new SolrQueryResponse(); SolrRequestInfo.setRequestInfo(new SolrRequestInfo(req, rsp)); core.execute( handler, req, rsp ); if( rsp.getException() != null ) { throw rsp.getException(); } // Now write it out QueryResponseWriter responseWriter = core.getQueryResponseWriter(req); StringWriter out = new StringWriter(); responseWriter.write(out, req, rsp); return out.toString(); } finally { if (req != null) { req.close(); } SolrRequestInfo.clearRequestInfo(); } }
// in core/src/java/org/apache/solr/schema/FieldTypePluginLoader.java
Override protected FieldType create( ResourceLoader loader, String name, String className, Node node ) throws Exception { FieldType ft = loader.newInstance(className, FieldType.class); ft.setTypeName(name); String expression = "./analyzer[@type='query']"; Node anode = (Node)xpath.evaluate(expression, node, XPathConstants.NODE); Analyzer queryAnalyzer = readAnalyzer(anode); expression = "./analyzer[@type='multiterm']"; anode = (Node)xpath.evaluate(expression, node, XPathConstants.NODE); Analyzer multiAnalyzer = readAnalyzer(anode); // An analyzer without a type specified, or with type="index" expression = "./analyzer[not(@type)] | ./analyzer[@type='index']"; anode = (Node)xpath.evaluate(expression, node, XPathConstants.NODE); Analyzer analyzer = readAnalyzer(anode); // a custom similarity[Factory] expression = "./similarity"; anode = (Node)xpath.evaluate(expression, node, XPathConstants.NODE); SimilarityFactory simFactory = IndexSchema.readSimilarity(loader, anode); if (queryAnalyzer==null) queryAnalyzer=analyzer; if (analyzer==null) analyzer=queryAnalyzer; if (multiAnalyzer == null) { multiAnalyzer = constructMultiTermAnalyzer(queryAnalyzer); } if (analyzer!=null) { ft.setAnalyzer(analyzer); ft.setQueryAnalyzer(queryAnalyzer); if (ft instanceof TextField) ((TextField)ft).setMultiTermAnalyzer(multiAnalyzer); } if (simFactory!=null) { ft.setSimilarity(simFactory.getSimilarity()); } if (ft instanceof SchemaAware){ schemaAware.add((SchemaAware) ft); } return ft; }
// in core/src/java/org/apache/solr/schema/FieldTypePluginLoader.java
Override protected void init(FieldType plugin, Node node) throws Exception { Map<String,String> params = DOMUtil.toMapExcept( node.getAttributes(), "name","class" ); plugin.setArgs(schema, params ); }
// in core/src/java/org/apache/solr/schema/FieldTypePluginLoader.java
Override protected FieldType register(String name, FieldType plugin) throws Exception { log.trace("fieldtype defined: " + plugin ); return fieldTypes.put( name, plugin ); }
// in core/src/java/org/apache/solr/schema/FieldTypePluginLoader.java
private Analyzer readAnalyzer(Node node) throws XPathExpressionException { final SolrResourceLoader loader = schema.getResourceLoader(); // parent node used to be passed in as "fieldtype" // if (!fieldtype.hasChildNodes()) return null; // Node node = DOMUtil.getChild(fieldtype,"analyzer"); if (node == null) return null; NamedNodeMap attrs = node.getAttributes(); String analyzerName = DOMUtil.getAttr(attrs,"class"); if (analyzerName != null) { try { // No need to be core-aware as Analyzers are not in the core-aware list final Class<? extends Analyzer> clazz = loader.findClass(analyzerName, Analyzer.class); try { // first try to use a ctor with version parameter // (needed for many new Analyzers that have no default one anymore) Constructor<? extends Analyzer> cnstr = clazz.getConstructor(Version.class); final String matchVersionStr = DOMUtil.getAttr(attrs, LUCENE_MATCH_VERSION_PARAM); final Version luceneMatchVersion = (matchVersionStr == null) ? schema.getDefaultLuceneMatchVersion() : Config.parseLuceneVersionString(matchVersionStr); if (luceneMatchVersion == null) { throw new SolrException ( SolrException.ErrorCode.SERVER_ERROR, "Configuration Error: Analyzer '" + clazz.getName() + "' needs a 'luceneMatchVersion' parameter"); } return cnstr.newInstance(luceneMatchVersion); } catch (NoSuchMethodException nsme) { // otherwise use default ctor return clazz.newInstance(); } } catch (Exception e) { log.error("Cannot load analyzer: "+analyzerName, e); throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "Cannot load analyzer: "+analyzerName, e ); } } // Load the CharFilters final ArrayList<CharFilterFactory> charFilters = new ArrayList<CharFilterFactory>(); AbstractPluginLoader<CharFilterFactory> charFilterLoader = new AbstractPluginLoader<CharFilterFactory> ("[schema.xml] analyzer/charFilter", CharFilterFactory.class, false, false) { @Override protected void init(CharFilterFactory plugin, Node node) throws Exception { if( plugin != null ) { final Map<String,String> params = DOMUtil.toMapExcept(node.getAttributes(),"class"); String configuredVersion = params.remove(LUCENE_MATCH_VERSION_PARAM); plugin.setLuceneMatchVersion(parseConfiguredVersion(configuredVersion, plugin.getClass().getSimpleName())); plugin.init( params ); charFilters.add( plugin ); } } @Override protected CharFilterFactory register(String name, CharFilterFactory plugin) { return null; // used for map registration } }; charFilterLoader.load( loader, (NodeList)xpath.evaluate("./charFilter", node, XPathConstants.NODESET) ); // Load the Tokenizer // Although an analyzer only allows a single Tokenizer, we load a list to make sure // the configuration is ok final ArrayList<TokenizerFactory> tokenizers = new ArrayList<TokenizerFactory>(1); AbstractPluginLoader<TokenizerFactory> tokenizerLoader = new AbstractPluginLoader<TokenizerFactory> ("[schema.xml] analyzer/tokenizer", TokenizerFactory.class, false, false) { @Override protected void init(TokenizerFactory plugin, Node node) throws Exception { if( !tokenizers.isEmpty() ) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "The schema defines multiple tokenizers for: "+node ); } final Map<String,String> params = DOMUtil.toMapExcept(node.getAttributes(),"class"); String configuredVersion = params.remove(LUCENE_MATCH_VERSION_PARAM); plugin.setLuceneMatchVersion(parseConfiguredVersion(configuredVersion, plugin.getClass().getSimpleName())); plugin.init( params ); tokenizers.add( plugin ); } @Override protected TokenizerFactory register(String name, TokenizerFactory plugin) { return null; // used for map registration } }; tokenizerLoader.load( loader, (NodeList)xpath.evaluate("./tokenizer", node, XPathConstants.NODESET) ); // Make sure something was loaded if( tokenizers.isEmpty() ) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR,"analyzer without class or tokenizer & filter list"); } // Load the Filters final ArrayList<TokenFilterFactory> filters = new ArrayList<TokenFilterFactory>(); AbstractPluginLoader<TokenFilterFactory> filterLoader = new AbstractPluginLoader<TokenFilterFactory>("[schema.xml] analyzer/filter", TokenFilterFactory.class, false, false) { @Override protected void init(TokenFilterFactory plugin, Node node) throws Exception { if( plugin != null ) { final Map<String,String> params = DOMUtil.toMapExcept(node.getAttributes(),"class"); String configuredVersion = params.remove(LUCENE_MATCH_VERSION_PARAM); plugin.setLuceneMatchVersion(parseConfiguredVersion(configuredVersion, plugin.getClass().getSimpleName())); plugin.init( params ); filters.add( plugin ); } } @Override protected TokenFilterFactory register(String name, TokenFilterFactory plugin) throws Exception { return null; // used for map registration } }; filterLoader.load( loader, (NodeList)xpath.evaluate("./filter", node, XPathConstants.NODESET) ); return new TokenizerChain(charFilters.toArray(new CharFilterFactory[charFilters.size()]), tokenizers.get(0), filters.toArray(new TokenFilterFactory[filters.size()])); }
// in core/src/java/org/apache/solr/schema/FieldTypePluginLoader.java
Override protected void init(CharFilterFactory plugin, Node node) throws Exception { if( plugin != null ) { final Map<String,String> params = DOMUtil.toMapExcept(node.getAttributes(),"class"); String configuredVersion = params.remove(LUCENE_MATCH_VERSION_PARAM); plugin.setLuceneMatchVersion(parseConfiguredVersion(configuredVersion, plugin.getClass().getSimpleName())); plugin.init( params ); charFilters.add( plugin ); } }
// in core/src/java/org/apache/solr/schema/FieldTypePluginLoader.java
Override protected void init(TokenizerFactory plugin, Node node) throws Exception { if( !tokenizers.isEmpty() ) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "The schema defines multiple tokenizers for: "+node ); } final Map<String,String> params = DOMUtil.toMapExcept(node.getAttributes(),"class"); String configuredVersion = params.remove(LUCENE_MATCH_VERSION_PARAM); plugin.setLuceneMatchVersion(parseConfiguredVersion(configuredVersion, plugin.getClass().getSimpleName())); plugin.init( params ); tokenizers.add( plugin ); }
// in core/src/java/org/apache/solr/schema/FieldTypePluginLoader.java
Override protected void init(TokenFilterFactory plugin, Node node) throws Exception { if( plugin != null ) { final Map<String,String> params = DOMUtil.toMapExcept(node.getAttributes(),"class"); String configuredVersion = params.remove(LUCENE_MATCH_VERSION_PARAM); plugin.setLuceneMatchVersion(parseConfiguredVersion(configuredVersion, plugin.getClass().getSimpleName())); plugin.init( params ); filters.add( plugin ); } }
// in core/src/java/org/apache/solr/schema/FieldTypePluginLoader.java
Override protected TokenFilterFactory register(String name, TokenFilterFactory plugin) throws Exception { return null; // used for map registration }
// in core/src/java/org/apache/solr/schema/PreAnalyzedField.java
public IndexableField fromString(SchemaField field, String val, float boost) throws Exception { if (val == null || val.trim().length() == 0) { return null; } PreAnalyzedTokenizer parse = new PreAnalyzedTokenizer(new StringReader(val), parser); Field f = (Field)super.createField(field, val, boost); if (parse.getStringValue() != null) { f.setStringValue(parse.getStringValue()); } else if (parse.getBinaryValue() != null) { f.setBytesValue(parse.getBinaryValue()); } else { f.fieldType().setStored(false); } if (parse.hasTokenStream()) { f.fieldType().setIndexed(true); f.fieldType().setTokenized(true); f.setTokenStream(parse); } return f; }
// in core/src/java/org/apache/solr/internal/csv/writer/CSVWriter.java
protected String writeValue(CSVField field, String value) throws Exception { if (config.isFixedWidth()) { if (value.length() < field.getSize()) { int fillPattern = config.getFill(); if (field.overrideFill()) { fillPattern = field.getFill(); } StringBuffer sb = new StringBuffer(); int fillSize = (field.getSize() - value.length()); char[] fill = new char[fillSize]; Arrays.fill(fill, config.getFillChar()); if (fillPattern == CSVConfig.FILLLEFT) { sb.append(fill); sb.append(value); value = sb.toString(); } else { // defaults to fillpattern FILLRIGHT when fixedwidth is used sb.append(value); sb.append(fill); value = sb.toString(); } } else if (value.length() > field.getSize()) { // value to big.. value = value.substring(0, field.getSize()); } if (!config.isValueDelimiterIgnored()) { // add the value delimiter.. value = config.getValueDelimiter()+value+config.getValueDelimiter(); } } return value; }
// in core/src/java/org/apache/solr/search/function/FileFloatSource.java
Override public void handleRequestBody(SolrQueryRequest req, SolrQueryResponse rsp) throws Exception { FileFloatSource.resetCache(); log.debug("readerCache has been reset."); UpdateRequestProcessor processor = req.getCore().getUpdateProcessingChain(null).createProcessor(req, rsp); try{ RequestHandlerUtils.handleCommit(req, processor, req.getParams(), true); } finally{ processor.finish(); } }
// in core/src/java/org/apache/solr/cloud/ZkController.java
public static void main(String[] args) throws Exception { // start up a tmp zk server first String zkServerAddress = args[0]; String solrHome = args[1]; String solrPort = null; if (args.length > 2) { solrPort = args[2]; } SolrZkServer zkServer = null; if (solrPort != null) { zkServer = new SolrZkServer("true", null, solrHome + "/zoo_data", solrHome, solrPort); zkServer.parseConfig(); zkServer.start(); } SolrZkClient zkClient = new SolrZkClient(zkServerAddress, 15000, 5000, new OnReconnect() { @Override public void command() { }}); SolrResourceLoader loader = new SolrResourceLoader(solrHome); solrHome = loader.getInstanceDir(); InputSource cfgis = new InputSource(new File(solrHome, "solr.xml").toURI().toASCIIString()); Config cfg = new Config(loader, null, cfgis , null, false); bootstrapConf(zkClient, cfg, solrHome); if (solrPort != null) { zkServer.stop(); } }
// in core/src/java/org/apache/solr/cloud/ZkController.java
public String register(String coreName, final CoreDescriptor desc) throws Exception { return register(coreName, desc, false); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
public String register(String coreName, final CoreDescriptor desc, boolean recoverReloadedCores) throws Exception { final String baseUrl = getBaseUrl(); final CloudDescriptor cloudDesc = desc.getCloudDescriptor(); final String collection = cloudDesc.getCollectionName(); final String coreZkNodeName = getNodeName() + "_" + coreName; String shardId = cloudDesc.getShardId(); Map<String,String> props = new HashMap<String,String>(); // we only put a subset of props into the leader node props.put(ZkStateReader.BASE_URL_PROP, baseUrl); props.put(ZkStateReader.CORE_NAME_PROP, coreName); props.put(ZkStateReader.NODE_NAME_PROP, getNodeName()); if (log.isInfoEnabled()) { log.info("Register shard - core:" + coreName + " address:" + baseUrl + " shardId:" + shardId); } ZkNodeProps leaderProps = new ZkNodeProps(props); try { joinElection(desc); } catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } catch (KeeperException e) { throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } catch (IOException e) { throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } // rather than look in the cluster state file, we go straight to the zknodes // here, because on cluster restart there could be stale leader info in the // cluster state node that won't be updated for a moment String leaderUrl = getLeaderProps(collection, cloudDesc.getShardId()).getCoreUrl(); // now wait until our currently cloud state contains the latest leader String cloudStateLeader = zkStateReader.getLeaderUrl(collection, cloudDesc.getShardId(), 30000); int tries = 0; while (!leaderUrl.equals(cloudStateLeader)) { if (tries == 60) { throw new SolrException(ErrorCode.SERVER_ERROR, "There is conflicting information about the leader of shard: " + cloudDesc.getShardId()); } Thread.sleep(1000); tries++; cloudStateLeader = zkStateReader.getLeaderUrl(collection, cloudDesc.getShardId(), 30000); } String ourUrl = ZkCoreNodeProps.getCoreUrl(baseUrl, coreName); log.info("We are " + ourUrl + " and leader is " + leaderUrl); boolean isLeader = leaderUrl.equals(ourUrl); SolrCore core = null; if (cc != null) { // CoreContainer only null in tests try { core = cc.getCore(desc.getName()); // recover from local transaction log and wait for it to complete before // going active // TODO: should this be moved to another thread? To recoveryStrat? // TODO: should this actually be done earlier, before (or as part of) // leader election perhaps? // TODO: if I'm the leader, ensure that a replica that is trying to recover waits until I'm // active (or don't make me the // leader until my local replay is done. UpdateLog ulog = core.getUpdateHandler().getUpdateLog(); if (!core.isReloaded() && ulog != null) { Future<UpdateLog.RecoveryInfo> recoveryFuture = core.getUpdateHandler() .getUpdateLog().recoverFromLog(); if (recoveryFuture != null) { recoveryFuture.get(); // NOTE: this could potentially block for // minutes or more! // TODO: public as recovering in the mean time? // TODO: in the future we could do peerync in parallel with recoverFromLog } else { log.info("No LogReplay needed for core="+core.getName() + " baseURL=" + baseUrl); } } boolean didRecovery = checkRecovery(coreName, desc, recoverReloadedCores, isLeader, cloudDesc, collection, coreZkNodeName, shardId, leaderProps, core, cc); if (!didRecovery) { publishAsActive(baseUrl, desc, coreZkNodeName, coreName); } } finally { if (core != null) { core.close(); } } } else { publishAsActive(baseUrl, desc, coreZkNodeName, coreName); } // make sure we have an update cluster state right away zkStateReader.updateCloudState(true); return shardId; }
// in core/src/java/org/apache/solr/update/SolrCmdDistributor.java
public void submit(final Request sreq) { if (completionService == null) { completionService = new ExecutorCompletionService<Request>(commExecutor); pending = new HashSet<Future<Request>>(); } final String url = sreq.node.getUrl(); Callable<Request> task = new Callable<Request>() { @Override public Request call() throws Exception { Request clonedRequest = new Request(); clonedRequest.node = sreq.node; clonedRequest.ureq = sreq.ureq; clonedRequest.retries = sreq.retries; try { String fullUrl; if (!url.startsWith("http://") && !url.startsWith("https://")) { fullUrl = "http://" + url; } else { fullUrl = url; } HttpSolrServer server = new HttpSolrServer(fullUrl, client); clonedRequest.ursp = server.request(clonedRequest.ureq); // currently no way to get the request body. } catch (Exception e) { clonedRequest.exception = e; if (e instanceof SolrException) { clonedRequest.rspCode = ((SolrException) e).code(); } else { clonedRequest.rspCode = -1; } } return clonedRequest; } }; pending.add(completionService.submit(task)); }
// in core/src/java/org/apache/solr/update/SolrCmdDistributor.java
Override public Request call() throws Exception { Request clonedRequest = new Request(); clonedRequest.node = sreq.node; clonedRequest.ureq = sreq.ureq; clonedRequest.retries = sreq.retries; try { String fullUrl; if (!url.startsWith("http://") && !url.startsWith("https://")) { fullUrl = "http://" + url; } else { fullUrl = url; } HttpSolrServer server = new HttpSolrServer(fullUrl, client); clonedRequest.ursp = server.request(clonedRequest.ureq); // currently no way to get the request body. } catch (Exception e) { clonedRequest.exception = e; if (e instanceof SolrException) { clonedRequest.rspCode = ((SolrException) e).code(); } else { clonedRequest.rspCode = -1; } } return clonedRequest; }
// in core/src/java/org/apache/solr/core/SolrCore.java
public Object call() throws Exception { latch.await(); return null; }
// in core/src/java/org/apache/solr/core/SolrCore.java
public RefCounted<SolrIndexSearcher> getSearcher(boolean forceNew, boolean returnSearcher, final Future[] waitSearcher, boolean updateHandlerReopens) throws IOException { // it may take some time to open an index.... we may need to make // sure that two threads aren't trying to open one at the same time // if it isn't necessary. synchronized (searcherLock) { // see if we can return the current searcher if (_searcher!=null && !forceNew) { if (returnSearcher) { _searcher.incref(); return _searcher; } else { return null; } } // check to see if we can wait for someone else's searcher to be set if (onDeckSearchers>0 && !forceNew && _searcher==null) { try { searcherLock.wait(); } catch (InterruptedException e) { log.info(SolrException.toStr(e)); } } // check again: see if we can return right now if (_searcher!=null && !forceNew) { if (returnSearcher) { _searcher.incref(); return _searcher; } else { return null; } } // At this point, we know we need to open a new searcher... // first: increment count to signal other threads that we are // opening a new searcher. onDeckSearchers++; if (onDeckSearchers < 1) { // should never happen... just a sanity check log.error(logid+"ERROR!!! onDeckSearchers is " + onDeckSearchers); onDeckSearchers=1; // reset } else if (onDeckSearchers > maxWarmingSearchers) { onDeckSearchers--; String msg="Error opening new searcher. exceeded limit of maxWarmingSearchers="+maxWarmingSearchers + ", try again later."; log.warn(logid+""+ msg); // HTTP 503==service unavailable, or 409==Conflict throw new SolrException(SolrException.ErrorCode.SERVICE_UNAVAILABLE,msg); } else if (onDeckSearchers > 1) { log.warn(logid+"PERFORMANCE WARNING: Overlapping onDeckSearchers=" + onDeckSearchers); } } // a signal to decrement onDeckSearchers if something goes wrong. final boolean[] decrementOnDeckCount=new boolean[]{true}; RefCounted<SolrIndexSearcher> currSearcherHolder = null; // searcher we are autowarming from RefCounted<SolrIndexSearcher> searchHolder = null; boolean success = false; openSearcherLock.lock(); try { searchHolder = openNewSearcher(updateHandlerReopens, false); // the searchHolder will be incremented once already (and it will eventually be assigned to _searcher when registered) // increment it again if we are going to return it to the caller. if (returnSearcher) { searchHolder.incref(); } final RefCounted<SolrIndexSearcher> newSearchHolder = searchHolder; final SolrIndexSearcher newSearcher = newSearchHolder.get(); boolean alreadyRegistered = false; synchronized (searcherLock) { if (_searcher == null) { // if there isn't a current searcher then we may // want to register this one before warming is complete instead of waiting. if (solrConfig.useColdSearcher) { registerSearcher(newSearchHolder); decrementOnDeckCount[0]=false; alreadyRegistered=true; } } else { // get a reference to the current searcher for purposes of autowarming. currSearcherHolder=_searcher; currSearcherHolder.incref(); } } final SolrIndexSearcher currSearcher = currSearcherHolder==null ? null : currSearcherHolder.get(); Future future=null; // warm the new searcher based on the current searcher. // should this go before the other event handlers or after? if (currSearcher != null) { future = searcherExecutor.submit( new Callable() { public Object call() throws Exception { try { newSearcher.warm(currSearcher); } catch (Throwable e) { SolrException.log(log,e); } return null; } } ); } if (currSearcher==null && firstSearcherListeners.size() > 0) { future = searcherExecutor.submit( new Callable() { public Object call() throws Exception { try { for (SolrEventListener listener : firstSearcherListeners) { listener.newSearcher(newSearcher,null); } } catch (Throwable e) { SolrException.log(log,null,e); } return null; } } ); }
// in core/src/java/org/apache/solr/core/SolrCore.java
public Object call() throws Exception { try { newSearcher.warm(currSearcher); } catch (Throwable e) { SolrException.log(log,e); } return null; }
// in core/src/java/org/apache/solr/core/SolrCore.java
public Object call() throws Exception { try { for (SolrEventListener listener : firstSearcherListeners) { listener.newSearcher(newSearcher,null); } } catch (Throwable e) { SolrException.log(log,null,e); } return null; }
// in core/src/java/org/apache/solr/core/SolrCore.java
public Object call() throws Exception { try { for (SolrEventListener listener : newSearcherListeners) { listener.newSearcher(newSearcher, currSearcher); } } catch (Throwable e) { SolrException.log(log,null,e); } return null; }
// in core/src/java/org/apache/solr/core/SolrCore.java
public Object call() throws Exception { try { // registerSearcher will decrement onDeckSearchers and // do a notify, even if it fails. registerSearcher(newSearchHolder); } catch (Throwable e) { SolrException.log(log, e); } finally { // we are all done with the old searcher we used // for warming... if (currSearcherHolderF!=null) currSearcherHolderF.decref(); } return null; }
// in core/src/java/org/apache/solr/util/plugin/MapPluginLoader.java
Override protected void init(T plugin, Node node) throws Exception { Map<String,String> params = DOMUtil.toMapExcept( node.getAttributes(), "name","class" ); plugin.init( params ); }
// in core/src/java/org/apache/solr/util/plugin/MapPluginLoader.java
Override protected T register(String name, T plugin) throws Exception { if( registry != null ) { return registry.put( name, plugin ); } return null; }
// in core/src/java/org/apache/solr/util/plugin/NamedListPluginLoader.java
Override protected void init(T plugin,Node node) throws Exception { plugin.init( DOMUtil.childNodesToNamedList(node) ); }
// in core/src/java/org/apache/solr/util/plugin/NamedListPluginLoader.java
Override protected T register(String name, T plugin) throws Exception { if( registry != null ) { return registry.put( name, plugin ); } return null; }
254
            
// in solrj/src/java/org/apache/zookeeper/SolrZooKeeper.java
catch (Exception e) { }
// in solrj/src/java/org/apache/solr/common/cloud/DefaultConnectionStrategy.java
catch (Exception e) { SolrException.log(log, "Reconnect to ZooKeeper failed", e); log.info("Reconnect to ZooKeeper failed"); }
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
catch (Exception e) { throw new RuntimeException("Problem pretty printing XML", e); }
// in solrj/src/java/org/apache/solr/common/cloud/ConnectionManager.java
catch (Exception e) { SolrException.log(log, "", e); }
// in solrj/src/java/org/apache/solr/common/util/ContentStreamBase.java
catch(Exception ex) {}
// in solrj/src/java/org/apache/solr/common/params/CoreAdminParams.java
catch( Exception ex ) {}
// in solrj/src/java/org/apache/solr/common/params/SolrParams.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, ex.getMessage(), ex ); }
// in solrj/src/java/org/apache/solr/common/params/SolrParams.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, ex.getMessage(), ex ); }
// in solrj/src/java/org/apache/solr/common/params/SolrParams.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, ex.getMessage(), ex ); }
// in solrj/src/java/org/apache/solr/common/params/SolrParams.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, ex.getMessage(), ex ); }
// in solrj/src/java/org/apache/solr/common/params/SolrParams.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, ex.getMessage(), ex ); }
// in solrj/src/java/org/apache/solr/common/params/SolrParams.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, ex.getMessage(), ex ); }
// in solrj/src/java/org/apache/solr/common/params/SolrParams.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, ex.getMessage(), ex ); }
// in solrj/src/java/org/apache/solr/common/params/SolrParams.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, ex.getMessage(), ex ); }
// in solrj/src/java/org/apache/solr/common/params/SolrParams.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, ex.getMessage(), ex ); }
// in solrj/src/java/org/apache/solr/common/params/SolrParams.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, ex.getMessage(), ex ); }
// in solrj/src/java/org/apache/solr/common/params/SolrParams.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, ex.getMessage(), ex ); }
// in solrj/src/java/org/apache/solr/common/params/SolrParams.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, ex.getMessage(), ex ); }
// in solrj/src/java/org/apache/solr/client/solrj/request/QueryRequest.java
catch (Exception e) { throw new SolrServerException("Error executing query", e); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/ConcurrentUpdateSolrServer.java
catch (Exception ex) { }
// in solrj/src/java/org/apache/solr/client/solrj/impl/HttpSolrServer.java
catch (Exception ex) {}
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
catch (Exception e) { throw new SolrServerException(e); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
catch (Exception e) { throw new SolrServerException(e); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
catch (Exception e) { throw new SolrServerException(e); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
catch (Exception e) { throw new SolrServerException(e); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
catch (Exception e) { //Expected. The server is still down. zombieServer.failedPings++; // If the server doesn't belong in the standard set belonging to this load balancer // then simply drop it after a certain number of failed pings. if (!zombieServer.standard && zombieServer.failedPings >= NONSTANDARD_PING_LIMIT) { zombieServers.remove(zombieServer.getKey()); } }
// in solrj/src/java/org/apache/solr/client/solrj/impl/XMLResponseParser.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "parsing error", ex ); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/XMLResponseParser.java
catch( Exception ex ){}
// in solrj/src/java/org/apache/solr/client/solrj/impl/XMLResponseParser.java
catch( Exception ex ) { ex.printStackTrace(); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/XMLResponseParser.java
catch( Exception ex ) {}
// in solrj/src/java/org/apache/solr/client/solrj/beans/DocumentObjectBinder.java
catch (Exception e) { throw new BindingException("Could not instantiate object of " + clazz, e); }
// in solrj/src/java/org/apache/solr/client/solrj/beans/DocumentObjectBinder.java
catch (Exception ex) { // no getter -- don't worry about it... if (type == Boolean.class) { gname = "is" + setter.getName().substring(3); try { getter = setter.getDeclaringClass().getMethod(gname, (Class[]) null); } catch(Exception ex2) { // no getter -- don't worry about it... } } }
// in solrj/src/java/org/apache/solr/client/solrj/beans/DocumentObjectBinder.java
catch(Exception ex2) { // no getter -- don't worry about it... }
// in solrj/src/java/org/apache/solr/client/solrj/beans/DocumentObjectBinder.java
catch (Exception e) { throw new BindingException("Exception while setting value : " + v + " on " + (field != null ? field : setter), e); }
// in solrj/src/java/org/apache/solr/client/solrj/beans/DocumentObjectBinder.java
catch (Exception e) { throw new BindingException("Exception while getting value: " + field, e); }
// in solrj/src/java/org/apache/solr/client/solrj/beans/DocumentObjectBinder.java
catch (Exception e) { throw new BindingException("Exception while getting value: " + getter, e); }
// in solrj/src/java/org/apache/solr/client/solrj/util/ClientUtils.java
catch( Exception ex ){}
// in contrib/extraction/src/java/org/apache/solr/handler/extraction/SolrContentHandler.java
catch (Exception e) { // Let the specific fieldType handle errors // throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Invalid value: " + val + " for field: " + schFld, e); }
// in contrib/extraction/src/java/org/apache/solr/handler/extraction/ExtractingRequestHandler.java
catch (Exception e) { throw new SolrException(ErrorCode.SERVER_ERROR, e); }
// in contrib/uima/src/java/org/apache/solr/uima/processor/UIMAUpdateRequestProcessor.java
catch (Exception e) { String logField = solrUIMAConfiguration.getLogField(); if(logField == null){ SchemaField uniqueKeyField = solrCore.getSchema().getUniqueKeyField(); if(uniqueKeyField != null){ logField = uniqueKeyField.getName(); } } String optionalFieldInfo = logField == null ? "." : new StringBuilder(". ").append(logField).append("=") .append((String)cmd.getSolrInputDocument().getField(logField).getValue()) .append(", ").toString(); int len = Math.min(text.length(), 100); if (solrUIMAConfiguration.isIgnoreErrors()) { log.warn(new StringBuilder("skip the text processing due to ") .append(e.getLocalizedMessage()).append(optionalFieldInfo) .append(" text=\"").append(text.substring(0, len)).append("...\"").toString()); } else { throw new SolrException(ErrorCode.SERVER_ERROR, new StringBuilder("processing error: ") .append(e.getLocalizedMessage()).append(optionalFieldInfo) .append(" text=\"").append(text.substring(0, len)).append("...\"").toString(), e); } }
// in contrib/uima/src/java/org/apache/solr/uima/processor/UIMAToSolrMapper.java
catch (Exception e) { throw new FieldMappingException(e); }
// in contrib/clustering/src/java/org/apache/solr/handler/clustering/carrot2/LuceneCarrot2TokenizerFactory.java
catch (Exception e) { throw ExceptionUtils.wrapAsRuntimeException(e); }
// in contrib/clustering/src/java/org/apache/solr/handler/clustering/carrot2/LuceneCarrot2StemmerFactory.java
catch (Exception e) { logger.warn("Could not instantiate snowball stemmer" + " for language: " + language.name() + ". Quality of clustering may be degraded.", e); return IdentityStemmer.INSTANCE; }
// in contrib/clustering/src/java/org/apache/solr/handler/clustering/carrot2/CarrotClusteringEngine.java
catch (Exception ignored) { // If we get the exception, the resource loader implementation // probably does not support getConfigDir(). Not a big problem. }
// in contrib/clustering/src/java/org/apache/solr/handler/clustering/carrot2/CarrotClusteringEngine.java
catch (Exception e) { log.error("Carrot2 clustering failed", e); throw new SolrException(ErrorCode.SERVER_ERROR, "Carrot2 clustering failed", e); }
// in contrib/langid/src/java/org/apache/solr/update/processor/LangDetectLanguageIdentifierUpdateProcessorFactory.java
catch (Exception e) { throw new RuntimeException("Couldn't load profile data, will return empty languages always!", e); }
// in contrib/dataimporthandler-extras/src/java/org/apache/solr/handler/dataimport/TikaEntityProcessor.java
catch (Exception e) { wrapAndThrow (SEVERE, e,"Unable to load Tika Config"); }
// in contrib/dataimporthandler-extras/src/java/org/apache/solr/handler/dataimport/TikaEntityProcessor.java
catch (Exception e) { wrapAndThrow(SEVERE, e, "Unable to read content"); }
// in contrib/dataimporthandler-extras/src/java/org/apache/solr/handler/dataimport/MailEntityProcessor.java
catch (Exception e) { return null; }
// in contrib/dataimporthandler-extras/src/java/org/apache/solr/handler/dataimport/MailEntityProcessor.java
catch (Exception e) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Custom filter could not be created", e); }
// in contrib/analysis-extras/src/java/org/apache/solr/schema/ICUCollationField.java
catch (Exception e) { // io error or invalid rules throw new RuntimeException(e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DataImporter.java
catch (Exception e) { throw new DataImportHandlerException(SEVERE, "Exception occurred while initializing context", e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DataImporter.java
catch (Exception e) { wrapAndThrow(SEVERE, e, "Invalid type for data source: " + type); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DataImporter.java
catch (Exception e) { wrapAndThrow(SEVERE, e, "Failed to initialize DataSource: " + key.getDataSourceName()); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/XPathEntityProcessor.java
catch (Exception e) { throw new DataImportHandlerException(SEVERE, "Error initializing XSL ", e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/XPathEntityProcessor.java
catch (Exception e) { if (ABORT.equals(onError)) { wrapAndThrow(SEVERE, e); } else if (SKIP.equals(onError)) { if (LOG.isDebugEnabled()) LOG.debug("Skipping url : " + s, e); wrapAndThrow(DataImportHandlerException.SKIP, e); } else { LOG.warn("Failed for url : " + s, e); rowIterator = Collections.EMPTY_LIST.iterator(); return; } }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/XPathEntityProcessor.java
catch (Exception e) { String msg = "Parsing failed for xml, url:" + s + " rows processed:" + rows.size(); if (rows.size() > 0) msg += " last row: " + rows.get(rows.size() - 1); if (ABORT.equals(onError)) { wrapAndThrow(SEVERE, e, msg); } else if (SKIP.equals(onError)) { LOG.warn(msg, e); Map<String, Object> map = new HashMap<String, Object>(); map.put(SKIP_DOC, Boolean.TRUE); rows.add(map); } else if (CONTINUE.equals(onError)) { LOG.warn(msg, e); } }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/XPathEntityProcessor.java
catch (Exception e) { /* Ignore */ }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/XPathEntityProcessor.java
catch (Exception e) { isEnd.set(true); return; }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/XPathEntityProcessor.java
catch (Exception e) { if(throwExp.get()) exp.set(e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/RegexTransformer.java
catch (Exception e) { LOG.warn("Parsing failed for field : " + columnName, e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DebugLogger.java
catch (Exception e) { log(DIHLogLevels.ENTITY_EXCEPTION, null, e); DataImportHandlerException de = new DataImportHandlerException( DataImportHandlerException.SEVERE, "", e); de.debugged = true; throw de; }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DebugLogger.java
catch (Exception e) { log(DIHLogLevels.TRANSFORMER_EXCEPTION, tName, e); DataImportHandlerException de = new DataImportHandlerException(DataImportHandlerException.SEVERE, "", e); de.debugged = true; throw de; }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/ScriptTransformer.java
catch (Exception e) { wrapAndThrow(SEVERE,e, "Error invoking script for entity " + context.getEntityAttribute("name")); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/FileDataSource.java
catch (Exception e) { wrapAndThrow(SEVERE,e,"Unable to open File : "+f.getAbsolutePath()); return null; }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/FieldStreamDataSource.java
catch (Exception e) { LOG.info("Unable to get data from BLOB"); return null; }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/SqlEntityProcessor.java
catch (Exception e) { LOG.error( "The query failed '" + q + "'", e); throw new DataImportHandlerException(DataImportHandlerException.SEVERE, e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/SimplePropertiesWriter.java
catch (Exception e) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Unable to persist Index Start Time", e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/SimplePropertiesWriter.java
catch (Exception e) { log.warn("Unable to read: " + persistFilename); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/EvaluatorBag.java
catch (Exception e) { wrapAndThrow(SEVERE, e, "Unable to encode expression: " + expression + " with value: " + s); return null; }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/EvaluatorBag.java
catch (Exception e) { wrapAndThrow(SEVERE, e, "Unable to instantiate evaluator: " + map.get(CLASS)); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/JdbcDataSource.java
catch (Exception e) { wrapAndThrow(SEVERE, e, "Unable to execute query: " + query); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/JdbcDataSource.java
catch (Exception e) { logError("Exception while closing result set", e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/JdbcDataSource.java
catch (Exception e) { LOG.error("Ignoring Error when closing connection", e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DIHCacheSupport.java
catch (Exception e) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Unable to load Cache implementation:" + cacheImplName, e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DIHCacheSupport.java
catch (Exception e) { SolrException.log(log, "getNextFromCache() failed for query '" + query + "'", e); wrapAndThrow(DataImportHandlerException.WARN, e); return null; }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/SolrWriter.java
catch (Exception e) { log.warn("Error creating document : " + d, e); return false; }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/SolrWriter.java
catch (Exception e) { }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/EntityProcessorWrapper.java
catch (Exception e) { log.error("Unable to load Transformer: " + aTransArr, e); wrapAndThrow(SEVERE, e,"Unable to load Transformer: " + trans); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/EntityProcessorWrapper.java
catch (Exception e) { log.warn("method invocation failed on transformer : " + trans, e); throw new DataImportHandlerException(WARN, e);
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/EntityProcessorWrapper.java
catch (Exception e) { log.warn("transformer threw error", e); if (ABORT.equals(onError)) { wrapAndThrow(SEVERE, e); } else if (SKIP.equals(onError)) { wrapAndThrow(DataImportHandlerException.SKIP, e); } // onError = continue }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/EntityProcessorWrapper.java
catch (Exception e) { if(ABORT.equals(onError)){ wrapAndThrow(SEVERE, e); } else { //SKIP is not really possible. If this calls the nextRow() again the Entityprocessor would be in an inconisttent state SolrException.log(log, "Exception in entity : "+ entityName, e); return null; } }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/ZKPropertiesWriter.java
catch (Exception e) { log.warn( "Could not persist properties to " + path + " :" + e.getClass(), e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/XPathRecordReader.java
catch (Exception e) { throw new RuntimeException(e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/BinURLDataSource.java
catch (Exception e) { LOG.error("Exception thrown while getting data", e); wrapAndThrow (SEVERE, e, "Exception in invoking url " + url); return null;//unreachable }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/URLDataSource.java
catch (Exception e) { LOG.error("Exception thrown while getting data", e); throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Exception in invoking url " + url, e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/FieldReaderDataSource.java
catch (Exception e) { LOG.info("Unable to get data from CLOB"); return null; }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/FieldReaderDataSource.java
catch (Exception e) { LOG.info("Unable to get data from BLOB"); return null; }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/FieldReaderDataSource.java
catch (Exception e) { wrapAndThrow(SEVERE, e,"Unable to get reader from clob"); return null;//unreachable }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/EntityProcessorBase.java
catch (Exception e) { SolrException.log(log, "getNext() failed for query '" + query + "'", e); query = null; rowIterator = null; wrapAndThrow(DataImportHandlerException.WARN, e); return null; }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DocBuilder.java
catch (Exception e) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Unable to load Writer implementation:" + writerClassStr, e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DocBuilder.java
catch (Exception e) { wrapAndThrow(SEVERE, e); // unreachable statement return null; }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DocBuilder.java
catch (Exception e) { wrapAndThrow(SEVERE, e, "Unable to load class : " + className); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DocBuilder.java
catch(Exception e) { throw new RuntimeException(e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DocBuilder.java
catch (Exception e) { LOG.error("Could not write property file", e); statusMessages.put("error", "Could not write property file. Delta imports will not work. " + "Make sure your conf directory is writable"); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DocBuilder.java
catch (Exception e) { throw new RuntimeException(e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DocBuilder.java
catch (Exception e) { wrapAndThrow (SEVERE,e, "Unable to load EntityProcessor implementation for entity:" + entity.getName()); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DocBuilder.java
catch (Exception e) { try { String n = DocBuilder.class.getPackage().getName() + "." + name; return core != null ? core.getResourceLoader().findClass(n, Object.class) : Class.forName(n); } catch (Exception e1) { throw new ClassNotFoundException("Unable to load " + name + " or " + DocBuilder.class.getPackage().getName() + "." + name, e); } }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DocBuilder.java
catch (Exception e1) { throw new ClassNotFoundException("Unable to load " + name + " or " + DocBuilder.class.getPackage().getName() + "." + name, e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/PlainTextEntityProcessor.java
catch (Exception e) { wrapAndThrow(SEVERE, e, "Exception reading url : " + url); }
// in contrib/velocity/src/java/org/apache/solr/response/VelocityResponseWriter.java
catch (Exception e) { throw new IOException(e.getMessage()); }
// in contrib/velocity/src/java/org/apache/solr/response/VelocityResponseWriter.java
catch (Exception e) { throw new RuntimeException(e); }
// in contrib/velocity/src/java/org/apache/solr/response/VelocityResponseWriter.java
catch (Exception e) { throw new IOException(e.getMessage()); }
// in core/src/java/org/apache/solr/handler/DocumentAnalysisRequestHandler.java
catch (Exception e) { // ignore analysis exceptions since we are applying arbitrary text to all fields termsToMatch = EMPTY_BYTES_SET; }
// in core/src/java/org/apache/solr/handler/DocumentAnalysisRequestHandler.java
catch (Exception e) { // ignore analysis exceptions since we are applying arbitrary text to all fields }
// in core/src/java/org/apache/solr/handler/RequestHandlerBase.java
catch (Exception e) { if (e instanceof SolrException) { SolrException se = (SolrException)e; if (se.code() == SolrException.ErrorCode.CONFLICT.code) { // TODO: should we allow this to be counted as an error (numErrors++)? } else { SolrException.log(SolrCore.log,e); } } else { SolrException.log(SolrCore.log,e); if (e instanceof ParseException) { e = new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); } } rsp.setException(e); numErrors++; }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
catch (Exception e) { LOG.error("Exception in fetching index", e); }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
catch (Exception e) { }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
catch (Exception e) { LOG.error("Master at: " + masterUrl + " is not available. Index fetch failed. Exception: " + e.getMessage()); return false; }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
catch (Exception e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Index fetch failed : ", e); }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
catch (Exception e) { LOG.warn("Exception while updating statistics", e); }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
catch (Exception e) { LOG.error("Could not restart core ", e); }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
catch (Exception e) { LOG.error("Unable to load index.properties"); }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
catch (Exception e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Unable to write index.properties", e); }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
catch (Exception e) { LOG.warn("Error in fetching packets ", e); //for any failure , increment the error count errorCount++; //if it fails for the same pacaket for MAX_RETRIES fail and come out if (errorCount > MAX_RETRIES) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Fetch failed for file:" + fileName, e); } return ERR; }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
catch (Exception e) {/* noop */ LOG.error("Error closing the file stream: "+ this.saveAs ,e); }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
catch (Exception e) { LOG.error("Error deleting file in cleanup" + e.getMessage()); }
// in core/src/java/org/apache/solr/handler/loader/JavabinLoader.java
catch (Exception e) { log.error("Exception while processing update request", e); break; }
// in core/src/java/org/apache/solr/handler/ReplicationHandler.java
catch (Exception e) { LOG.warn("Exception in finding checksum of " + f, e); }
// in core/src/java/org/apache/solr/handler/ReplicationHandler.java
catch (Exception e) { SolrException.log(LOG, "SnapPull failed ", e); }
// in core/src/java/org/apache/solr/handler/ReplicationHandler.java
catch (Exception e) { LOG.warn("Exception during creating a snapshot", e); rsp.add("exception", e); }
// in core/src/java/org/apache/solr/handler/ReplicationHandler.java
catch (Exception e) { LOG.warn("Exception while invoking 'details' method for replication on master ", e); slave.add(ERR_STATUS, "invalid_master"); }
// in core/src/java/org/apache/solr/handler/ReplicationHandler.java
catch (Exception e) { LOG.error("Exception while writing replication details: ", e); }
// in core/src/java/org/apache/solr/handler/ReplicationHandler.java
catch (Exception e) { LOG.warn("Exception while reading " + SnapPuller.REPLICATION_PROPERTIES); }
// in core/src/java/org/apache/solr/handler/ReplicationHandler.java
catch (Exception e) { LOG.error("Exception while snapshooting", e); }
// in core/src/java/org/apache/solr/handler/MoreLikeThisHandler.java
catch (Exception e) { SolrException.log(SolrCore.log, "Exception during debug", e); rsp.add("exception_during_debug", SolrException.toStr(e)); }
// in core/src/java/org/apache/solr/handler/component/QueryElevationComponent.java
catch (Exception ex) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Error initializing QueryElevationComponent.", ex); }
// in core/src/java/org/apache/solr/handler/component/QueryElevationComponent.java
catch (Exception ex) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Error loading elevation", ex); }
// in core/src/java/org/apache/solr/handler/component/SpellCheckComponent.java
catch (Exception e) { log.error( "Exception in building spell check index for spellchecker: " + checker.getDictionaryName(), e); }
// in core/src/java/org/apache/solr/handler/component/HighlightComponent.java
catch (Exception e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/handler/admin/LukeRequestHandler.java
catch( Exception ex ) {}
// in core/src/java/org/apache/solr/handler/admin/LukeRequestHandler.java
catch( Exception ex ) { log.warn( "error writing term vector", ex ); }
// in core/src/java/org/apache/solr/handler/admin/LukeRequestHandler.java
catch( Exception ex ) { log.warn( "error reading field: "+fieldName ); }
// in core/src/java/org/apache/solr/handler/admin/SolrInfoMBeanHandler.java
catch(Exception ex) { throw new SolrException(ErrorCode.BAD_REQUEST, "missing content-stream for diff"); }
// in core/src/java/org/apache/solr/handler/admin/SolrInfoMBeanHandler.java
catch(Exception ex) { throw new SolrException(ErrorCode.BAD_REQUEST, "Unable to read original XML", ex); }
// in core/src/java/org/apache/solr/handler/admin/CoreAdminHandler.java
catch (Exception ex) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Error executing default implementation of CREATE", ex); }
// in core/src/java/org/apache/solr/handler/admin/CoreAdminHandler.java
catch (Exception ex) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Error handling 'status' action ", ex); }
// in core/src/java/org/apache/solr/handler/admin/CoreAdminHandler.java
catch (Exception ex) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Error handling 'reload' action", ex); }
// in core/src/java/org/apache/solr/handler/admin/SystemInfoHandler.java
catch( Exception ex ) {}
// in core/src/java/org/apache/solr/handler/admin/SystemInfoHandler.java
catch( Exception ex ) { // ignore - log.warn("Error executing command", ex); return "(error executing: " + cmd + ")"; }
// in core/src/java/org/apache/solr/handler/admin/SystemInfoHandler.java
catch (Exception e) { log.warn("Error getting JMX properties", e); }
// in core/src/java/org/apache/solr/handler/admin/LoggingHandler.java
catch(Exception ex) { throw new SolrException(ErrorCode.BAD_REQUEST, "invalid timestamp: "+since); }
// in core/src/java/org/apache/solr/handler/SnapShooter.java
catch (Exception e) { SnapPuller.delTree(snapShotDir); LOG.error("Exception while creating snapshot", e); details.add("snapShootException", e.getMessage()); }
// in core/src/java/org/apache/solr/handler/SnapShooter.java
catch(Exception e) { this.dir = null; this.timestamp = null; }
// in core/src/java/org/apache/solr/analysis/PhoneticFilterFactory.java
catch (Exception e) { throw new InitializationException("Encoder " + name + " / " + clazz + " does not support " + MAX_CODE_LENGTH, e); }
// in core/src/java/org/apache/solr/analysis/PhoneticFilterFactory.java
catch (Exception e) { final Throwable t = (e instanceof InvocationTargetException) ? e.getCause() : e; throw new InitializationException("Error initializing encoder: " + name + " / " + clazz, t); }
// in core/src/java/org/apache/solr/analysis/HunspellStemFilterFactory.java
catch (Exception e) { throw new InitializationException("Unable to load hunspell data! [dictionary=" + args.get("dictionary") + ",affix=" + affixFile + "]", e); }
// in core/src/java/org/apache/solr/analysis/SynonymFilterFactory.java
catch (Exception e) { throw new InitializationException("Exception thrown while loading synonyms", e); }
// in core/src/java/org/apache/solr/analysis/TrimFilterFactory.java
catch( Exception ex ) { throw new InitializationException("Error reading updateOffsets value. Must be true or false.", ex); }
// in core/src/java/org/apache/solr/analysis/HyphenationCompoundWordTokenFilterFactory.java
catch (Exception e) { // TODO: getHyphenationTree really shouldn't throw "Exception" throw new InitializationException("Exception thrown while loading dictionary and hyphenation file", e); }
// in core/src/java/org/apache/solr/analysis/SnowballPorterFilterFactory.java
catch (Exception e) { throw new InitializationException("Error instantiating stemmer for language " + language + "from class " + stemClass, e); }
// in core/src/java/org/apache/solr/analysis/JapaneseTokenizerFactory.java
catch (Exception e) { throw new InitializationException("Exception thrown while loading dictionary", e); }
// in core/src/java/org/apache/solr/analysis/PatternTokenizerFactory.java
catch( Exception ex ) { throw new InitializationException("invalid group argument: " + g); }
// in core/src/java/org/apache/solr/client/solrj/embedded/EmbeddedSolrServer.java
catch (Exception ex) { throw new RuntimeException(ex); }
// in core/src/java/org/apache/solr/client/solrj/embedded/EmbeddedSolrServer.java
catch( Exception ex ) { throw new SolrServerException( ex ); }
// in core/src/java/org/apache/solr/client/solrj/embedded/JettySolrRunner.java
catch (Exception ex) { ex.printStackTrace(); }
// in core/src/java/org/apache/solr/response/XSLTResponseWriter.java
catch(Exception e) { // TODO should our parent interface throw (IO)Exception? throw new RuntimeException("getTransformer fails in getContentType",e); }
// in core/src/java/org/apache/solr/response/transform/ValueAugmenterFactory.java
catch( Exception ex ) { throw new SolrException( ErrorCode.BAD_REQUEST, "Unable to parse "+type+"="+val, ex ); }
// in core/src/java/org/apache/solr/response/transform/ExplainAugmenterFactory.java
catch( Exception ex ) { throw new SolrException( ErrorCode.BAD_REQUEST, "Unknown Explain Style: "+str ); }
// in core/src/java/org/apache/solr/response/BinaryResponseWriter.java
catch (Exception e) { LOG.warn("Error reading a field : " + o, e); }
// in core/src/java/org/apache/solr/response/BinaryResponseWriter.java
catch (Exception e) { // There is a chance of the underlying field not really matching the // actual field type . So ,it can throw exception LOG.warn("Error reading a field from document : " + solrDoc, e); //if it happens log it and continue continue; }
// in core/src/java/org/apache/solr/response/BinaryResponseWriter.java
catch (Exception ex) { throw new RuntimeException(ex); }
// in core/src/java/org/apache/solr/request/SimpleFacets.java
catch (Exception e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Can't parse value "+rawval+" for field: " + field.getName(), e); }
// in core/src/java/org/apache/solr/request/SimpleFacets.java
catch (Exception e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Can't parse gap "+gap+" for field: " + field.getName(), e); }
// in core/src/java/org/apache/solr/request/SimpleFacets.java
catch (Exception e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Can't add gap "+gap+" to value " + value + " for field: " + field.getName(), e); }
// in core/src/java/org/apache/solr/servlet/SolrDispatchFilter.java
catch (Exception e) { //unlikely throw new SolrException(SolrException.ErrorCode.SERVER_ERROR,e); }
// in core/src/java/org/apache/solr/servlet/cache/Method.java
catch (Exception e) { return OTHER; }
// in core/src/java/org/apache/solr/schema/OpenExchangeRatesOrgProvider.java
catch (Exception e) { ratesJsonStream = resourceLoader.openResource(ratesFileLocation); }
// in core/src/java/org/apache/solr/schema/OpenExchangeRatesOrgProvider.java
catch (Exception e) { throw new SolrException(ErrorCode.SERVER_ERROR, "Error reloading exchange rates", e); }
// in core/src/java/org/apache/solr/schema/OpenExchangeRatesOrgProvider.java
catch (Exception e) { throw new SolrException(ErrorCode.BAD_REQUEST, "Error initializing", e); }
// in core/src/java/org/apache/solr/schema/IndexSchema.java
catch(Exception e) { // unexpected exception... throw new SolrException( SolrException.ErrorCode.SERVER_ERROR,"Schema Parsing Failed: " + e.getMessage(), e); }
// in core/src/java/org/apache/solr/schema/DateField.java
catch (Exception e) { return DateUtil.parseDate(s); }
// in core/src/java/org/apache/solr/schema/CurrencyField.java
catch (Exception e) { throw new SolrException(ErrorCode.BAD_REQUEST, "Error instansiating exhange rate provider "+exchangeRateProviderClass+". Please check your FieldType configuration", e); }
// in core/src/java/org/apache/solr/schema/FieldTypePluginLoader.java
catch (Exception e) { log.error("Cannot load analyzer: "+analyzerName, e); throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "Cannot load analyzer: "+analyzerName, e ); }
// in core/src/java/org/apache/solr/schema/PreAnalyzedField.java
catch (Exception e) { LOG.warn("Can't use the configured PreAnalyzedParser class '" + implName + "' (" + e.getMessage() + "), using default " + DEFAULT_IMPL); parser = new JsonPreAnalyzedParser(); }
// in core/src/java/org/apache/solr/schema/PreAnalyzedField.java
catch (Exception e) { e.printStackTrace(); return null; }
// in core/src/java/org/apache/solr/internal/csv/writer/CSVConfigGuesser.java
catch(Exception e) { e.printStackTrace(); }
// in core/src/java/org/apache/solr/internal/csv/writer/CSVConfigGuesser.java
catch(Exception e) { // ignore exception. }
// in core/src/java/org/apache/solr/internal/csv/writer/CSVWriter.java
catch(Exception e) { e.printStackTrace(); }
// in core/src/java/org/apache/solr/search/SurroundQParserPlugin.java
catch (Exception e) { LOG.warn("Couldn't parse maxBasicQueries value " + mbqparam +", using default of 1000"); this.maxBasicQueries = DEFMAXBASICQUERIES; }
// in core/src/java/org/apache/solr/search/ExtendedDismaxQParserPlugin.java
catch (Exception e) { // ignore failure and reparse later after escaping reserved chars up.exceptions = false; }
// in core/src/java/org/apache/solr/search/ExtendedDismaxQParserPlugin.java
catch (Exception e) { // an exception here is due to the field query not being compatible with the input text // for example, passing a string to a numeric field. return null; }
// in core/src/java/org/apache/solr/search/QueryParsing.java
catch (Exception e) { // hang onto this in case the string isn't a full field name either qParserException = e; }
// in core/src/java/org/apache/solr/search/QueryParsing.java
catch (Exception e) { out.append("EXCEPTION(val="); out.append(val); out.append(")"); }
// in core/src/java/org/apache/solr/search/QueryParsing.java
catch (Exception e) { out.append("EXCEPTION(val="); out.append(val.utf8ToString()); out.append(")"); }
// in core/src/java/org/apache/solr/search/QueryParsing.java
catch (Exception e) { throw new RuntimeException(e); }
// in core/src/java/org/apache/solr/search/function/FileFloatSource.java
catch (Exception e) { if (++otherErrors<=10) { SolrCore.log.error( "Error loading external value source + fileName + " + e + (otherErrors<10 ? "" : "\tSkipping future errors for this file.") ); } continue; // go to next line in file.. leave values as default. }
// in core/src/java/org/apache/solr/search/function/FileFloatSource.java
catch(Exception e){}
// in core/src/java/org/apache/solr/search/SolrCacheBase.java
catch (Exception e) { throw new RuntimeException("Can't parse autoWarm value: " + configValue, e); }
// in core/src/java/org/apache/solr/search/CacheConfig.java
catch (Exception e) { SolrException.log(SolrCache.log,"Error instantiating cache",e); // we can carry on without a cache... but should we? // in some cases (like an OOM) we probably should try to continue. return null; }
// in core/src/java/org/apache/solr/spelling/SpellCheckCollator.java
catch (Exception e) { LOG.warn("Exception trying to re-query to check if a spell check possibility would return any hits.", e); }
// in core/src/java/org/apache/solr/spelling/suggest/Suggester.java
catch (Exception e) { LOG.error("Error while building or storing Suggester data", e); }
// in core/src/java/org/apache/solr/cloud/ZkSolrResourceLoader.java
catch (Exception e) { throw new RuntimeException("Error opening " + file, e); }
// in core/src/java/org/apache/solr/cloud/ZkSolrResourceLoader.java
catch (Exception e) { throw new RuntimeException("Error opening " + resource, e); }
// in core/src/java/org/apache/solr/cloud/SyncStrategy.java
catch (Exception e) { SolrException.log(log, "Sync Failed", e); }
// in core/src/java/org/apache/solr/cloud/SyncStrategy.java
catch (Exception e) { SolrException.log(log, "Sync Failed", e); }
// in core/src/java/org/apache/solr/cloud/SyncStrategy.java
catch (Exception e) { SolrException.log(log, "Error syncing replica to leader", e); }
// in core/src/java/org/apache/solr/cloud/SyncStrategy.java
catch (Exception e) { log.info("Could not tell a replica to recover", e); }
// in core/src/java/org/apache/solr/cloud/NodeStateWatcher.java
catch (Exception e) { log.warn("Error processing state change", e); }
// in core/src/java/org/apache/solr/cloud/SolrZkServer.java
catch (Exception e) { log.error("STARTING ZOOKEEPER", e); }
// in core/src/java/org/apache/solr/cloud/RecoveryStrategy.java
catch (Exception e) { log.error("", e); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (Exception e) { SolrException.log(log, "", e); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (Exception e) { SolrException.log(log, "", e); throw new ZooKeeperException( SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (Exception e) { SolrException.log(log, "There was a problem finding the leader in zk", e); try { Thread.sleep(2000); } catch (InterruptedException e1) { Thread.currentThread().interrupt(); } if (i == retries - 1) { throw new SolrException(ErrorCode.SERVER_ERROR, "There was a problem finding the leader in zk"); } }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (Exception e) { SolrException.log(log, "There was a problem making a request to the leader", e); try { Thread.sleep(2000); } catch (InterruptedException e1) { Thread.currentThread().interrupt(); } if (i == retries - 1) { throw new SolrException(ErrorCode.SERVER_ERROR, "There was a problem making a request to the leader"); } }
// in core/src/java/org/apache/solr/cloud/LeaderElector.java
catch (Exception e) { log.warn("", e); }
// in core/src/java/org/apache/solr/update/UpdateLog.java
catch (Exception e) { SolrException.log(log, "Failure to open existing log file (non fatal) " + f, e); f.delete(); }
// in core/src/java/org/apache/solr/update/UpdateLog.java
catch (Exception e) { SolrException.log(log, e); }
// in core/src/java/org/apache/solr/update/UpdateLog.java
catch (Exception ex) { log.warn("Exception reverse reading log", ex); break; }
// in core/src/java/org/apache/solr/update/CommitTracker.java
catch (Exception e) { SolrException.log(log, "auto commit error...", e); }
// in core/src/java/org/apache/solr/update/processor/DistributedUpdateProcessor.java
catch (Exception e) { log.info("Could not tell a replica to recover", e); }
// in core/src/java/org/apache/solr/update/processor/FieldMutatingUpdateProcessor.java
catch (Exception e) { throw new SolrException(SERVER_ERROR, "Can't resolve typeClass: " + t, e); }
// in core/src/java/org/apache/solr/update/DocumentBuilder.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "ERROR: "+getID(doc, schema)+"Error adding field '" + field.getName() + "'='" +field.getValue()+"' msg=" + ex.getMessage(), ex ); }
// in core/src/java/org/apache/solr/update/SolrCmdDistributor.java
catch (Exception e) { clonedRequest.exception = e; if (e instanceof SolrException) { clonedRequest.rspCode = ((SolrException) e).code(); } else { clonedRequest.rspCode = -1; } }
// in core/src/java/org/apache/solr/update/PeerSync.java
catch (Exception e) { sreq.updateException = e; log.error(msg() + "Error applying updates from " + sreq.shards + " ,update=" + o, e); return false; }
// in core/src/java/org/apache/solr/update/PeerSync.java
catch (Exception e) { sreq.updateException = e; log.error(msg() + "Error applying updates from " + sreq.shards + " ,finish()", e); return false; }
// in core/src/java/org/apache/solr/core/RequestHandlers.java
catch (Exception ex) { throw new SolrException (ErrorCode.SERVER_ERROR, "RequestHandler init failure", ex); }
// in core/src/java/org/apache/solr/core/RequestHandlers.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "lazy loading error", ex ); }
// in core/src/java/org/apache/solr/core/SolrConfig.java
catch (Exception e) { log.warn( "Unrecognized value for lastModFrom: " + s, e); return BOGUS; }
// in core/src/java/org/apache/solr/core/SolrConfig.java
catch (Exception e) { log.warn( "Ignoring exception while attempting to " + "extract max-age from cacheControl config: " + cacheControlHeader, e); }
// in core/src/java/org/apache/solr/core/MMapDirectoryFactory.java
catch (Exception e) { log.warn("Unmap not supported on this JVM, continuing on without setting unmap", e); }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (Exception e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR,"Error Instantiating "+msg+", "+className+ " failed to instantiate " +cast.getName(), e); }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (Exception e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR,"Error Instantiating "+msg+", "+className+ " failed to instantiate " + UpdateHandler.class.getName(), e); }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (Exception e) { SolrException.log(log, e); }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (Exception e) { SolrException.log(log, e); }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (Exception e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Error opening new searcher", e); }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (Exception e) { if (e instanceof SolrException) throw (SolrException)e; throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (Exception ex) { SolrException e = new SolrException (SolrException.ErrorCode.SERVER_ERROR, "QueryResponseWriter init failure", ex); SolrException.log(log,null,e); throw e; }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (Exception e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (Exception e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (Exception e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "lazy loading error", ex ); }
// in core/src/java/org/apache/solr/core/SolrResourceLoader.java
catch (Exception e) { throw new RuntimeException("Error opening " + resource, e); }
// in core/src/java/org/apache/solr/core/SolrResourceLoader.java
catch (Exception e) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "Error instantiating class: '" + clazz.getName()+"'", e); }
// in core/src/java/org/apache/solr/core/SolrResourceLoader.java
catch (Exception e) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "Error instantiating class: '" + clazz.getName()+"'", e); }
// in core/src/java/org/apache/solr/core/SolrResourceLoader.java
catch (Exception e) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "Error instantiating class: '" + clazz.getName()+"'", e); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (Exception e) { // if register fails, this is really bad - close the zkController to // minimize any damage we can cause zkController.publish(core.getCoreDescriptor(), ZkStateReader.DOWN); SolrException.log(log, "", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/core/SolrDeletionPolicy.java
catch (Exception e) { sb.append(e); }
// in core/src/java/org/apache/solr/core/SolrDeletionPolicy.java
catch (Exception e) { log.warn("Exception while checking commit point's age for deletion", e); }
// in core/src/java/org/apache/solr/core/QuerySenderListener.java
catch (Exception e) { // do nothing... we want to continue with the other requests. // the failure should have already been logged. }
// in core/src/java/org/apache/solr/core/JmxMonitoredMap.java
catch (Exception e) { // Release the reference server = null; throw new RuntimeException("Could not start JMX monitoring ", e); }
// in core/src/java/org/apache/solr/core/JmxMonitoredMap.java
catch (Exception e) { LOG.warn( "Failed to register info bean: " + key, e); }
// in core/src/java/org/apache/solr/core/JmxMonitoredMap.java
catch (Exception e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Failed to unregister info bean: " + key, e); }
// in core/src/java/org/apache/solr/core/JmxMonitoredMap.java
catch (Exception e) { LOG.warn("Could not getStatistics on info bean {}", infoBean.getName(), e); }
// in core/src/java/org/apache/solr/core/JmxMonitoredMap.java
catch (Exception e) { throw new RuntimeException(e); }
// in core/src/java/org/apache/solr/core/JmxMonitoredMap.java
catch (Exception e) { throw new AttributeNotFoundException(attribute); }
// in core/src/java/org/apache/solr/core/JmxMonitoredMap.java
catch (Exception e) { LOG.warn("Could not get attibute " + attribute); }
// in core/src/java/org/apache/solr/util/SolrPluginUtils.java
catch (Exception e) { return null; }
// in core/src/java/org/apache/solr/util/VersionedFile.java
catch (Exception e) { // swallow exception for now }
// in core/src/java/org/apache/solr/util/xslt/TransformerProvider.java
catch (Exception e) { log.error(getClass().getName(), "newTemplates", e); final IOException ioe = new IOException("Unable to initialize Templates '" + filename + "'"); ioe.initCause(e); throw ioe; }
// in core/src/java/org/apache/solr/util/plugin/AbstractPluginLoader.java
catch (Exception ex) { SolrException e = new SolrException (ErrorCode.SERVER_ERROR, "Plugin init failure for " + type + (null != name ? (" \"" + name + "\"") : "") + ": " + ex.getMessage(), ex); throw e; }
// in core/src/java/org/apache/solr/util/plugin/AbstractPluginLoader.java
catch( Exception ex ) { SolrException e = new SolrException (ErrorCode.SERVER_ERROR, "Plugin Initializing failure for " + type, ex); throw e; }
// in core/src/java/org/apache/solr/util/plugin/AbstractPluginLoader.java
catch (Exception ex) { SolrException e = new SolrException (ErrorCode.SERVER_ERROR, "Plugin init failure for " + type, ex); throw e; }
// in core/src/java/org/apache/solr/util/plugin/AbstractPluginLoader.java
catch (Exception ex) { SolrException e = new SolrException (ErrorCode.SERVER_ERROR, "Plugin init failure for " + type, ex); throw e; }
119
            
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
catch (Exception e) { throw new RuntimeException("Problem pretty printing XML", e); }
// in solrj/src/java/org/apache/solr/common/params/SolrParams.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, ex.getMessage(), ex ); }
// in solrj/src/java/org/apache/solr/common/params/SolrParams.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, ex.getMessage(), ex ); }
// in solrj/src/java/org/apache/solr/common/params/SolrParams.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, ex.getMessage(), ex ); }
// in solrj/src/java/org/apache/solr/common/params/SolrParams.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, ex.getMessage(), ex ); }
// in solrj/src/java/org/apache/solr/common/params/SolrParams.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, ex.getMessage(), ex ); }
// in solrj/src/java/org/apache/solr/common/params/SolrParams.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, ex.getMessage(), ex ); }
// in solrj/src/java/org/apache/solr/common/params/SolrParams.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, ex.getMessage(), ex ); }
// in solrj/src/java/org/apache/solr/common/params/SolrParams.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, ex.getMessage(), ex ); }
// in solrj/src/java/org/apache/solr/common/params/SolrParams.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, ex.getMessage(), ex ); }
// in solrj/src/java/org/apache/solr/common/params/SolrParams.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, ex.getMessage(), ex ); }
// in solrj/src/java/org/apache/solr/common/params/SolrParams.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, ex.getMessage(), ex ); }
// in solrj/src/java/org/apache/solr/common/params/SolrParams.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, ex.getMessage(), ex ); }
// in solrj/src/java/org/apache/solr/client/solrj/request/QueryRequest.java
catch (Exception e) { throw new SolrServerException("Error executing query", e); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
catch (Exception e) { throw new SolrServerException(e); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
catch (Exception e) { throw new SolrServerException(e); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
catch (Exception e) { throw new SolrServerException(e); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
catch (Exception e) { throw new SolrServerException(e); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/XMLResponseParser.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "parsing error", ex ); }
// in solrj/src/java/org/apache/solr/client/solrj/beans/DocumentObjectBinder.java
catch (Exception e) { throw new BindingException("Could not instantiate object of " + clazz, e); }
// in solrj/src/java/org/apache/solr/client/solrj/beans/DocumentObjectBinder.java
catch (Exception e) { throw new BindingException("Exception while setting value : " + v + " on " + (field != null ? field : setter), e); }
// in solrj/src/java/org/apache/solr/client/solrj/beans/DocumentObjectBinder.java
catch (Exception e) { throw new BindingException("Exception while getting value: " + field, e); }
// in solrj/src/java/org/apache/solr/client/solrj/beans/DocumentObjectBinder.java
catch (Exception e) { throw new BindingException("Exception while getting value: " + getter, e); }
// in contrib/extraction/src/java/org/apache/solr/handler/extraction/ExtractingRequestHandler.java
catch (Exception e) { throw new SolrException(ErrorCode.SERVER_ERROR, e); }
// in contrib/uima/src/java/org/apache/solr/uima/processor/UIMAUpdateRequestProcessor.java
catch (Exception e) { String logField = solrUIMAConfiguration.getLogField(); if(logField == null){ SchemaField uniqueKeyField = solrCore.getSchema().getUniqueKeyField(); if(uniqueKeyField != null){ logField = uniqueKeyField.getName(); } } String optionalFieldInfo = logField == null ? "." : new StringBuilder(". ").append(logField).append("=") .append((String)cmd.getSolrInputDocument().getField(logField).getValue()) .append(", ").toString(); int len = Math.min(text.length(), 100); if (solrUIMAConfiguration.isIgnoreErrors()) { log.warn(new StringBuilder("skip the text processing due to ") .append(e.getLocalizedMessage()).append(optionalFieldInfo) .append(" text=\"").append(text.substring(0, len)).append("...\"").toString()); } else { throw new SolrException(ErrorCode.SERVER_ERROR, new StringBuilder("processing error: ") .append(e.getLocalizedMessage()).append(optionalFieldInfo) .append(" text=\"").append(text.substring(0, len)).append("...\"").toString(), e); } }
// in contrib/uima/src/java/org/apache/solr/uima/processor/UIMAToSolrMapper.java
catch (Exception e) { throw new FieldMappingException(e); }
// in contrib/clustering/src/java/org/apache/solr/handler/clustering/carrot2/LuceneCarrot2TokenizerFactory.java
catch (Exception e) { throw ExceptionUtils.wrapAsRuntimeException(e); }
// in contrib/clustering/src/java/org/apache/solr/handler/clustering/carrot2/CarrotClusteringEngine.java
catch (Exception e) { log.error("Carrot2 clustering failed", e); throw new SolrException(ErrorCode.SERVER_ERROR, "Carrot2 clustering failed", e); }
// in contrib/langid/src/java/org/apache/solr/update/processor/LangDetectLanguageIdentifierUpdateProcessorFactory.java
catch (Exception e) { throw new RuntimeException("Couldn't load profile data, will return empty languages always!", e); }
// in contrib/dataimporthandler-extras/src/java/org/apache/solr/handler/dataimport/MailEntityProcessor.java
catch (Exception e) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Custom filter could not be created", e); }
// in contrib/analysis-extras/src/java/org/apache/solr/schema/ICUCollationField.java
catch (Exception e) { // io error or invalid rules throw new RuntimeException(e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DataImporter.java
catch (Exception e) { throw new DataImportHandlerException(SEVERE, "Exception occurred while initializing context", e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/XPathEntityProcessor.java
catch (Exception e) { throw new DataImportHandlerException(SEVERE, "Error initializing XSL ", e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DebugLogger.java
catch (Exception e) { log(DIHLogLevels.ENTITY_EXCEPTION, null, e); DataImportHandlerException de = new DataImportHandlerException( DataImportHandlerException.SEVERE, "", e); de.debugged = true; throw de; }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DebugLogger.java
catch (Exception e) { log(DIHLogLevels.TRANSFORMER_EXCEPTION, tName, e); DataImportHandlerException de = new DataImportHandlerException(DataImportHandlerException.SEVERE, "", e); de.debugged = true; throw de; }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/SqlEntityProcessor.java
catch (Exception e) { LOG.error( "The query failed '" + q + "'", e); throw new DataImportHandlerException(DataImportHandlerException.SEVERE, e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/SimplePropertiesWriter.java
catch (Exception e) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Unable to persist Index Start Time", e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DIHCacheSupport.java
catch (Exception e) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Unable to load Cache implementation:" + cacheImplName, e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/EntityProcessorWrapper.java
catch (Exception e) { log.warn("method invocation failed on transformer : " + trans, e); throw new DataImportHandlerException(WARN, e);
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/XPathRecordReader.java
catch (Exception e) { throw new RuntimeException(e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/URLDataSource.java
catch (Exception e) { LOG.error("Exception thrown while getting data", e); throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Exception in invoking url " + url, e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DocBuilder.java
catch (Exception e) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Unable to load Writer implementation:" + writerClassStr, e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DocBuilder.java
catch(Exception e) { throw new RuntimeException(e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DocBuilder.java
catch (Exception e) { throw new RuntimeException(e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DocBuilder.java
catch (Exception e) { try { String n = DocBuilder.class.getPackage().getName() + "." + name; return core != null ? core.getResourceLoader().findClass(n, Object.class) : Class.forName(n); } catch (Exception e1) { throw new ClassNotFoundException("Unable to load " + name + " or " + DocBuilder.class.getPackage().getName() + "." + name, e); } }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DocBuilder.java
catch (Exception e1) { throw new ClassNotFoundException("Unable to load " + name + " or " + DocBuilder.class.getPackage().getName() + "." + name, e); }
// in contrib/velocity/src/java/org/apache/solr/response/VelocityResponseWriter.java
catch (Exception e) { throw new IOException(e.getMessage()); }
// in contrib/velocity/src/java/org/apache/solr/response/VelocityResponseWriter.java
catch (Exception e) { throw new RuntimeException(e); }
// in contrib/velocity/src/java/org/apache/solr/response/VelocityResponseWriter.java
catch (Exception e) { throw new IOException(e.getMessage()); }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
catch (Exception e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Index fetch failed : ", e); }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
catch (Exception e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Unable to write index.properties", e); }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
catch (Exception e) { LOG.warn("Error in fetching packets ", e); //for any failure , increment the error count errorCount++; //if it fails for the same pacaket for MAX_RETRIES fail and come out if (errorCount > MAX_RETRIES) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Fetch failed for file:" + fileName, e); } return ERR; }
// in core/src/java/org/apache/solr/handler/component/QueryElevationComponent.java
catch (Exception ex) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Error initializing QueryElevationComponent.", ex); }
// in core/src/java/org/apache/solr/handler/component/QueryElevationComponent.java
catch (Exception ex) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Error loading elevation", ex); }
// in core/src/java/org/apache/solr/handler/component/HighlightComponent.java
catch (Exception e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/handler/admin/SolrInfoMBeanHandler.java
catch(Exception ex) { throw new SolrException(ErrorCode.BAD_REQUEST, "missing content-stream for diff"); }
// in core/src/java/org/apache/solr/handler/admin/SolrInfoMBeanHandler.java
catch(Exception ex) { throw new SolrException(ErrorCode.BAD_REQUEST, "Unable to read original XML", ex); }
// in core/src/java/org/apache/solr/handler/admin/CoreAdminHandler.java
catch (Exception ex) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Error executing default implementation of CREATE", ex); }
// in core/src/java/org/apache/solr/handler/admin/CoreAdminHandler.java
catch (Exception ex) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Error handling 'status' action ", ex); }
// in core/src/java/org/apache/solr/handler/admin/CoreAdminHandler.java
catch (Exception ex) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Error handling 'reload' action", ex); }
// in core/src/java/org/apache/solr/handler/admin/LoggingHandler.java
catch(Exception ex) { throw new SolrException(ErrorCode.BAD_REQUEST, "invalid timestamp: "+since); }
// in core/src/java/org/apache/solr/analysis/PhoneticFilterFactory.java
catch (Exception e) { throw new InitializationException("Encoder " + name + " / " + clazz + " does not support " + MAX_CODE_LENGTH, e); }
// in core/src/java/org/apache/solr/analysis/PhoneticFilterFactory.java
catch (Exception e) { final Throwable t = (e instanceof InvocationTargetException) ? e.getCause() : e; throw new InitializationException("Error initializing encoder: " + name + " / " + clazz, t); }
// in core/src/java/org/apache/solr/analysis/HunspellStemFilterFactory.java
catch (Exception e) { throw new InitializationException("Unable to load hunspell data! [dictionary=" + args.get("dictionary") + ",affix=" + affixFile + "]", e); }
// in core/src/java/org/apache/solr/analysis/SynonymFilterFactory.java
catch (Exception e) { throw new InitializationException("Exception thrown while loading synonyms", e); }
// in core/src/java/org/apache/solr/analysis/TrimFilterFactory.java
catch( Exception ex ) { throw new InitializationException("Error reading updateOffsets value. Must be true or false.", ex); }
// in core/src/java/org/apache/solr/analysis/HyphenationCompoundWordTokenFilterFactory.java
catch (Exception e) { // TODO: getHyphenationTree really shouldn't throw "Exception" throw new InitializationException("Exception thrown while loading dictionary and hyphenation file", e); }
// in core/src/java/org/apache/solr/analysis/SnowballPorterFilterFactory.java
catch (Exception e) { throw new InitializationException("Error instantiating stemmer for language " + language + "from class " + stemClass, e); }
// in core/src/java/org/apache/solr/analysis/JapaneseTokenizerFactory.java
catch (Exception e) { throw new InitializationException("Exception thrown while loading dictionary", e); }
// in core/src/java/org/apache/solr/analysis/PatternTokenizerFactory.java
catch( Exception ex ) { throw new InitializationException("invalid group argument: " + g); }
// in core/src/java/org/apache/solr/client/solrj/embedded/EmbeddedSolrServer.java
catch (Exception ex) { throw new RuntimeException(ex); }
// in core/src/java/org/apache/solr/client/solrj/embedded/EmbeddedSolrServer.java
catch( Exception ex ) { throw new SolrServerException( ex ); }
// in core/src/java/org/apache/solr/response/XSLTResponseWriter.java
catch(Exception e) { // TODO should our parent interface throw (IO)Exception? throw new RuntimeException("getTransformer fails in getContentType",e); }
// in core/src/java/org/apache/solr/response/transform/ValueAugmenterFactory.java
catch( Exception ex ) { throw new SolrException( ErrorCode.BAD_REQUEST, "Unable to parse "+type+"="+val, ex ); }
// in core/src/java/org/apache/solr/response/transform/ExplainAugmenterFactory.java
catch( Exception ex ) { throw new SolrException( ErrorCode.BAD_REQUEST, "Unknown Explain Style: "+str ); }
// in core/src/java/org/apache/solr/response/BinaryResponseWriter.java
catch (Exception ex) { throw new RuntimeException(ex); }
// in core/src/java/org/apache/solr/request/SimpleFacets.java
catch (Exception e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Can't parse value "+rawval+" for field: " + field.getName(), e); }
// in core/src/java/org/apache/solr/request/SimpleFacets.java
catch (Exception e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Can't parse gap "+gap+" for field: " + field.getName(), e); }
// in core/src/java/org/apache/solr/request/SimpleFacets.java
catch (Exception e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Can't add gap "+gap+" to value " + value + " for field: " + field.getName(), e); }
// in core/src/java/org/apache/solr/servlet/SolrDispatchFilter.java
catch (Exception e) { //unlikely throw new SolrException(SolrException.ErrorCode.SERVER_ERROR,e); }
// in core/src/java/org/apache/solr/schema/OpenExchangeRatesOrgProvider.java
catch (Exception e) { throw new SolrException(ErrorCode.SERVER_ERROR, "Error reloading exchange rates", e); }
// in core/src/java/org/apache/solr/schema/OpenExchangeRatesOrgProvider.java
catch (Exception e) { throw new SolrException(ErrorCode.BAD_REQUEST, "Error initializing", e); }
// in core/src/java/org/apache/solr/schema/IndexSchema.java
catch(Exception e) { // unexpected exception... throw new SolrException( SolrException.ErrorCode.SERVER_ERROR,"Schema Parsing Failed: " + e.getMessage(), e); }
// in core/src/java/org/apache/solr/schema/CurrencyField.java
catch (Exception e) { throw new SolrException(ErrorCode.BAD_REQUEST, "Error instansiating exhange rate provider "+exchangeRateProviderClass+". Please check your FieldType configuration", e); }
// in core/src/java/org/apache/solr/schema/FieldTypePluginLoader.java
catch (Exception e) { log.error("Cannot load analyzer: "+analyzerName, e); throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "Cannot load analyzer: "+analyzerName, e ); }
// in core/src/java/org/apache/solr/search/QueryParsing.java
catch (Exception e) { throw new RuntimeException(e); }
// in core/src/java/org/apache/solr/search/SolrCacheBase.java
catch (Exception e) { throw new RuntimeException("Can't parse autoWarm value: " + configValue, e); }
// in core/src/java/org/apache/solr/cloud/ZkSolrResourceLoader.java
catch (Exception e) { throw new RuntimeException("Error opening " + file, e); }
// in core/src/java/org/apache/solr/cloud/ZkSolrResourceLoader.java
catch (Exception e) { throw new RuntimeException("Error opening " + resource, e); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (Exception e) { SolrException.log(log, "", e); throw new ZooKeeperException( SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (Exception e) { SolrException.log(log, "There was a problem finding the leader in zk", e); try { Thread.sleep(2000); } catch (InterruptedException e1) { Thread.currentThread().interrupt(); } if (i == retries - 1) { throw new SolrException(ErrorCode.SERVER_ERROR, "There was a problem finding the leader in zk"); } }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (Exception e) { SolrException.log(log, "There was a problem making a request to the leader", e); try { Thread.sleep(2000); } catch (InterruptedException e1) { Thread.currentThread().interrupt(); } if (i == retries - 1) { throw new SolrException(ErrorCode.SERVER_ERROR, "There was a problem making a request to the leader"); } }
// in core/src/java/org/apache/solr/update/processor/FieldMutatingUpdateProcessor.java
catch (Exception e) { throw new SolrException(SERVER_ERROR, "Can't resolve typeClass: " + t, e); }
// in core/src/java/org/apache/solr/update/DocumentBuilder.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "ERROR: "+getID(doc, schema)+"Error adding field '" + field.getName() + "'='" +field.getValue()+"' msg=" + ex.getMessage(), ex ); }
// in core/src/java/org/apache/solr/core/RequestHandlers.java
catch (Exception ex) { throw new SolrException (ErrorCode.SERVER_ERROR, "RequestHandler init failure", ex); }
// in core/src/java/org/apache/solr/core/RequestHandlers.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "lazy loading error", ex ); }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (Exception e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR,"Error Instantiating "+msg+", "+className+ " failed to instantiate " +cast.getName(), e); }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (Exception e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR,"Error Instantiating "+msg+", "+className+ " failed to instantiate " + UpdateHandler.class.getName(), e); }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (Exception e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Error opening new searcher", e); }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (Exception e) { if (e instanceof SolrException) throw (SolrException)e; throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (Exception ex) { SolrException e = new SolrException (SolrException.ErrorCode.SERVER_ERROR, "QueryResponseWriter init failure", ex); SolrException.log(log,null,e); throw e; }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (Exception e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (Exception e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (Exception e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "lazy loading error", ex ); }
// in core/src/java/org/apache/solr/core/SolrResourceLoader.java
catch (Exception e) { throw new RuntimeException("Error opening " + resource, e); }
// in core/src/java/org/apache/solr/core/SolrResourceLoader.java
catch (Exception e) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "Error instantiating class: '" + clazz.getName()+"'", e); }
// in core/src/java/org/apache/solr/core/SolrResourceLoader.java
catch (Exception e) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "Error instantiating class: '" + clazz.getName()+"'", e); }
// in core/src/java/org/apache/solr/core/SolrResourceLoader.java
catch (Exception e) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "Error instantiating class: '" + clazz.getName()+"'", e); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (Exception e) { // if register fails, this is really bad - close the zkController to // minimize any damage we can cause zkController.publish(core.getCoreDescriptor(), ZkStateReader.DOWN); SolrException.log(log, "", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/core/JmxMonitoredMap.java
catch (Exception e) { // Release the reference server = null; throw new RuntimeException("Could not start JMX monitoring ", e); }
// in core/src/java/org/apache/solr/core/JmxMonitoredMap.java
catch (Exception e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Failed to unregister info bean: " + key, e); }
// in core/src/java/org/apache/solr/core/JmxMonitoredMap.java
catch (Exception e) { throw new RuntimeException(e); }
// in core/src/java/org/apache/solr/core/JmxMonitoredMap.java
catch (Exception e) { throw new AttributeNotFoundException(attribute); }
// in core/src/java/org/apache/solr/util/xslt/TransformerProvider.java
catch (Exception e) { log.error(getClass().getName(), "newTemplates", e); final IOException ioe = new IOException("Unable to initialize Templates '" + filename + "'"); ioe.initCause(e); throw ioe; }
// in core/src/java/org/apache/solr/util/plugin/AbstractPluginLoader.java
catch (Exception ex) { SolrException e = new SolrException (ErrorCode.SERVER_ERROR, "Plugin init failure for " + type + (null != name ? (" \"" + name + "\"") : "") + ": " + ex.getMessage(), ex); throw e; }
// in core/src/java/org/apache/solr/util/plugin/AbstractPluginLoader.java
catch( Exception ex ) { SolrException e = new SolrException (ErrorCode.SERVER_ERROR, "Plugin Initializing failure for " + type, ex); throw e; }
// in core/src/java/org/apache/solr/util/plugin/AbstractPluginLoader.java
catch (Exception ex) { SolrException e = new SolrException (ErrorCode.SERVER_ERROR, "Plugin init failure for " + type, ex); throw e; }
// in core/src/java/org/apache/solr/util/plugin/AbstractPluginLoader.java
catch (Exception ex) { SolrException e = new SolrException (ErrorCode.SERVER_ERROR, "Plugin init failure for " + type, ex); throw e; }
89
unknown (Lib) ExecutionException 0 0 2
            
// in core/src/java/org/apache/solr/cloud/RecoveryStrategy.java
private Future<RecoveryInfo> replay(UpdateLog ulog) throws InterruptedException, ExecutionException, TimeoutException { Future<RecoveryInfo> future = ulog.applyBufferedUpdates(); if (future == null) { // no replay needed\ log.info("No replay needed"); } else { log.info("Replaying buffered documents"); // wait for replay future.get(); } // solrcloud_debug // try { // RefCounted<SolrIndexSearcher> searchHolder = core.getNewestSearcher(false); // SolrIndexSearcher searcher = searchHolder.get(); // try { // System.out.println(core.getCoreDescriptor().getCoreContainer().getZkController().getNodeName() + " replayed " // + searcher.search(new MatchAllDocsQuery(), 1).totalHits); // } finally { // searchHolder.decref(); // } // } catch (Exception e) { // // } return future; }
// in core/src/java/org/apache/solr/cloud/ZkController.java
private boolean checkRecovery(String coreName, final CoreDescriptor desc, boolean recoverReloadedCores, final boolean isLeader, final CloudDescriptor cloudDesc, final String collection, final String shardZkNodeName, String shardId, ZkNodeProps leaderProps, SolrCore core, CoreContainer cc) throws InterruptedException, KeeperException, IOException, ExecutionException { if (SKIP_AUTO_RECOVERY) { log.warn("Skipping recovery according to sys prop solrcloud.skip.autorecovery"); return false; } boolean doRecovery = true; if (!isLeader) { if (core.isReloaded() && !recoverReloadedCores) { doRecovery = false; } if (doRecovery) { log.info("Core needs to recover:" + core.getName()); core.getUpdateHandler().getSolrCoreState().doRecovery(cc, coreName); return true; } } else { log.info("I am the leader, no recovery necessary"); } return false; }
5
            
// in core/src/java/org/apache/solr/handler/SnapPuller.java
catch (ExecutionException e) { SolrException.log(LOG,e); }
// in core/src/java/org/apache/solr/handler/component/HttpShardHandler.java
catch (ExecutionException e) { // should be impossible... the problem with catching the exception // at this level is we don't know what ShardRequest it applied to throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Impossible Exception",e); }
// in core/src/java/org/apache/solr/request/PerSegmentSingleValuedFaceting.java
catch (ExecutionException e) { Throwable cause = e.getCause(); if (cause instanceof RuntimeException) { throw (RuntimeException)cause; } else { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Error in per-segment faceting on field: " + fieldName, cause); } }
// in core/src/java/org/apache/solr/update/DirectUpdateHandler2.java
catch (ExecutionException e) { SolrException.log(log,e); }
// in core/src/java/org/apache/solr/update/SolrCmdDistributor.java
catch (ExecutionException e) { // shouldn't happen since we catch exceptions ourselves SolrException.log(SolrCore.log, "error sending update request to shard", e); }
3
            
// in core/src/java/org/apache/solr/handler/component/HttpShardHandler.java
catch (ExecutionException e) { // should be impossible... the problem with catching the exception // at this level is we don't know what ShardRequest it applied to throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Impossible Exception",e); }
// in core/src/java/org/apache/solr/request/PerSegmentSingleValuedFaceting.java
catch (ExecutionException e) { Throwable cause = e.getCause(); if (cause instanceof RuntimeException) { throw (RuntimeException)cause; } else { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Error in per-segment faceting on field: " + fieldName, cause); } }
2
checked (Domain) FieldMappingException
public class FieldMappingException extends Exception {
  public FieldMappingException(Exception e) {
  }
}
1
            
// in contrib/uima/src/java/org/apache/solr/uima/processor/UIMAToSolrMapper.java
public void map(String typeName, Map<String, MapField> featureFieldsmapping) throws FieldMappingException { try { Type type = cas.getTypeSystem().getType(typeName); for (FSIterator<FeatureStructure> iterator = cas.getFSIndexRepository().getAllIndexedFS(type); iterator .hasNext(); ) { FeatureStructure fs = iterator.next(); for (String featureName : featureFieldsmapping.keySet()) { MapField mapField = featureFieldsmapping.get(featureName); String fieldNameFeature = mapField.getFieldNameFeature(); String fieldNameFeatureValue = fieldNameFeature == null ? null : fs.getFeatureValueAsString(type.getFeatureByBaseName(fieldNameFeature)); String fieldName = mapField.getFieldName(fieldNameFeatureValue); log.info(new StringBuffer("mapping ").append(typeName).append("@").append(featureName) .append(" to ").append(fieldName).toString()); String featureValue = null; if (fs instanceof Annotation && "coveredText".equals(featureName)) { featureValue = ((Annotation) fs).getCoveredText(); } else { featureValue = fs.getFeatureValueAsString(type.getFeatureByBaseName(featureName)); } log.info(new StringBuffer("writing ").append(featureValue).append(" in ").append( fieldName).toString()); document.addField(fieldName, featureValue, 1.0f); } } } catch (Exception e) { throw new FieldMappingException(e); } }
1
            
// in contrib/uima/src/java/org/apache/solr/uima/processor/UIMAToSolrMapper.java
catch (Exception e) { throw new FieldMappingException(e); }
1
            
// in contrib/uima/src/java/org/apache/solr/uima/processor/UIMAToSolrMapper.java
public void map(String typeName, Map<String, MapField> featureFieldsmapping) throws FieldMappingException { try { Type type = cas.getTypeSystem().getType(typeName); for (FSIterator<FeatureStructure> iterator = cas.getFSIndexRepository().getAllIndexedFS(type); iterator .hasNext(); ) { FeatureStructure fs = iterator.next(); for (String featureName : featureFieldsmapping.keySet()) { MapField mapField = featureFieldsmapping.get(featureName); String fieldNameFeature = mapField.getFieldNameFeature(); String fieldNameFeatureValue = fieldNameFeature == null ? null : fs.getFeatureValueAsString(type.getFeatureByBaseName(fieldNameFeature)); String fieldName = mapField.getFieldName(fieldNameFeatureValue); log.info(new StringBuffer("mapping ").append(typeName).append("@").append(featureName) .append(" to ").append(fieldName).toString()); String featureValue = null; if (fs instanceof Annotation && "coveredText".equals(featureName)) { featureValue = ((Annotation) fs).getCoveredText(); } else { featureValue = fs.getFeatureValueAsString(type.getFeatureByBaseName(featureName)); } log.info(new StringBuffer("writing ").append(featureValue).append(" in ").append( fieldName).toString()); document.addField(fieldName, featureValue, 1.0f); } } } catch (Exception e) { throw new FieldMappingException(e); } }
0 0 0
unknown (Lib) FileNotFoundException 3
            
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/FileDataSource.java
static File getFile(String basePath, String query) { try { File file0 = new File(query); File file = file0; if (!file.isAbsolute()) file = new File(basePath + query); if (file.isFile() && file.canRead()) { LOG.debug("Accessing File: " + file.toString()); return file; } else if (file != file0) if (file0.isFile() && file0.canRead()) { LOG.debug("Accessing File0: " + file0.toString()); return file0; } throw new FileNotFoundException("Could not find file: " + query); } catch (FileNotFoundException e) { throw new RuntimeException(e); } }
// in core/src/java/org/apache/solr/handler/SnapShooter.java
public void copyFile(File source, File destination, boolean preserveFileDate) throws IOException { // check source exists if (!source.exists()) { String message = "File " + source + " does not exist"; throw new FileNotFoundException(message); } // does destinations directory exist ? if (destination.getParentFile() != null && !destination.getParentFile().exists()) { destination.getParentFile().mkdirs(); } // make sure we can write to destination if (destination.exists() && !destination.canWrite()) { String message = "Unable to open file " + destination + " for writing."; throw new IOException(message); } FileInputStream input = null; FileOutputStream output = null; try { input = new FileInputStream(source); output = new FileOutputStream(destination); int count = 0; int n = 0; int rcnt = 0; while (-1 != (n = input.read(buffer))) { output.write(buffer, 0, n); count += n; rcnt++; /*** // reserve every 4.6875 MB if (rcnt == 150) { rcnt = 0; delPolicy.setReserveDuration(indexCommit.getVersion(), reserveTime); } ***/ } } finally { try { IOUtils.closeQuietly(input); } finally { IOUtils.closeQuietly(output); } } if (source.length() != destination.length()) { String message = "Failed to copy full contents from " + source + " to " + destination; throw new IOException(message); } if (preserveFileDate) { // file copy should preserve file date destination.setLastModified(source.lastModified()); } }
// in core/src/java/org/apache/solr/util/FileUtils.java
public static void sync(File fullFile) throws IOException { if (fullFile == null || !fullFile.exists()) throw new FileNotFoundException("File does not exist " + fullFile); boolean success = false; int retryCount = 0; IOException exc = null; while(!success && retryCount < 5) { retryCount++; RandomAccessFile file = null; try { try { file = new RandomAccessFile(fullFile, "rw"); file.getFD().sync(); success = true; } finally { if (file != null) file.close(); } } catch (IOException ioe) { if (exc == null) exc = ioe; try { // Pause 5 msec Thread.sleep(5); } catch (InterruptedException ie) { Thread.currentThread().interrupt(); } } } if (!success) // Throw original exception throw exc; }
0 2
            
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/FileDataSource.java
protected Reader openStream(File file) throws FileNotFoundException, UnsupportedEncodingException { if (encoding == null) { return new InputStreamReader(new FileInputStream(file)); } else { return new InputStreamReader(new FileInputStream(file), encoding); } }
// in core/src/java/org/apache/solr/util/VersionedFile.java
public static InputStream getLatestFile(String dirName, String fileName) throws FileNotFoundException { Collection<File> oldFiles=null; final String prefix = fileName+'.'; File f = new File(dirName, fileName); InputStream is = null; // there can be a race between checking for a file and opening it... // the user may have just put a new version in and deleted an old version. // try multiple times in a row. for (int retry=0; retry<10 && is==null; retry++) { try { if (!f.exists()) { File dir = new File(dirName); String[] names = dir.list(new FilenameFilter() { public boolean accept(File dir, String name) { return name.startsWith(prefix); } }); Arrays.sort(names); f = new File(dir, names[names.length-1]); oldFiles = new ArrayList<File>(); for (int i=0; i<names.length-1; i++) { oldFiles.add(new File(dir, names[i])); } } is = new FileInputStream(f); } catch (Exception e) { // swallow exception for now } }
3
            
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/BinFileDataSource.java
catch (FileNotFoundException e) { wrapAndThrow(SEVERE,e,"Unable to open file "+f.getAbsolutePath()); return null; }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/FileDataSource.java
catch (FileNotFoundException e) { throw new RuntimeException(e); }
// in core/src/java/org/apache/solr/core/SolrXMLSerializer.java
catch (java.io.FileNotFoundException xnf) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, xnf); }
2
            
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/FileDataSource.java
catch (FileNotFoundException e) { throw new RuntimeException(e); }
// in core/src/java/org/apache/solr/core/SolrXMLSerializer.java
catch (java.io.FileNotFoundException xnf) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, xnf); }
2
checked (Lib) IOException 25
            
// in solrj/src/java/org/apache/solr/client/solrj/impl/ConcurrentUpdateSolrServer.java
public NamedList<Object> request(final SolrRequest request) throws SolrServerException, IOException { if (!(request instanceof UpdateRequest)) { return server.request(request); } UpdateRequest req = (UpdateRequest) request; // this happens for commit... if (req.getDocuments() == null || req.getDocuments().isEmpty()) { blockUntilFinished(); return server.request(request); } SolrParams params = req.getParams(); if (params != null) { // check if it is waiting for the searcher if (params.getBool(UpdateParams.WAIT_SEARCHER, false)) { log.info("blocking for commit/optimize"); blockUntilFinished(); // empty the queue return server.request(request); } } try { CountDownLatch tmpLock = lock; if (tmpLock != null) { tmpLock.await(); } boolean success = queue.offer(req); for (;;) { synchronized (runners) { if (runners.isEmpty() || (queue.remainingCapacity() < queue.size() // queue // is // half // full // and // we // can // add // more // runners && runners.size() < threadCount)) { // We need more runners, so start a new one. Runner r = new Runner(); runners.add(r); scheduler.execute(r); } else { // break out of the retry loop if we added the element to the queue // successfully, *and* // while we are still holding the runners lock to prevent race // conditions. // race conditions. if (success) break; } } // Retry to add to the queue w/o the runners lock held (else we risk // temporary deadlock) // This retry could also fail because // 1) existing runners were not able to take off any new elements in the // queue // 2) the queue was filled back up since our last try // If we succeed, the queue may have been completely emptied, and all // runners stopped. // In all cases, we should loop back to the top to see if we need to // start more runners. // if (!success) { success = queue.offer(req, 100, TimeUnit.MILLISECONDS); } } } catch (InterruptedException e) { log.error("interrupted", e); throw new IOException(e.getLocalizedMessage()); } // RETURN A DUMMY result NamedList<Object> dummy = new NamedList<Object>(); dummy.add("NOTE", "the request is processed in a background stream"); return dummy; }
// in contrib/velocity/src/java/org/apache/solr/response/VelocityResponseWriter.java
public void write(Writer writer, SolrQueryRequest request, SolrQueryResponse response) throws IOException { VelocityEngine engine = getEngine(request); // TODO: have HTTP headers available for configuring engine Template template = getTemplate(engine, request); VelocityContext context = new VelocityContext(); context.put("request", request); // Turn the SolrQueryResponse into a SolrResponse. // QueryResponse has lots of conveniences suitable for a view // Problem is, which SolrResponse class to use? // One patch to SOLR-620 solved this by passing in a class name as // as a parameter and using reflection and Solr's class loader to // create a new instance. But for now the implementation simply // uses QueryResponse, and if it chokes in a known way, fall back // to bare bones SolrResponseBase. // TODO: Can this writer know what the handler class is? With echoHandler=true it can get its string name at least SolrResponse rsp = new QueryResponse(); NamedList<Object> parsedResponse = BinaryResponseWriter.getParsedResponse(request, response); try { rsp.setResponse(parsedResponse); // page only injected if QueryResponse works context.put("page", new PageTool(request, response)); // page tool only makes sense for a SearchHandler request... *sigh* } catch (ClassCastException e) { // known edge case where QueryResponse's extraction assumes "response" is a SolrDocumentList // (AnalysisRequestHandler emits a "response") e.printStackTrace(); rsp = new SolrResponseBase(); rsp.setResponse(parsedResponse); } context.put("response", rsp); // Velocity context tools - TODO: make these pluggable context.put("esc", new EscapeTool()); context.put("date", new ComparisonDateTool()); context.put("list", new ListTool()); context.put("math", new MathTool()); context.put("number", new NumberTool()); context.put("sort", new SortTool()); context.put("engine", engine); // for $engine.resourceExists(...) String layout_template = request.getParams().get("v.layout"); String json_wrapper = request.getParams().get("v.json"); boolean wrap_response = (layout_template != null) || (json_wrapper != null); // create output, optionally wrap it into a json object if (wrap_response) { StringWriter stringWriter = new StringWriter(); template.merge(context, stringWriter); if (layout_template != null) { context.put("content", stringWriter.toString()); stringWriter = new StringWriter(); try { engine.getTemplate(layout_template + ".vm").merge(context, stringWriter); } catch (Exception e) { throw new IOException(e.getMessage()); } } if (json_wrapper != null) { writer.write(request.getParams().get("v.json") + "("); writer.write(getJSONWrap(stringWriter.toString())); writer.write(')'); } else { // using a layout, but not JSON wrapping writer.write(stringWriter.toString()); } } else { template.merge(context, writer); } }
// in contrib/velocity/src/java/org/apache/solr/response/VelocityResponseWriter.java
private Template getTemplate(VelocityEngine engine, SolrQueryRequest request) throws IOException { Template template; String template_name = request.getParams().get("v.template"); String qt = request.getParams().get("qt"); String path = (String) request.getContext().get("path"); if (template_name == null && path != null) { template_name = path; } // TODO: path is never null, so qt won't get picked up maybe special case for '/select' to use qt, otherwise use path? if (template_name == null && qt != null) { template_name = qt; } if (template_name == null) template_name = "index"; try { template = engine.getTemplate(template_name + ".vm"); } catch (Exception e) { throw new IOException(e.getMessage()); } return template; }
// in core/src/java/org/apache/solr/handler/SnapShooter.java
public void copyFile(File source, File destination, boolean preserveFileDate) throws IOException { // check source exists if (!source.exists()) { String message = "File " + source + " does not exist"; throw new FileNotFoundException(message); } // does destinations directory exist ? if (destination.getParentFile() != null && !destination.getParentFile().exists()) { destination.getParentFile().mkdirs(); } // make sure we can write to destination if (destination.exists() && !destination.canWrite()) { String message = "Unable to open file " + destination + " for writing."; throw new IOException(message); } FileInputStream input = null; FileOutputStream output = null; try { input = new FileInputStream(source); output = new FileOutputStream(destination); int count = 0; int n = 0; int rcnt = 0; while (-1 != (n = input.read(buffer))) { output.write(buffer, 0, n); count += n; rcnt++; /*** // reserve every 4.6875 MB if (rcnt == 150) { rcnt = 0; delPolicy.setReserveDuration(indexCommit.getVersion(), reserveTime); } ***/ } } finally { try { IOUtils.closeQuietly(input); } finally { IOUtils.closeQuietly(output); } } if (source.length() != destination.length()) { String message = "Failed to copy full contents from " + source + " to " + destination; throw new IOException(message); } if (preserveFileDate) { // file copy should preserve file date destination.setLastModified(source.lastModified()); } }
// in core/src/java/org/apache/solr/response/XSLTResponseWriter.java
protected Transformer getTransformer(SolrQueryRequest request) throws IOException { final String xslt = request.getParams().get(CommonParams.TR,null); if(xslt==null) { throw new IOException("'" + CommonParams.TR + "' request parameter is required to use the XSLTResponseWriter"); } // not the cleanest way to achieve this SolrConfig solrConfig = request.getCore().getSolrConfig(); // no need to synchronize access to context, right? // Nothing else happens with it at the same time final Map<Object,Object> ctx = request.getContext(); Transformer result = (Transformer)ctx.get(CONTEXT_TRANSFORMER_KEY); if(result==null) { result = TransformerProvider.instance.getTransformer(solrConfig, xslt,xsltCacheLifetimeSeconds.intValue()); result.setErrorListener(xmllog); ctx.put(CONTEXT_TRANSFORMER_KEY,result); } return result; }
// in core/src/java/org/apache/solr/response/RawResponseWriter.java
public void write(OutputStream out, SolrQueryRequest request, SolrQueryResponse response) throws IOException { Object obj = response.getValues().get( CONTENT ); if( obj != null && (obj instanceof ContentStream ) ) { // copy the contents to the writer... ContentStream content = (ContentStream)obj; java.io.InputStream in = content.getStream(); try { IOUtils.copy( in, out ); } finally { in.close(); } } else { //getBaseWriter( request ).write( writer, request, response ); throw new IOException("did not find a CONTENT object"); } }
// in core/src/java/org/apache/solr/schema/SimplePreAnalyzedParser.java
Override public ParseResult parse(Reader reader, AttributeSource parent) throws IOException { ParseResult res = new ParseResult(); StringBuilder sb = new StringBuilder(); char[] buf = new char[128]; int cnt; while ((cnt = reader.read(buf)) > 0) { sb.append(buf, 0, cnt); } String val = sb.toString(); // empty string - accept even without version number if (val.length() == 0) { return res; } // first consume the version int idx = val.indexOf(' '); if (idx == -1) { throw new IOException("Missing VERSION token"); } String version = val.substring(0, idx); if (!VERSION.equals(version)) { throw new IOException("Unknown VERSION " + version); } val = val.substring(idx + 1); // then consume the optional stored part int tsStart = 0; boolean hasStored = false; StringBuilder storedBuf = new StringBuilder(); if (val.charAt(0) == '=') { hasStored = true; if (val.length() > 1) { for (int i = 1; i < val.length(); i++) { char c = val.charAt(i); if (c == '\\') { if (i < val.length() - 1) { c = val.charAt(++i); if (c == '=') { // we recognize only \= escape in the stored part storedBuf.append('='); } else { storedBuf.append('\\'); storedBuf.append(c); continue; } } else { storedBuf.append(c); continue; } } else if (c == '=') { // end of stored text tsStart = i + 1; break; } else { storedBuf.append(c); } } if (tsStart == 0) { // missing end-of-stored marker throw new IOException("Missing end marker of stored part"); } } else { throw new IOException("Unexpected end of stored field"); } } if (hasStored) { res.str = storedBuf.toString(); } Tok tok = new Tok(); StringBuilder attName = new StringBuilder(); StringBuilder attVal = new StringBuilder(); // parser state S s = S.UNDEF; int lastPos = 0; for (int i = tsStart; i < val.length(); i++) { char c = val.charAt(i); if (c == ' ') { // collect leftovers switch (s) { case VALUE : if (attVal.length() == 0) { throw new IOException("Unexpected character '" + c + "' at position " + i + " - empty value of attribute."); } if (attName.length() > 0) { tok.attr.put(attName.toString(), attVal.toString()); } break; case NAME: // attr name without a value ? if (attName.length() > 0) { throw new IOException("Unexpected character '" + c + "' at position " + i + " - missing attribute value."); } else { // accept missing att name and value } break; case TOKEN: case UNDEF: // do nothing, advance to next token } attName.setLength(0); attVal.setLength(0); if (!tok.isEmpty() || s == S.NAME) { AttributeSource.State state = createState(parent, tok, lastPos); if (state != null) res.states.add(state.clone()); } // reset tok s = S.UNDEF; tok.reset(); // skip lastPos++; continue; } StringBuilder tgt = null; switch (s) { case TOKEN: tgt = tok.token; break; case NAME: tgt = attName; break; case VALUE: tgt = attVal; break; case UNDEF: tgt = tok.token; s = S.TOKEN; } if (c == '\\') { if (s == S.TOKEN) lastPos++; if (i >= val.length() - 1) { // end tgt.append(c); continue; } else { c = val.charAt(++i); switch (c) { case '\\' : case '=' : case ',' : case ' ' : tgt.append(c); break; case 'n': tgt.append('\n'); break; case 'r': tgt.append('\r'); break; case 't': tgt.append('\t'); break; default: tgt.append('\\'); tgt.append(c); lastPos++; } } } else { // state switch if (c == ',') { if (s == S.TOKEN) { s = S.NAME; } else if (s == S.VALUE) { // end of value, start of next attr if (attVal.length() == 0) { throw new IOException("Unexpected character '" + c + "' at position " + i + " - empty value of attribute."); } if (attName.length() > 0 && attVal.length() > 0) { tok.attr.put(attName.toString(), attVal.toString()); } // reset attName.setLength(0); attVal.setLength(0); s = S.NAME; } else { throw new IOException("Unexpected character '" + c + "' at position " + i + " - missing attribute value."); } } else if (c == '=') { if (s == S.NAME) { s = S.VALUE; } else { throw new IOException("Unexpected character '" + c + "' at position " + i + " - empty value of attribute."); } } else { tgt.append(c); if (s == S.TOKEN) lastPos++; } } } // collect leftovers if (!tok.isEmpty() || s == S.NAME || s == S.VALUE) { // remaining attrib? if (s == S.VALUE) { if (attName.length() > 0 && attVal.length() > 0) { tok.attr.put(attName.toString(), attVal.toString()); } } AttributeSource.State state = createState(parent, tok, lastPos); if (state != null) res.states.add(state.clone()); } return res; }
// in core/src/java/org/apache/solr/internal/csv/CSVParser.java
public String nextValue() throws IOException { Token tkn = nextToken(); String ret = null; switch (tkn.type) { case TT_TOKEN: case TT_EORECORD: ret = tkn.content.toString(); break; case TT_EOF: ret = null; break; case TT_INVALID: default: // error no token available (or error) throw new IOException( "(line " + getLineNumber() + ") invalid parse sequence"); // unreachable: break; } return ret; }
// in core/src/java/org/apache/solr/internal/csv/CSVParser.java
public String[] getLine() throws IOException { String[] ret = EMPTY_STRING_ARRAY; record.clear(); while (true) { reusableToken.reset(); nextToken(reusableToken); switch (reusableToken.type) { case TT_TOKEN: record.add(reusableToken.content.toString()); break; case TT_EORECORD: record.add(reusableToken.content.toString()); break; case TT_EOF: if (reusableToken.isReady) { record.add(reusableToken.content.toString()); } else { ret = null; } break; case TT_INVALID: default: // error: throw IOException throw new IOException("(line " + getLineNumber() + ") invalid parse sequence"); // unreachable: break; } if (reusableToken.type != TT_TOKEN) { break; } } if (!record.isEmpty()) { ret = (String[]) record.toArray(new String[record.size()]); } return ret; }
// in core/src/java/org/apache/solr/internal/csv/CSVParser.java
private Token encapsulatedTokenLexer(Token tkn, int c) throws IOException { // save current line int startLineNumber = getLineNumber(); // ignore the given delimiter // assert c == delimiter; for (;;) { c = in.read(); if (c == '\\' && strategy.getUnicodeEscapeInterpretation() && in.lookAhead()=='u') { tkn.content.append((char) unicodeEscapeLexer(c)); } else if (c == strategy.getEscape()) { tkn.content.append((char)readEscape(c)); } else if (c == strategy.getEncapsulator()) { if (in.lookAhead() == strategy.getEncapsulator()) { // double or escaped encapsulator -> add single encapsulator to token c = in.read(); tkn.content.append((char) c); } else { // token finish mark (encapsulator) reached: ignore whitespace till delimiter for (;;) { c = in.read(); if (c == strategy.getDelimiter()) { tkn.type = TT_TOKEN; tkn.isReady = true; return tkn; } else if (isEndOfFile(c)) { tkn.type = TT_EOF; tkn.isReady = true; return tkn; } else if (isEndOfLine(c)) { // ok eo token reached tkn.type = TT_EORECORD; tkn.isReady = true; return tkn; } else if (!isWhitespace(c)) { // error invalid char between token and next delimiter throw new IOException( "(line " + getLineNumber() + ") invalid char between encapsulated token end delimiter" ); } } } } else if (isEndOfFile(c)) { // error condition (end of file before end of token) throw new IOException( "(startline " + startLineNumber + ")" + "eof reached before encapsulated token finished" ); } else { // consume character tkn.content.append((char) c); } } }
// in core/src/java/org/apache/solr/internal/csv/CSVParser.java
protected int unicodeEscapeLexer(int c) throws IOException { int ret = 0; // ignore 'u' (assume c==\ now) and read 4 hex digits c = in.read(); code.clear(); try { for (int i = 0; i < 4; i++) { c = in.read(); if (isEndOfFile(c) || isEndOfLine(c)) { throw new NumberFormatException("number too short"); } code.append((char) c); } ret = Integer.parseInt(code.toString(), 16); } catch (NumberFormatException e) { throw new IOException( "(line " + getLineNumber() + ") Wrong unicode escape sequence found '" + code.toString() + "'" + e.toString()); } return ret; }
5
            
// in solrj/src/java/org/apache/solr/client/solrj/impl/ConcurrentUpdateSolrServer.java
catch (InterruptedException e) { log.error("interrupted", e); throw new IOException(e.getLocalizedMessage()); }
// in contrib/velocity/src/java/org/apache/solr/response/VelocityResponseWriter.java
catch (Exception e) { throw new IOException(e.getMessage()); }
// in contrib/velocity/src/java/org/apache/solr/response/VelocityResponseWriter.java
catch (Exception e) { throw new IOException(e.getMessage()); }
// in core/src/java/org/apache/solr/internal/csv/CSVParser.java
catch (NumberFormatException e) { throw new IOException( "(line " + getLineNumber() + ") Wrong unicode escape sequence found '" + code.toString() + "'" + e.toString()); }
// in core/src/java/org/apache/solr/util/SystemIdResolver.java
catch (RuntimeException re) { // unfortunately XInclude fallback only works with IOException, but openResource() never throws that one throw (IOException) (new IOException(re.getMessage()).initCause(re)); }
1072
            
// in solrj/src/java/org/apache/solr/common/cloud/DefaultConnectionStrategy.java
Override public void connect(String serverAddress, int timeout, Watcher watcher, ZkUpdate updater) throws IOException, InterruptedException, TimeoutException { updater.update(new SolrZooKeeper(serverAddress, timeout, watcher)); }
// in solrj/src/java/org/apache/solr/common/cloud/DefaultConnectionStrategy.java
Override public void reconnect(final String serverAddress, final int zkClientTimeout, final Watcher watcher, final ZkUpdate updater) throws IOException { log.info("Connection expired - starting a new one..."); try { updater .update(new SolrZooKeeper(serverAddress, zkClientTimeout, watcher)); log.info("Reconnected to ZooKeeper"); } catch (Exception e) { SolrException.log(log, "Reconnect to ZooKeeper failed", e); log.info("Reconnect to ZooKeeper failed"); } }
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
public void makePath(String path, File file, boolean failOnExists, boolean retryOnConnLoss) throws IOException, KeeperException, InterruptedException { makePath(path, FileUtils.readFileToString(file).getBytes("UTF-8"), CreateMode.PERSISTENT, null, failOnExists, retryOnConnLoss); }
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
public void makePath(String path, File file, boolean retryOnConnLoss) throws IOException, KeeperException, InterruptedException { makePath(path, FileUtils.readFileToString(file).getBytes("UTF-8"), retryOnConnLoss); }
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
public void setData(String path, File file, boolean retryOnConnLoss) throws IOException, KeeperException, InterruptedException { if (log.isInfoEnabled()) { log.info("Write to ZooKeepeer " + file.getAbsolutePath() + " to " + path); } String data = FileUtils.readFileToString(file); setData(path, data.getBytes("UTF-8"), retryOnConnLoss); }
// in solrj/src/java/org/apache/solr/common/cloud/ConnectionManager.java
public synchronized void process(WatchedEvent event) { if (log.isInfoEnabled()) { log.info("Watcher " + this + " name:" + name + " got event " + event + " path:" + event.getPath() + " type:" + event.getType()); } state = event.getState(); if (state == KeeperState.SyncConnected) { connected = true; clientConnected.countDown(); } else if (state == KeeperState.Expired) { connected = false; log.info("Attempting to reconnect to recover relationship with ZooKeeper..."); try { connectionStrategy.reconnect(zkServerAddress, zkClientTimeout, this, new ZkClientConnectionStrategy.ZkUpdate() { @Override public void update(SolrZooKeeper keeper) throws InterruptedException, TimeoutException, IOException { synchronized (connectionStrategy) { waitForConnected(SolrZkClient.DEFAULT_CLIENT_CONNECT_TIMEOUT); client.updateKeeper(keeper); if (onReconnect != null) { onReconnect.command(); } synchronized (ConnectionManager.this) { ConnectionManager.this.connected = true; } } } }); } catch (Exception e) { SolrException.log(log, "", e); } log.info("Connected:" + connected); }
// in solrj/src/java/org/apache/solr/common/cloud/ConnectionManager.java
Override public void update(SolrZooKeeper keeper) throws InterruptedException, TimeoutException, IOException { synchronized (connectionStrategy) { waitForConnected(SolrZkClient.DEFAULT_CLIENT_CONNECT_TIMEOUT); client.updateKeeper(keeper); if (onReconnect != null) { onReconnect.command(); } synchronized (ConnectionManager.this) { ConnectionManager.this.connected = true; } } }
// in solrj/src/java/org/apache/solr/common/cloud/ConnectionManager.java
public synchronized void waitForConnected(long waitForConnection) throws InterruptedException, TimeoutException, IOException { long expire = System.currentTimeMillis() + waitForConnection; long left = waitForConnection; while (!connected && left > 0) { wait(left); left = expire - System.currentTimeMillis(); } if (!connected) { throw new TimeoutException("Could not connect to ZooKeeper " + zkServerAddress + " within " + waitForConnection + " ms"); } }
// in solrj/src/java/org/apache/solr/common/util/StrUtils.java
public static void partialURLEncodeVal(Appendable dest, String val) throws IOException { for (int i=0; i<val.length(); i++) { char ch = val.charAt(i); if (ch < 32) { dest.append('%'); if (ch < 0x10) dest.append('0'); dest.append(Integer.toHexString(ch)); } else { switch (ch) { case ' ': dest.append('+'); break; case '&': dest.append("%26"); break; case '%': dest.append("%25"); break; case '=': dest.append("%3D"); break; case '+': dest.append("%2B"); break; default : dest.append(ch); break; } } } }
// in solrj/src/java/org/apache/solr/common/util/ContentStreamBase.java
public InputStream getStream() throws IOException { URLConnection conn = this.url.openConnection(); contentType = conn.getContentType(); name = url.toExternalForm(); size = new Long( conn.getContentLength() ); return conn.getInputStream(); }
// in solrj/src/java/org/apache/solr/common/util/ContentStreamBase.java
public InputStream getStream() throws IOException { return new FileInputStream( file ); }
// in solrj/src/java/org/apache/solr/common/util/ContentStreamBase.java
Override public Reader getReader() throws IOException { String charset = getCharsetFromContentType( contentType ); return charset == null ? new FileReader( file ) : new InputStreamReader( getStream(), charset ); }
// in solrj/src/java/org/apache/solr/common/util/ContentStreamBase.java
public InputStream getStream() throws IOException { return new ByteArrayInputStream( str.getBytes(DEFAULT_CHARSET) ); }
// in solrj/src/java/org/apache/solr/common/util/ContentStreamBase.java
Override public Reader getReader() throws IOException { String charset = getCharsetFromContentType( contentType ); return charset == null ? new StringReader( str ) : new InputStreamReader( getStream(), charset ); }
// in solrj/src/java/org/apache/solr/common/util/ContentStreamBase.java
public Reader getReader() throws IOException { String charset = getCharsetFromContentType( getContentType() ); return charset == null ? new InputStreamReader( getStream(), DEFAULT_CHARSET ) : new InputStreamReader( getStream(), charset ); }
// in solrj/src/java/org/apache/solr/common/util/XML.java
public static void escapeCharData(String str, Writer out) throws IOException { escape(str, out, chardata_escapes); }
// in solrj/src/java/org/apache/solr/common/util/XML.java
public static void escapeAttributeValue(String str, Writer out) throws IOException { escape(str, out, attribute_escapes); }
// in solrj/src/java/org/apache/solr/common/util/XML.java
public static void escapeAttributeValue(char [] chars, int start, int length, Writer out) throws IOException { escape(chars, start, length, out, attribute_escapes); }
// in solrj/src/java/org/apache/solr/common/util/XML.java
public final static void writeXML(Writer out, String tag, String val) throws IOException { out.write('<'); out.write(tag); if (val == null) { out.write('/'); out.write('>'); } else { out.write('>'); escapeCharData(val,out); out.write('<'); out.write('/'); out.write(tag); out.write('>'); } }
// in solrj/src/java/org/apache/solr/common/util/XML.java
public final static void writeUnescapedXML(Writer out, String tag, String val, Object... attrs) throws IOException { out.write('<'); out.write(tag); for (int i=0; i<attrs.length; i++) { out.write(' '); out.write(attrs[i++].toString()); out.write('='); out.write('"'); out.write(attrs[i].toString()); out.write('"'); } if (val == null) { out.write('/'); out.write('>'); } else { out.write('>'); out.write(val); out.write('<'); out.write('/'); out.write(tag); out.write('>'); } }
// in solrj/src/java/org/apache/solr/common/util/XML.java
public final static void writeXML(Writer out, String tag, String val, Object... attrs) throws IOException { out.write('<'); out.write(tag); for (int i=0; i<attrs.length; i++) { out.write(' '); out.write(attrs[i++].toString()); out.write('='); out.write('"'); escapeAttributeValue(attrs[i].toString(), out); out.write('"'); } if (val == null) { out.write('/'); out.write('>'); } else { out.write('>'); escapeCharData(val,out); out.write('<'); out.write('/'); out.write(tag); out.write('>'); } }
// in solrj/src/java/org/apache/solr/common/util/XML.java
public static void writeXML(Writer out, String tag, String val, Map<String, String> attrs) throws IOException { out.write('<'); out.write(tag); for (Map.Entry<String, String> entry : attrs.entrySet()) { out.write(' '); out.write(entry.getKey()); out.write('='); out.write('"'); escapeAttributeValue(entry.getValue(), out); out.write('"'); } if (val == null) { out.write('/'); out.write('>'); } else { out.write('>'); escapeCharData(val,out); out.write('<'); out.write('/'); out.write(tag); out.write('>'); } }
// in solrj/src/java/org/apache/solr/common/util/XML.java
private static void escape(char [] chars, int offset, int length, Writer out, String [] escapes) throws IOException{ for (int i=offset; i<length; i++) { char ch = chars[i]; if (ch<escapes.length) { String replacement = escapes[ch]; if (replacement != null) { out.write(replacement); continue; } } out.write(ch); } }
// in solrj/src/java/org/apache/solr/common/util/XML.java
private static void escape(String str, Writer out, String[] escapes) throws IOException { for (int i=0; i<str.length(); i++) { char ch = str.charAt(i); if (ch<escapes.length) { String replacement = escapes[ch]; if (replacement != null) { out.write(replacement); continue; } } out.write(ch); } }
// in solrj/src/java/org/apache/solr/common/util/FastInputStream.java
Override public int read() throws IOException { if (pos >= end) { refill(); if (pos >= end) return -1; } return buf[pos++] & 0xff; }
// in solrj/src/java/org/apache/solr/common/util/FastInputStream.java
public int peek() throws IOException { if (pos >= end) { refill(); if (pos >= end) return -1; } return buf[pos] & 0xff; }
// in solrj/src/java/org/apache/solr/common/util/FastInputStream.java
public int readUnsignedByte() throws IOException { if (pos >= end) { refill(); if (pos >= end) { throw new EOFException(); } } return buf[pos++] & 0xff; }
// in solrj/src/java/org/apache/solr/common/util/FastInputStream.java
public int readWrappedStream(byte[] target, int offset, int len) throws IOException { return in.read(target, offset, len); }
// in solrj/src/java/org/apache/solr/common/util/FastInputStream.java
public void refill() throws IOException { // this will set end to -1 at EOF end = readWrappedStream(buf, 0, buf.length); if (end > 0) readFromStream += end; pos = 0; }
// in solrj/src/java/org/apache/solr/common/util/FastInputStream.java
Override public int available() throws IOException { return end - pos; }
// in solrj/src/java/org/apache/solr/common/util/FastInputStream.java
Override public int read(byte b[], int off, int len) throws IOException { int r=0; // number of bytes we have read // first read from our buffer; if (end-pos > 0) { r = Math.min(end-pos, len); System.arraycopy(buf, pos, b, off, r); pos += r; } if (r == len) return r; // amount left to read is >= buffer size if (len-r >= buf.length) { int ret = readWrappedStream(b, off+r, len-r); if (ret >= 0) { readFromStream += ret; r += ret; return r; } else { // negative return code return r > 0 ? r : -1; } } refill(); // read rest from our buffer if (end-pos > 0) { int toRead = Math.min(end-pos, len-r); System.arraycopy(buf, pos, b, off+r, toRead); pos += toRead; r += toRead; return r; } return r > 0 ? r : -1; }
// in solrj/src/java/org/apache/solr/common/util/FastInputStream.java
Override public void close() throws IOException { in.close(); }
// in solrj/src/java/org/apache/solr/common/util/FastInputStream.java
public void readFully(byte b[]) throws IOException { readFully(b, 0, b.length); }
// in solrj/src/java/org/apache/solr/common/util/FastInputStream.java
public void readFully(byte b[], int off, int len) throws IOException { while (len>0) { int ret = read(b, off, len); if (ret==-1) { throw new EOFException(); } off += ret; len -= ret; } }
// in solrj/src/java/org/apache/solr/common/util/FastInputStream.java
public int skipBytes(int n) throws IOException { if (end-pos >= n) { pos += n; return n; } if (end-pos<0) return -1; int r = end-pos; pos = end; while (r < n) { refill(); if (end-pos <= 0) return r; int toRead = Math.min(end-pos, n-r); r += toRead; pos += toRead; } return r; }
// in solrj/src/java/org/apache/solr/common/util/FastInputStream.java
public boolean readBoolean() throws IOException { return readByte()==1; }
// in solrj/src/java/org/apache/solr/common/util/FastInputStream.java
public byte readByte() throws IOException { if (pos >= end) { refill(); if (pos >= end) throw new EOFException(); } return buf[pos++]; }
// in solrj/src/java/org/apache/solr/common/util/FastInputStream.java
public short readShort() throws IOException { return (short)((readUnsignedByte() << 8) | readUnsignedByte()); }
// in solrj/src/java/org/apache/solr/common/util/FastInputStream.java
public int readUnsignedShort() throws IOException { return (readUnsignedByte() << 8) | readUnsignedByte(); }
// in solrj/src/java/org/apache/solr/common/util/FastInputStream.java
public char readChar() throws IOException { return (char)((readUnsignedByte() << 8) | readUnsignedByte()); }
// in solrj/src/java/org/apache/solr/common/util/FastInputStream.java
public int readInt() throws IOException { return ((readUnsignedByte() << 24) |(readUnsignedByte() << 16) |(readUnsignedByte() << 8) | readUnsignedByte()); }
// in solrj/src/java/org/apache/solr/common/util/FastInputStream.java
public long readLong() throws IOException { return (((long)readUnsignedByte()) << 56) | (((long)readUnsignedByte()) << 48) | (((long)readUnsignedByte()) << 40) | (((long)readUnsignedByte()) << 32) | (((long)readUnsignedByte()) << 24) | (readUnsignedByte() << 16) | (readUnsignedByte() << 8) | (readUnsignedByte()); }
// in solrj/src/java/org/apache/solr/common/util/FastInputStream.java
public float readFloat() throws IOException { return Float.intBitsToFloat(readInt()); }
// in solrj/src/java/org/apache/solr/common/util/FastInputStream.java
public double readDouble() throws IOException { return Double.longBitsToDouble(readLong()); }
// in solrj/src/java/org/apache/solr/common/util/FastInputStream.java
public String readLine() throws IOException { return new DataInputStream(this).readLine(); }
// in solrj/src/java/org/apache/solr/common/util/FastInputStream.java
public String readUTF() throws IOException { return new DataInputStream(this).readUTF(); }
// in solrj/src/java/org/apache/solr/common/util/FastOutputStream.java
Override public void write(int b) throws IOException { write((byte)b); }
// in solrj/src/java/org/apache/solr/common/util/FastOutputStream.java
Override public void write(byte b[]) throws IOException { write(b,0,b.length); }
// in solrj/src/java/org/apache/solr/common/util/FastOutputStream.java
public void write(byte b) throws IOException { if (pos >= buf.length) { out.write(buf); written += pos; pos=0; } buf[pos++] = b; }
// in solrj/src/java/org/apache/solr/common/util/FastOutputStream.java
Override public void write(byte arr[], int off, int len) throws IOException { int space = buf.length - pos; if (len < space) { System.arraycopy(arr, off, buf, pos, len); pos += len; } else if (len<buf.length) { // if the data to write is small enough, buffer it. System.arraycopy(arr, off, buf, pos, space); out.write(buf); written += buf.length; pos = len-space; System.arraycopy(arr, off+space, buf, 0, pos); } else { if (pos>0) { out.write(buf,0,pos); // flush written += pos; pos=0; } // don't buffer, just write to sink out.write(arr, off, len); written += len; } }
// in solrj/src/java/org/apache/solr/common/util/FastOutputStream.java
public void reserve(int len) throws IOException { if (len > (buf.length - pos)) flushBuffer(); }
// in solrj/src/java/org/apache/solr/common/util/FastOutputStream.java
public void writeBoolean(boolean v) throws IOException { write(v ? 1:0); }
// in solrj/src/java/org/apache/solr/common/util/FastOutputStream.java
public void writeByte(int v) throws IOException { write((byte)v); }
// in solrj/src/java/org/apache/solr/common/util/FastOutputStream.java
public void writeShort(int v) throws IOException { write((byte)(v >>> 8)); write((byte)v); }
// in solrj/src/java/org/apache/solr/common/util/FastOutputStream.java
public void writeChar(int v) throws IOException { writeShort(v); }
// in solrj/src/java/org/apache/solr/common/util/FastOutputStream.java
public void writeInt(int v) throws IOException { reserve(4); buf[pos] = (byte)(v>>>24); buf[pos+1] = (byte)(v>>>16); buf[pos+2] = (byte)(v>>>8); buf[pos+3] = (byte)(v); pos+=4; }
// in solrj/src/java/org/apache/solr/common/util/FastOutputStream.java
public void writeLong(long v) throws IOException { reserve(8); buf[pos] = (byte)(v>>>56); buf[pos+1] = (byte)(v>>>48); buf[pos+2] = (byte)(v>>>40); buf[pos+3] = (byte)(v>>>32); buf[pos+4] = (byte)(v>>>24); buf[pos+5] = (byte)(v>>>16); buf[pos+6] = (byte)(v>>>8); buf[pos+7] = (byte)(v); pos+=8; }
// in solrj/src/java/org/apache/solr/common/util/FastOutputStream.java
public void writeFloat(float v) throws IOException { writeInt(Float.floatToRawIntBits(v)); }
// in solrj/src/java/org/apache/solr/common/util/FastOutputStream.java
public void writeDouble(double v) throws IOException { writeLong(Double.doubleToRawLongBits(v)); }
// in solrj/src/java/org/apache/solr/common/util/FastOutputStream.java
public void writeBytes(String s) throws IOException { // non-optimized version, but this shouldn't be used anyway for (int i=0; i<s.length(); i++) write((byte)s.charAt(i)); }
// in solrj/src/java/org/apache/solr/common/util/FastOutputStream.java
public void writeChars(String s) throws IOException { // non-optimized version for (int i=0; i<s.length(); i++) writeChar(s.charAt(i)); }
// in solrj/src/java/org/apache/solr/common/util/FastOutputStream.java
public void writeUTF(String s) throws IOException { // non-optimized version, but this shouldn't be used anyway DataOutputStream daos = new DataOutputStream(this); daos.writeUTF(s); }
// in solrj/src/java/org/apache/solr/common/util/FastOutputStream.java
Override public void flush() throws IOException { flushBuffer(); out.flush(); }
// in solrj/src/java/org/apache/solr/common/util/FastOutputStream.java
Override public void close() throws IOException { flushBuffer(); out.close(); }
// in solrj/src/java/org/apache/solr/common/util/FastOutputStream.java
public void flushBuffer() throws IOException { if (pos > 0) { out.write(buf, 0, pos); written += pos; pos=0; } }
// in solrj/src/java/org/apache/solr/common/util/JavaBinCodec.java
public void marshal(Object nl, OutputStream os) throws IOException { init(FastOutputStream.wrap(os)); try { daos.writeByte(VERSION); writeVal(nl); } finally { daos.flushBuffer(); } }
// in solrj/src/java/org/apache/solr/common/util/JavaBinCodec.java
public Object unmarshal(InputStream is) throws IOException { FastInputStream dis = FastInputStream.wrap(is); version = dis.readByte(); if (version != VERSION) { throw new RuntimeException("Invalid version (expected " + VERSION + ", but " + version + ") or the data in not in 'javabin' format"); } return readVal(dis); }
// in solrj/src/java/org/apache/solr/common/util/JavaBinCodec.java
public SimpleOrderedMap<Object> readOrderedMap(FastInputStream dis) throws IOException { int sz = readSize(dis); SimpleOrderedMap<Object> nl = new SimpleOrderedMap<Object>(); for (int i = 0; i < sz; i++) { String name = (String) readVal(dis); Object val = readVal(dis); nl.add(name, val); } return nl; }
// in solrj/src/java/org/apache/solr/common/util/JavaBinCodec.java
public NamedList<Object> readNamedList(FastInputStream dis) throws IOException { int sz = readSize(dis); NamedList<Object> nl = new NamedList<Object>(); for (int i = 0; i < sz; i++) { String name = (String) readVal(dis); Object val = readVal(dis); nl.add(name, val); } return nl; }
// in solrj/src/java/org/apache/solr/common/util/JavaBinCodec.java
public void writeNamedList(NamedList<?> nl) throws IOException { writeTag(nl instanceof SimpleOrderedMap ? ORDERED_MAP : NAMED_LST, nl.size()); for (int i = 0; i < nl.size(); i++) { String name = nl.getName(i); writeExternString(name); Object val = nl.getVal(i); writeVal(val); } }
// in solrj/src/java/org/apache/solr/common/util/JavaBinCodec.java
public void writeVal(Object val) throws IOException { if (writeKnownType(val)) { return; } else { Object tmpVal = val; if (resolver != null) { tmpVal = resolver.resolve(val, this); if (tmpVal == null) return; // null means the resolver took care of it fully if (writeKnownType(tmpVal)) return; } } writeVal(val.getClass().getName() + ':' + val.toString()); }
// in solrj/src/java/org/apache/solr/common/util/JavaBinCodec.java
public Object readVal(FastInputStream dis) throws IOException { tagByte = dis.readByte(); // if ((tagByte & 0xe0) == 0) { // if top 3 bits are clear, this is a normal tag // OK, try type + size in single byte switch (tagByte >>> 5) { case STR >>> 5: return readStr(dis); case SINT >>> 5: return readSmallInt(dis); case SLONG >>> 5: return readSmallLong(dis); case ARR >>> 5: return readArray(dis); case ORDERED_MAP >>> 5: return readOrderedMap(dis); case NAMED_LST >>> 5: return readNamedList(dis); case EXTERN_STRING >>> 5: return readExternString(dis); } switch (tagByte) { case NULL: return null; case DATE: return new Date(dis.readLong()); case INT: return dis.readInt(); case BOOL_TRUE: return Boolean.TRUE; case BOOL_FALSE: return Boolean.FALSE; case FLOAT: return dis.readFloat(); case DOUBLE: return dis.readDouble(); case LONG: return dis.readLong(); case BYTE: return dis.readByte(); case SHORT: return dis.readShort(); case MAP: return readMap(dis); case SOLRDOC: return readSolrDocument(dis); case SOLRDOCLST: return readSolrDocumentList(dis); case BYTEARR: return readByteArray(dis); case ITERATOR: return readIterator(dis); case END: return END_OBJ; case SOLRINPUTDOC: return readSolrInputDocument(dis); } throw new RuntimeException("Unknown type " + tagByte); }
// in solrj/src/java/org/apache/solr/common/util/JavaBinCodec.java
public boolean writeKnownType(Object val) throws IOException { if (writePrimitive(val)) return true; if (val instanceof NamedList) { writeNamedList((NamedList<?>) val); return true; } if (val instanceof SolrDocumentList) { // SolrDocumentList is a List, so must come before List check writeSolrDocumentList((SolrDocumentList) val); return true; } if (val instanceof Collection) { writeArray((Collection) val); return true; } if (val instanceof Object[]) { writeArray((Object[]) val); return true; } if (val instanceof SolrDocument) { //this needs special treatment to know which fields are to be written if (resolver == null) { writeSolrDocument((SolrDocument) val); } else { Object retVal = resolver.resolve(val, this); if (retVal != null) { if (retVal instanceof SolrDocument) { writeSolrDocument((SolrDocument) retVal); } else { writeVal(retVal); } } } return true; } if (val instanceof SolrInputDocument) { writeSolrInputDocument((SolrInputDocument)val); return true; } if (val instanceof Map) { writeMap((Map) val); return true; } if (val instanceof Iterator) { writeIterator((Iterator) val); return true; } if (val instanceof Iterable) { writeIterator(((Iterable) val).iterator()); return true; } return false; }
// in solrj/src/java/org/apache/solr/common/util/JavaBinCodec.java
public void writeTag(byte tag) throws IOException { daos.writeByte(tag); }
// in solrj/src/java/org/apache/solr/common/util/JavaBinCodec.java
public void writeTag(byte tag, int size) throws IOException { if ((tag & 0xe0) != 0) { if (size < 0x1f) { daos.writeByte(tag | size); } else { daos.writeByte(tag | 0x1f); writeVInt(size - 0x1f, daos); } } else { daos.writeByte(tag); writeVInt(size, daos); } }
// in solrj/src/java/org/apache/solr/common/util/JavaBinCodec.java
public void writeByteArray(byte[] arr, int offset, int len) throws IOException { writeTag(BYTEARR, len); daos.write(arr, offset, len); }
// in solrj/src/java/org/apache/solr/common/util/JavaBinCodec.java
public byte[] readByteArray(FastInputStream dis) throws IOException { byte[] arr = new byte[readVInt(dis)]; dis.readFully(arr); return arr; }
// in solrj/src/java/org/apache/solr/common/util/JavaBinCodec.java
public void writeSolrDocument(SolrDocument doc) throws IOException { writeTag(SOLRDOC); writeTag(ORDERED_MAP, doc.size()); for (Map.Entry<String, Object> entry : doc) { String name = entry.getKey(); writeExternString(name); Object val = entry.getValue(); writeVal(val); } }
// in solrj/src/java/org/apache/solr/common/util/JavaBinCodec.java
public SolrDocument readSolrDocument(FastInputStream dis) throws IOException { NamedList nl = (NamedList) readVal(dis); SolrDocument doc = new SolrDocument(); for (int i = 0; i < nl.size(); i++) { String name = nl.getName(i); Object val = nl.getVal(i); doc.setField(name, val); } return doc; }
// in solrj/src/java/org/apache/solr/common/util/JavaBinCodec.java
public SolrDocumentList readSolrDocumentList(FastInputStream dis) throws IOException { SolrDocumentList solrDocs = new SolrDocumentList(); List list = (List) readVal(dis); solrDocs.setNumFound((Long) list.get(0)); solrDocs.setStart((Long) list.get(1)); solrDocs.setMaxScore((Float) list.get(2)); @SuppressWarnings("unchecked") List<SolrDocument> l = (List<SolrDocument>) readVal(dis); solrDocs.addAll(l); return solrDocs; }
// in solrj/src/java/org/apache/solr/common/util/JavaBinCodec.java
public void writeSolrDocumentList(SolrDocumentList docs) throws IOException { writeTag(SOLRDOCLST); List<Number> l = new ArrayList<Number>(3); l.add(docs.getNumFound()); l.add(docs.getStart()); l.add(docs.getMaxScore()); writeArray(l); writeArray(docs); }
// in solrj/src/java/org/apache/solr/common/util/JavaBinCodec.java
public SolrInputDocument readSolrInputDocument(FastInputStream dis) throws IOException { int sz = readVInt(dis); float docBoost = (Float)readVal(dis); SolrInputDocument sdoc = new SolrInputDocument(); sdoc.setDocumentBoost(docBoost); for (int i = 0; i < sz; i++) { float boost = 1.0f; String fieldName; Object boostOrFieldName = readVal(dis); if (boostOrFieldName instanceof Float) { boost = (Float)boostOrFieldName; fieldName = (String)readVal(dis); } else { fieldName = (String)boostOrFieldName; } Object fieldVal = readVal(dis); sdoc.setField(fieldName, fieldVal, boost); } return sdoc; }
// in solrj/src/java/org/apache/solr/common/util/JavaBinCodec.java
public void writeSolrInputDocument(SolrInputDocument sdoc) throws IOException { writeTag(SOLRINPUTDOC, sdoc.size()); writeFloat(sdoc.getDocumentBoost()); for (SolrInputField inputField : sdoc.values()) { if (inputField.getBoost() != 1.0f) { writeFloat(inputField.getBoost()); } writeExternString(inputField.getName()); writeVal(inputField.getValue()); } }
// in solrj/src/java/org/apache/solr/common/util/JavaBinCodec.java
public Map<Object,Object> readMap(FastInputStream dis) throws IOException { int sz = readVInt(dis); Map<Object,Object> m = new LinkedHashMap<Object,Object>(); for (int i = 0; i < sz; i++) { Object key = readVal(dis); Object val = readVal(dis); m.put(key, val); } return m; }
// in solrj/src/java/org/apache/solr/common/util/JavaBinCodec.java
public void writeIterator(Iterator iter) throws IOException { writeTag(ITERATOR); while (iter.hasNext()) { writeVal(iter.next()); } writeVal(END_OBJ); }
// in solrj/src/java/org/apache/solr/common/util/JavaBinCodec.java
public List<Object> readIterator(FastInputStream fis) throws IOException { ArrayList<Object> l = new ArrayList<Object>(); while (true) { Object o = readVal(fis); if (o == END_OBJ) break; l.add(o); } return l; }
// in solrj/src/java/org/apache/solr/common/util/JavaBinCodec.java
public void writeArray(List l) throws IOException { writeTag(ARR, l.size()); for (int i = 0; i < l.size(); i++) { writeVal(l.get(i)); } }
// in solrj/src/java/org/apache/solr/common/util/JavaBinCodec.java
public void writeArray(Collection coll) throws IOException { writeTag(ARR, coll.size()); for (Object o : coll) { writeVal(o); } }
// in solrj/src/java/org/apache/solr/common/util/JavaBinCodec.java
public void writeArray(Object[] arr) throws IOException { writeTag(ARR, arr.length); for (int i = 0; i < arr.length; i++) { Object o = arr[i]; writeVal(o); } }
// in solrj/src/java/org/apache/solr/common/util/JavaBinCodec.java
public List<Object> readArray(FastInputStream dis) throws IOException { int sz = readSize(dis); ArrayList<Object> l = new ArrayList<Object>(sz); for (int i = 0; i < sz; i++) { l.add(readVal(dis)); } return l; }
// in solrj/src/java/org/apache/solr/common/util/JavaBinCodec.java
public void writeStr(String s) throws IOException { if (s == null) { writeTag(NULL); return; } int end = s.length(); int maxSize = end * 4; if (bytes == null || bytes.length < maxSize) bytes = new byte[maxSize]; int sz = ByteUtils.UTF16toUTF8(s, 0, end, bytes, 0); writeTag(STR, sz); daos.write(bytes, 0, sz); }
// in solrj/src/java/org/apache/solr/common/util/JavaBinCodec.java
public String readStr(FastInputStream dis) throws IOException { int sz = readSize(dis); if (bytes == null || bytes.length < sz) bytes = new byte[sz]; dis.readFully(bytes, 0, sz); arr.reset(); ByteUtils.UTF8toUTF16(bytes, 0, sz, arr); return arr.toString(); }
// in solrj/src/java/org/apache/solr/common/util/JavaBinCodec.java
public void writeInt(int val) throws IOException { if (val > 0) { int b = SINT | (val & 0x0f); if (val >= 0x0f) { b |= 0x10; daos.writeByte(b); writeVInt(val >>> 4, daos); } else { daos.writeByte(b); } } else { daos.writeByte(INT); daos.writeInt(val); } }
// in solrj/src/java/org/apache/solr/common/util/JavaBinCodec.java
public int readSmallInt(FastInputStream dis) throws IOException { int v = tagByte & 0x0F; if ((tagByte & 0x10) != 0) v = (readVInt(dis) << 4) | v; return v; }
// in solrj/src/java/org/apache/solr/common/util/JavaBinCodec.java
public void writeLong(long val) throws IOException { if ((val & 0xff00000000000000L) == 0) { int b = SLONG | ((int) val & 0x0f); if (val >= 0x0f) { b |= 0x10; daos.writeByte(b); writeVLong(val >>> 4, daos); } else { daos.writeByte(b); } } else { daos.writeByte(LONG); daos.writeLong(val); } }
// in solrj/src/java/org/apache/solr/common/util/JavaBinCodec.java
public long readSmallLong(FastInputStream dis) throws IOException { long v = tagByte & 0x0F; if ((tagByte & 0x10) != 0) v = (readVLong(dis) << 4) | v; return v; }
// in solrj/src/java/org/apache/solr/common/util/JavaBinCodec.java
public void writeFloat(float val) throws IOException { daos.writeByte(FLOAT); daos.writeFloat(val); }
// in solrj/src/java/org/apache/solr/common/util/JavaBinCodec.java
public boolean writePrimitive(Object val) throws IOException { if (val == null) { daos.writeByte(NULL); return true; } else if (val instanceof String) { writeStr((String) val); return true; } else if (val instanceof Number) { if (val instanceof Integer) { writeInt(((Integer) val).intValue()); return true; } else if (val instanceof Long) { writeLong(((Long) val).longValue()); return true; } else if (val instanceof Float) { writeFloat(((Float) val).floatValue()); return true; } else if (val instanceof Double) { daos.writeByte(DOUBLE); daos.writeDouble(((Double) val).doubleValue()); return true; } else if (val instanceof Byte) { daos.writeByte(BYTE); daos.writeByte(((Byte) val).intValue()); return true; } else if (val instanceof Short) { daos.writeByte(SHORT); daos.writeShort(((Short) val).intValue()); return true; } return false; } else if (val instanceof Date) { daos.writeByte(DATE); daos.writeLong(((Date) val).getTime()); return true; } else if (val instanceof Boolean) { if ((Boolean) val) daos.writeByte(BOOL_TRUE); else daos.writeByte(BOOL_FALSE); return true; } else if (val instanceof byte[]) { writeByteArray((byte[]) val, 0, ((byte[]) val).length); return true; } else if (val instanceof ByteBuffer) { ByteBuffer buf = (ByteBuffer) val; writeByteArray(buf.array(),buf.position(),buf.limit() - buf.position()); return true; } else if (val == END_OBJ) { writeTag(END); return true; } return false; }
// in solrj/src/java/org/apache/solr/common/util/JavaBinCodec.java
public void writeMap(Map<?,?> val) throws IOException { writeTag(MAP, val.size()); for (Map.Entry<?,?> entry : val.entrySet()) { Object key = entry.getKey(); if (key instanceof String) { writeExternString((String) key); } else { writeVal(key); } writeVal(entry.getValue()); } }
// in solrj/src/java/org/apache/solr/common/util/JavaBinCodec.java
public int readSize(FastInputStream in) throws IOException { int sz = tagByte & 0x1f; if (sz == 0x1f) sz += readVInt(in); return sz; }
// in solrj/src/java/org/apache/solr/common/util/JavaBinCodec.java
public static void writeVInt(int i, FastOutputStream out) throws IOException { while ((i & ~0x7F) != 0) { out.writeByte((byte) ((i & 0x7f) | 0x80)); i >>>= 7; } out.writeByte((byte) i); }
// in solrj/src/java/org/apache/solr/common/util/JavaBinCodec.java
public static int readVInt(FastInputStream in) throws IOException { byte b = in.readByte(); int i = b & 0x7F; for (int shift = 7; (b & 0x80) != 0; shift += 7) { b = in.readByte(); i |= (b & 0x7F) << shift; }
// in solrj/src/java/org/apache/solr/common/util/JavaBinCodec.java
public static void writeVLong(long i, FastOutputStream out) throws IOException { while ((i & ~0x7F) != 0) { out.writeByte((byte) ((i & 0x7f) | 0x80)); i >>>= 7; } out.writeByte((byte) i); }
// in solrj/src/java/org/apache/solr/common/util/JavaBinCodec.java
public static long readVLong(FastInputStream in) throws IOException { byte b = in.readByte(); long i = b & 0x7F; for (int shift = 7; (b & 0x80) != 0; shift += 7) { b = in.readByte(); i |= (long) (b & 0x7F) << shift; }
// in solrj/src/java/org/apache/solr/common/util/JavaBinCodec.java
public void writeExternString(String s) throws IOException { if (s == null) { writeTag(NULL); return; } Integer idx = stringsMap == null ? null : stringsMap.get(s); if (idx == null) idx = 0; writeTag(EXTERN_STRING, idx); if (idx == 0) { writeStr(s); if (stringsMap == null) stringsMap = new HashMap<String, Integer>(); stringsMap.put(s, ++stringsCount); } }
// in solrj/src/java/org/apache/solr/common/util/JavaBinCodec.java
public String readExternString(FastInputStream fis) throws IOException { int idx = readSize(fis); if (idx != 0) {// idx != 0 is the index of the extern string return stringsList.get(idx - 1); } else {// idx == 0 means it has a string value String s = (String) readVal(fis); if (stringsList == null) stringsList = new ArrayList<String>(); stringsList.add(s); return s; } }
// in solrj/src/java/org/apache/solr/common/util/DateUtil.java
public static Calendar formatDate(Date date, Calendar cal, Appendable out) throws IOException { // using a stringBuilder for numbers can be nice since // a temporary string isn't used (it's added directly to the // builder's buffer. StringBuilder sb = out instanceof StringBuilder ? (StringBuilder)out : new StringBuilder(); if (cal==null) cal = Calendar.getInstance(TimeZone.getTimeZone("GMT"), Locale.US); cal.setTime(date); int i = cal.get(Calendar.YEAR); sb.append(i); sb.append('-'); i = cal.get(Calendar.MONTH) + 1; // 0 based, so add 1 if (i<10) sb.append('0'); sb.append(i); sb.append('-'); i=cal.get(Calendar.DAY_OF_MONTH); if (i<10) sb.append('0'); sb.append(i); sb.append('T'); i=cal.get(Calendar.HOUR_OF_DAY); // 24 hour time format if (i<10) sb.append('0'); sb.append(i); sb.append(':'); i=cal.get(Calendar.MINUTE); if (i<10) sb.append('0'); sb.append(i); sb.append(':'); i=cal.get(Calendar.SECOND); if (i<10) sb.append('0'); sb.append(i); i=cal.get(Calendar.MILLISECOND); if (i != 0) { sb.append('.'); if (i<100) sb.append('0'); if (i<10) sb.append('0'); sb.append(i); // handle canonical format specifying fractional // seconds shall not end in '0'. Given the slowness of // integer div/mod, simply checking the last character // is probably the fastest way to check. int lastIdx = sb.length()-1; if (sb.charAt(lastIdx)=='0') { lastIdx--; if (sb.charAt(lastIdx)=='0') { lastIdx--; } sb.setLength(lastIdx+1); } } sb.append('Z'); if (out != sb) out.append(sb); return cal; }
// in solrj/src/java/org/apache/solr/client/solrj/request/DirectXmlRequest.java
Override public UpdateResponse process( SolrServer server ) throws SolrServerException, IOException { long startTime = System.currentTimeMillis(); UpdateResponse res = new UpdateResponse(); res.setResponse( server.request( this ) ); res.setElapsedTime( System.currentTimeMillis()-startTime ); return res; }
// in solrj/src/java/org/apache/solr/client/solrj/request/SolrPing.java
Override public SolrPingResponse process( SolrServer server ) throws SolrServerException, IOException { long startTime = System.currentTimeMillis(); SolrPingResponse res = new SolrPingResponse(); res.setResponse( server.request( this ) ); res.setElapsedTime( System.currentTimeMillis()-startTime ); return res; }
// in solrj/src/java/org/apache/solr/client/solrj/request/AbstractUpdateRequest.java
Override public UpdateResponse process( SolrServer server ) throws SolrServerException, IOException { long startTime = System.currentTimeMillis(); UpdateResponse res = new UpdateResponse(); res.setResponse( server.request( this ) ); res.setElapsedTime( System.currentTimeMillis()-startTime ); return res; }
// in solrj/src/java/org/apache/solr/client/solrj/request/ContentStreamUpdateRequest.java
Override public Collection<ContentStream> getContentStreams() throws IOException { return contentStreams; }
// in solrj/src/java/org/apache/solr/client/solrj/request/ContentStreamUpdateRequest.java
public void addFile(File file, String contentType) throws IOException { ContentStreamBase cs = new ContentStreamBase.FileStream(file); cs.setContentType(contentType); addContentStream(cs); }
// in solrj/src/java/org/apache/solr/client/solrj/request/FieldAnalysisRequest.java
Override public Collection<ContentStream> getContentStreams() throws IOException { return null; }
// in solrj/src/java/org/apache/solr/client/solrj/request/FieldAnalysisRequest.java
Override public FieldAnalysisResponse process(SolrServer server) throws SolrServerException, IOException { if (fieldTypes == null && fieldNames == null) { throw new IllegalStateException("At least one field type or field name need to be specified"); } if (fieldValue == null) { throw new IllegalStateException("The field value must be set"); } long startTime = System.currentTimeMillis(); FieldAnalysisResponse res = new FieldAnalysisResponse(); res.setResponse(server.request(this)); res.setElapsedTime(System.currentTimeMillis() - startTime); return res; }
// in solrj/src/java/org/apache/solr/client/solrj/request/RequestWriter.java
public Collection<ContentStream> getContentStreams(SolrRequest req) throws IOException { if (req instanceof UpdateRequest) { UpdateRequest updateRequest = (UpdateRequest) req; if (isEmpty(updateRequest)) return null; List<ContentStream> l = new ArrayList<ContentStream>(); l.add(new LazyContentStream(updateRequest)); return l; } return req.getContentStreams(); }
// in solrj/src/java/org/apache/solr/client/solrj/request/RequestWriter.java
public ContentStream getContentStream(UpdateRequest req) throws IOException { return new ContentStreamBase.StringStream(req.getXML()); }
// in solrj/src/java/org/apache/solr/client/solrj/request/RequestWriter.java
public void write(SolrRequest request, OutputStream os) throws IOException { if (request instanceof UpdateRequest) { UpdateRequest updateRequest = (UpdateRequest) request; OutputStreamWriter writer = new OutputStreamWriter(os, UTF_8); updateRequest.writeXML(writer); writer.flush(); } }
// in solrj/src/java/org/apache/solr/client/solrj/request/RequestWriter.java
public InputStream getStream() throws IOException { return getDelegate().getStream(); }
// in solrj/src/java/org/apache/solr/client/solrj/request/RequestWriter.java
public Reader getReader() throws IOException { return getDelegate().getReader(); }
// in solrj/src/java/org/apache/solr/client/solrj/request/RequestWriter.java
public void writeTo(OutputStream os) throws IOException { write(req, os); }
// in solrj/src/java/org/apache/solr/client/solrj/request/UpdateRequest.java
Override public Collection<ContentStream> getContentStreams() throws IOException { return ClientUtils.toContentStreams( getXML(), ClientUtils.TEXT_XML ); }
// in solrj/src/java/org/apache/solr/client/solrj/request/UpdateRequest.java
public String getXML() throws IOException { StringWriter writer = new StringWriter(); writeXML( writer ); writer.flush(); // If action is COMMIT or OPTIMIZE, it is sent with params String xml = writer.toString(); //System.out.println( "SEND:"+xml ); return (xml.length() > 0) ? xml : null; }
// in solrj/src/java/org/apache/solr/client/solrj/request/UpdateRequest.java
public void writeXML( Writer writer ) throws IOException { if( (documents != null && documents.size() > 0) || docIterator != null) { if( commitWithin > 0 ) { writer.write("<add commitWithin=\""+commitWithin+"\">"); } else { writer.write("<add>"); } if(documents != null) { for (SolrInputDocument doc : documents) { if (doc != null) { ClientUtils.writeXML(doc, writer); } } } if (docIterator != null) { while (docIterator.hasNext()) { SolrInputDocument doc = docIterator.next(); if (doc != null) { ClientUtils.writeXML(doc, writer); } } } writer.write("</add>"); } // Add the delete commands boolean deleteI = deleteById != null && deleteById.size() > 0; boolean deleteQ = deleteQuery != null && deleteQuery.size() > 0; if( deleteI || deleteQ ) { if(commitWithin>0) { writer.append( "<delete commitWithin=\"" + commitWithin + "\">" ); } else { writer.append( "<delete>" ); } if( deleteI ) { for( String id : deleteById ) { writer.append( "<id>" ); XML.escapeCharData( id, writer ); writer.append( "</id>" ); } } if( deleteQ ) { for( String q : deleteQuery ) { writer.append( "<query>" ); XML.escapeCharData( q, writer ); writer.append( "</query>" ); } } writer.append( "</delete>" ); } }
// in solrj/src/java/org/apache/solr/client/solrj/request/LukeRequest.java
Override public LukeResponse process( SolrServer server ) throws SolrServerException, IOException { long startTime = System.currentTimeMillis(); LukeResponse res = new LukeResponse(); res.setResponse( server.request( this ) ); res.setElapsedTime( System.currentTimeMillis()-startTime ); return res; }
// in solrj/src/java/org/apache/solr/client/solrj/request/UpdateRequestExt.java
Override public Collection<ContentStream> getContentStreams() throws IOException { return ClientUtils.toContentStreams(getXML(), ClientUtils.TEXT_XML); }
// in solrj/src/java/org/apache/solr/client/solrj/request/UpdateRequestExt.java
public String getXML() throws IOException { StringWriter writer = new StringWriter(); writeXML(writer); writer.flush(); String xml = writer.toString(); return (xml.length() > 0) ? xml : null; }
// in solrj/src/java/org/apache/solr/client/solrj/request/UpdateRequestExt.java
public void writeXML(Writer writer) throws IOException { List<List<SolrDoc>> getDocLists = getDocLists(documents); for (List<SolrDoc> docs : getDocLists) { if ((docs != null && docs.size() > 0)) { SolrDoc firstDoc = docs.get(0); int commitWithin = firstDoc.commitWithin != -1 ? firstDoc.commitWithin : this.commitWithin; boolean overwrite = firstDoc.overwrite; if (commitWithin > -1 || overwrite != true) { writer.write("<add commitWithin=\"" + commitWithin + "\" " + "overwrite=\"" + overwrite + "\">"); } else { writer.write("<add>"); } if (documents != null) { for (SolrDoc doc : documents) { if (doc != null) { ClientUtils.writeXML(doc.document, writer); } } } writer.write("</add>"); } } // Add the delete commands boolean deleteI = deleteById != null && deleteById.size() > 0; boolean deleteQ = deleteQuery != null && deleteQuery.size() > 0; if (deleteI || deleteQ) { writer.append("<delete>"); if (deleteI) { for (Map.Entry<String,Long> entry : deleteById.entrySet()) { writer.append("<id"); Long version = entry.getValue(); if (version != null) { writer.append(" version=\"" + version + "\""); } writer.append(">"); XML.escapeCharData(entry.getKey(), writer); writer.append("</id>"); } } if (deleteQ) { for (String q : deleteQuery) { writer.append("<query>"); XML.escapeCharData(q, writer); writer.append("</query>"); } } writer.append("</delete>"); } }
// in solrj/src/java/org/apache/solr/client/solrj/request/DocumentAnalysisRequest.java
Override public Collection<ContentStream> getContentStreams() throws IOException { return ClientUtils.toContentStreams(getXML(), ClientUtils.TEXT_XML); }
// in solrj/src/java/org/apache/solr/client/solrj/request/DocumentAnalysisRequest.java
Override public DocumentAnalysisResponse process(SolrServer server) throws SolrServerException, IOException { long startTime = System.currentTimeMillis(); DocumentAnalysisResponse res = new DocumentAnalysisResponse(); res.setResponse(server.request(this)); res.setElapsedTime(System.currentTimeMillis() - startTime); return res; }
// in solrj/src/java/org/apache/solr/client/solrj/request/DocumentAnalysisRequest.java
String getXML() throws IOException { StringWriter writer = new StringWriter(); writer.write("<docs>"); for (SolrInputDocument document : documents) { ClientUtils.writeXML(document, writer); } writer.write("</docs>"); writer.flush(); String xml = writer.toString(); return (xml.length() > 0) ? xml : null; }
// in solrj/src/java/org/apache/solr/client/solrj/request/JavaBinUpdateRequestCodec.java
public void marshal(UpdateRequest updateRequest, OutputStream os) throws IOException { NamedList nl = new NamedList(); NamedList params = solrParamsToNamedList(updateRequest.getParams()); if (updateRequest.getCommitWithin() != -1) { params.add("commitWithin", updateRequest.getCommitWithin()); } Iterator<SolrInputDocument> docIter = null; if (updateRequest.getDocuments() != null) { docIter = updateRequest.getDocuments().iterator(); } if(updateRequest.getDocIterator() != null){ docIter = updateRequest.getDocIterator(); } nl.add("params", params);// 0: params nl.add("delById", updateRequest.getDeleteById()); nl.add("delByQ", updateRequest.getDeleteQuery()); nl.add("docs", docIter); JavaBinCodec codec = new JavaBinCodec(); codec.marshal(nl, os); }
// in solrj/src/java/org/apache/solr/client/solrj/request/JavaBinUpdateRequestCodec.java
public UpdateRequest unmarshal(InputStream is, final StreamingUpdateHandler handler) throws IOException { final UpdateRequest updateRequest = new UpdateRequest(); List<List<NamedList>> doclist; List<String> delById; List<String> delByQ; final NamedList[] namedList = new NamedList[1]; JavaBinCodec codec = new JavaBinCodec() { // NOTE: this only works because this is an anonymous inner class // which will only ever be used on a single stream -- if this class // is ever refactored, this will not work. private boolean seenOuterMostDocIterator = false; @Override public NamedList readNamedList(FastInputStream dis) throws IOException { int sz = readSize(dis); NamedList nl = new NamedList(); if (namedList[0] == null) { namedList[0] = nl; } for (int i = 0; i < sz; i++) { String name = (String) readVal(dis); Object val = readVal(dis); nl.add(name, val); } return nl; } @Override public List readIterator(FastInputStream fis) throws IOException { // default behavior for reading any regular Iterator in the stream if (seenOuterMostDocIterator) return super.readIterator(fis); // special treatment for first outermost Iterator // (the list of documents) seenOuterMostDocIterator = true; return readOuterMostDocIterator(fis); } private List readOuterMostDocIterator(FastInputStream fis) throws IOException { NamedList params = (NamedList) namedList[0].getVal(0); updateRequest.setParams(new ModifiableSolrParams(SolrParams.toSolrParams(params))); if (handler == null) return super.readIterator(fis); while (true) { Object o = readVal(fis); if (o == END_OBJ) break; SolrInputDocument sdoc = null; if (o instanceof List) { sdoc = listToSolrInputDocument((List<NamedList>) o); } else if (o instanceof NamedList) { UpdateRequest req = new UpdateRequest(); req.setParams(new ModifiableSolrParams(SolrParams.toSolrParams((NamedList) o))); handler.update(null, req); } else { sdoc = (SolrInputDocument) o; } handler.update(sdoc, updateRequest); } return Collections.EMPTY_LIST; } }; codec.unmarshal(is); // NOTE: if the update request contains only delete commands the params // must be loaded now if(updateRequest.getParams()==null) { NamedList params = (NamedList) namedList[0].get("params"); if(params!=null) { updateRequest.setParams(new ModifiableSolrParams(SolrParams.toSolrParams(params))); } } delById = (List<String>) namedList[0].get("delById"); delByQ = (List<String>) namedList[0].get("delByQ"); doclist = (List) namedList[0].get("docs"); if (doclist != null && !doclist.isEmpty()) { List<SolrInputDocument> solrInputDocs = new ArrayList<SolrInputDocument>(); for (Object o : doclist) { if (o instanceof List) { solrInputDocs.add(listToSolrInputDocument((List<NamedList>)o)); } else { solrInputDocs.add((SolrInputDocument)o); } } updateRequest.add(solrInputDocs); } if (delById != null) { for (String s : delById) { updateRequest.deleteById(s); } } if (delByQ != null) { for (String s : delByQ) { updateRequest.deleteByQuery(s); } } return updateRequest; }
// in solrj/src/java/org/apache/solr/client/solrj/request/JavaBinUpdateRequestCodec.java
Override public NamedList readNamedList(FastInputStream dis) throws IOException { int sz = readSize(dis); NamedList nl = new NamedList(); if (namedList[0] == null) { namedList[0] = nl; } for (int i = 0; i < sz; i++) { String name = (String) readVal(dis); Object val = readVal(dis); nl.add(name, val); } return nl; }
// in solrj/src/java/org/apache/solr/client/solrj/request/JavaBinUpdateRequestCodec.java
Override public List readIterator(FastInputStream fis) throws IOException { // default behavior for reading any regular Iterator in the stream if (seenOuterMostDocIterator) return super.readIterator(fis); // special treatment for first outermost Iterator // (the list of documents) seenOuterMostDocIterator = true; return readOuterMostDocIterator(fis); }
// in solrj/src/java/org/apache/solr/client/solrj/request/JavaBinUpdateRequestCodec.java
private List readOuterMostDocIterator(FastInputStream fis) throws IOException { NamedList params = (NamedList) namedList[0].getVal(0); updateRequest.setParams(new ModifiableSolrParams(SolrParams.toSolrParams(params))); if (handler == null) return super.readIterator(fis); while (true) { Object o = readVal(fis); if (o == END_OBJ) break; SolrInputDocument sdoc = null; if (o instanceof List) { sdoc = listToSolrInputDocument((List<NamedList>) o); } else if (o instanceof NamedList) { UpdateRequest req = new UpdateRequest(); req.setParams(new ModifiableSolrParams(SolrParams.toSolrParams((NamedList) o))); handler.update(null, req); } else { sdoc = (SolrInputDocument) o; } handler.update(sdoc, updateRequest); } return Collections.EMPTY_LIST; }
// in solrj/src/java/org/apache/solr/client/solrj/request/CoreAdminRequest.java
Override public Collection<ContentStream> getContentStreams() throws IOException { return null; }
// in solrj/src/java/org/apache/solr/client/solrj/request/CoreAdminRequest.java
Override public CoreAdminResponse process(SolrServer server) throws SolrServerException, IOException { long startTime = System.currentTimeMillis(); CoreAdminResponse res = new CoreAdminResponse(); res.setResponse( server.request( this ) ); res.setElapsedTime( System.currentTimeMillis()-startTime ); return res; }
// in solrj/src/java/org/apache/solr/client/solrj/request/CoreAdminRequest.java
public static CoreAdminResponse reloadCore( String name, SolrServer server ) throws SolrServerException, IOException { CoreAdminRequest req = new CoreAdminRequest(); req.setCoreName( name ); req.setAction( CoreAdminAction.RELOAD ); return req.process( server ); }
// in solrj/src/java/org/apache/solr/client/solrj/request/CoreAdminRequest.java
public static CoreAdminResponse unloadCore( String name, SolrServer server ) throws SolrServerException, IOException { return unloadCore(name, false, server); }
// in solrj/src/java/org/apache/solr/client/solrj/request/CoreAdminRequest.java
public static CoreAdminResponse unloadCore( String name, boolean deleteIndex, SolrServer server ) throws SolrServerException, IOException { Unload req = new Unload(deleteIndex); req.setCoreName( name ); return req.process( server ); }
// in solrj/src/java/org/apache/solr/client/solrj/request/CoreAdminRequest.java
public static CoreAdminResponse renameCore(String coreName, String newName, SolrServer server ) throws SolrServerException, IOException { CoreAdminRequest req = new CoreAdminRequest(); req.setCoreName(coreName); req.setOtherCoreName(newName); req.setAction( CoreAdminAction.RENAME ); return req.process( server ); }
// in solrj/src/java/org/apache/solr/client/solrj/request/CoreAdminRequest.java
public static CoreAdminResponse getStatus( String name, SolrServer server ) throws SolrServerException, IOException { CoreAdminRequest req = new CoreAdminRequest(); req.setCoreName( name ); req.setAction( CoreAdminAction.STATUS ); return req.process( server ); }
// in solrj/src/java/org/apache/solr/client/solrj/request/CoreAdminRequest.java
public static CoreAdminResponse createCore( String name, String instanceDir, SolrServer server ) throws SolrServerException, IOException { return CoreAdminRequest.createCore(name, instanceDir, server, null, null); }
// in solrj/src/java/org/apache/solr/client/solrj/request/CoreAdminRequest.java
public static CoreAdminResponse createCore( String name, String instanceDir, SolrServer server, String configFile, String schemaFile ) throws SolrServerException, IOException { CoreAdminRequest.Create req = new CoreAdminRequest.Create(); req.setCoreName( name ); req.setInstanceDir(instanceDir); if(configFile != null){ req.setConfigName(configFile); } if(schemaFile != null){ req.setSchemaName(schemaFile); } return req.process( server ); }
// in solrj/src/java/org/apache/solr/client/solrj/request/CoreAdminRequest.java
public static CoreAdminResponse persist(String fileName, SolrServer server) throws SolrServerException, IOException { CoreAdminRequest.Persist req = new CoreAdminRequest.Persist(); req.setFileName(fileName); return req.process(server); }
// in solrj/src/java/org/apache/solr/client/solrj/request/CoreAdminRequest.java
public static CoreAdminResponse mergeIndexes(String name, String[] indexDirs, String[] srcCores, SolrServer server) throws SolrServerException, IOException { CoreAdminRequest.MergeIndexes req = new CoreAdminRequest.MergeIndexes(); req.setCoreName(name); req.setIndexDirs(Arrays.asList(indexDirs)); req.setSrcCores(Arrays.asList(srcCores)); return req.process(server); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/ConcurrentUpdateSolrServer.java
public void run() { runnerLock.lock(); // info is ok since this should only happen once for each thread log.info("starting runner: {}", this); HttpPost method = null; HttpResponse response = null; try { while (!queue.isEmpty()) { try { final UpdateRequest updateRequest = queue.poll(250, TimeUnit.MILLISECONDS); if (updateRequest == null) break; String contentType = server.requestWriter.getUpdateContentType(); final boolean isXml = ClientUtils.TEXT_XML.equals(contentType); final ModifiableSolrParams origParams = new ModifiableSolrParams(updateRequest.getParams()); EntityTemplate template = new EntityTemplate(new ContentProducer() { public void writeTo(OutputStream out) throws IOException { try { if (isXml) { out.write("<stream>".getBytes("UTF-8")); // can be anything } UpdateRequest req = updateRequest; while (req != null) { SolrParams currentParams = new ModifiableSolrParams(req.getParams()); if (!origParams.toNamedList().equals(currentParams.toNamedList())) { queue.add(req); // params are different, push back to queue break; } server.requestWriter.write(req, out); if (isXml) { // check for commit or optimize SolrParams params = req.getParams(); if (params != null) { String fmt = null; if (params.getBool(UpdateParams.OPTIMIZE, false)) { fmt = "<optimize waitSearcher=\"%s\" waitFlush=\"%s\" />"; } else if (params.getBool(UpdateParams.COMMIT, false)) { fmt = "<commit waitSearcher=\"%s\" waitFlush=\"%s\" />"; } if (fmt != null) { byte[] content = String.format( fmt, params.getBool(UpdateParams.WAIT_SEARCHER, false) + "").getBytes("UTF-8"); out.write(content); } } } out.flush(); req = queue.poll(250, TimeUnit.MILLISECONDS); } if (isXml) { out.write("</stream>".getBytes("UTF-8")); } } catch (InterruptedException e) { e.printStackTrace(); } } }); // The parser 'wt=' and 'version=' params are used instead of the // original params ModifiableSolrParams requestParams = new ModifiableSolrParams(origParams); requestParams.set(CommonParams.WT, server.parser.getWriterType()); requestParams.set(CommonParams.VERSION, server.parser.getVersion()); method = new HttpPost(server.getBaseURL() + "/update" + ClientUtils.toQueryString(requestParams, false)); method.setEntity(template); method.addHeader("User-Agent", HttpSolrServer.AGENT); method.addHeader("Content-Type", contentType); response = server.getHttpClient().execute(method); int statusCode = response.getStatusLine().getStatusCode(); log.info("Status for: " + updateRequest.getDocuments().get(0).getFieldValue("id") + " is " + statusCode); if (statusCode != HttpStatus.SC_OK) { StringBuilder msg = new StringBuilder(); msg.append(response.getStatusLine().getReasonPhrase()); msg.append("\n\n"); msg.append("\n\n"); msg.append("request: ").append(method.getURI()); handleError(new Exception(msg.toString())); } } finally { try { if (response != null) { response.getEntity().getContent().close(); } } catch (Exception ex) { } } } } catch (Throwable e) { handleError(e); } finally { // remove it from the list of running things unless we are the last // runner and the queue is full... // in which case, the next queue.put() would block and there would be no // runners to handle it. // This case has been further handled by using offer instead of put, and // using a retry loop // to avoid blocking forever (see request()). synchronized (runners) { if (runners.size() == 1 && queue.remainingCapacity() == 0) { // keep this runner alive scheduler.execute(this); } else { runners.remove(this); } } log.info("finished: {}", this); runnerLock.unlock(); } }
// in solrj/src/java/org/apache/solr/client/solrj/impl/ConcurrentUpdateSolrServer.java
public void writeTo(OutputStream out) throws IOException { try { if (isXml) { out.write("<stream>".getBytes("UTF-8")); // can be anything } UpdateRequest req = updateRequest; while (req != null) { SolrParams currentParams = new ModifiableSolrParams(req.getParams()); if (!origParams.toNamedList().equals(currentParams.toNamedList())) { queue.add(req); // params are different, push back to queue break; } server.requestWriter.write(req, out); if (isXml) { // check for commit or optimize SolrParams params = req.getParams(); if (params != null) { String fmt = null; if (params.getBool(UpdateParams.OPTIMIZE, false)) { fmt = "<optimize waitSearcher=\"%s\" waitFlush=\"%s\" />"; } else if (params.getBool(UpdateParams.COMMIT, false)) { fmt = "<commit waitSearcher=\"%s\" waitFlush=\"%s\" />"; } if (fmt != null) { byte[] content = String.format( fmt, params.getBool(UpdateParams.WAIT_SEARCHER, false) + "").getBytes("UTF-8"); out.write(content); } } } out.flush(); req = queue.poll(250, TimeUnit.MILLISECONDS); } if (isXml) { out.write("</stream>".getBytes("UTF-8")); } } catch (InterruptedException e) { e.printStackTrace(); } }
// in solrj/src/java/org/apache/solr/client/solrj/impl/ConcurrentUpdateSolrServer.java
public NamedList<Object> request(final SolrRequest request) throws SolrServerException, IOException { if (!(request instanceof UpdateRequest)) { return server.request(request); } UpdateRequest req = (UpdateRequest) request; // this happens for commit... if (req.getDocuments() == null || req.getDocuments().isEmpty()) { blockUntilFinished(); return server.request(request); } SolrParams params = req.getParams(); if (params != null) { // check if it is waiting for the searcher if (params.getBool(UpdateParams.WAIT_SEARCHER, false)) { log.info("blocking for commit/optimize"); blockUntilFinished(); // empty the queue return server.request(request); } } try { CountDownLatch tmpLock = lock; if (tmpLock != null) { tmpLock.await(); } boolean success = queue.offer(req); for (;;) { synchronized (runners) { if (runners.isEmpty() || (queue.remainingCapacity() < queue.size() // queue // is // half // full // and // we // can // add // more // runners && runners.size() < threadCount)) { // We need more runners, so start a new one. Runner r = new Runner(); runners.add(r); scheduler.execute(r); } else { // break out of the retry loop if we added the element to the queue // successfully, *and* // while we are still holding the runners lock to prevent race // conditions. // race conditions. if (success) break; } } // Retry to add to the queue w/o the runners lock held (else we risk // temporary deadlock) // This retry could also fail because // 1) existing runners were not able to take off any new elements in the // queue // 2) the queue was filled back up since our last try // If we succeed, the queue may have been completely emptied, and all // runners stopped. // In all cases, we should loop back to the top to see if we need to // start more runners. // if (!success) { success = queue.offer(req, 100, TimeUnit.MILLISECONDS); } } } catch (InterruptedException e) { log.error("interrupted", e); throw new IOException(e.getLocalizedMessage()); } // RETURN A DUMMY result NamedList<Object> dummy = new NamedList<Object>(); dummy.add("NOTE", "the request is processed in a background stream"); return dummy; }
// in solrj/src/java/org/apache/solr/client/solrj/impl/BinaryRequestWriter.java
Override public Collection<ContentStream> getContentStreams(SolrRequest req) throws IOException { if (req instanceof UpdateRequest) { UpdateRequest updateRequest = (UpdateRequest) req; if (isNull(updateRequest.getDocuments()) && isNull(updateRequest.getDeleteById()) && isNull(updateRequest.getDeleteQuery()) && (updateRequest.getDocIterator() == null) ) { return null; } List<ContentStream> l = new ArrayList<ContentStream>(); l.add(new LazyContentStream(updateRequest)); return l; } else { return super.getContentStreams(req); } }
// in solrj/src/java/org/apache/solr/client/solrj/impl/BinaryRequestWriter.java
Override public ContentStream getContentStream(final UpdateRequest request) throws IOException { final BAOS baos = new BAOS(); new JavaBinUpdateRequestCodec().marshal(request, baos); return new ContentStream() { public String getName() { return null; } public String getSourceInfo() { return "javabin"; } public String getContentType() { return "application/javabin"; } public Long getSize() // size if we know it, otherwise null { return new Long(baos.size()); } public InputStream getStream() throws IOException { return new ByteArrayInputStream(baos.getbuf(), 0, baos.size()); } public Reader getReader() throws IOException { throw new RuntimeException("No reader available . this is a binarystream"); } }; }
// in solrj/src/java/org/apache/solr/client/solrj/impl/BinaryRequestWriter.java
public InputStream getStream() throws IOException { return new ByteArrayInputStream(baos.getbuf(), 0, baos.size()); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/BinaryRequestWriter.java
public Reader getReader() throws IOException { throw new RuntimeException("No reader available . this is a binarystream"); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/BinaryRequestWriter.java
Override public void write(SolrRequest request, OutputStream os) throws IOException { if (request instanceof UpdateRequest) { UpdateRequest updateRequest = (UpdateRequest) request; new JavaBinUpdateRequestCodec().marshal(updateRequest, os); } }
// in solrj/src/java/org/apache/solr/client/solrj/impl/CloudSolrServer.java
Override public NamedList<Object> request(SolrRequest request) throws SolrServerException, IOException { connect(); // TODO: if you can hash here, you could favor the shard leader CloudState cloudState = zkStateReader.getCloudState(); SolrParams reqParams = request.getParams(); if (reqParams == null) { reqParams = new ModifiableSolrParams(); } String collection = reqParams.get("collection", defaultCollection); if (collection == null) { throw new SolrServerException("No collection param specified on request and no default collection has been set."); } // Extract each comma separated collection name and store in a List. List<String> collectionList = StrUtils.splitSmart(collection, ",", true); // Retrieve slices from the cloud state and, for each collection specified, // add it to the Map of slices. Map<String,Slice> slices = new HashMap<String,Slice>(); for (int i = 0; i < collectionList.size(); i++) { String coll= collectionList.get(i); ClientUtils.appendMap(coll, slices, cloudState.getSlices(coll)); } Set<String> liveNodes = cloudState.getLiveNodes(); // IDEA: have versions on various things... like a global cloudState version // or shardAddressVersion (which only changes when the shards change) // to allow caching. // build a map of unique nodes // TODO: allow filtering by group, role, etc Map<String,ZkNodeProps> nodes = new HashMap<String,ZkNodeProps>(); List<String> urlList = new ArrayList<String>(); for (Slice slice : slices.values()) { for (ZkNodeProps nodeProps : slice.getShards().values()) { ZkCoreNodeProps coreNodeProps = new ZkCoreNodeProps(nodeProps); String node = coreNodeProps.getNodeName(); if (!liveNodes.contains(coreNodeProps.getNodeName()) || !coreNodeProps.getState().equals( ZkStateReader.ACTIVE)) continue; if (nodes.put(node, nodeProps) == null) { String url = coreNodeProps.getCoreUrl(); urlList.add(url); } } } Collections.shuffle(urlList, rand); //System.out.println("########################## MAKING REQUEST TO " + urlList); LBHttpSolrServer.Req req = new LBHttpSolrServer.Req(request, urlList); LBHttpSolrServer.Rsp rsp = lbServer.request(req); return rsp.getResponse(); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/StreamingBinaryResponseParser.java
Override public NamedList<Object> processResponse(InputStream body, String encoding) { try { JavaBinCodec codec = new JavaBinCodec() { @Override public SolrDocument readSolrDocument(FastInputStream dis) throws IOException { SolrDocument doc = super.readSolrDocument(dis); callback.streamSolrDocument( doc ); return null; } @Override public SolrDocumentList readSolrDocumentList(FastInputStream dis) throws IOException { SolrDocumentList solrDocs = new SolrDocumentList(); List list = (List) readVal(dis); solrDocs.setNumFound((Long) list.get(0)); solrDocs.setStart((Long) list.get(1)); solrDocs.setMaxScore((Float) list.get(2)); callback.streamDocListInfo( solrDocs.getNumFound(), solrDocs.getStart(), solrDocs.getMaxScore() ); // Read the Array tagByte = dis.readByte(); if( (tagByte >>> 5) != (ARR >>> 5) ) { throw new RuntimeException( "doclist must have an array" ); } int sz = readSize(dis); for (int i = 0; i < sz; i++) { // must be a SolrDocument readVal( dis ); } return solrDocs; } }; return (NamedList<Object>) codec.unmarshal(body); } catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "parsing error", e); } }
// in solrj/src/java/org/apache/solr/client/solrj/impl/StreamingBinaryResponseParser.java
Override public SolrDocument readSolrDocument(FastInputStream dis) throws IOException { SolrDocument doc = super.readSolrDocument(dis); callback.streamSolrDocument( doc ); return null; }
// in solrj/src/java/org/apache/solr/client/solrj/impl/StreamingBinaryResponseParser.java
Override public SolrDocumentList readSolrDocumentList(FastInputStream dis) throws IOException { SolrDocumentList solrDocs = new SolrDocumentList(); List list = (List) readVal(dis); solrDocs.setNumFound((Long) list.get(0)); solrDocs.setStart((Long) list.get(1)); solrDocs.setMaxScore((Float) list.get(2)); callback.streamDocListInfo( solrDocs.getNumFound(), solrDocs.getStart(), solrDocs.getMaxScore() ); // Read the Array tagByte = dis.readByte(); if( (tagByte >>> 5) != (ARR >>> 5) ) { throw new RuntimeException( "doclist must have an array" ); } int sz = readSize(dis); for (int i = 0; i < sz; i++) { // must be a SolrDocument readVal( dis ); } return solrDocs; }
// in solrj/src/java/org/apache/solr/client/solrj/impl/HttpClientUtil.java
Override public void process(HttpRequest request, HttpContext context) throws HttpException, IOException { if (!request.containsHeader("Accept-Encoding")) { request.addHeader("Accept-Encoding", "gzip, deflate"); } }
// in solrj/src/java/org/apache/solr/client/solrj/impl/HttpClientUtil.java
public void process(final HttpResponse response, final HttpContext context) throws HttpException, IOException { HttpEntity entity = response.getEntity(); Header ceheader = entity.getContentEncoding(); if (ceheader != null) { HeaderElement[] codecs = ceheader.getElements(); for (int i = 0; i < codecs.length; i++) { if (codecs[i].getName().equalsIgnoreCase("gzip")) { response .setEntity(new GzipDecompressingEntity(response.getEntity())); return; } if (codecs[i].getName().equalsIgnoreCase("deflate")) { response.setEntity(new DeflateDecompressingEntity(response .getEntity())); return; } } } }
// in solrj/src/java/org/apache/solr/client/solrj/impl/HttpClientUtil.java
public InputStream getContent() throws IOException, IllegalStateException { return new GZIPInputStream(wrappedEntity.getContent()); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/HttpClientUtil.java
public InputStream getContent() throws IOException, IllegalStateException { return new InflaterInputStream(wrappedEntity.getContent()); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/HttpSolrServer.java
Override public NamedList<Object> request(final SolrRequest request) throws SolrServerException, IOException { ResponseParser responseParser = request.getResponseParser(); if (responseParser == null) { responseParser = parser; } return request(request, responseParser); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/HttpSolrServer.java
public NamedList<Object> request(final SolrRequest request, final ResponseParser processor) throws SolrServerException, IOException { HttpRequestBase method = null; InputStream is = null; SolrParams params = request.getParams(); Collection<ContentStream> streams = requestWriter.getContentStreams(request); String path = requestWriter.getPath(request); if (path == null || !path.startsWith("/")) { path = DEFAULT_PATH; } ResponseParser parser = request.getResponseParser(); if (parser == null) { parser = this.parser; } // The parser 'wt=' and 'version=' params are used instead of the original // params ModifiableSolrParams wparams = new ModifiableSolrParams(params); wparams.set(CommonParams.WT, parser.getWriterType()); wparams.set(CommonParams.VERSION, parser.getVersion()); if (invariantParams != null) { wparams.add(invariantParams); } params = wparams; int tries = maxRetries + 1; try { while( tries-- > 0 ) { // Note: since we aren't do intermittent time keeping // ourselves, the potential non-timeout latency could be as // much as tries-times (plus scheduling effects) the given // timeAllowed. try { if( SolrRequest.METHOD.GET == request.getMethod() ) { if( streams != null ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "GET can't send streams!" ); } method = new HttpGet( baseUrl + path + ClientUtils.toQueryString( params, false ) ); } else if( SolrRequest.METHOD.POST == request.getMethod() ) { String url = baseUrl + path; boolean isMultipart = ( streams != null && streams.size() > 1 ); LinkedList<NameValuePair> postParams = new LinkedList<NameValuePair>(); if (streams == null || isMultipart) { HttpPost post = new HttpPost(url); post.setHeader("Content-Charset", "UTF-8"); if (!this.useMultiPartPost && !isMultipart) { post.addHeader("Content-Type", "application/x-www-form-urlencoded; charset=UTF-8"); } List<FormBodyPart> parts = new LinkedList<FormBodyPart>(); Iterator<String> iter = params.getParameterNamesIterator(); while (iter.hasNext()) { String p = iter.next(); String[] vals = params.getParams(p); if (vals != null) { for (String v : vals) { if (this.useMultiPartPost || isMultipart) { parts.add(new FormBodyPart(p, new StringBody(v, Charset.forName("UTF-8")))); } else { postParams.add(new BasicNameValuePair(p, v)); } } } } if (isMultipart) { for (ContentStream content : streams) { String contentType = content.getContentType(); if(contentType==null) { contentType = "application/octet-stream"; // default } parts.add(new FormBodyPart(content.getName(), new InputStreamBody( content.getStream(), contentType, content.getName()))); } } if (parts.size() > 0) { MultipartEntity entity = new MultipartEntity(HttpMultipartMode.STRICT); for(FormBodyPart p: parts) { entity.addPart(p); } post.setEntity(entity); } else { //not using multipart post.setEntity(new UrlEncodedFormEntity(postParams, "UTF-8")); } method = post; } // It is has one stream, it is the post body, put the params in the URL else { String pstr = ClientUtils.toQueryString(params, false); HttpPost post = new HttpPost(url + pstr); // Single stream as body // Using a loop just to get the first one final ContentStream[] contentStream = new ContentStream[1]; for (ContentStream content : streams) { contentStream[0] = content; break; } if (contentStream[0] instanceof RequestWriter.LazyContentStream) { post.setEntity(new InputStreamEntity(contentStream[0].getStream(), -1) { @Override public Header getContentType() { return new BasicHeader("Content-Type", contentStream[0].getContentType()); } @Override public boolean isRepeatable() { return false; } }); } else { post.setEntity(new InputStreamEntity(contentStream[0].getStream(), -1) { @Override public Header getContentType() { return new BasicHeader("Content-Type", contentStream[0].getContentType()); } @Override public boolean isRepeatable() { return false; } }); } method = post; } } else { throw new SolrServerException("Unsupported method: "+request.getMethod() ); } } catch( NoHttpResponseException r ) { method = null; if(is != null) { is.close(); } // If out of tries then just rethrow (as normal error). if (tries < 1) { throw r; } } }
// in solrj/src/java/org/apache/solr/client/solrj/impl/HttpSolrServer.java
public UpdateResponse add(Iterator<SolrInputDocument> docIterator) throws SolrServerException, IOException { UpdateRequest req = new UpdateRequest(); req.setDocIterator(docIterator); return req.process(this); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/HttpSolrServer.java
public UpdateResponse addBeans(final Iterator<?> beanIterator) throws SolrServerException, IOException { UpdateRequest req = new UpdateRequest(); req.setDocIterator(new Iterator<SolrInputDocument>() { public boolean hasNext() { return beanIterator.hasNext(); } public SolrInputDocument next() { Object o = beanIterator.next(); if (o == null) return null; return getBinder().toSolrInputDocument(o); } public void remove() { beanIterator.remove(); } }); return req.process(this); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
public Rsp request(Req req) throws SolrServerException, IOException { Rsp rsp = new Rsp(); Exception ex = null; List<ServerWrapper> skipped = new ArrayList<ServerWrapper>(req.getNumDeadServersToTry()); for (String serverStr : req.getServers()) { serverStr = normalize(serverStr); // if the server is currently a zombie, just skip to the next one ServerWrapper wrapper = zombieServers.get(serverStr); if (wrapper != null) { // System.out.println("ZOMBIE SERVER QUERIED: " + serverStr); if (skipped.size() < req.getNumDeadServersToTry()) skipped.add(wrapper); continue; } rsp.server = serverStr; HttpSolrServer server = makeServer(serverStr); try { rsp.rsp = server.request(req.getRequest()); return rsp; // SUCCESS } catch (SolrException e) { // we retry on 404 or 403 or 503 - you can see this on solr shutdown if (e.code() == 404 || e.code() == 403 || e.code() == 503 || e.code() == 500) { ex = addZombie(server, e); } else { // Server is alive but the request was likely malformed or invalid throw e; } // TODO: consider using below above - currently does cause a problem with distrib updates: // seems to match up against a failed forward to leader exception as well... // || e.getMessage().contains("java.net.SocketException") // || e.getMessage().contains("java.net.ConnectException") } catch (SocketException e) { ex = addZombie(server, e); } catch (SocketTimeoutException e) { ex = addZombie(server, e); } catch (SolrServerException e) { Throwable rootCause = e.getRootCause(); if (rootCause instanceof IOException) { ex = addZombie(server, e); } else { throw e; } } catch (Exception e) { throw new SolrServerException(e); } } // try the servers we previously skipped for (ServerWrapper wrapper : skipped) { try { rsp.rsp = wrapper.solrServer.request(req.getRequest()); zombieServers.remove(wrapper.getKey()); return rsp; // SUCCESS } catch (SolrException e) { // we retry on 404 or 403 or 503 - you can see this on solr shutdown if (e.code() == 404 || e.code() == 403 || e.code() == 503 || e.code() == 500) { ex = e; // already a zombie, no need to re-add } else { // Server is alive but the request was malformed or invalid zombieServers.remove(wrapper.getKey()); throw e; } } catch (SocketException e) { ex = e; } catch (SocketTimeoutException e) { ex = e; } catch (SolrServerException e) { Throwable rootCause = e.getRootCause(); if (rootCause instanceof IOException) { ex = e; // already a zombie, no need to re-add } else { throw e; } } catch (Exception e) { throw new SolrServerException(e); } } if (ex == null) { throw new SolrServerException("No live SolrServers available to handle this request"); } else { throw new SolrServerException("No live SolrServers available to handle this request:" + zombieServers.keySet(), ex); } }
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
Override public NamedList<Object> request(final SolrRequest request) throws SolrServerException, IOException { Exception ex = null; ServerWrapper[] serverList = aliveServerList; int maxTries = serverList.length; Map<String,ServerWrapper> justFailed = null; for (int attempts=0; attempts<maxTries; attempts++) { int count = counter.incrementAndGet(); ServerWrapper wrapper = serverList[count % serverList.length]; wrapper.lastUsed = System.currentTimeMillis(); try { return wrapper.solrServer.request(request); } catch (SolrException e) { // Server is alive but the request was malformed or invalid throw e; } catch (SolrServerException e) { if (e.getRootCause() instanceof IOException) { ex = e; moveAliveToDead(wrapper); if (justFailed == null) justFailed = new HashMap<String,ServerWrapper>(); justFailed.put(wrapper.getKey(), wrapper); } else { throw e; } } catch (Exception e) { throw new SolrServerException(e); } } // try other standard servers that we didn't try just now for (ServerWrapper wrapper : zombieServers.values()) { if (wrapper.standard==false || justFailed!=null && justFailed.containsKey(wrapper.getKey())) continue; try { NamedList<Object> rsp = wrapper.solrServer.request(request); // remove from zombie list *before* adding to alive to avoid a race that could lose a server zombieServers.remove(wrapper.getKey()); addToAlive(wrapper); return rsp; } catch (SolrException e) { // Server is alive but the request was malformed or invalid throw e; } catch (SolrServerException e) { if (e.getRootCause() instanceof IOException) { ex = e; // still dead } else { throw e; } } catch (Exception e) { throw new SolrServerException(e); } } if (ex == null) { throw new SolrServerException("No live SolrServers available to handle this request"); } else { throw new SolrServerException("No live SolrServers available to handle this request", ex); } }
// in solrj/src/java/org/apache/solr/client/solrj/SolrServer.java
public UpdateResponse add(Collection<SolrInputDocument> docs) throws SolrServerException, IOException { return add(docs, -1); }
// in solrj/src/java/org/apache/solr/client/solrj/SolrServer.java
public UpdateResponse add(Collection<SolrInputDocument> docs, int commitWithinMs) throws SolrServerException, IOException { UpdateRequest req = new UpdateRequest(); req.add(docs); req.setCommitWithin(commitWithinMs); return req.process(this); }
// in solrj/src/java/org/apache/solr/client/solrj/SolrServer.java
public UpdateResponse addBeans(Collection<?> beans ) throws SolrServerException, IOException { return addBeans(beans, -1); }
// in solrj/src/java/org/apache/solr/client/solrj/SolrServer.java
public UpdateResponse addBeans(Collection<?> beans, int commitWithinMs) throws SolrServerException, IOException { DocumentObjectBinder binder = this.getBinder(); ArrayList<SolrInputDocument> docs = new ArrayList<SolrInputDocument>(beans.size()); for (Object bean : beans) { docs.add(binder.toSolrInputDocument(bean)); } return add(docs, commitWithinMs); }
// in solrj/src/java/org/apache/solr/client/solrj/SolrServer.java
public UpdateResponse add(SolrInputDocument doc ) throws SolrServerException, IOException { return add(doc, -1); }
// in solrj/src/java/org/apache/solr/client/solrj/SolrServer.java
public UpdateResponse add(SolrInputDocument doc, int commitWithinMs) throws SolrServerException, IOException { UpdateRequest req = new UpdateRequest(); req.add(doc); req.setCommitWithin(commitWithinMs); return req.process(this); }
// in solrj/src/java/org/apache/solr/client/solrj/SolrServer.java
public UpdateResponse addBean(Object obj) throws IOException, SolrServerException { return addBean(obj, -1); }
// in solrj/src/java/org/apache/solr/client/solrj/SolrServer.java
public UpdateResponse addBean(Object obj, int commitWithinMs) throws IOException, SolrServerException { return add(getBinder().toSolrInputDocument(obj),commitWithinMs); }
// in solrj/src/java/org/apache/solr/client/solrj/SolrServer.java
public UpdateResponse commit( ) throws SolrServerException, IOException { return commit(true, true); }
// in solrj/src/java/org/apache/solr/client/solrj/SolrServer.java
public UpdateResponse optimize( ) throws SolrServerException, IOException { return optimize(true, true, 1); }
// in solrj/src/java/org/apache/solr/client/solrj/SolrServer.java
public UpdateResponse commit( boolean waitFlush, boolean waitSearcher ) throws SolrServerException, IOException { return new UpdateRequest().setAction( UpdateRequest.ACTION.COMMIT, waitFlush, waitSearcher ).process( this ); }
// in solrj/src/java/org/apache/solr/client/solrj/SolrServer.java
public UpdateResponse commit( boolean waitFlush, boolean waitSearcher, boolean softCommit ) throws SolrServerException, IOException { return new UpdateRequest().setAction( UpdateRequest.ACTION.COMMIT, waitFlush, waitSearcher, softCommit ).process( this ); }
// in solrj/src/java/org/apache/solr/client/solrj/SolrServer.java
public UpdateResponse optimize( boolean waitFlush, boolean waitSearcher ) throws SolrServerException, IOException { return optimize(waitFlush, waitSearcher, 1); }
// in solrj/src/java/org/apache/solr/client/solrj/SolrServer.java
public UpdateResponse optimize(boolean waitFlush, boolean waitSearcher, int maxSegments ) throws SolrServerException, IOException { return new UpdateRequest().setAction( UpdateRequest.ACTION.OPTIMIZE, waitFlush, waitSearcher, maxSegments ).process( this ); }
// in solrj/src/java/org/apache/solr/client/solrj/SolrServer.java
public UpdateResponse rollback() throws SolrServerException, IOException { return new UpdateRequest().rollback().process( this ); }
// in solrj/src/java/org/apache/solr/client/solrj/SolrServer.java
public UpdateResponse deleteById(String id) throws SolrServerException, IOException { return deleteById(id, -1); }
// in solrj/src/java/org/apache/solr/client/solrj/SolrServer.java
public UpdateResponse deleteById(String id, int commitWithinMs) throws SolrServerException, IOException { UpdateRequest req = new UpdateRequest(); req.deleteById(id); req.setCommitWithin(commitWithinMs); return req.process(this); }
// in solrj/src/java/org/apache/solr/client/solrj/SolrServer.java
public UpdateResponse deleteById(List<String> ids) throws SolrServerException, IOException { return deleteById(ids, -1); }
// in solrj/src/java/org/apache/solr/client/solrj/SolrServer.java
public UpdateResponse deleteById(List<String> ids, int commitWithinMs) throws SolrServerException, IOException { UpdateRequest req = new UpdateRequest(); req.deleteById(ids); req.setCommitWithin(commitWithinMs); return req.process(this); }
// in solrj/src/java/org/apache/solr/client/solrj/SolrServer.java
public UpdateResponse deleteByQuery(String query) throws SolrServerException, IOException { return deleteByQuery(query, -1); }
// in solrj/src/java/org/apache/solr/client/solrj/SolrServer.java
public UpdateResponse deleteByQuery(String query, int commitWithinMs) throws SolrServerException, IOException { UpdateRequest req = new UpdateRequest(); req.deleteByQuery(query); req.setCommitWithin(commitWithinMs); return req.process(this); }
// in solrj/src/java/org/apache/solr/client/solrj/SolrServer.java
public SolrPingResponse ping() throws SolrServerException, IOException { return new SolrPing().process( this ); }
// in solrj/src/java/org/apache/solr/client/solrj/SolrServer.java
public QueryResponse queryAndStreamResponse( SolrParams params, StreamingResponseCallback callback ) throws SolrServerException, IOException { ResponseParser parser = new StreamingBinaryResponseParser( callback ); QueryRequest req = new QueryRequest( params ); req.setStreamingResponseCallback( callback ); req.setResponseParser( parser ); return req.process(this); }
// in solrj/src/java/org/apache/solr/client/solrj/util/ClientUtils.java
public static void writeXML( SolrInputDocument doc, Writer writer ) throws IOException { writer.write("<doc boost=\""+doc.getDocumentBoost()+"\">"); for( SolrInputField field : doc ) { float boost = field.getBoost(); String name = field.getName(); for( Object v : field ) { String update = null; if (v instanceof Map) { // currently only supports a single value for (Entry<Object,Object> entry : ((Map<Object,Object>)v).entrySet()) { update = entry.getKey().toString(); Object fieldVal = entry.getValue(); v = fieldVal; } } if (v instanceof Date) { v = DateUtil.getThreadLocalDateFormat().format( (Date)v ); } else if (v instanceof byte[]) { byte[] bytes = (byte[]) v; v = Base64.byteArrayToBase64(bytes, 0,bytes.length); } else if (v instanceof ByteBuffer) { ByteBuffer bytes = (ByteBuffer) v; v = Base64.byteArrayToBase64(bytes.array(), bytes.position(),bytes.limit() - bytes.position()); } if (update == null) { if( boost != 1.0f ) { XML.writeXML(writer, "field", v.toString(), "name", name, "boost", boost ); } else if (v != null) { XML.writeXML(writer, "field", v.toString(), "name", name ); } } else { if( boost != 1.0f ) { XML.writeXML(writer, "field", v.toString(), "name", name, "boost", boost, "update", update); } else if (v != null) { XML.writeXML(writer, "field", v.toString(), "name", name, "update", update); } } // only write the boost for the first multi-valued field // otherwise, the used boost is the product of all the boost values boost = 1.0f; } } writer.write("</doc>"); }
// in solrj/src/java/org/apache/noggit/CharArr.java
public int read() throws IOException { if (start>=end) return -1; return buf[start++]; }
// in solrj/src/java/org/apache/noggit/CharArr.java
public int read(CharBuffer cb) throws IOException { /*** int sz = size(); if (sz<=0) return -1; if (sz>0) cb.put(buf, start, sz); return -1; ***/ int sz = size(); if (sz>0) cb.put(buf, start, sz); start=end; while (true) { fill(); int s = size(); if (s==0) return sz==0 ? -1 : sz; sz += s; cb.put(buf, start, s); } }
// in solrj/src/java/org/apache/noggit/CharArr.java
public int fill() throws IOException { return 0; // or -1? }
// in solrj/src/java/org/apache/noggit/CharArr.java
public final Appendable append(CharSequence csq) throws IOException { return append(csq, 0, csq.length()); }
// in solrj/src/java/org/apache/noggit/CharArr.java
public Appendable append(CharSequence csq, int start, int end) throws IOException { write(csq.subSequence(start, end).toString()); return null; }
// in solrj/src/java/org/apache/noggit/CharArr.java
public final Appendable append(char c) throws IOException { write(c); return this; }
// in solrj/src/java/org/apache/noggit/CharArr.java
public Appendable append(CharSequence csq, int start, int end) throws IOException { return this; }
// in solrj/src/java/org/apache/noggit/CharArr.java
public int read() throws IOException { if (start>=end) fill(); return start>=end ? -1 : buf[start++]; }
// in solrj/src/java/org/apache/noggit/CharArr.java
public int read(CharBuffer cb) throws IOException { // empty the buffer and then read direct int sz = size(); if (sz>0) cb.put(buf,start,end); int sz2 = in.read(cb); if (sz2>=0) return sz+sz2; return sz>0 ? sz : -1; }
// in solrj/src/java/org/apache/noggit/CharArr.java
public int fill() throws IOException { if (start>=end) { reset(); } else if (start>0) { System.arraycopy(buf, start, buf, 0, size()); end=size(); start=0; } /*** // fill fully or not??? do { int sz = in.read(buf,end,buf.length-end); if (sz==-1) return; end+=sz; } while (end < buf.length); ***/ int sz = in.read(buf,end,buf.length-end); if (sz>0) end+=sz; return sz; }
// in solrj/src/java/org/apache/noggit/JSONParser.java
protected void fill() throws IOException { if (in!=null) { gpos += end; start=0; int num = in.read(buf,0,buf.length); end = num>=0 ? num : 0; } if (start>=end) eof=true; }
// in solrj/src/java/org/apache/noggit/JSONParser.java
private void getMore() throws IOException { fill(); if (start>=end) { throw err(null); } }
// in solrj/src/java/org/apache/noggit/JSONParser.java
protected int getChar() throws IOException { if (start>=end) { fill(); if (start>=end) return -1; } return buf[start++]; }
// in solrj/src/java/org/apache/noggit/JSONParser.java
private int getCharNWS() throws IOException { for (;;) { int ch = getChar(); if (!(ch==' ' || ch=='\t' || ch=='\n' || ch=='\r')) return ch; } }
// in solrj/src/java/org/apache/noggit/JSONParser.java
private void expect(char[] arr) throws IOException { for (int i=1; i<arr.length; i++) { int ch = getChar(); if (ch != arr[i]) { throw err("Expected " + new String(arr)); } } }
// in solrj/src/java/org/apache/noggit/JSONParser.java
private long readNumber(int firstChar, boolean isNeg) throws IOException { out.unsafeWrite(firstChar); // unsafe OK since we know output is big enough // We build up the number in the negative plane since it's larger (by one) than // the positive plane. long v = '0' - firstChar; // can't overflow a long in 18 decimal digits (i.e. 17 additional after the first). // we also need 22 additional to handle double so we'll handle in 2 separate loops. int i; for (i=0; i<17; i++) { int ch = getChar(); // TODO: is this switch faster as an if-then-else? switch(ch) { case '0': case '1': case '2': case '3': case '4': case '5': case '6': case '7': case '8': case '9': v = v*10 - (ch-'0'); out.unsafeWrite(ch); continue; case '.': out.unsafeWrite('.'); valstate = readFrac(out,22-i); return 0; case 'e': case 'E': out.unsafeWrite(ch); nstate=0; valstate = readExp(out,22-i); return 0; default: // return the number, relying on nextEvent() to return an error // for invalid chars following the number. if (ch!=-1) --start; // push back last char if not EOF valstate = LONG; return isNeg ? v : -v; } } // after this, we could overflow a long and need to do extra checking boolean overflow = false; long maxval = isNeg ? Long.MIN_VALUE : -Long.MAX_VALUE; for (; i<22; i++) { int ch = getChar(); switch(ch) { case '0': case '1': case '2': case '3': case '4': case '5': case '6': case '7': case '8': case '9': if (v < (0x8000000000000000L/10)) overflow=true; // can't multiply by 10 w/o overflowing v *= 10; int digit = ch - '0'; if (v < maxval + digit) overflow=true; // can't add digit w/o overflowing v -= digit; out.unsafeWrite(ch); continue; case '.': out.unsafeWrite('.'); valstate = readFrac(out,22-i); return 0; case 'e': case 'E': out.unsafeWrite(ch); nstate=0; valstate = readExp(out,22-i); return 0; default: // return the number, relying on nextEvent() to return an error // for invalid chars following the number. if (ch!=-1) --start; // push back last char if not EOF valstate = overflow ? BIGNUMBER : LONG; return isNeg ? v : -v; } } nstate=0; valstate = BIGNUMBER; return 0; }
// in solrj/src/java/org/apache/noggit/JSONParser.java
private int readFrac(CharArr arr, int lim) throws IOException { nstate = HAS_FRACTION; // deliberate set instead of '|' while(--lim>=0) { int ch = getChar(); if (ch>='0' && ch<='9') { arr.write(ch); } else if (ch=='e' || ch=='E') { arr.write(ch); return readExp(arr,lim); } else { if (ch!=-1) start--; // back up return NUMBER; } } return BIGNUMBER; }
// in solrj/src/java/org/apache/noggit/JSONParser.java
private int readExp(CharArr arr, int lim) throws IOException { nstate |= HAS_EXPONENT; int ch = getChar(); lim--; if (ch=='+' || ch=='-') { arr.write(ch); ch = getChar(); lim--; } // make sure at least one digit is read. if (ch<'0' || ch>'9') { throw err("missing exponent number"); } arr.write(ch); return readExpDigits(arr,lim); }
// in solrj/src/java/org/apache/noggit/JSONParser.java
private int readExpDigits(CharArr arr, int lim) throws IOException { while (--lim>=0) { int ch = getChar(); if (ch>='0' && ch<='9') { arr.write(ch); } else { if (ch!=-1) start--; // back up return NUMBER; } } return BIGNUMBER; }
// in solrj/src/java/org/apache/noggit/JSONParser.java
private void continueNumber(CharArr arr) throws IOException { if (arr != out) arr.write(out); if ((nstate & HAS_EXPONENT)!=0){ readExpDigits(arr, Integer.MAX_VALUE); return; } if (nstate != 0) { readFrac(arr, Integer.MAX_VALUE); return; } for(;;) { int ch = getChar(); if (ch>='0' && ch <='9') { arr.write(ch); } else if (ch=='.') { arr.write(ch); readFrac(arr,Integer.MAX_VALUE); return; } else if (ch=='e' || ch=='E') { arr.write(ch); readExp(arr,Integer.MAX_VALUE); return; } else { if (ch!=-1) start--; return; } } }
// in solrj/src/java/org/apache/noggit/JSONParser.java
private char readEscapedChar() throws IOException { switch (getChar()) { case '"' : return '"'; case '\\' : return '\\'; case '/' : return '/'; case 'n' : return '\n'; case 'r' : return '\r'; case 't' : return '\t'; case 'f' : return '\f'; case 'b' : return '\b'; case 'u' : return (char)( (hexval(getChar()) << 12) | (hexval(getChar()) << 8) | (hexval(getChar()) << 4) | (hexval(getChar()))); } throw err("Invalid character escape in string"); }
// in solrj/src/java/org/apache/noggit/JSONParser.java
private CharArr readStringChars() throws IOException { char c=0; int i; for (i=start; i<end; i++) { c = buf[i]; if (c=='"') { tmp.set(buf,start,i); // directly use input buffer start=i+1; // advance past last '"' return tmp; } else if (c=='\\') { break; } } out.reset(); readStringChars2(out, i); return out; }
// in solrj/src/java/org/apache/noggit/JSONParser.java
private void readStringChars2(CharArr arr, int middle) throws IOException { for (;;) { if (middle>=end) { arr.write(buf,start,middle-start); start=middle; getMore(); middle=start; } int ch = buf[middle++]; if (ch=='"') { int len = middle-start-1; if (len>0) arr.write(buf,start,len); start=middle; return; } else if (ch=='\\') { int len = middle-start-1; if (len>0) arr.write(buf,start,len); start=middle; arr.write(readEscapedChar()); middle=start; } } }
// in solrj/src/java/org/apache/noggit/JSONParser.java
private int next(int ch) throws IOException { for(;;) { switch (ch) { case ' ': case '\t': break; case '\r': case '\n': break; // try and keep track of linecounts? case '"' : valstate = STRING; return STRING; case '{' : push(); state= DID_OBJSTART; return OBJECT_START; case '[': push(); state=DID_ARRSTART; return ARRAY_START; case '0' : out.reset(); //special case '0'? If next char isn't '.' val=0 ch=getChar(); if (ch=='.') { start--; ch='0'; readNumber('0',false); return valstate; } else if (ch>'9' || ch<'0') { out.unsafeWrite('0'); if (ch!=-1) start--; lval = 0; valstate=LONG; return LONG; } else { throw err("Leading zeros not allowed"); } case '1' : case '2' : case '3' : case '4' : case '5' : case '6' : case '7' : case '8' : case '9' : out.reset(); lval = readNumber(ch,false); return valstate; case '-' : out.reset(); out.unsafeWrite('-'); ch = getChar(); if (ch<'0' || ch>'9') throw err("expected digit after '-'"); lval = readNumber(ch,true); return valstate; case 't': valstate=BOOLEAN; // TODO: test performance of this non-branching inline version. // if ((('r'-getChar())|('u'-getChar())|('e'-getChar())) != 0) err(""); expect(JSONUtil.TRUE_CHARS); bool=true; return BOOLEAN; case 'f': valstate=BOOLEAN; expect(JSONUtil.FALSE_CHARS); bool=false; return BOOLEAN; case 'n': valstate=NULL; expect(JSONUtil.NULL_CHARS); return NULL; case -1: if (getLevel()>0) throw err("Premature EOF"); return EOF; default: throw err(null); } ch = getChar(); } }
// in solrj/src/java/org/apache/noggit/JSONParser.java
public int nextEvent() throws IOException { if (valstate==STRING) { readStringChars2(devNull,start); } else if (valstate==BIGNUMBER) { continueNumber(devNull); } valstate=0; int ch; // TODO: factor out getCharNWS() to here and check speed switch (state) { case 0: return event = next(getCharNWS()); case DID_OBJSTART: ch = getCharNWS(); if (ch=='}') { pop(); return event = OBJECT_END; } if (ch != '"') { throw err("Expected string"); } state = DID_MEMNAME; valstate = STRING; return event = STRING; case DID_MEMNAME: ch = getCharNWS(); if (ch!=':') { throw err("Expected key,value separator ':'"); } state = DID_MEMVAL; // set state first because it might be pushed... return event = next(getChar()); case DID_MEMVAL: ch = getCharNWS(); if (ch=='}') { pop(); return event = OBJECT_END; } else if (ch!=',') { throw err("Expected ',' or '}'"); } ch = getCharNWS(); if (ch != '"') { throw err("Expected string"); } state = DID_MEMNAME; valstate = STRING; return event = STRING; case DID_ARRSTART: ch = getCharNWS(); if (ch==']') { pop(); return event = ARRAY_END; } state = DID_ARRELEM; // set state first, might be pushed... return event = next(ch); case DID_ARRELEM: ch = getCharNWS(); if (ch==']') { pop(); return event = ARRAY_END; } else if (ch!=',') { throw err("Expected ',' or ']'"); } // state = DID_ARRELEM; return event = next(getChar()); } return 0; }
// in solrj/src/java/org/apache/noggit/JSONParser.java
private void goTo(int what) throws IOException { if (valstate==what) { valstate=0; return; } if (valstate==0) { int ev = nextEvent(); // TODO if (valstate!=what) { throw err("type mismatch"); } valstate=0; } else { throw err("type mismatch"); } }
// in solrj/src/java/org/apache/noggit/JSONParser.java
public String getString() throws IOException { return getStringChars().toString(); }
// in solrj/src/java/org/apache/noggit/JSONParser.java
public CharArr getStringChars() throws IOException { goTo(STRING); return readStringChars(); }
// in solrj/src/java/org/apache/noggit/JSONParser.java
public void getString(CharArr output) throws IOException { goTo(STRING); readStringChars2(output,start); }
// in solrj/src/java/org/apache/noggit/JSONParser.java
public long getLong() throws IOException { goTo(LONG); return lval; }
// in solrj/src/java/org/apache/noggit/JSONParser.java
public double getDouble() throws IOException { return Double.parseDouble(getNumberChars().toString()); }
// in solrj/src/java/org/apache/noggit/JSONParser.java
public CharArr getNumberChars() throws IOException { int ev=0; if (valstate==0) ev = nextEvent(); if (valstate == LONG || valstate == NUMBER) { valstate=0; return out; } else if (valstate==BIGNUMBER) { continueNumber(out); valstate=0; return out; } else { throw err("Unexpected " + ev); } }
// in solrj/src/java/org/apache/noggit/JSONParser.java
public void getNumberChars(CharArr output) throws IOException { int ev=0; if (valstate==0) ev=nextEvent(); if (valstate == LONG || valstate == NUMBER) output.write(this.out); else if (valstate==BIGNUMBER) { continueNumber(output); } else { throw err("Unexpected " + ev); } valstate=0; }
// in solrj/src/java/org/apache/noggit/JSONParser.java
public boolean getBoolean() throws IOException { goTo(BOOLEAN); return bool; }
// in solrj/src/java/org/apache/noggit/JSONParser.java
public void getNull() throws IOException { goTo(NULL); }
// in solrj/src/java/org/apache/noggit/ObjectBuilder.java
public static Object fromJSON(String json) throws IOException { JSONParser p = new JSONParser(json); return getVal(p); }
// in solrj/src/java/org/apache/noggit/ObjectBuilder.java
public static Object getVal(JSONParser parser) throws IOException { return new ObjectBuilder(parser).getVal(); }
// in solrj/src/java/org/apache/noggit/ObjectBuilder.java
public Object getVal() throws IOException { int ev = parser.lastEvent(); switch(ev) { case JSONParser.STRING: return getString(); case JSONParser.LONG: return getLong(); case JSONParser.NUMBER: return getNumber(); case JSONParser.BIGNUMBER: return getBigNumber(); case JSONParser.BOOLEAN: return getBoolean(); case JSONParser.NULL: return getNull(); case JSONParser.OBJECT_START: return getObject(); case JSONParser.OBJECT_END: return null; // OR ERROR? case JSONParser.ARRAY_START: return getArray(); case JSONParser.ARRAY_END: return null; // OR ERROR? case JSONParser.EOF: return null; // OR ERROR? default: return null; // OR ERROR? } }
// in solrj/src/java/org/apache/noggit/ObjectBuilder.java
public Object getString() throws IOException { return parser.getString(); }
// in solrj/src/java/org/apache/noggit/ObjectBuilder.java
public Object getLong() throws IOException { return Long.valueOf(parser.getLong()); }
// in solrj/src/java/org/apache/noggit/ObjectBuilder.java
public Object getNumber() throws IOException { CharArr num = parser.getNumberChars(); String numstr = num.toString(); double d = Double.parseDouble(numstr); if (!Double.isInfinite(d)) return Double.valueOf(d); // TODO: use more efficient constructor in Java5 return new BigDecimal(numstr); }
// in solrj/src/java/org/apache/noggit/ObjectBuilder.java
public Object getBigNumber() throws IOException { CharArr num = parser.getNumberChars(); String numstr = num.toString(); for(int ch; (ch=num.read())!=-1;) { if (ch=='.' || ch=='e' || ch=='E') return new BigDecimal(numstr); }
// in solrj/src/java/org/apache/noggit/ObjectBuilder.java
public Object getBoolean() throws IOException { return parser.getBoolean(); }
// in solrj/src/java/org/apache/noggit/ObjectBuilder.java
public Object getNull() throws IOException { parser.getNull(); return null; }
// in solrj/src/java/org/apache/noggit/ObjectBuilder.java
public Object newObject() throws IOException { return new LinkedHashMap(); }
// in solrj/src/java/org/apache/noggit/ObjectBuilder.java
public Object getKey() throws IOException { return parser.getString(); }
// in solrj/src/java/org/apache/noggit/ObjectBuilder.java
public void addKeyVal(Object map, Object key, Object val) throws IOException { Object prev = ((Map)map).put(key,val); // TODO: test for repeated value? }
// in solrj/src/java/org/apache/noggit/ObjectBuilder.java
public Object getObject() throws IOException { Object m = newObject(); for(;;) { int ev = parser.nextEvent(); if (ev==JSONParser.OBJECT_END) return objectEnd(m); Object key = getKey(); ev = parser.nextEvent(); Object val = getVal(); addKeyVal(m, key, val); } }
// in solrj/src/java/org/apache/noggit/ObjectBuilder.java
public void addArrayVal(Object arr, Object val) throws IOException { ((List)arr).add(val); }
// in solrj/src/java/org/apache/noggit/ObjectBuilder.java
public Object getArray() throws IOException { Object arr = newArray(); for(;;) { int ev = parser.nextEvent(); if (ev==JSONParser.ARRAY_END) return endArray(arr); Object val = getVal(); addArrayVal(arr, val); } }
// in contrib/extraction/src/java/org/apache/solr/handler/extraction/ExtractingDocumentLoader.java
void doAdd(SolrContentHandler handler, AddUpdateCommand template) throws IOException { template.solrDoc = handler.newDocument(); processor.processAdd(template); }
// in contrib/extraction/src/java/org/apache/solr/handler/extraction/ExtractingDocumentLoader.java
void addDoc(SolrContentHandler handler) throws IOException { templateAdd.clear(); doAdd(handler, templateAdd); }
// in contrib/extraction/src/java/org/apache/solr/handler/extraction/ExtractingRequestHandler.java
private TikaConfig getDefaultConfig(ClassLoader classLoader) throws MimeTypeException, IOException { return new TikaConfig(classLoader); }
// in contrib/uima/src/java/org/apache/solr/uima/processor/UIMAUpdateRequestProcessor.java
Override public void processAdd(AddUpdateCommand cmd) throws IOException { String text = null; try { /* get Solr document */ SolrInputDocument solrInputDocument = cmd.getSolrInputDocument(); /* get the fields to analyze */ String[] texts = getTextsToAnalyze(solrInputDocument); for (int i = 0; i < texts.length; i++) { text = texts[i]; if (text != null && text.length()>0) { /* process the text value */ JCas jcas = processText(text); UIMAToSolrMapper uimaToSolrMapper = new UIMAToSolrMapper(solrInputDocument, jcas); /* get field mapping from config */ Map<String, Map<String, MapField>> typesAndFeaturesFieldsMap = solrUIMAConfiguration .getTypesFeaturesFieldsMapping(); /* map type features on fields */ for (String typeFQN : typesAndFeaturesFieldsMap.keySet()) { uimaToSolrMapper.map(typeFQN, typesAndFeaturesFieldsMap.get(typeFQN)); } } } } catch (Exception e) { String logField = solrUIMAConfiguration.getLogField(); if(logField == null){ SchemaField uniqueKeyField = solrCore.getSchema().getUniqueKeyField(); if(uniqueKeyField != null){ logField = uniqueKeyField.getName(); } } String optionalFieldInfo = logField == null ? "." : new StringBuilder(". ").append(logField).append("=") .append((String)cmd.getSolrInputDocument().getField(logField).getValue()) .append(", ").toString(); int len = Math.min(text.length(), 100); if (solrUIMAConfiguration.isIgnoreErrors()) { log.warn(new StringBuilder("skip the text processing due to ") .append(e.getLocalizedMessage()).append(optionalFieldInfo) .append(" text=\"").append(text.substring(0, len)).append("...\"").toString()); } else { throw new SolrException(ErrorCode.SERVER_ERROR, new StringBuilder("processing error: ") .append(e.getLocalizedMessage()).append(optionalFieldInfo) .append(" text=\"").append(text.substring(0, len)).append("...\"").toString(), e); } } super.processAdd(cmd); }
// in contrib/clustering/src/java/org/apache/solr/handler/clustering/carrot2/LuceneCarrot2TokenizerFactory.java
public short nextToken() throws IOException { final boolean hasNextToken = wordTokenFilter.incrementToken(); if (hasNextToken) { short flags = 0; final char[] image = term.buffer(); final int length = term.length(); tempCharSequence.reset(image, 0, length); if (length == 1 && image[0] == ',') { // ChineseTokenizer seems to convert all punctuation to ',' // characters flags = ITokenizer.TT_PUNCTUATION; } else if (numeric.matcher(tempCharSequence).matches()) { flags = ITokenizer.TT_NUMERIC; } else { flags = ITokenizer.TT_TERM; } return flags; } return ITokenizer.TT_EOF; }
// in contrib/clustering/src/java/org/apache/solr/handler/clustering/carrot2/LuceneCarrot2TokenizerFactory.java
public void reset(Reader input) throws IOException { try { sentenceTokenizer.reset(input); wordTokenFilter = (TokenStream) tokenFilterClass.getConstructor( TokenStream.class).newInstance(sentenceTokenizer); term = wordTokenFilter.addAttribute(CharTermAttribute.class); } catch (Exception e) { throw ExceptionUtils.wrapAsRuntimeException(e); } }
// in contrib/clustering/src/java/org/apache/solr/handler/clustering/carrot2/CarrotClusteringEngine.java
Override public IResource[] getAll(final String resource) { final String resourceName = carrot2ResourcesDir + "/" + resource; log.debug("Looking for Solr resource: " + resourceName); InputStream resourceStream = null; final byte [] asBytes; try { resourceStream = resourceLoader.openResource(resourceName); asBytes = IOUtils.toByteArray(resourceStream); } catch (RuntimeException e) { log.debug("Resource not found in Solr's config: " + resourceName + ". Using the default " + resource + " from Carrot JAR."); return new IResource[] {}; } catch (IOException e) { log.warn("Could not read Solr resource " + resourceName); return new IResource[] {}; } finally { if (resourceStream != null) Closeables.closeQuietly(resourceStream); } log.info("Loaded Solr resource: " + resourceName); final IResource foundResource = new IResource() { @Override public InputStream open() throws IOException { return new ByteArrayInputStream(asBytes); } @Override public int hashCode() { // In case multiple resources are found they will be deduped, but we don't use it in Solr, // so simply rely on instance equivalence. return super.hashCode(); } @Override public boolean equals(Object obj) { // In case multiple resources are found they will be deduped, but we don't use it in Solr, // so simply rely on instance equivalence. return super.equals(obj); } @Override public String toString() { return "Solr config resource: " + resourceName; } }; return new IResource[] { foundResource }; }
// in contrib/clustering/src/java/org/apache/solr/handler/clustering/carrot2/CarrotClusteringEngine.java
Override public InputStream open() throws IOException { return new ByteArrayInputStream(asBytes); }
// in contrib/clustering/src/java/org/apache/solr/handler/clustering/carrot2/CarrotClusteringEngine.java
private List<Document> getDocuments(SolrDocumentList solrDocList, Map<SolrDocument, Integer> docIds, Query query, final SolrQueryRequest sreq) throws IOException { SolrHighlighter highlighter = null; SolrParams solrParams = sreq.getParams(); SolrCore core = sreq.getCore(); String urlField = solrParams.get(CarrotParams.URL_FIELD_NAME, "url"); String titleFieldSpec = solrParams.get(CarrotParams.TITLE_FIELD_NAME, "title"); String snippetFieldSpec = solrParams.get(CarrotParams.SNIPPET_FIELD_NAME, titleFieldSpec); String languageField = solrParams.get(CarrotParams.LANGUAGE_FIELD_NAME, null); // Maps Solr field names to Carrot2 custom field names Map<String, String> customFields = getCustomFieldsMap(solrParams); // Parse language code map string into a map Map<String, String> languageCodeMap = Maps.newHashMap(); if (StringUtils.isNotBlank(languageField)) { for (String pair : solrParams.get(CarrotParams.LANGUAGE_CODE_MAP, "") .split("[, ]")) { final String[] split = pair.split(":"); if (split.length == 2 && StringUtils.isNotBlank(split[0]) && StringUtils.isNotBlank(split[1])) { languageCodeMap.put(split[0], split[1]); } else { log.warn("Unsupported format for " + CarrotParams.LANGUAGE_CODE_MAP + ": '" + pair + "'. Skipping this mapping."); } } } // Get the documents boolean produceSummary = solrParams.getBool(CarrotParams.PRODUCE_SUMMARY, false); SolrQueryRequest req = null; String[] snippetFieldAry = null; if (produceSummary) { highlighter = HighlightComponent.getHighlighter(core); if (highlighter != null){ Map<String, Object> args = Maps.newHashMap(); snippetFieldAry = snippetFieldSpec.split("[, ]"); args.put(HighlightParams.FIELDS, snippetFieldAry); args.put(HighlightParams.HIGHLIGHT, "true"); args.put(HighlightParams.SIMPLE_PRE, ""); //we don't care about actually highlighting the area args.put(HighlightParams.SIMPLE_POST, ""); args.put(HighlightParams.FRAGSIZE, solrParams.getInt(CarrotParams.SUMMARY_FRAGSIZE, solrParams.getInt(HighlightParams.FRAGSIZE, 100))); args.put(HighlightParams.SNIPPETS, solrParams.getInt(CarrotParams.SUMMARY_SNIPPETS, solrParams.getInt(HighlightParams.SNIPPETS, 1))); req = new LocalSolrQueryRequest(core, query.toString(), "", 0, 1, args) { @Override public SolrIndexSearcher getSearcher() { return sreq.getSearcher(); } }; } else { log.warn("No highlighter configured, cannot produce summary"); produceSummary = false; } } Iterator<SolrDocument> docsIter = solrDocList.iterator(); List<Document> result = new ArrayList<Document>(solrDocList.size()); float[] scores = {1.0f}; int[] docsHolder = new int[1]; Query theQuery = query; while (docsIter.hasNext()) { SolrDocument sdoc = docsIter.next(); String snippet = null; // TODO: docIds will be null when running distributed search. // See comment in ClusteringComponent#finishStage(). if (produceSummary && docIds != null) { docsHolder[0] = docIds.get(sdoc).intValue(); DocList docAsList = new DocSlice(0, 1, docsHolder, scores, 1, 1.0f); NamedList<Object> highlights = highlighter.doHighlighting(docAsList, theQuery, req, snippetFieldAry); if (highlights != null && highlights.size() == 1) {//should only be one value given our setup //should only be one document @SuppressWarnings("unchecked") NamedList<String []> tmp = (NamedList<String[]>) highlights.getVal(0); final StringBuilder sb = new StringBuilder(); for (int j = 0; j < snippetFieldAry.length; j++) { // Join fragments with a period, so that Carrot2 does not create // cross-fragment phrases, such phrases rarely make sense. String [] highlt = tmp.get(snippetFieldAry[j]); if (highlt != null && highlt.length > 0) { for (int i = 0; i < highlt.length; i++) { sb.append(highlt[i]); sb.append(" . "); } } } snippet = sb.toString(); } } // If summaries not enabled or summary generation failed, use full content. if (snippet == null) { snippet = getConcatenated(sdoc, snippetFieldSpec); } // Create a Carrot2 document Document carrotDocument = new Document(getConcatenated(sdoc, titleFieldSpec), snippet, ObjectUtils.toString(sdoc.getFieldValue(urlField), "")); // Store Solr id of the document, we need it to map document instances // found in clusters back to identifiers. carrotDocument.setField(SOLR_DOCUMENT_ID, sdoc.getFieldValue(idFieldName)); // Set language if (StringUtils.isNotBlank(languageField)) { Collection<Object> languages = sdoc.getFieldValues(languageField); if (languages != null) { // Use the first Carrot2-supported language for (Object l : languages) { String lang = ObjectUtils.toString(l, ""); if (languageCodeMap.containsKey(lang)) { lang = languageCodeMap.get(lang); } // Language detection Library for Java uses dashes to separate // language variants, such as 'zh-cn', but Carrot2 uses underscores. if (lang.indexOf('-') > 0) { lang = lang.replace('-', '_'); } // If the language is supported by Carrot2, we'll get a non-null value final LanguageCode carrot2Language = LanguageCode.forISOCode(lang); if (carrot2Language != null) { carrotDocument.setLanguage(carrot2Language); break; } } } } // Add custom fields if (customFields != null) { for (Entry<String, String> entry : customFields.entrySet()) { carrotDocument.setField(entry.getValue(), sdoc.getFieldValue(entry.getKey())); } } result.add(carrotDocument); } return result; }
// in contrib/clustering/src/java/org/apache/solr/handler/clustering/ClusteringComponent.java
Override public void prepare(ResponseBuilder rb) throws IOException { SolrParams params = rb.req.getParams(); if (!params.getBool(COMPONENT_NAME, false)) { return; } }
// in contrib/clustering/src/java/org/apache/solr/handler/clustering/ClusteringComponent.java
Override public void process(ResponseBuilder rb) throws IOException { SolrParams params = rb.req.getParams(); if (!params.getBool(COMPONENT_NAME, false)) { return; } String name = getClusteringEngineName(rb); boolean useResults = params.getBool(ClusteringParams.USE_SEARCH_RESULTS, false); if (useResults == true) { SearchClusteringEngine engine = getSearchClusteringEngine(rb); if (engine != null) { DocListAndSet results = rb.getResults(); Map<SolrDocument,Integer> docIds = new HashMap<SolrDocument, Integer>(results.docList.size()); SolrDocumentList solrDocList = engine.getSolrDocumentList(results.docList, rb.req, docIds); Object clusters = engine.cluster(rb.getQuery(), solrDocList, docIds, rb.req); rb.rsp.add("clusters", clusters); } else { log.warn("No engine for: " + name); } } boolean useCollection = params.getBool(ClusteringParams.USE_COLLECTION, false); if (useCollection == true) { DocumentClusteringEngine engine = documentClusteringEngines.get(name); if (engine != null) { boolean useDocSet = params.getBool(ClusteringParams.USE_DOC_SET, false); NamedList nl = null; //TODO: This likely needs to be made into a background task that runs in an executor if (useDocSet == true) { nl = engine.cluster(rb.getResults().docSet, params); } else { nl = engine.cluster(params); } rb.rsp.add("clusters", nl); } else { log.warn("No engine for " + name); } } }
// in contrib/clustering/src/java/org/apache/solr/handler/clustering/SearchClusteringEngine.java
public SolrDocumentList getSolrDocumentList(DocList docList, SolrQueryRequest sreq, Map<SolrDocument, Integer> docIds) throws IOException{ return SolrPluginUtils.docListToSolrDocumentList( docList, sreq.getSearcher(), getFieldsToLoad(sreq), docIds); }
// in contrib/langid/src/java/org/apache/solr/update/processor/LanguageIdentifierUpdateProcessor.java
Override public void processAdd(AddUpdateCommand cmd) throws IOException { if (isEnabled()) { process(cmd.getSolrInputDocument()); } else { log.debug("Processor not enabled, not running"); } super.processAdd(cmd); }
// in contrib/langid/src/java/org/apache/solr/update/processor/LangDetectLanguageIdentifierUpdateProcessorFactory.java
public static synchronized void loadData() throws IOException, LangDetectException { if (loaded) { return; } loaded = true; List<String> profileData = new ArrayList<String>(); Charset encoding = Charset.forName("UTF-8"); for (String language : languages) { InputStream stream = LangDetectLanguageIdentifierUpdateProcessor.class.getResourceAsStream("langdetect-profiles/" + language); BufferedReader reader = new BufferedReader(new InputStreamReader(stream, encoding)); profileData.add(new String(IOUtils.toCharArray(reader))); reader.close(); } DetectorFactory.loadProfile(profileData); DetectorFactory.setSeed(0); }
// in contrib/analysis-extras/src/java/org/apache/solr/schema/ICUCollationField.java
Override public void write(TextResponseWriter writer, String name, IndexableField f) throws IOException { writer.writeStr(name, f.stringValue(), true); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/SolrWriter.java
static String getResourceAsString(InputStream in) throws IOException { ByteArrayOutputStream baos = new ByteArrayOutputStream(1024); byte[] buf = new byte[1024]; int sz = 0; try { while ((sz = in.read(buf)) != -1) { baos.write(buf, 0, sz); } } finally { try { in.close(); } catch (Exception e) { } } return new String(baos.toByteArray(), "UTF-8"); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/XPathRecordReader.java
private void parse(XMLStreamReader parser, Handler handler, Map<String, Object> values, Stack<Set<String>> stack, // lists of values to purge boolean recordStarted ) throws IOException, XMLStreamException { Set<String> valuesAddedinThisFrame = null; if (isRecord) { // This Node is a match for an XPATH from a forEach attribute, // prepare for the clean up that will occurr when the record // is emitted after its END_ELEMENT is matched recordStarted = true; valuesAddedinThisFrame = new HashSet<String>(); stack.push(valuesAddedinThisFrame); } else if (recordStarted) { // This node is a child of some parent which matched against forEach // attribute. Continue to add values to an existing record. valuesAddedinThisFrame = stack.peek(); } try { /* The input stream has deposited us at this Node in our tree of * intresting nodes. Depending on how this node is of interest, * process further tokens from the input stream and decide what * we do next */ if (attributes != null) { // we interested in storing attributes from the input stream for (Node node : attributes) { String value = parser.getAttributeValue(null, node.name); if (value != null || (recordStarted && !isRecord)) { putText(values, value, node.fieldName, node.multiValued); valuesAddedinThisFrame.add(node.fieldName); } } } Set<Node> childrenFound = new HashSet<Node>(); int event = -1; int flattenedStarts=0; // our tag depth when flattening elements StringBuilder text = new StringBuilder(); while (true) { event = parser.next(); if (event == END_ELEMENT) { if (flattenedStarts > 0) flattenedStarts--; else { if (hasText && valuesAddedinThisFrame != null) { valuesAddedinThisFrame.add(fieldName); putText(values, text.toString(), fieldName, multiValued); } if (isRecord) handler.handle(getDeepCopy(values), forEachPath); if (childNodes != null && recordStarted && !isRecord && !childrenFound.containsAll(childNodes)) { // nonReccord nodes where we have not collected text for ALL // the child nodes. for (Node n : childNodes) { // For the multivalue child nodes where we could have, but // didnt, collect text. Push a null string into values. if (!childrenFound.contains(n)) n.putNulls(values); } } return; } } else if (hasText && (event==CDATA || event==CHARACTERS || event==SPACE)) { text.append(parser.getText()); } else if (event == START_ELEMENT) { if ( flatten ) flattenedStarts++; else handleStartElement(parser, childrenFound, handler, values, stack, recordStarted); } // END_DOCUMENT is least likely to appear and should be // last in if-then-else skip chain else if (event == END_DOCUMENT) return; } }finally { if ((isRecord || !recordStarted) && !stack.empty()) { Set<String> cleanThis = stack.pop(); if (cleanThis != null) { for (String fld : cleanThis) values.remove(fld); } } } }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/XPathRecordReader.java
private void handleStartElement(XMLStreamReader parser, Set<Node> childrenFound, Handler handler, Map<String, Object> values, Stack<Set<String>> stack, boolean recordStarted) throws IOException, XMLStreamException { Node n = getMatchingNode(parser,childNodes); Map<String, Object> decends=new HashMap<String, Object>(); if (n != null) { childrenFound.add(n); n.parse(parser, handler, values, stack, recordStarted); return; } // The stream has diverged from the tree of interesting elements, but // are there any wildCardNodes ... anywhere in our path from the root? Node dn = this; // checking our Node first! do { if (dn.wildCardNodes != null) { // Check to see if the streams tag matches one of the "//" all // decendents type expressions for this node. n = getMatchingNode(parser, dn.wildCardNodes); if (n != null) { childrenFound.add(n); n.parse(parser, handler, values, stack, recordStarted); break; } // add the list of this nodes wild decendents to the cache for (Node nn : dn.wildCardNodes) decends.put(nn.name, nn); } dn = dn.wildAncestor; // leap back along the tree toward root } while (dn != null) ; if (n == null) { // we have a START_ELEMENT which is not within the tree of // interesting nodes. Skip over the contents of this element // but recursivly repeat the above for any START_ELEMENTs // found within this element. int count = 1; // we have had our first START_ELEMENT while (count != 0) { int token = parser.next(); if (token == START_ELEMENT) { Node nn = (Node) decends.get(parser.getLocalName()); if (nn != null) { // We have a //Node which matches the stream's parser.localName childrenFound.add(nn); // Parse the contents of this stream element nn.parse(parser, handler, values, stack, recordStarted); } else count++; } else if (token == END_ELEMENT) count--; } } }
// in contrib/velocity/src/java/org/apache/solr/response/VelocityResponseWriter.java
public void write(Writer writer, SolrQueryRequest request, SolrQueryResponse response) throws IOException { VelocityEngine engine = getEngine(request); // TODO: have HTTP headers available for configuring engine Template template = getTemplate(engine, request); VelocityContext context = new VelocityContext(); context.put("request", request); // Turn the SolrQueryResponse into a SolrResponse. // QueryResponse has lots of conveniences suitable for a view // Problem is, which SolrResponse class to use? // One patch to SOLR-620 solved this by passing in a class name as // as a parameter and using reflection and Solr's class loader to // create a new instance. But for now the implementation simply // uses QueryResponse, and if it chokes in a known way, fall back // to bare bones SolrResponseBase. // TODO: Can this writer know what the handler class is? With echoHandler=true it can get its string name at least SolrResponse rsp = new QueryResponse(); NamedList<Object> parsedResponse = BinaryResponseWriter.getParsedResponse(request, response); try { rsp.setResponse(parsedResponse); // page only injected if QueryResponse works context.put("page", new PageTool(request, response)); // page tool only makes sense for a SearchHandler request... *sigh* } catch (ClassCastException e) { // known edge case where QueryResponse's extraction assumes "response" is a SolrDocumentList // (AnalysisRequestHandler emits a "response") e.printStackTrace(); rsp = new SolrResponseBase(); rsp.setResponse(parsedResponse); } context.put("response", rsp); // Velocity context tools - TODO: make these pluggable context.put("esc", new EscapeTool()); context.put("date", new ComparisonDateTool()); context.put("list", new ListTool()); context.put("math", new MathTool()); context.put("number", new NumberTool()); context.put("sort", new SortTool()); context.put("engine", engine); // for $engine.resourceExists(...) String layout_template = request.getParams().get("v.layout"); String json_wrapper = request.getParams().get("v.json"); boolean wrap_response = (layout_template != null) || (json_wrapper != null); // create output, optionally wrap it into a json object if (wrap_response) { StringWriter stringWriter = new StringWriter(); template.merge(context, stringWriter); if (layout_template != null) { context.put("content", stringWriter.toString()); stringWriter = new StringWriter(); try { engine.getTemplate(layout_template + ".vm").merge(context, stringWriter); } catch (Exception e) { throw new IOException(e.getMessage()); } } if (json_wrapper != null) { writer.write(request.getParams().get("v.json") + "("); writer.write(getJSONWrap(stringWriter.toString())); writer.write(')'); } else { // using a layout, but not JSON wrapping writer.write(stringWriter.toString()); } } else { template.merge(context, writer); } }
// in contrib/velocity/src/java/org/apache/solr/response/VelocityResponseWriter.java
private Template getTemplate(VelocityEngine engine, SolrQueryRequest request) throws IOException { Template template; String template_name = request.getParams().get("v.template"); String qt = request.getParams().get("qt"); String path = (String) request.getContext().get("path"); if (template_name == null && path != null) { template_name = path; } // TODO: path is never null, so qt won't get picked up maybe special case for '/select' to use qt, otherwise use path? if (template_name == null && qt != null) { template_name = qt; } if (template_name == null) template_name = "index"; try { template = engine.getTemplate(template_name + ".vm"); } catch (Exception e) { throw new IOException(e.getMessage()); } return template; }
// in core/src/java/org/apache/solr/handler/DocumentAnalysisRequestHandler.java
DocumentAnalysisRequest resolveAnalysisRequest(SolrQueryRequest req) throws IOException, XMLStreamException { DocumentAnalysisRequest request = new DocumentAnalysisRequest(); SolrParams params = req.getParams(); String query = params.get(AnalysisParams.QUERY, params.get(CommonParams.Q, null)); request.setQuery(query); boolean showMatch = params.getBool(AnalysisParams.SHOW_MATCH, false); request.setShowMatch(showMatch); ContentStream stream = extractSingleContentStream(req); InputStream is = null; XMLStreamReader parser = null; try { is = stream.getStream(); final String charset = ContentStreamBase.getCharsetFromContentType(stream.getContentType()); parser = (charset == null) ? inputFactory.createXMLStreamReader(is) : inputFactory.createXMLStreamReader(is, charset); while (true) { int event = parser.next(); switch (event) { case XMLStreamConstants.END_DOCUMENT: { parser.close(); return request; } case XMLStreamConstants.START_ELEMENT: { String currTag = parser.getLocalName(); if ("doc".equals(currTag)) { log.trace("Reading doc..."); SolrInputDocument document = readDocument(parser, req.getSchema()); request.addDocument(document); } break; } } } } finally { if (parser != null) parser.close(); IOUtils.closeQuietly(is); } }
// in core/src/java/org/apache/solr/handler/DumpRequestHandler.java
Override public void handleRequestBody(SolrQueryRequest req, SolrQueryResponse rsp) throws IOException { // Show params rsp.add( "params", req.getParams().toNamedList() ); // Write the streams... if( req.getContentStreams() != null ) { ArrayList<NamedList<Object>> streams = new ArrayList<NamedList<Object>>(); // Cycle through each stream for( ContentStream content : req.getContentStreams() ) { NamedList<Object> stream = new SimpleOrderedMap<Object>(); stream.add( "name", content.getName() ); stream.add( "sourceInfo", content.getSourceInfo() ); stream.add( "size", content.getSize() ); stream.add( "contentType", content.getContentType() ); Reader reader = content.getReader(); try { stream.add( "stream", IOUtils.toString(reader) ); } finally { reader.close(); } streams.add( stream ); } rsp.add( "streams", streams ); } rsp.add("context", req.getContext()); }
// in core/src/java/org/apache/solr/handler/AnalysisRequestHandlerBase.java
Override public boolean incrementToken() throws IOException { if (tokenIterator.hasNext()) { clearAttributes(); AttributeSource next = tokenIterator.next(); Iterator<Class<? extends Attribute>> atts = next.getAttributeClassesIterator(); while (atts.hasNext()) // make sure all att impls in the token exist here addAttribute(atts.next()); next.copyTo(this); return true; } else { return false; } }
// in core/src/java/org/apache/solr/handler/AnalysisRequestHandlerBase.java
Override public void reset() throws IOException { super.reset(); tokenIterator = tokens.iterator(); }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
NamedList getCommandResponse(NamedList<String> commands) throws IOException { HttpPost post = new HttpPost(masterUrl); List<BasicNameValuePair> formparams = new ArrayList<BasicNameValuePair>(); formparams.add(new BasicNameValuePair("wt", "javabin")); for (Map.Entry<String, String> c : commands) { formparams.add(new BasicNameValuePair(c.getKey(), c.getValue())); } UrlEncodedFormEntity entity = new UrlEncodedFormEntity(formparams, "UTF-8"); post.setEntity(entity); return getNamedListResponse(post); }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
private NamedList<?> getNamedListResponse(HttpPost method) throws IOException { InputStream input = null; NamedList<?> result = null; try { HttpResponse response = myHttpClient.execute(method); int status = response.getStatusLine().getStatusCode(); if (status != HttpStatus.SC_OK) { throw new SolrException(SolrException.ErrorCode.SERVICE_UNAVAILABLE, "Request failed for the url " + method); } input = response.getEntity().getContent(); result = (NamedList<?>)new JavaBinCodec().unmarshal(input); } finally { try { if (input != null) { input.close(); } } catch (Exception e) { } } return result; }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
void fetchFileList(long gen) throws IOException { HttpPost post = new HttpPost(masterUrl); List<BasicNameValuePair> formparams = new ArrayList<BasicNameValuePair>(); formparams.add(new BasicNameValuePair("wt", "javabin")); formparams.add(new BasicNameValuePair(COMMAND, CMD_GET_FILE_LIST)); formparams.add(new BasicNameValuePair(GENERATION, String.valueOf(gen))); UrlEncodedFormEntity entity = new UrlEncodedFormEntity(formparams, "UTF-8"); post.setEntity(entity); @SuppressWarnings("unchecked") NamedList<List<Map<String, Object>>> nl = (NamedList<List<Map<String, Object>>>) getNamedListResponse(post); List<Map<String, Object>> f = nl.get(CMD_GET_FILE_LIST); if (f != null) filesToDownload = Collections.synchronizedList(f); else { filesToDownload = Collections.emptyList(); LOG.error("No files to download for index generation: "+ gen); } f = nl.get(CONF_FILES); if (f != null) confFilesToDownload = Collections.synchronizedList(f); }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
boolean fetchLatestIndex(SolrCore core, boolean force) throws IOException, InterruptedException { successfulInstall = false; replicationStartTime = System.currentTimeMillis(); try { //get the current 'replicateable' index version in the master NamedList response = null; try { response = getLatestVersion(); } catch (Exception e) { LOG.error("Master at: " + masterUrl + " is not available. Index fetch failed. Exception: " + e.getMessage()); return false; } long latestVersion = (Long) response.get(CMD_INDEX_VERSION); long latestGeneration = (Long) response.get(GENERATION); IndexCommit commit; RefCounted<SolrIndexSearcher> searcherRefCounted = null; try { searcherRefCounted = core.getNewestSearcher(false); if (searcherRefCounted == null) { SolrException.log(LOG, "No open searcher found - fetch aborted"); return false; } commit = searcherRefCounted.get().getIndexReader().getIndexCommit(); } finally { if (searcherRefCounted != null) searcherRefCounted.decref(); } if (latestVersion == 0L) { if (force && commit.getGeneration() != 0) { // since we won't get the files for an empty index, // we just clear ours and commit core.getUpdateHandler().getSolrCoreState().getIndexWriter(core).deleteAll(); SolrQueryRequest req = new LocalSolrQueryRequest(core, new ModifiableSolrParams()); core.getUpdateHandler().commit(new CommitUpdateCommand(req, false)); } //there is nothing to be replicated successfulInstall = true; return true; } if (!force && IndexDeletionPolicyWrapper.getCommitTimestamp(commit) == latestVersion) { //master and slave are already in sync just return LOG.info("Slave in sync with master."); successfulInstall = true; return true; } LOG.info("Master's generation: " + latestGeneration); LOG.info("Slave's generation: " + commit.getGeneration()); LOG.info("Starting replication process"); // get the list of files first fetchFileList(latestGeneration); // this can happen if the commit point is deleted before we fetch the file list. if(filesToDownload.isEmpty()) return false; LOG.info("Number of files in latest index in master: " + filesToDownload.size()); // Create the sync service fsyncService = Executors.newSingleThreadExecutor(); // use a synchronized list because the list is read by other threads (to show details) filesDownloaded = Collections.synchronizedList(new ArrayList<Map<String, Object>>()); // if the generateion of master is older than that of the slave , it means they are not compatible to be copied // then a new index direcory to be created and all the files need to be copied boolean isFullCopyNeeded = IndexDeletionPolicyWrapper.getCommitTimestamp(commit) >= latestVersion || force; File tmpIndexDir = createTempindexDir(core); if (isIndexStale()) isFullCopyNeeded = true; successfulInstall = false; boolean deleteTmpIdxDir = true; File indexDir = null ; try { indexDir = new File(core.getIndexDir()); downloadIndexFiles(isFullCopyNeeded, tmpIndexDir, latestGeneration); LOG.info("Total time taken for download : " + ((System.currentTimeMillis() - replicationStartTime) / 1000) + " secs"); Collection<Map<String, Object>> modifiedConfFiles = getModifiedConfFiles(confFilesToDownload); if (!modifiedConfFiles.isEmpty()) { downloadConfFiles(confFilesToDownload, latestGeneration); if (isFullCopyNeeded) { successfulInstall = modifyIndexProps(tmpIndexDir.getName()); deleteTmpIdxDir = false; } else { successfulInstall = copyIndexFiles(tmpIndexDir, indexDir); } if (successfulInstall) { LOG.info("Configuration files are modified, core will be reloaded"); logReplicationTimeAndConfFiles(modifiedConfFiles, successfulInstall);//write to a file time of replication and conf files. reloadCore(); } } else { terminateAndWaitFsyncService(); if (isFullCopyNeeded) { successfulInstall = modifyIndexProps(tmpIndexDir.getName()); deleteTmpIdxDir = false; } else { successfulInstall = copyIndexFiles(tmpIndexDir, indexDir); } if (successfulInstall) { logReplicationTimeAndConfFiles(modifiedConfFiles, successfulInstall); doCommit(); } } replicationStartTime = 0; return successfulInstall; } catch (ReplicationHandlerException e) { LOG.error("User aborted Replication"); return false; } catch (SolrException e) { throw e; } catch (InterruptedException e) { throw new InterruptedException("Index fetch interrupted"); } catch (Exception e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Index fetch failed : ", e); } finally { if (deleteTmpIdxDir) delTree(tmpIndexDir); else delTree(indexDir); } } finally { if (!successfulInstall) { logReplicationTimeAndConfFiles(null, successfulInstall); } filesToDownload = filesDownloaded = confFilesDownloaded = confFilesToDownload = null; replicationStartTime = 0; fileFetcher = null; if (fsyncService != null && !fsyncService.isShutdown()) fsyncService.shutdownNow(); fsyncService = null; stop = false; fsyncException = null; } }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
private void doCommit() throws IOException { SolrQueryRequest req = new LocalSolrQueryRequest(solrCore, new ModifiableSolrParams()); // reboot the writer on the new index and get a new searcher solrCore.getUpdateHandler().newIndexWriter(); try { // first try to open an NRT searcher so that the new // IndexWriter is registered with the reader Future[] waitSearcher = new Future[1]; solrCore.getSearcher(true, false, waitSearcher, true); if (waitSearcher[0] != null) { try { waitSearcher[0].get(); } catch (InterruptedException e) { SolrException.log(LOG,e); } catch (ExecutionException e) { SolrException.log(LOG,e); } } // update our commit point to the right dir solrCore.getUpdateHandler().commit(new CommitUpdateCommand(req, false)); } finally { req.close(); } }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
private boolean copyIndexFiles(File tmpIdxDir, File indexDir) throws IOException { String segmentsFile = null; List<String> copiedfiles = new ArrayList<String>(); for (Map<String, Object> f : filesDownloaded) { String fname = (String) f.get(NAME); // the segments file must be copied last // or else if there is a failure in between the // index will be corrupted if (fname.startsWith("segments_")) { //The segments file must be copied in the end //Otherwise , if the copy fails index ends up corrupted segmentsFile = fname; continue; } if (!copyAFile(tmpIdxDir, indexDir, fname, copiedfiles)) return false; copiedfiles.add(fname); } //copy the segments file last if (segmentsFile != null) { if (!copyAFile(tmpIdxDir, indexDir, segmentsFile, copiedfiles)) return false; } return true; }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
private void copyTmpConfFiles2Conf(File tmpconfDir) throws IOException { File confDir = new File(solrCore.getResourceLoader().getConfigDir()); for (File file : tmpconfDir.listFiles()) { File oldFile = new File(confDir, file.getName()); if (oldFile.exists()) { File backupFile = new File(confDir, oldFile.getName() + "." + getDateAsStr(new Date(oldFile.lastModified()))); boolean status = oldFile.renameTo(backupFile); if (!status) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Unable to rename: " + oldFile + " to: " + backupFile); } } boolean status = file.renameTo(oldFile); if (status) { } else { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Unable to rename: " + file + " to: " + oldFile); } } }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
FastInputStream getStream() throws IOException { post = new HttpPost(masterUrl); //the method is command=filecontent List<BasicNameValuePair> formparams = new ArrayList<BasicNameValuePair>(); formparams.add(new BasicNameValuePair(COMMAND, CMD_GET_FILE)); //add the version to download. This is used to reserve the download formparams.add(new BasicNameValuePair(GENERATION, indexGen.toString())); if (isConf) { //set cf instead of file for config file formparams.add(new BasicNameValuePair(CONF_FILE_SHORT, fileName)); } else { formparams.add(new BasicNameValuePair(FILE, fileName)); } if (useInternal) { formparams.add(new BasicNameValuePair(COMPRESSION, "true")); } if (useExternal) { formparams.add(new BasicNameValuePair("Accept-Encoding", "gzip,deflate")); } //use checksum if (this.includeChecksum) formparams.add(new BasicNameValuePair(CHECKSUM, "true")); //wt=filestream this is a custom protocol formparams.add(new BasicNameValuePair("wt", FILE_STREAM)); // This happen if there is a failure there is a retry. the offset=<sizedownloaded> ensures that // the server starts from the offset if (bytesDownloaded > 0) { formparams.add(new BasicNameValuePair(OFFSET, "" + bytesDownloaded)); } UrlEncodedFormEntity entity = new UrlEncodedFormEntity(formparams, "UTF-8"); post.setEntity(entity); HttpResponse response = myHttpClient.execute(post); InputStream is = response.getEntity().getContent(); //wrap it using FastInputStream if (useInternal) { is = new InflaterInputStream(is); } else if (useExternal) { is = checkCompressed(post, is); } return new FastInputStream(is); }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
private InputStream checkCompressed(AbstractHttpMessage method, InputStream respBody) throws IOException { Header contentEncodingHeader = method.getFirstHeader("Content-Encoding"); if (contentEncodingHeader != null) { String contentEncoding = contentEncodingHeader.getValue(); if (contentEncoding.contains("gzip")) { respBody = new GZIPInputStream(respBody); } else if (contentEncoding.contains("deflate")) { respBody = new InflaterInputStream(respBody); } } else { Header contentTypeHeader = method.getFirstHeader("Content-Type"); if (contentTypeHeader != null) { String contentType = contentTypeHeader.getValue(); if (contentType != null) { if (contentType.startsWith("application/x-gzip-compressed")) { respBody = new GZIPInputStream(respBody); } else if (contentType.startsWith("application/x-deflate")) { respBody = new InflaterInputStream(respBody); } } } } return respBody; }
// in core/src/java/org/apache/solr/handler/loader/XMLLoader.java
Transformer getTransformer(String xslt, SolrQueryRequest request) throws IOException { // not the cleanest way to achieve this // no need to synchronize access to context, right? // Nothing else happens with it at the same time final Map<Object,Object> ctx = request.getContext(); Transformer result = (Transformer)ctx.get(CONTEXT_TRANSFORMER_KEY); if(result==null) { SolrConfig solrConfig = request.getCore().getSolrConfig(); result = TransformerProvider.instance.getTransformer(solrConfig, xslt, xsltCacheLifetimeSeconds); result.setErrorListener(xmllog); ctx.put(CONTEXT_TRANSFORMER_KEY,result); } return result; }
// in core/src/java/org/apache/solr/handler/loader/XMLLoader.java
void processUpdate(SolrQueryRequest req, UpdateRequestProcessor processor, XMLStreamReader parser) throws XMLStreamException, IOException, FactoryConfigurationError, InstantiationException, IllegalAccessException, TransformerConfigurationException { AddUpdateCommand addCmd = null; SolrParams params = req.getParams(); while (true) { int event = parser.next(); switch (event) { case XMLStreamConstants.END_DOCUMENT: parser.close(); return; case XMLStreamConstants.START_ELEMENT: String currTag = parser.getLocalName(); if (currTag.equals(UpdateRequestHandler.ADD)) { log.trace("SolrCore.update(add)"); addCmd = new AddUpdateCommand(req); // First look for commitWithin parameter on the request, will be overwritten for individual <add>'s addCmd.commitWithin = params.getInt(UpdateParams.COMMIT_WITHIN, -1); addCmd.overwrite = params.getBool(UpdateParams.OVERWRITE, true); for (int i = 0; i < parser.getAttributeCount(); i++) { String attrName = parser.getAttributeLocalName(i); String attrVal = parser.getAttributeValue(i); if (UpdateRequestHandler.OVERWRITE.equals(attrName)) { addCmd.overwrite = StrUtils.parseBoolean(attrVal); } else if (UpdateRequestHandler.COMMIT_WITHIN.equals(attrName)) { addCmd.commitWithin = Integer.parseInt(attrVal); } else { log.warn("Unknown attribute id in add:" + attrName); } } } else if ("doc".equals(currTag)) { if(addCmd != null) { log.trace("adding doc..."); addCmd.clear(); addCmd.solrDoc = readDoc(parser); processor.processAdd(addCmd); } else { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Unexpected <doc> tag without an <add> tag surrounding it."); } } else if (UpdateRequestHandler.COMMIT.equals(currTag) || UpdateRequestHandler.OPTIMIZE.equals(currTag)) { log.trace("parsing " + currTag); CommitUpdateCommand cmd = new CommitUpdateCommand(req, UpdateRequestHandler.OPTIMIZE.equals(currTag)); ModifiableSolrParams mp = new ModifiableSolrParams(); for (int i = 0; i < parser.getAttributeCount(); i++) { String attrName = parser.getAttributeLocalName(i); String attrVal = parser.getAttributeValue(i); mp.set(attrName, attrVal); } RequestHandlerUtils.validateCommitParams(mp); SolrParams p = SolrParams.wrapDefaults(mp, req.getParams()); // default to the normal request params for commit options RequestHandlerUtils.updateCommit(cmd, p); processor.processCommit(cmd); } // end commit else if (UpdateRequestHandler.ROLLBACK.equals(currTag)) { log.trace("parsing " + currTag); RollbackUpdateCommand cmd = new RollbackUpdateCommand(req); processor.processRollback(cmd); } // end rollback else if (UpdateRequestHandler.DELETE.equals(currTag)) { log.trace("parsing delete"); processDelete(req, processor, parser); } // end delete break; } } }
// in core/src/java/org/apache/solr/handler/loader/XMLLoader.java
void processDelete(SolrQueryRequest req, UpdateRequestProcessor processor, XMLStreamReader parser) throws XMLStreamException, IOException { // Parse the command DeleteUpdateCommand deleteCmd = new DeleteUpdateCommand(req); // First look for commitWithin parameter on the request, will be overwritten for individual <delete>'s SolrParams params = req.getParams(); deleteCmd.commitWithin = params.getInt(UpdateParams.COMMIT_WITHIN, -1); for (int i = 0; i < parser.getAttributeCount(); i++) { String attrName = parser.getAttributeLocalName(i); String attrVal = parser.getAttributeValue(i); if ("fromPending".equals(attrName)) { // deprecated } else if ("fromCommitted".equals(attrName)) { // deprecated } else if (UpdateRequestHandler.COMMIT_WITHIN.equals(attrName)) { deleteCmd.commitWithin = Integer.parseInt(attrVal); } else { log.warn("unexpected attribute delete/@" + attrName); } } StringBuilder text = new StringBuilder(); while (true) { int event = parser.next(); switch (event) { case XMLStreamConstants.START_ELEMENT: String mode = parser.getLocalName(); if (!("id".equals(mode) || "query".equals(mode))) { log.warn("unexpected XML tag /delete/" + mode); throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "unexpected XML tag /delete/" + mode); } text.setLength(0); if ("id".equals(mode)) { for (int i = 0; i < parser.getAttributeCount(); i++) { String attrName = parser.getAttributeLocalName(i); String attrVal = parser.getAttributeValue(i); if (UpdateRequestHandler.VERSION.equals(attrName)) { deleteCmd.setVersion(Long.parseLong(attrVal)); } } } break; case XMLStreamConstants.END_ELEMENT: String currTag = parser.getLocalName(); if ("id".equals(currTag)) { deleteCmd.setId(text.toString()); } else if ("query".equals(currTag)) { deleteCmd.setQuery(text.toString()); } else if ("delete".equals(currTag)) { return; } else { log.warn("unexpected XML tag /delete/" + currTag); throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "unexpected XML tag /delete/" + currTag); } processor.processDelete(deleteCmd); deleteCmd.clear(); break; // Add everything to the text case XMLStreamConstants.SPACE: case XMLStreamConstants.CDATA: case XMLStreamConstants.CHARACTERS: text.append(parser.getText()); break; } } }
// in core/src/java/org/apache/solr/handler/loader/JavabinLoader.java
private void parseAndLoadDocs(final SolrQueryRequest req, SolrQueryResponse rsp, InputStream stream, final UpdateRequestProcessor processor) throws IOException { UpdateRequest update = null; JavaBinUpdateRequestCodec.StreamingUpdateHandler handler = new JavaBinUpdateRequestCodec.StreamingUpdateHandler() { private AddUpdateCommand addCmd = null; @Override public void update(SolrInputDocument document, UpdateRequest updateRequest) { if (document == null) { // Perhaps commit from the parameters try { RequestHandlerUtils.handleCommit(req, processor, updateRequest.getParams(), false); RequestHandlerUtils.handleRollback(req, processor, updateRequest.getParams(), false); } catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "ERROR handling commit/rollback"); } return; } if (addCmd == null) { addCmd = getAddCommand(req, updateRequest.getParams()); } addCmd.solrDoc = document; try { processor.processAdd(addCmd); addCmd.clear(); } catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "ERROR adding document " + document); } } }; FastInputStream in = FastInputStream.wrap(stream); for (; ; ) { try { update = new JavaBinUpdateRequestCodec().unmarshal(in, handler); } catch (EOFException e) { break; // this is expected } catch (Exception e) { log.error("Exception while processing update request", e); break; } if (update.getDeleteById() != null || update.getDeleteQuery() != null) { delete(req, update, processor); } } }
// in core/src/java/org/apache/solr/handler/loader/JavabinLoader.java
private void delete(SolrQueryRequest req, UpdateRequest update, UpdateRequestProcessor processor) throws IOException { SolrParams params = update.getParams(); DeleteUpdateCommand delcmd = new DeleteUpdateCommand(req); if(params != null) { delcmd.commitWithin = params.getInt(UpdateParams.COMMIT_WITHIN, -1); } if(update.getDeleteById() != null) { for (String s : update.getDeleteById()) { delcmd.id = s; processor.processDelete(delcmd); } delcmd.id = null; } if(update.getDeleteQuery() != null) { for (String s : update.getDeleteQuery()) { delcmd.query = s; processor.processDelete(delcmd); } } }
// in core/src/java/org/apache/solr/handler/loader/JsonLoader.java
DeleteUpdateCommand parseDelete() throws IOException { assertNextEvent( JSONParser.OBJECT_START ); DeleteUpdateCommand cmd = new DeleteUpdateCommand(req); cmd.commitWithin = commitWithin; while( true ) { int ev = parser.nextEvent(); if( ev == JSONParser.STRING ) { String key = parser.getString(); if( parser.wasKey() ) { if( "id".equals( key ) ) { cmd.setId(parser.getString()); } else if( "query".equals(key) ) { cmd.setQuery(parser.getString()); } else if( "commitWithin".equals(key) ) { cmd.commitWithin = Integer.parseInt(parser.getString()); } else { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Unknown key: "+key+" ["+parser.getPosition()+"]" ); } } else { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "invalid string: " + key +" at ["+parser.getPosition()+"]" ); } } else if( ev == JSONParser.OBJECT_END ) { if( cmd.getId() == null && cmd.getQuery() == null ) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Missing id or query for delete ["+parser.getPosition()+"]" ); } return cmd; } else { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Got: "+JSONParser.getEventString( ev ) +" at ["+parser.getPosition()+"]" ); } } }
// in core/src/java/org/apache/solr/handler/loader/JsonLoader.java
RollbackUpdateCommand parseRollback() throws IOException { assertNextEvent( JSONParser.OBJECT_START ); assertNextEvent( JSONParser.OBJECT_END ); return new RollbackUpdateCommand(req); }
// in core/src/java/org/apache/solr/handler/loader/JsonLoader.java
void parseCommitOptions(CommitUpdateCommand cmd ) throws IOException { assertNextEvent( JSONParser.OBJECT_START ); final Map<String,Object> map = (Map)ObjectBuilder.getVal(parser); // SolrParams currently expects string values... SolrParams p = new SolrParams() { @Override public String get(String param) { Object o = map.get(param); return o == null ? null : o.toString(); } @Override public String[] getParams(String param) { return new String[]{get(param)}; } @Override public Iterator<String> getParameterNamesIterator() { return map.keySet().iterator(); } }; RequestHandlerUtils.validateCommitParams(p); p = SolrParams.wrapDefaults(p, req.getParams()); // default to the normal request params for commit options RequestHandlerUtils.updateCommit(cmd, p); }
// in core/src/java/org/apache/solr/handler/loader/JsonLoader.java
AddUpdateCommand parseAdd() throws IOException { AddUpdateCommand cmd = new AddUpdateCommand(req); cmd.commitWithin = commitWithin; cmd.overwrite = overwrite; float boost = 1.0f; while( true ) { int ev = parser.nextEvent(); if( ev == JSONParser.STRING ) { if( parser.wasKey() ) { String key = parser.getString(); if( "doc".equals( key ) ) { if( cmd.solrDoc != null ) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "multiple docs in same add command" ); } ev = assertNextEvent( JSONParser.OBJECT_START ); cmd.solrDoc = parseDoc( ev ); } else if( UpdateRequestHandler.OVERWRITE.equals( key ) ) { cmd.overwrite = parser.getBoolean(); // reads next boolean } else if( UpdateRequestHandler.COMMIT_WITHIN.equals( key ) ) { cmd.commitWithin = (int)parser.getLong(); } else if( "boost".equals( key ) ) { boost = Float.parseFloat( parser.getNumberChars().toString() ); } else { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Unknown key: "+key+" ["+parser.getPosition()+"]" ); } } else { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Should be a key " +" at ["+parser.getPosition()+"]" ); } } else if( ev == JSONParser.OBJECT_END ) { if( cmd.solrDoc == null ) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST,"missing solr document. "+parser.getPosition() ); } cmd.solrDoc.setDocumentBoost( boost ); return cmd; } else { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Got: "+JSONParser.getEventString( ev ) +" at ["+parser.getPosition()+"]" ); } } }
// in core/src/java/org/apache/solr/handler/loader/JsonLoader.java
void handleAdds() throws IOException { while( true ) { AddUpdateCommand cmd = new AddUpdateCommand(req); cmd.commitWithin = commitWithin; cmd.overwrite = overwrite; int ev = parser.nextEvent(); if (ev == JSONParser.ARRAY_END) break; assertEvent(ev, JSONParser.OBJECT_START); cmd.solrDoc = parseDoc(ev); processor.processAdd(cmd); } }
// in core/src/java/org/apache/solr/handler/loader/JsonLoader.java
int assertNextEvent(int expected ) throws IOException { int got = parser.nextEvent(); assertEvent(got, expected); return got; }
// in core/src/java/org/apache/solr/handler/loader/JsonLoader.java
private SolrInputDocument parseDoc(int ev) throws IOException { assert ev == JSONParser.OBJECT_START; SolrInputDocument sdoc = new SolrInputDocument(); for (;;) { SolrInputField sif = parseField(); if (sif == null) return sdoc; SolrInputField prev = sdoc.put(sif.getName(), sif); if (prev != null) { // blech - repeated keys sif.addValue(prev.getValue(), prev.getBoost()); } } }
// in core/src/java/org/apache/solr/handler/loader/JsonLoader.java
private SolrInputField parseField() throws IOException { int ev = parser.nextEvent(); if (ev == JSONParser.OBJECT_END) { return null; } String fieldName = parser.getString(); SolrInputField sif = new SolrInputField(fieldName); parseFieldValue(sif); return sif; }
// in core/src/java/org/apache/solr/handler/loader/JsonLoader.java
private void parseFieldValue(SolrInputField sif) throws IOException { int ev = parser.nextEvent(); if (ev == JSONParser.OBJECT_START) { parseExtendedFieldValue(sif, ev); } else { Object val = parseNormalFieldValue(ev); sif.setValue(val, 1.0f); } }
// in core/src/java/org/apache/solr/handler/loader/JsonLoader.java
private void parseExtendedFieldValue(SolrInputField sif, int ev) throws IOException { assert ev == JSONParser.OBJECT_START; float boost = 1.0f; Object normalFieldValue = null; Map<String, Object> extendedInfo = null; for (;;) { ev = parser.nextEvent(); switch (ev) { case JSONParser.STRING: String label = parser.getString(); if ("boost".equals(label)) { ev = parser.nextEvent(); if( ev != JSONParser.NUMBER && ev != JSONParser.LONG && ev != JSONParser.BIGNUMBER ) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "boost should have number! "+JSONParser.getEventString(ev) ); } boost = (float)parser.getDouble(); } else if ("value".equals(label)) { normalFieldValue = parseNormalFieldValue(parser.nextEvent()); } else { // If we encounter other unknown map keys, then use a map if (extendedInfo == null) { extendedInfo = new HashMap<String, Object>(2); } // for now, the only extended info will be field values // we could either store this as an Object or a SolrInputField Object val = parseNormalFieldValue(parser.nextEvent()); extendedInfo.put(label, val); } break; case JSONParser.OBJECT_END: if (extendedInfo != null) { if (normalFieldValue != null) { extendedInfo.put("value",normalFieldValue); } sif.setValue(extendedInfo, boost); } else { sif.setValue(normalFieldValue, boost); } return; default: throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Error parsing JSON extended field value. Unexpected "+JSONParser.getEventString(ev) ); } } }
// in core/src/java/org/apache/solr/handler/loader/JsonLoader.java
private Object parseNormalFieldValue(int ev) throws IOException { if (ev == JSONParser.ARRAY_START) { List<Object> val = parseArrayFieldValue(ev); return val; } else { Object val = parseSingleFieldValue(ev); return val; } }
// in core/src/java/org/apache/solr/handler/loader/JsonLoader.java
private Object parseSingleFieldValue(int ev) throws IOException { switch (ev) { case JSONParser.STRING: return parser.getString(); case JSONParser.LONG: case JSONParser.NUMBER: case JSONParser.BIGNUMBER: return parser.getNumberChars().toString(); case JSONParser.BOOLEAN: return Boolean.toString(parser.getBoolean()); // for legacy reasons, single values s are expected to be strings case JSONParser.NULL: parser.getNull(); return null; case JSONParser.ARRAY_START: return parseArrayFieldValue(ev); default: throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Error parsing JSON field value. Unexpected "+JSONParser.getEventString(ev) ); } }
// in core/src/java/org/apache/solr/handler/loader/JsonLoader.java
private List<Object> parseArrayFieldValue(int ev) throws IOException { assert ev == JSONParser.ARRAY_START; ArrayList lst = new ArrayList(2); for (;;) { ev = parser.nextEvent(); if (ev == JSONParser.ARRAY_END) { return lst; } Object val = parseSingleFieldValue(ev); lst.add(val); } }
// in core/src/java/org/apache/solr/handler/loader/CSVLoader.java
Override void addDoc(int line, String[] vals) throws IOException { templateAdd.clear(); SolrInputDocument doc = new SolrInputDocument(); doAdd(line, vals, doc, templateAdd); }
// in core/src/java/org/apache/solr/handler/loader/CSVLoaderBase.java
Override public void load(SolrQueryRequest req, SolrQueryResponse rsp, ContentStream stream, UpdateRequestProcessor processor) throws IOException { errHeader = "CSVLoader: input=" + stream.getSourceInfo(); Reader reader = null; try { reader = stream.getReader(); if (skipLines>0) { if (!(reader instanceof BufferedReader)) { reader = new BufferedReader(reader); } BufferedReader r = (BufferedReader)reader; for (int i=0; i<skipLines; i++) { r.readLine(); } } CSVParser parser = new CSVParser(reader, strategy); // parse the fieldnames from the header of the file if (fieldnames==null) { fieldnames = parser.getLine(); if (fieldnames==null) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST,"Expected fieldnames in CSV input"); } prepareFields(); } // read the rest of the CSV file for(;;) { int line = parser.getLineNumber(); // for error reporting in MT mode String[] vals = null; try { vals = parser.getLine(); } catch (IOException e) { //Catch the exception and rethrow it with more line information input_err("can't read line: " + line, null, line, e); } if (vals==null) break; if (vals.length != fields.length) { input_err("expected "+fields.length+" values but got "+vals.length, vals, line); } addDoc(line,vals); } } finally{ if (reader != null) { IOUtils.closeQuietly(reader); } } }
// in core/src/java/org/apache/solr/handler/loader/CSVLoaderBase.java
void doAdd(int line, String[] vals, SolrInputDocument doc, AddUpdateCommand template) throws IOException { // the line number is passed simply for error reporting in MT mode. // first, create the lucene document for (int i=0; i<vals.length; i++) { if (fields[i]==null) continue; // ignore this field String val = vals[i]; adders[i].add(doc, line, i, val); } // add any literals for (SchemaField sf : literals.keySet()) { String fn = sf.getName(); String val = literals.get(sf); doc.addField(fn, val); } template.solrDoc = doc; processor.processAdd(template); }
// in core/src/java/org/apache/solr/handler/ReplicationHandler.java
private void registerFileStreamResponseWriter() { core.registerResponseWriter(FILE_STREAM, new BinaryQueryResponseWriter() { public void write(OutputStream out, SolrQueryRequest request, SolrQueryResponse resp) throws IOException { FileStream stream = (FileStream) resp.getValues().get(FILE_STREAM); stream.write(out); } public void write(Writer writer, SolrQueryRequest request, SolrQueryResponse response) throws IOException { throw new RuntimeException("This is a binary writer , Cannot write to a characterstream"); } public String getContentType(SolrQueryRequest request, SolrQueryResponse response) { return "application/octet-stream"; } public void init(NamedList args) { /*no op*/ } }); }
// in core/src/java/org/apache/solr/handler/ReplicationHandler.java
public void write(OutputStream out, SolrQueryRequest request, SolrQueryResponse resp) throws IOException { FileStream stream = (FileStream) resp.getValues().get(FILE_STREAM); stream.write(out); }
// in core/src/java/org/apache/solr/handler/ReplicationHandler.java
public void write(Writer writer, SolrQueryRequest request, SolrQueryResponse response) throws IOException { throw new RuntimeException("This is a binary writer , Cannot write to a characterstream"); }
// in core/src/java/org/apache/solr/handler/ReplicationHandler.java
public void write(OutputStream out) throws IOException { String fileName = params.get(FILE); String cfileName = params.get(CONF_FILE_SHORT); String sOffset = params.get(OFFSET); String sLen = params.get(LEN); String compress = params.get(COMPRESSION); String sChecksum = params.get(CHECKSUM); String sGen = params.get(GENERATION); if (sGen != null) indexGen = Long.parseLong(sGen); if (Boolean.parseBoolean(compress)) { fos = new FastOutputStream(new DeflaterOutputStream(out)); } else { fos = new FastOutputStream(out); } FileInputStream inputStream = null; int packetsWritten = 0; try { long offset = -1; int len = -1; //check if checksum is requested boolean useChecksum = Boolean.parseBoolean(sChecksum); if (sOffset != null) offset = Long.parseLong(sOffset); if (sLen != null) len = Integer.parseInt(sLen); if (fileName == null && cfileName == null) { //no filename do nothing writeNothing(); } File file = null; if (cfileName != null) { //if if is a conf file read from config diectory file = new File(core.getResourceLoader().getConfigDir(), cfileName); } else { //else read from the indexdirectory file = new File(core.getIndexDir(), fileName); } if (file.exists() && file.canRead()) { inputStream = new FileInputStream(file); FileChannel channel = inputStream.getChannel(); //if offset is mentioned move the pointer to that point if (offset != -1) channel.position(offset); byte[] buf = new byte[(len == -1 || len > PACKET_SZ) ? PACKET_SZ : len]; Checksum checksum = null; if (useChecksum) checksum = new Adler32(); ByteBuffer bb = ByteBuffer.wrap(buf); while (true) { bb.clear(); long bytesRead = channel.read(bb); if (bytesRead <= 0) { writeNothing(); fos.close(); break; } fos.writeInt((int) bytesRead); if (useChecksum) { checksum.reset(); checksum.update(buf, 0, (int) bytesRead); fos.writeLong(checksum.getValue()); } fos.write(buf, 0, (int) bytesRead); fos.flush(); if (indexGen != null && (packetsWritten % 5 == 0)) { //after every 5 packets reserve the commitpoint for some time delPolicy.setReserveDuration(indexGen, reserveCommitDuration); } packetsWritten++; } } else { writeNothing(); } } catch (IOException e) { LOG.warn("Exception while writing response for params: " + params, e); } finally { IOUtils.closeQuietly(inputStream); } }
// in core/src/java/org/apache/solr/handler/ReplicationHandler.java
private void writeNothing() throws IOException { fos.writeInt(0); fos.flush(); }
// in core/src/java/org/apache/solr/handler/RequestHandlerUtils.java
public static boolean handleCommit(SolrQueryRequest req, UpdateRequestProcessor processor, SolrParams params, boolean force ) throws IOException { if( params == null) { params = new MapSolrParams( new HashMap<String, String>() ); } boolean optimize = params.getBool( UpdateParams.OPTIMIZE, false ); boolean commit = params.getBool( UpdateParams.COMMIT, false ); boolean softCommit = params.getBool( UpdateParams.SOFT_COMMIT, false ); boolean prepareCommit = params.getBool( UpdateParams.PREPARE_COMMIT, false ); if( optimize || commit || softCommit || prepareCommit || force ) { CommitUpdateCommand cmd = new CommitUpdateCommand(req, optimize ); updateCommit(cmd, params); processor.processCommit( cmd ); return true; } return false; }
// in core/src/java/org/apache/solr/handler/RequestHandlerUtils.java
public static void updateCommit(CommitUpdateCommand cmd, SolrParams params) throws IOException { if( params == null ) return; cmd.openSearcher = params.getBool( UpdateParams.OPEN_SEARCHER, cmd.openSearcher ); cmd.waitSearcher = params.getBool( UpdateParams.WAIT_SEARCHER, cmd.waitSearcher ); cmd.softCommit = params.getBool( UpdateParams.SOFT_COMMIT, cmd.softCommit ); cmd.expungeDeletes = params.getBool( UpdateParams.EXPUNGE_DELETES, cmd.expungeDeletes ); cmd.maxOptimizeSegments = params.getInt( UpdateParams.MAX_OPTIMIZE_SEGMENTS, cmd.maxOptimizeSegments ); cmd.prepareCommit = params.getBool( UpdateParams.PREPARE_COMMIT, cmd.prepareCommit ); }
// in core/src/java/org/apache/solr/handler/RequestHandlerUtils.java
public static boolean handleRollback(SolrQueryRequest req, UpdateRequestProcessor processor, SolrParams params, boolean force ) throws IOException { if( params == null ) { params = new MapSolrParams( new HashMap<String, String>() ); } boolean rollback = params.getBool( UpdateParams.ROLLBACK, false ); if( rollback || force ) { RollbackUpdateCommand cmd = new RollbackUpdateCommand(req); processor.processRollback( cmd ); return true; } return false; }
// in core/src/java/org/apache/solr/handler/MoreLikeThisHandler.java
public DocListAndSet getMoreLikeThis( int id, int start, int rows, List<Query> filters, List<InterestingTerm> terms, int flags ) throws IOException { Document doc = reader.document(id); rawMLTQuery = mlt.like(id); boostedMLTQuery = getBoostedQuery( rawMLTQuery ); if( terms != null ) { fillInterestingTermsFromMLTQuery( rawMLTQuery, terms ); } // exclude current document from results realMLTQuery = new BooleanQuery(); realMLTQuery.add(boostedMLTQuery, BooleanClause.Occur.MUST); realMLTQuery.add( new TermQuery(new Term(uniqueKeyField.getName(), uniqueKeyField.getType().storedToIndexed(doc.getField(uniqueKeyField.getName())))), BooleanClause.Occur.MUST_NOT); DocListAndSet results = new DocListAndSet(); if (this.needDocSet) { results = searcher.getDocListAndSet(realMLTQuery, filters, null, start, rows, flags); } else { results.docList = searcher.getDocList(realMLTQuery, filters, null, start, rows, flags); } return results; }
// in core/src/java/org/apache/solr/handler/MoreLikeThisHandler.java
public DocListAndSet getMoreLikeThis( Reader reader, int start, int rows, List<Query> filters, List<InterestingTerm> terms, int flags ) throws IOException { // analyzing with the first field: previous (stupid) behavior rawMLTQuery = mlt.like(reader, mlt.getFieldNames()[0]); boostedMLTQuery = getBoostedQuery( rawMLTQuery ); if( terms != null ) { fillInterestingTermsFromMLTQuery( boostedMLTQuery, terms ); } DocListAndSet results = new DocListAndSet(); if (this.needDocSet) { results = searcher.getDocListAndSet( boostedMLTQuery, filters, null, start, rows, flags); } else { results.docList = searcher.getDocList( boostedMLTQuery, filters, null, start, rows, flags); } return results; }
// in core/src/java/org/apache/solr/handler/MoreLikeThisHandler.java
Deprecated public NamedList<DocList> getMoreLikeThese( DocList docs, int rows, int flags ) throws IOException { IndexSchema schema = searcher.getSchema(); NamedList<DocList> mlt = new SimpleOrderedMap<DocList>(); DocIterator iterator = docs.iterator(); while( iterator.hasNext() ) { int id = iterator.nextDoc(); DocListAndSet sim = getMoreLikeThis( id, 0, rows, null, null, flags ); String name = schema.printableUniqueKey( reader.document( id ) ); mlt.add(name, sim.docList); } return mlt; }
// in core/src/java/org/apache/solr/handler/component/TermsComponent.java
Override public void prepare(ResponseBuilder rb) throws IOException { SolrParams params = rb.req.getParams(); if (params.getBool(TermsParams.TERMS, false)) { rb.doTerms = true; } // TODO: temporary... this should go in a different component. String shards = params.get(ShardParams.SHARDS); if (shards != null) { rb.isDistrib = true; if (params.get(ShardParams.SHARDS_QT) == null) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "No shards.qt parameter specified"); } List<String> lst = StrUtils.splitSmart(shards, ",", true); rb.shards = lst.toArray(new String[lst.size()]); } }
// in core/src/java/org/apache/solr/handler/component/TermsComponent.java
Override public void process(ResponseBuilder rb) throws IOException { SolrParams params = rb.req.getParams(); if (!params.getBool(TermsParams.TERMS, false)) return; String[] fields = params.getParams(TermsParams.TERMS_FIELD); NamedList<Object> termsResult = new SimpleOrderedMap<Object>(); rb.rsp.add("terms", termsResult); if (fields == null || fields.length==0) return; int limit = params.getInt(TermsParams.TERMS_LIMIT, 10); if (limit < 0) { limit = Integer.MAX_VALUE; } String lowerStr = params.get(TermsParams.TERMS_LOWER); String upperStr = params.get(TermsParams.TERMS_UPPER); boolean upperIncl = params.getBool(TermsParams.TERMS_UPPER_INCLUSIVE, false); boolean lowerIncl = params.getBool(TermsParams.TERMS_LOWER_INCLUSIVE, true); boolean sort = !TermsParams.TERMS_SORT_INDEX.equals( params.get(TermsParams.TERMS_SORT, TermsParams.TERMS_SORT_COUNT)); int freqmin = params.getInt(TermsParams.TERMS_MINCOUNT, 1); int freqmax = params.getInt(TermsParams.TERMS_MAXCOUNT, UNLIMITED_MAX_COUNT); if (freqmax<0) { freqmax = Integer.MAX_VALUE; } String prefix = params.get(TermsParams.TERMS_PREFIX_STR); String regexp = params.get(TermsParams.TERMS_REGEXP_STR); Pattern pattern = regexp != null ? Pattern.compile(regexp, resolveRegexpFlags(params)) : null; boolean raw = params.getBool(TermsParams.TERMS_RAW, false); final AtomicReader indexReader = rb.req.getSearcher().getAtomicReader(); Fields lfields = indexReader.fields(); for (String field : fields) { NamedList<Integer> fieldTerms = new NamedList<Integer>(); termsResult.add(field, fieldTerms); Terms terms = lfields == null ? null : lfields.terms(field); if (terms == null) { // no terms for this field continue; } FieldType ft = raw ? null : rb.req.getSchema().getFieldTypeNoEx(field); if (ft==null) ft = new StrField(); // prefix must currently be text BytesRef prefixBytes = prefix==null ? null : new BytesRef(prefix); BytesRef upperBytes = null; if (upperStr != null) { upperBytes = new BytesRef(); ft.readableToIndexed(upperStr, upperBytes); } BytesRef lowerBytes; if (lowerStr == null) { // If no lower bound was specified, use the prefix lowerBytes = prefixBytes; } else { lowerBytes = new BytesRef(); if (raw) { // TODO: how to handle binary? perhaps we don't for "raw"... or if the field exists // perhaps we detect if the FieldType is non-character and expect hex if so? lowerBytes = new BytesRef(lowerStr); } else { lowerBytes = new BytesRef(); ft.readableToIndexed(lowerStr, lowerBytes); } } TermsEnum termsEnum = terms.iterator(null); BytesRef term = null; if (lowerBytes != null) { if (termsEnum.seekCeil(lowerBytes, true) == TermsEnum.SeekStatus.END) { termsEnum = null; } else { term = termsEnum.term(); //Only advance the enum if we are excluding the lower bound and the lower Term actually matches if (lowerIncl == false && term.equals(lowerBytes)) { term = termsEnum.next(); } } } else { // position termsEnum on first term term = termsEnum.next(); } int i = 0; BoundedTreeSet<CountPair<BytesRef, Integer>> queue = (sort ? new BoundedTreeSet<CountPair<BytesRef, Integer>>(limit) : null); CharsRef external = new CharsRef(); while (term != null && (i<limit || sort)) { boolean externalized = false; // did we fill in "external" yet for this term? // stop if the prefix doesn't match if (prefixBytes != null && !StringHelper.startsWith(term, prefixBytes)) break; if (pattern != null) { // indexed text or external text? // TODO: support "raw" mode? ft.indexedToReadable(term, external); externalized = true; if (!pattern.matcher(external).matches()) { term = termsEnum.next(); continue; } } if (upperBytes != null) { int upperCmp = term.compareTo(upperBytes); // if we are past the upper term, or equal to it (when don't include upper) then stop. if (upperCmp>0 || (upperCmp==0 && !upperIncl)) break; } // This is a good term in the range. Check if mincount/maxcount conditions are satisfied. int docFreq = termsEnum.docFreq(); if (docFreq >= freqmin && docFreq <= freqmax) { // add the term to the list if (sort) { queue.add(new CountPair<BytesRef, Integer>(BytesRef.deepCopyOf(term), docFreq)); } else { // TODO: handle raw somehow if (!externalized) { ft.indexedToReadable(term, external); } fieldTerms.add(external.toString(), docFreq); i++; } } term = termsEnum.next(); } if (sort) { for (CountPair<BytesRef, Integer> item : queue) { if (i >= limit) break; ft.indexedToReadable(item.key, external); fieldTerms.add(external.toString(), item.val); i++; } } } }
// in core/src/java/org/apache/solr/handler/component/TermsComponent.java
Override public int distributedProcess(ResponseBuilder rb) throws IOException { if (!rb.doTerms) { return ResponseBuilder.STAGE_DONE; } if (rb.stage == ResponseBuilder.STAGE_EXECUTE_QUERY) { TermsHelper th = rb._termsHelper; if (th == null) { th = rb._termsHelper = new TermsHelper(); th.init(rb.req.getParams()); } ShardRequest sreq = createShardQuery(rb.req.getParams()); rb.addRequest(this, sreq); } if (rb.stage < ResponseBuilder.STAGE_EXECUTE_QUERY) { return ResponseBuilder.STAGE_EXECUTE_QUERY; } else { return ResponseBuilder.STAGE_DONE; } }
// in core/src/java/org/apache/solr/handler/component/QueryElevationComponent.java
private Map<String, ElevationObj> loadElevationMap(Config cfg) throws IOException { XPath xpath = XPathFactory.newInstance().newXPath(); Map<String, ElevationObj> map = new HashMap<String, ElevationObj>(); NodeList nodes = (NodeList) cfg.evaluate("elevate/query", XPathConstants.NODESET); for (int i = 0; i < nodes.getLength(); i++) { Node node = nodes.item(i); String qstr = DOMUtil.getAttr(node, "text", "missing query 'text'"); NodeList children = null; try { children = (NodeList) xpath.evaluate("doc", node, XPathConstants.NODESET); } catch (XPathExpressionException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "query requires '<doc .../>' child"); } ArrayList<String> include = new ArrayList<String>(); ArrayList<String> exclude = new ArrayList<String>(); for (int j = 0; j < children.getLength(); j++) { Node child = children.item(j); String id = DOMUtil.getAttr(child, "id", "missing 'id'"); String e = DOMUtil.getAttr(child, EXCLUDE, null); if (e != null) { if (Boolean.valueOf(e)) { exclude.add(id); continue; } } include.add(id); } ElevationObj elev = new ElevationObj(qstr, include, exclude); if (map.containsKey(elev.analyzed)) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Boosting query defined twice for query: '" + elev.text + "' (" + elev.analyzed + "')"); } map.put(elev.analyzed, elev); } return map; }
// in core/src/java/org/apache/solr/handler/component/QueryElevationComponent.java
void setTopQueryResults(IndexReader reader, String query, String[] ids, String[] ex) throws IOException { if (ids == null) { ids = new String[0]; } if (ex == null) { ex = new String[0]; } Map<String, ElevationObj> elev = elevationCache.get(reader); if (elev == null) { elev = new HashMap<String, ElevationObj>(); elevationCache.put(reader, elev); } ElevationObj obj = new ElevationObj(query, Arrays.asList(ids), Arrays.asList(ex)); elev.put(obj.analyzed, obj); }
// in core/src/java/org/apache/solr/handler/component/QueryElevationComponent.java
String getAnalyzedQuery(String query) throws IOException { if (analyzer == null) { return query; } StringBuilder norm = new StringBuilder(); TokenStream tokens = analyzer.tokenStream("", new StringReader(query)); tokens.reset(); CharTermAttribute termAtt = tokens.addAttribute(CharTermAttribute.class); while (tokens.incrementToken()) { norm.append(termAtt.buffer(), 0, termAtt.length()); } tokens.end(); tokens.close(); return norm.toString(); }
// in core/src/java/org/apache/solr/handler/component/QueryElevationComponent.java
Override public void prepare(ResponseBuilder rb) throws IOException { SolrQueryRequest req = rb.req; SolrParams params = req.getParams(); // A runtime param can skip if (!params.getBool(QueryElevationParams.ENABLE, true)) { return; } boolean exclusive = params.getBool(QueryElevationParams.EXCLUSIVE, false); // A runtime parameter can alter the config value for forceElevation boolean force = params.getBool(QueryElevationParams.FORCE_ELEVATION, forceElevation); boolean markExcludes = params.getBool(QueryElevationParams.MARK_EXCLUDES, false); Query query = rb.getQuery(); String qstr = rb.getQueryString(); if (query == null || qstr == null) { return; } qstr = getAnalyzedQuery(qstr); IndexReader reader = req.getSearcher().getIndexReader(); ElevationObj booster = null; try { booster = getElevationMap(reader, req.getCore()).get(qstr); } catch (Exception ex) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Error loading elevation", ex); } if (booster != null) { rb.req.getContext().put(BOOSTED, booster.ids); // Change the query to insert forced documents if (exclusive == true) { //we only want these results rb.setQuery(booster.include); } else { BooleanQuery newq = new BooleanQuery(true); newq.add(query, BooleanClause.Occur.SHOULD); newq.add(booster.include, BooleanClause.Occur.SHOULD); if (booster.exclude != null) { if (markExcludes == false) { for (TermQuery tq : booster.exclude) { newq.add(new BooleanClause(tq, BooleanClause.Occur.MUST_NOT)); } } else { //we are only going to mark items as excluded, not actually exclude them. This works //with the EditorialMarkerFactory rb.req.getContext().put(EXCLUDED, booster.excludeIds); for (TermQuery tq : booster.exclude) { newq.add(new BooleanClause(tq, BooleanClause.Occur.SHOULD)); } } } rb.setQuery(newq); } ElevationComparatorSource comparator = new ElevationComparatorSource(booster); // if the sort is 'score desc' use a custom sorting method to // insert documents in their proper place SortSpec sortSpec = rb.getSortSpec(); if (sortSpec.getSort() == null) { sortSpec.setSort(new Sort(new SortField[]{ new SortField("_elevate_", comparator, true), new SortField(null, SortField.Type.SCORE, false) })); } else { // Check if the sort is based on score boolean modify = false; SortField[] current = sortSpec.getSort().getSort(); ArrayList<SortField> sorts = new ArrayList<SortField>(current.length + 1); // Perhaps force it to always sort by score if (force && current[0].getType() != SortField.Type.SCORE) { sorts.add(new SortField("_elevate_", comparator, true)); modify = true; } for (SortField sf : current) { if (sf.getType() == SortField.Type.SCORE) { sorts.add(new SortField("_elevate_", comparator, !sf.getReverse())); modify = true; } sorts.add(sf); } if (modify) { sortSpec.setSort(new Sort(sorts.toArray(new SortField[sorts.size()]))); } } } // Add debugging information if (rb.isDebug()) { List<String> match = null; if (booster != null) { // Extract the elevated terms into a list match = new ArrayList<String>(booster.priority.size()); for (Object o : booster.include.clauses()) { TermQuery tq = (TermQuery) ((BooleanClause) o).getQuery(); match.add(tq.getTerm().text()); } } SimpleOrderedMap<Object> dbg = new SimpleOrderedMap<Object>(); dbg.add("q", qstr); dbg.add("match", match); if (rb.isDebugQuery()) { rb.addDebugInfo("queryBoosting", dbg); } } }
// in core/src/java/org/apache/solr/handler/component/QueryElevationComponent.java
Override public void process(ResponseBuilder rb) throws IOException { // Do nothing -- the real work is modifying the input query }
// in core/src/java/org/apache/solr/handler/component/QueryElevationComponent.java
Override public FieldComparator<Integer> newComparator(String fieldname, final int numHits, int sortPos, boolean reversed) throws IOException { return new FieldComparator<Integer>() { private final int[] values = new int[numHits]; private int bottomVal; private TermsEnum termsEnum; private DocsEnum docsEnum; Set<String> seen = new HashSet<String>(elevations.ids.size()); @Override public int compare(int slot1, int slot2) { return values[slot1] - values[slot2]; // values will be small enough that there is no overflow concern } @Override public void setBottom(int slot) { bottomVal = values[slot]; } private int docVal(int doc) throws IOException { if (ordSet.size() > 0) { int slot = ordSet.find(doc); if (slot >= 0) { BytesRef id = termValues[slot]; Integer prio = elevations.priority.get(id); return prio == null ? 0 : prio.intValue(); } } return 0; } @Override public int compareBottom(int doc) throws IOException { return bottomVal - docVal(doc); } @Override public void copy(int slot, int doc) throws IOException { values[slot] = docVal(doc); } @Override public FieldComparator setNextReader(AtomicReaderContext context) throws IOException { //convert the ids to Lucene doc ids, the ordSet and termValues needs to be the same size as the number of elevation docs we have ordSet.clear(); Fields fields = context.reader().fields(); if (fields == null) return this; Terms terms = fields.terms(idField); if (terms == null) return this; termsEnum = terms.iterator(termsEnum); BytesRef term = new BytesRef(); Bits liveDocs = context.reader().getLiveDocs(); for (String id : elevations.ids) { term.copyChars(id); if (seen.contains(id) == false && termsEnum.seekExact(term, false)) { docsEnum = termsEnum.docs(liveDocs, docsEnum, false); if (docsEnum != null) { int docId = docsEnum.nextDoc(); if (docId == DocIdSetIterator.NO_MORE_DOCS ) continue; // must have been deleted termValues[ordSet.put(docId)] = BytesRef.deepCopyOf(term); seen.add(id); assert docsEnum.nextDoc() == DocIdSetIterator.NO_MORE_DOCS; } } } return this; } @Override public Integer value(int slot) { return values[slot]; } @Override public int compareDocToValue(int doc, Integer valueObj) throws IOException { final int value = valueObj.intValue(); final int docValue = docVal(doc); return docValue - value; // values will be small enough that there is no overflow concern } }; }
// in core/src/java/org/apache/solr/handler/component/QueryElevationComponent.java
private int docVal(int doc) throws IOException { if (ordSet.size() > 0) { int slot = ordSet.find(doc); if (slot >= 0) { BytesRef id = termValues[slot]; Integer prio = elevations.priority.get(id); return prio == null ? 0 : prio.intValue(); } } return 0; }
// in core/src/java/org/apache/solr/handler/component/QueryElevationComponent.java
Override public int compareBottom(int doc) throws IOException { return bottomVal - docVal(doc); }
// in core/src/java/org/apache/solr/handler/component/QueryElevationComponent.java
Override public void copy(int slot, int doc) throws IOException { values[slot] = docVal(doc); }
// in core/src/java/org/apache/solr/handler/component/QueryElevationComponent.java
Override public FieldComparator setNextReader(AtomicReaderContext context) throws IOException { //convert the ids to Lucene doc ids, the ordSet and termValues needs to be the same size as the number of elevation docs we have ordSet.clear(); Fields fields = context.reader().fields(); if (fields == null) return this; Terms terms = fields.terms(idField); if (terms == null) return this; termsEnum = terms.iterator(termsEnum); BytesRef term = new BytesRef(); Bits liveDocs = context.reader().getLiveDocs(); for (String id : elevations.ids) { term.copyChars(id); if (seen.contains(id) == false && termsEnum.seekExact(term, false)) { docsEnum = termsEnum.docs(liveDocs, docsEnum, false); if (docsEnum != null) { int docId = docsEnum.nextDoc(); if (docId == DocIdSetIterator.NO_MORE_DOCS ) continue; // must have been deleted termValues[ordSet.put(docId)] = BytesRef.deepCopyOf(term); seen.add(id); assert docsEnum.nextDoc() == DocIdSetIterator.NO_MORE_DOCS; } } } return this; }
// in core/src/java/org/apache/solr/handler/component/QueryElevationComponent.java
Override public int compareDocToValue(int doc, Integer valueObj) throws IOException { final int value = valueObj.intValue(); final int docValue = docVal(doc); return docValue - value; // values will be small enough that there is no overflow concern }
// in core/src/java/org/apache/solr/handler/component/SpellCheckComponent.java
private Collection<Token> getTokens(String q, Analyzer analyzer) throws IOException { Collection<Token> result = new ArrayList<Token>(); assert analyzer != null; TokenStream ts = analyzer.tokenStream("", new StringReader(q)); ts.reset(); // TODO: support custom attributes CharTermAttribute termAtt = ts.addAttribute(CharTermAttribute.class); OffsetAttribute offsetAtt = ts.addAttribute(OffsetAttribute.class); TypeAttribute typeAtt = ts.addAttribute(TypeAttribute.class); FlagsAttribute flagsAtt = ts.addAttribute(FlagsAttribute.class); PayloadAttribute payloadAtt = ts.addAttribute(PayloadAttribute.class); PositionIncrementAttribute posIncAtt = ts.addAttribute(PositionIncrementAttribute.class); while (ts.incrementToken()){ Token token = new Token(); token.copyBuffer(termAtt.buffer(), 0, termAtt.length()); token.setOffset(offsetAtt.startOffset(), offsetAtt.endOffset()); token.setType(typeAtt.type()); token.setFlags(flagsAtt.getFlags()); token.setPayload(payloadAtt.getPayload()); token.setPositionIncrement(posIncAtt.getPositionIncrement()); result.add(token); } ts.end(); ts.close(); return result; }
// in core/src/java/org/apache/solr/handler/component/DebugComponent.java
Override public void prepare(ResponseBuilder rb) throws IOException { }
// in core/src/java/org/apache/solr/handler/component/PivotFacetHelper.java
public SimpleOrderedMap<List<NamedList<Object>>> process(ResponseBuilder rb, SolrParams params, String[] pivots) throws IOException { if (!rb.doFacets || pivots == null) return null; int minMatch = params.getInt( FacetParams.FACET_PIVOT_MINCOUNT, 1 ); SimpleOrderedMap<List<NamedList<Object>>> pivotResponse = new SimpleOrderedMap<List<NamedList<Object>>>(); for (String pivot : pivots) { String[] fields = pivot.split(","); // only support two levels for now if( fields.length < 2 ) { throw new SolrException( ErrorCode.BAD_REQUEST, "Pivot Facet needs at least two fields: "+pivot ); } DocSet docs = rb.getResults().docSet; String field = fields[0]; String subField = fields[1]; Deque<String> fnames = new LinkedList<String>(); for( int i=fields.length-1; i>1; i-- ) { fnames.push( fields[i] ); } SimpleFacets sf = getFacetImplementation(rb.req, rb.getResults().docSet, rb.req.getParams()); NamedList<Integer> superFacets = sf.getTermCounts(field); pivotResponse.add(pivot, doPivots(superFacets, field, subField, fnames, rb, docs, minMatch)); } return pivotResponse; }
// in core/src/java/org/apache/solr/handler/component/PivotFacetHelper.java
protected List<NamedList<Object>> doPivots( NamedList<Integer> superFacets, String field, String subField, Deque<String> fnames, ResponseBuilder rb, DocSet docs, int minMatch ) throws IOException { SolrIndexSearcher searcher = rb.req.getSearcher(); // TODO: optimize to avoid converting to an external string and then having to convert back to internal below SchemaField sfield = searcher.getSchema().getField(field); FieldType ftype = sfield.getType(); String nextField = fnames.poll(); List<NamedList<Object>> values = new ArrayList<NamedList<Object>>( superFacets.size() ); for (Map.Entry<String, Integer> kv : superFacets) { // Only sub-facet if parent facet has positive count - still may not be any values for the sub-field though if (kv.getValue() >= minMatch ) { // don't reuse the same BytesRef each time since we will be constructing Term // objects that will most likely be cached. BytesRef termval = new BytesRef(); ftype.readableToIndexed(kv.getKey(), termval); SimpleOrderedMap<Object> pivot = new SimpleOrderedMap<Object>(); pivot.add( "field", field ); pivot.add( "value", ftype.toObject(sfield, termval) ); pivot.add( "count", kv.getValue() ); if( subField == null ) { values.add( pivot ); } else { Query query = new TermQuery(new Term(field, termval)); DocSet subset = searcher.getDocSet(query, docs); SimpleFacets sf = getFacetImplementation(rb.req, subset, rb.req.getParams()); NamedList<Integer> nl = sf.getTermCounts(subField); if (nl.size() >= minMatch ) { pivot.add( "pivot", doPivots( nl, subField, nextField, fnames, rb, subset, minMatch ) ); values.add( pivot ); // only add response if there are some counts } } } } // put the field back on the list fnames.push( nextField ); return values; }
// in core/src/java/org/apache/solr/handler/component/QueryComponent.java
Override public void prepare(ResponseBuilder rb) throws IOException { SolrQueryRequest req = rb.req; SolrParams params = req.getParams(); if (!params.getBool(COMPONENT_NAME, true)) { return; } SolrQueryResponse rsp = rb.rsp; // Set field flags ReturnFields returnFields = new ReturnFields( req ); rsp.setReturnFields( returnFields ); int flags = 0; if (returnFields.wantsScore()) { flags |= SolrIndexSearcher.GET_SCORES; } rb.setFieldFlags( flags ); String defType = params.get(QueryParsing.DEFTYPE,QParserPlugin.DEFAULT_QTYPE); // get it from the response builder to give a different component a chance // to set it. String queryString = rb.getQueryString(); if (queryString == null) { // this is the normal way it's set. queryString = params.get( CommonParams.Q ); rb.setQueryString(queryString); } try { QParser parser = QParser.getParser(rb.getQueryString(), defType, req); Query q = parser.getQuery(); if (q == null) { // normalize a null query to a query that matches nothing q = new BooleanQuery(); } rb.setQuery( q ); rb.setSortSpec( parser.getSort(true) ); rb.setQparser(parser); rb.setScoreDoc(parser.getPaging()); String[] fqs = req.getParams().getParams(CommonParams.FQ); if (fqs!=null && fqs.length!=0) { List<Query> filters = rb.getFilters(); if (filters==null) { filters = new ArrayList<Query>(fqs.length); } for (String fq : fqs) { if (fq != null && fq.trim().length()!=0) { QParser fqp = QParser.getParser(fq, null, req); filters.add(fqp.getQuery()); } } // only set the filters if they are not empty otherwise // fq=&someotherParam= will trigger all docs filter for every request // if filter cache is disabled if (!filters.isEmpty()) { rb.setFilters( filters ); } } } catch (ParseException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); } boolean grouping = params.getBool(GroupParams.GROUP, false); if (!grouping) { return; } SolrIndexSearcher.QueryCommand cmd = rb.getQueryCommand(); SolrIndexSearcher searcher = rb.req.getSearcher(); GroupingSpecification groupingSpec = new GroupingSpecification(); rb.setGroupingSpec(groupingSpec); //TODO: move weighting of sort Sort groupSort = searcher.weightSort(cmd.getSort()); if (groupSort == null) { groupSort = Sort.RELEVANCE; } // groupSort defaults to sort String groupSortStr = params.get(GroupParams.GROUP_SORT); //TODO: move weighting of sort Sort sortWithinGroup = groupSortStr == null ? groupSort : searcher.weightSort(QueryParsing.parseSort(groupSortStr, req)); if (sortWithinGroup == null) { sortWithinGroup = Sort.RELEVANCE; } groupingSpec.setSortWithinGroup(sortWithinGroup); groupingSpec.setGroupSort(groupSort); String formatStr = params.get(GroupParams.GROUP_FORMAT, Grouping.Format.grouped.name()); Grouping.Format responseFormat; try { responseFormat = Grouping.Format.valueOf(formatStr); } catch (IllegalArgumentException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, String.format("Illegal %s parameter", GroupParams.GROUP_FORMAT)); } groupingSpec.setResponseFormat(responseFormat); groupingSpec.setFields(params.getParams(GroupParams.GROUP_FIELD)); groupingSpec.setQueries(params.getParams(GroupParams.GROUP_QUERY)); groupingSpec.setFunctions(params.getParams(GroupParams.GROUP_FUNC)); groupingSpec.setGroupOffset(params.getInt(GroupParams.GROUP_OFFSET, 0)); groupingSpec.setGroupLimit(params.getInt(GroupParams.GROUP_LIMIT, 1)); groupingSpec.setOffset(rb.getSortSpec().getOffset()); groupingSpec.setLimit(rb.getSortSpec().getCount()); groupingSpec.setIncludeGroupCount(params.getBool(GroupParams.GROUP_TOTAL_COUNT, false)); groupingSpec.setMain(params.getBool(GroupParams.GROUP_MAIN, false)); groupingSpec.setNeedScore((cmd.getFlags() & SolrIndexSearcher.GET_SCORES) != 0); groupingSpec.setTruncateGroups(params.getBool(GroupParams.GROUP_TRUNCATE, false)); }
// in core/src/java/org/apache/solr/handler/component/QueryComponent.java
Override public void process(ResponseBuilder rb) throws IOException { SolrQueryRequest req = rb.req; SolrQueryResponse rsp = rb.rsp; SolrParams params = req.getParams(); if (!params.getBool(COMPONENT_NAME, true)) { return; } SolrIndexSearcher searcher = req.getSearcher(); if (rb.getQueryCommand().getOffset() < 0) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "'start' parameter cannot be negative"); } // -1 as flag if not set. long timeAllowed = (long)params.getInt( CommonParams.TIME_ALLOWED, -1 ); // Optional: This could also be implemented by the top-level searcher sending // a filter that lists the ids... that would be transparent to // the request handler, but would be more expensive (and would preserve score // too if desired). String ids = params.get(ShardParams.IDS); if (ids != null) { SchemaField idField = req.getSchema().getUniqueKeyField(); List<String> idArr = StrUtils.splitSmart(ids, ",", true); int[] luceneIds = new int[idArr.size()]; int docs = 0; for (int i=0; i<idArr.size(); i++) { int id = req.getSearcher().getFirstMatch( new Term(idField.getName(), idField.getType().toInternal(idArr.get(i)))); if (id >= 0) luceneIds[docs++] = id; } DocListAndSet res = new DocListAndSet(); res.docList = new DocSlice(0, docs, luceneIds, null, docs, 0); if (rb.isNeedDocSet()) { // TODO: create a cache for this! List<Query> queries = new ArrayList<Query>(); queries.add(rb.getQuery()); List<Query> filters = rb.getFilters(); if (filters != null) queries.addAll(filters); res.docSet = searcher.getDocSet(queries); } rb.setResults(res); ResultContext ctx = new ResultContext(); ctx.docs = rb.getResults().docList; ctx.query = null; // anything? rsp.add("response", ctx); return; } SolrIndexSearcher.QueryCommand cmd = rb.getQueryCommand(); cmd.setTimeAllowed(timeAllowed); SolrIndexSearcher.QueryResult result = new SolrIndexSearcher.QueryResult(); // // grouping / field collapsing // GroupingSpecification groupingSpec = rb.getGroupingSpec(); if (groupingSpec != null) { try { boolean needScores = (cmd.getFlags() & SolrIndexSearcher.GET_SCORES) != 0; if (params.getBool(GroupParams.GROUP_DISTRIBUTED_FIRST, false)) { CommandHandler.Builder topsGroupsActionBuilder = new CommandHandler.Builder() .setQueryCommand(cmd) .setNeedDocSet(false) // Order matters here .setIncludeHitCount(true) .setSearcher(searcher); for (String field : groupingSpec.getFields()) { topsGroupsActionBuilder.addCommandField(new SearchGroupsFieldCommand.Builder() .setField(searcher.getSchema().getField(field)) .setGroupSort(groupingSpec.getGroupSort()) .setTopNGroups(cmd.getOffset() + cmd.getLen()) .setIncludeGroupCount(groupingSpec.isIncludeGroupCount()) .build() ); } CommandHandler commandHandler = topsGroupsActionBuilder.build(); commandHandler.execute(); SearchGroupsResultTransformer serializer = new SearchGroupsResultTransformer(searcher); rsp.add("firstPhase", commandHandler.processResult(result, serializer)); rsp.add("totalHitCount", commandHandler.getTotalHitCount()); rb.setResult(result); return; } else if (params.getBool(GroupParams.GROUP_DISTRIBUTED_SECOND, false)) { CommandHandler.Builder secondPhaseBuilder = new CommandHandler.Builder() .setQueryCommand(cmd) .setTruncateGroups(groupingSpec.isTruncateGroups() && groupingSpec.getFields().length > 0) .setSearcher(searcher); for (String field : groupingSpec.getFields()) { String[] topGroupsParam = params.getParams(GroupParams.GROUP_DISTRIBUTED_TOPGROUPS_PREFIX + field); if (topGroupsParam == null) { topGroupsParam = new String[0]; } List<SearchGroup<BytesRef>> topGroups = new ArrayList<SearchGroup<BytesRef>>(topGroupsParam.length); for (String topGroup : topGroupsParam) { SearchGroup<BytesRef> searchGroup = new SearchGroup<BytesRef>(); if (!topGroup.equals(TopGroupsShardRequestFactory.GROUP_NULL_VALUE)) { searchGroup.groupValue = new BytesRef(searcher.getSchema().getField(field).getType().readableToIndexed(topGroup)); } topGroups.add(searchGroup); } secondPhaseBuilder.addCommandField( new TopGroupsFieldCommand.Builder() .setField(searcher.getSchema().getField(field)) .setGroupSort(groupingSpec.getGroupSort()) .setSortWithinGroup(groupingSpec.getSortWithinGroup()) .setFirstPhaseGroups(topGroups) .setMaxDocPerGroup(groupingSpec.getGroupOffset() + groupingSpec.getGroupLimit()) .setNeedScores(needScores) .setNeedMaxScore(needScores) .build() ); } for (String query : groupingSpec.getQueries()) { secondPhaseBuilder.addCommandField(new QueryCommand.Builder() .setDocsToCollect(groupingSpec.getOffset() + groupingSpec.getLimit()) .setSort(groupingSpec.getGroupSort()) .setQuery(query, rb.req) .setDocSet(searcher) .build() ); } CommandHandler commandHandler = secondPhaseBuilder.build(); commandHandler.execute(); TopGroupsResultTransformer serializer = new TopGroupsResultTransformer(rb); rsp.add("secondPhase", commandHandler.processResult(result, serializer)); rb.setResult(result); return; } int maxDocsPercentageToCache = params.getInt(GroupParams.GROUP_CACHE_PERCENTAGE, 0); boolean cacheSecondPassSearch = maxDocsPercentageToCache >= 1 && maxDocsPercentageToCache <= 100; Grouping.TotalCount defaultTotalCount = groupingSpec.isIncludeGroupCount() ? Grouping.TotalCount.grouped : Grouping.TotalCount.ungrouped; int limitDefault = cmd.getLen(); // this is normally from "rows" Grouping grouping = new Grouping(searcher, result, cmd, cacheSecondPassSearch, maxDocsPercentageToCache, groupingSpec.isMain()); grouping.setSort(groupingSpec.getGroupSort()) .setGroupSort(groupingSpec.getSortWithinGroup()) .setDefaultFormat(groupingSpec.getResponseFormat()) .setLimitDefault(limitDefault) .setDefaultTotalCount(defaultTotalCount) .setDocsPerGroupDefault(groupingSpec.getGroupLimit()) .setGroupOffsetDefault(groupingSpec.getGroupOffset()) .setGetGroupedDocSet(groupingSpec.isTruncateGroups()); if (groupingSpec.getFields() != null) { for (String field : groupingSpec.getFields()) { grouping.addFieldCommand(field, rb.req); } } if (groupingSpec.getFunctions() != null) { for (String groupByStr : groupingSpec.getFunctions()) { grouping.addFunctionCommand(groupByStr, rb.req); } } if (groupingSpec.getQueries() != null) { for (String groupByStr : groupingSpec.getQueries()) { grouping.addQueryCommand(groupByStr, rb.req); } } if (rb.doHighlights || rb.isDebug() || params.getBool(MoreLikeThisParams.MLT, false)) { // we need a single list of the returned docs cmd.setFlags(SolrIndexSearcher.GET_DOCLIST); } grouping.execute(); if (grouping.isSignalCacheWarning()) { rsp.add( "cacheWarning", String.format("Cache limit of %d percent relative to maxdoc has exceeded. Please increase cache size or disable caching.", maxDocsPercentageToCache) ); } rb.setResult(result); if (grouping.mainResult != null) { ResultContext ctx = new ResultContext(); ctx.docs = grouping.mainResult; ctx.query = null; // TODO? add the query? rsp.add("response", ctx); rsp.getToLog().add("hits", grouping.mainResult.matches()); } else if (!grouping.getCommands().isEmpty()) { // Can never be empty since grouping.execute() checks for this. rsp.add("grouped", result.groupedResults); rsp.getToLog().add("hits", grouping.getCommands().get(0).getMatches()); } return; } catch (ParseException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); } } // normal search result searcher.search(result,cmd); rb.setResult( result ); ResultContext ctx = new ResultContext(); ctx.docs = rb.getResults().docList; ctx.query = rb.getQuery(); rsp.add("response", ctx); rsp.getToLog().add("hits", rb.getResults().docList.matches()); doFieldSortValues(rb, searcher); doPrefetch(rb); }
// in core/src/java/org/apache/solr/handler/component/QueryComponent.java
protected void doFieldSortValues(ResponseBuilder rb, SolrIndexSearcher searcher) throws IOException { SolrQueryRequest req = rb.req; SolrQueryResponse rsp = rb.rsp; final CharsRef spare = new CharsRef(); // The query cache doesn't currently store sort field values, and SolrIndexSearcher doesn't // currently have an option to return sort field values. Because of this, we // take the documents given and re-derive the sort values. boolean fsv = req.getParams().getBool(ResponseBuilder.FIELD_SORT_VALUES,false); if(fsv){ Sort sort = searcher.weightSort(rb.getSortSpec().getSort()); SortField[] sortFields = sort==null ? new SortField[]{SortField.FIELD_SCORE} : sort.getSort(); NamedList<Object[]> sortVals = new NamedList<Object[]>(); // order is important for the sort fields Field field = new StringField("dummy", ""); // a dummy Field IndexReaderContext topReaderContext = searcher.getTopReaderContext(); AtomicReaderContext[] leaves = topReaderContext.leaves(); AtomicReaderContext currentLeaf = null; if (leaves.length==1) { // if there is a single segment, use that subReader and avoid looking up each time currentLeaf = leaves[0]; leaves=null; } DocList docList = rb.getResults().docList; // sort ids from lowest to highest so we can access them in order int nDocs = docList.size(); long[] sortedIds = new long[nDocs]; DocIterator it = rb.getResults().docList.iterator(); for (int i=0; i<nDocs; i++) { sortedIds[i] = (((long)it.nextDoc()) << 32) | i; } Arrays.sort(sortedIds); for (SortField sortField: sortFields) { SortField.Type type = sortField.getType(); if (type==SortField.Type.SCORE || type==SortField.Type.DOC) continue; FieldComparator comparator = null; String fieldname = sortField.getField(); FieldType ft = fieldname==null ? null : req.getSchema().getFieldTypeNoEx(fieldname); Object[] vals = new Object[nDocs]; int lastIdx = -1; int idx = 0; for (long idAndPos : sortedIds) { int doc = (int)(idAndPos >>> 32); int position = (int)idAndPos; if (leaves != null) { idx = ReaderUtil.subIndex(doc, leaves); currentLeaf = leaves[idx]; if (idx != lastIdx) { // we switched segments. invalidate comparator. comparator = null; } } if (comparator == null) { comparator = sortField.getComparator(1,0); comparator = comparator.setNextReader(currentLeaf); } doc -= currentLeaf.docBase; // adjust for what segment this is in comparator.copy(0, doc); Object val = comparator.value(0); // Sortable float, double, int, long types all just use a string // comparator. For these, we need to put the type into a readable // format. One reason for this is that XML can't represent all // string values (or even all unicode code points). // indexedToReadable() should be a no-op and should // thus be harmless anyway (for all current ways anyway) if (val instanceof String) { field.setStringValue((String)val); val = ft.toObject(field); } // Must do the same conversion when sorting by a // String field in Lucene, which returns the terms // data as BytesRef: if (val instanceof BytesRef) { UnicodeUtil.UTF8toUTF16((BytesRef)val, spare); field.setStringValue(spare.toString()); val = ft.toObject(field); } vals[position] = val; } sortVals.add(fieldname, vals); } rsp.add("sort_values", sortVals); } }
// in core/src/java/org/apache/solr/handler/component/QueryComponent.java
protected void doPrefetch(ResponseBuilder rb) throws IOException { SolrQueryRequest req = rb.req; SolrQueryResponse rsp = rb.rsp; //pre-fetch returned documents if (!req.getParams().getBool(ShardParams.IS_SHARD,false) && rb.getResults().docList != null && rb.getResults().docList.size()<=50) { SolrPluginUtils.optimizePreFetchDocs(rb, rb.getResults().docList, rb.getQuery(), req, rsp); } }
// in core/src/java/org/apache/solr/handler/component/QueryComponent.java
Override public int distributedProcess(ResponseBuilder rb) throws IOException { if (rb.grouping()) { return groupedDistributedProcess(rb); } else { return regularDistributedProcess(rb); } }
// in core/src/java/org/apache/solr/handler/component/FacetComponent.java
Override public void prepare(ResponseBuilder rb) throws IOException { if (rb.req.getParams().getBool(FacetParams.FACET,false)) { rb.setNeedDocSet( true ); rb.doFacets = true; } }
// in core/src/java/org/apache/solr/handler/component/FacetComponent.java
Override public void process(ResponseBuilder rb) throws IOException { if (rb.doFacets) { SolrParams params = rb.req.getParams(); SimpleFacets f = new SimpleFacets(rb.req, rb.getResults().docSet, params, rb ); NamedList<Object> counts = f.getFacetCounts(); String[] pivots = params.getParams( FacetParams.FACET_PIVOT ); if( pivots != null && pivots.length > 0 ) { NamedList v = pivotHelper.process(rb, params, pivots); if( v != null ) { counts.add( PIVOT_KEY, v ); } } // TODO ???? add this directly to the response, or to the builder? rb.rsp.add( "facet_counts", counts ); } }
// in core/src/java/org/apache/solr/handler/component/FacetComponent.java
Override public int distributedProcess(ResponseBuilder rb) throws IOException { if (!rb.doFacets) { return ResponseBuilder.STAGE_DONE; } if (rb.stage == ResponseBuilder.STAGE_GET_FIELDS) { // overlap facet refinement requests (those shards that we need a count for // particular facet values from), where possible, with // the requests to get fields (because we know that is the // only other required phase). // We do this in distributedProcess so we can look at all of the // requests in the outgoing queue at once. for (int shardNum=0; shardNum<rb.shards.length; shardNum++) { List<String> refinements = null; for (DistribFieldFacet dff : rb._facetInfo.facets.values()) { if (!dff.needRefinements) continue; List<String> refList = dff._toRefine[shardNum]; if (refList == null || refList.size()==0) continue; String key = dff.getKey(); // reuse the same key that was used for the main facet String termsKey = key + "__terms"; String termsVal = StrUtils.join(refList, ','); String facetCommand; // add terms into the original facet.field command // do it via parameter reference to avoid another layer of encoding. String termsKeyEncoded = QueryParsing.encodeLocalParamVal(termsKey); if (dff.localParams != null) { facetCommand = commandPrefix+termsKeyEncoded + " " + dff.facetStr.substring(2); } else { facetCommand = commandPrefix+termsKeyEncoded+'}'+dff.field; } if (refinements == null) { refinements = new ArrayList<String>(); } refinements.add(facetCommand); refinements.add(termsKey); refinements.add(termsVal); } if (refinements == null) continue; String shard = rb.shards[shardNum]; ShardRequest refine = null; boolean newRequest = false; // try to find a request that is already going out to that shard. // If nshards becomes to great, we way want to move to hashing for better // scalability. for (ShardRequest sreq : rb.outgoing) { if ((sreq.purpose & ShardRequest.PURPOSE_GET_FIELDS)!=0 && sreq.shards != null && sreq.shards.length==1 && sreq.shards[0].equals(shard)) { refine = sreq; break; } } if (refine == null) { // we didn't find any other suitable requests going out to that shard, so // create one ourselves. newRequest = true; refine = new ShardRequest(); refine.shards = new String[]{rb.shards[shardNum]}; refine.params = new ModifiableSolrParams(rb.req.getParams()); // don't request any documents refine.params.remove(CommonParams.START); refine.params.set(CommonParams.ROWS,"0"); } refine.purpose |= ShardRequest.PURPOSE_REFINE_FACETS; refine.params.set(FacetParams.FACET, "true"); refine.params.remove(FacetParams.FACET_FIELD); refine.params.remove(FacetParams.FACET_QUERY); for (int i=0; i<refinements.size();) { String facetCommand=refinements.get(i++); String termsKey=refinements.get(i++); String termsVal=refinements.get(i++); refine.params.add(FacetParams.FACET_FIELD, facetCommand); refine.params.set(termsKey, termsVal); } if (newRequest) { rb.addRequest(this, refine); } } } return ResponseBuilder.STAGE_DONE; }
// in core/src/java/org/apache/solr/handler/component/HighlightComponent.java
Override public void prepare(ResponseBuilder rb) throws IOException { SolrParams params = rb.req.getParams(); rb.doHighlights = highlighter.isHighlightingEnabled(params); if(rb.doHighlights){ String hlq = params.get(HighlightParams.Q); if(hlq != null){ try { QParser parser = QParser.getParser(hlq, null, rb.req); rb.setHighlightQuery(parser.getHighlightQuery()); } catch (ParseException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); } } } }
// in core/src/java/org/apache/solr/handler/component/HighlightComponent.java
Override public void process(ResponseBuilder rb) throws IOException { if (rb.doHighlights) { SolrQueryRequest req = rb.req; SolrParams params = req.getParams(); String[] defaultHighlightFields; //TODO: get from builder by default? if (rb.getQparser() != null) { defaultHighlightFields = rb.getQparser().getDefaultHighlightFields(); } else { defaultHighlightFields = params.getParams(CommonParams.DF); } Query highlightQuery = rb.getHighlightQuery(); if(highlightQuery==null) { if (rb.getQparser() != null) { try { highlightQuery = rb.getQparser().getHighlightQuery(); rb.setHighlightQuery( highlightQuery ); } catch (Exception e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); } } else { highlightQuery = rb.getQuery(); rb.setHighlightQuery( highlightQuery ); } } if(highlightQuery != null) { boolean rewrite = !(Boolean.valueOf(params.get(HighlightParams.USE_PHRASE_HIGHLIGHTER, "true")) && Boolean.valueOf(params.get(HighlightParams.HIGHLIGHT_MULTI_TERM, "true"))); highlightQuery = rewrite ? highlightQuery.rewrite(req.getSearcher().getIndexReader()) : highlightQuery; } // No highlighting if there is no query -- consider q.alt="*:* if( highlightQuery != null ) { NamedList sumData = highlighter.doHighlighting( rb.getResults().docList, highlightQuery, req, defaultHighlightFields ); if(sumData != null) { // TODO ???? add this directly to the response? rb.rsp.add("highlighting", sumData); } } } }
// in core/src/java/org/apache/solr/handler/component/SearchComponent.java
public int distributedProcess(ResponseBuilder rb) throws IOException { return ResponseBuilder.STAGE_DONE; }
// in core/src/java/org/apache/solr/handler/component/StatsComponent.java
Override public void prepare(ResponseBuilder rb) throws IOException { if (rb.req.getParams().getBool(StatsParams.STATS,false)) { rb.setNeedDocSet( true ); rb.doStats = true; } }
// in core/src/java/org/apache/solr/handler/component/StatsComponent.java
Override public void process(ResponseBuilder rb) throws IOException { if (rb.doStats) { SolrParams params = rb.req.getParams(); SimpleStats s = new SimpleStats(rb.req, rb.getResults().docSet, params ); // TODO ???? add this directly to the response, or to the builder? rb.rsp.add( "stats", s.getStatsCounts() ); } }
// in core/src/java/org/apache/solr/handler/component/StatsComponent.java
Override public int distributedProcess(ResponseBuilder rb) throws IOException { return ResponseBuilder.STAGE_DONE; }
// in core/src/java/org/apache/solr/handler/component/StatsComponent.java
public NamedList<Object> getStatsCounts() throws IOException { NamedList<Object> res = new SimpleOrderedMap<Object>(); res.add("stats_fields", getStatsFields()); return res; }
// in core/src/java/org/apache/solr/handler/component/StatsComponent.java
public NamedList<Object> getStatsFields() throws IOException { NamedList<Object> res = new SimpleOrderedMap<Object>(); String[] statsFs = params.getParams(StatsParams.STATS_FIELD); boolean isShard = params.getBool(ShardParams.IS_SHARD, false); if (null != statsFs) { for (String f : statsFs) { String[] facets = params.getFieldParams(f, StatsParams.STATS_FACET); if (facets == null) { facets = new String[0]; // make sure it is something... } SchemaField sf = searcher.getSchema().getField(f); FieldType ft = sf.getType(); NamedList<?> stv; // Currently, only UnInvertedField can deal with multi-part trie fields String prefix = TrieField.getMainValuePrefix(ft); if (sf.multiValued() || ft.multiValuedFieldCache() || prefix!=null) { //use UnInvertedField for multivalued fields UnInvertedField uif = UnInvertedField.getUnInvertedField(f, searcher); stv = uif.getStats(searcher, docs, facets).getStatsValues(); } else { stv = getFieldCacheStats(f, facets); } if (isShard == true || (Long) stv.get("count") > 0) { res.add(f, stv); } else { res.add(f, null); } } } return res; }
// in core/src/java/org/apache/solr/handler/component/RealTimeGetComponent.java
Override public void prepare(ResponseBuilder rb) throws IOException { // Set field flags ReturnFields returnFields = new ReturnFields( rb.req ); rb.rsp.setReturnFields( returnFields ); }
// in core/src/java/org/apache/solr/handler/component/RealTimeGetComponent.java
Override public void process(ResponseBuilder rb) throws IOException { SolrQueryRequest req = rb.req; SolrQueryResponse rsp = rb.rsp; SolrParams params = req.getParams(); if (!params.getBool(COMPONENT_NAME, true)) { return; } String val = params.get("getVersions"); if (val != null) { processGetVersions(rb); return; } val = params.get("getUpdates"); if (val != null) { processGetUpdates(rb); return; } String id[] = params.getParams("id"); String ids[] = params.getParams("ids"); if (id == null && ids == null) { return; } String[] allIds = id==null ? new String[0] : id; if (ids != null) { List<String> lst = new ArrayList<String>(); for (String s : allIds) { lst.add(s); } for (String idList : ids) { lst.addAll( StrUtils.splitSmart(idList, ",", true) ); } allIds = lst.toArray(new String[lst.size()]); } SchemaField idField = req.getSchema().getUniqueKeyField(); FieldType fieldType = idField.getType(); SolrDocumentList docList = new SolrDocumentList(); UpdateLog ulog = req.getCore().getUpdateHandler().getUpdateLog(); RefCounted<SolrIndexSearcher> searcherHolder = null; DocTransformer transformer = rsp.getReturnFields().getTransformer(); if (transformer != null) { TransformContext context = new TransformContext(); context.req = req; transformer.setContext(context); } try { SolrIndexSearcher searcher = null; BytesRef idBytes = new BytesRef(); for (String idStr : allIds) { fieldType.readableToIndexed(idStr, idBytes); if (ulog != null) { Object o = ulog.lookup(idBytes); if (o != null) { // should currently be a List<Oper,Ver,Doc/Id> List entry = (List)o; assert entry.size() >= 3; int oper = (Integer)entry.get(0) & UpdateLog.OPERATION_MASK; switch (oper) { case UpdateLog.ADD: SolrDocument doc = toSolrDoc((SolrInputDocument)entry.get(entry.size()-1), req.getSchema()); if(transformer!=null) { transformer.transform(doc, -1); // unknown docID } docList.add(doc); break; case UpdateLog.DELETE: break; default: throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Unknown Operation! " + oper); } continue; } } // didn't find it in the update log, so it should be in the newest searcher opened if (searcher == null) { searcherHolder = req.getCore().getRealtimeSearcher(); searcher = searcherHolder.get(); } // SolrCore.verbose("RealTimeGet using searcher ", searcher); int docid = searcher.getFirstMatch(new Term(idField.getName(), idBytes)); if (docid < 0) continue; Document luceneDocument = searcher.doc(docid); SolrDocument doc = toSolrDoc(luceneDocument, req.getSchema()); if( transformer != null ) { transformer.transform(doc, docid); } docList.add(doc); } } finally { if (searcherHolder != null) { searcherHolder.decref(); } } // if the client specified a single id=foo, then use "doc":{ // otherwise use a standard doclist if (ids == null && allIds.length <= 1) { // if the doc was not found, then use a value of null. rsp.add("doc", docList.size() > 0 ? docList.get(0) : null); } else { docList.setNumFound(docList.size()); rsp.add("response", docList); } }
// in core/src/java/org/apache/solr/handler/component/RealTimeGetComponent.java
public static SolrInputDocument getInputDocument(SolrCore core, BytesRef idBytes) throws IOException { SolrInputDocument sid = null; RefCounted<SolrIndexSearcher> searcherHolder = null; try { SolrIndexSearcher searcher = null; UpdateLog ulog = core.getUpdateHandler().getUpdateLog(); if (ulog != null) { Object o = ulog.lookup(idBytes); if (o != null) { // should currently be a List<Oper,Ver,Doc/Id> List entry = (List)o; assert entry.size() >= 3; int oper = (Integer)entry.get(0) & UpdateLog.OPERATION_MASK; switch (oper) { case UpdateLog.ADD: sid = (SolrInputDocument)entry.get(entry.size()-1); break; case UpdateLog.DELETE: return null; default: throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Unknown Operation! " + oper); } } } if (sid == null) { // didn't find it in the update log, so it should be in the newest searcher opened if (searcher == null) { searcherHolder = core.getRealtimeSearcher(); searcher = searcherHolder.get(); } // SolrCore.verbose("RealTimeGet using searcher ", searcher); SchemaField idField = core.getSchema().getUniqueKeyField(); int docid = searcher.getFirstMatch(new Term(idField.getName(), idBytes)); if (docid < 0) return null; Document luceneDocument = searcher.doc(docid); sid = toSolrInputDocument(luceneDocument, core.getSchema()); } } finally { if (searcherHolder != null) { searcherHolder.decref(); } } return sid; }
// in core/src/java/org/apache/solr/handler/component/RealTimeGetComponent.java
Override public int distributedProcess(ResponseBuilder rb) throws IOException { if (rb.stage < ResponseBuilder.STAGE_GET_FIELDS) return ResponseBuilder.STAGE_GET_FIELDS; if (rb.stage == ResponseBuilder.STAGE_GET_FIELDS) { return createSubRequests(rb); } return ResponseBuilder.STAGE_DONE; }
// in core/src/java/org/apache/solr/handler/component/RealTimeGetComponent.java
public int createSubRequests(ResponseBuilder rb) throws IOException { SolrParams params = rb.req.getParams(); String id1[] = params.getParams("id"); String ids[] = params.getParams("ids"); if (id1 == null && ids == null) { return ResponseBuilder.STAGE_DONE; } List<String> allIds = new ArrayList<String>(); if (id1 != null) { for (String s : id1) { allIds.add(s); } } if (ids != null) { for (String s : ids) { allIds.addAll( StrUtils.splitSmart(s, ",", true) ); } } // TODO: handle collection=...? ZkController zkController = rb.req.getCore().getCoreDescriptor().getCoreContainer().getZkController(); // if shards=... then use that if (zkController != null && params.get("shards") == null) { SchemaField sf = rb.req.getSchema().getUniqueKeyField(); CloudDescriptor cloudDescriptor = rb.req.getCore().getCoreDescriptor().getCloudDescriptor(); String collection = cloudDescriptor.getCollectionName(); CloudState cloudState = zkController.getCloudState(); Map<String, List<String>> shardToId = new HashMap<String, List<String>>(); for (String id : allIds) { BytesRef br = new BytesRef(); sf.getType().readableToIndexed(id, br); int hash = Hash.murmurhash3_x86_32(br.bytes, br.offset, br.length, 0); String shard = cloudState.getShard(hash, collection); List<String> idsForShard = shardToId.get(shard); if (idsForShard == null) { idsForShard = new ArrayList<String>(2); shardToId.put(shard, idsForShard); } idsForShard.add(id); } for (Map.Entry<String,List<String>> entry : shardToId.entrySet()) { String shard = entry.getKey(); String shardIdList = StrUtils.join(entry.getValue(), ','); ShardRequest sreq = new ShardRequest(); sreq.purpose = 1; // sreq.shards = new String[]{shard}; // TODO: would be nice if this would work... sreq.shards = sliceToShards(rb, collection, shard); sreq.actualShards = sreq.shards; sreq.params = new ModifiableSolrParams(); sreq.params.set(ShardParams.SHARDS_QT,"/get"); // TODO: how to avoid hardcoding this and hit the same handler? sreq.params.set("distrib",false); sreq.params.set("ids", shardIdList); rb.addRequest(this, sreq); } } else { String shardIdList = StrUtils.join(allIds, ','); ShardRequest sreq = new ShardRequest(); sreq.purpose = 1; sreq.shards = null; // ALL sreq.actualShards = sreq.shards; sreq.params = new ModifiableSolrParams(); sreq.params.set(ShardParams.SHARDS_QT,"/get"); // TODO: how to avoid hardcoding this and hit the same handler? sreq.params.set("distrib",false); sreq.params.set("ids", shardIdList); rb.addRequest(this, sreq); } return ResponseBuilder.STAGE_DONE; }
// in core/src/java/org/apache/solr/handler/component/RealTimeGetComponent.java
public void processGetVersions(ResponseBuilder rb) throws IOException { SolrQueryRequest req = rb.req; SolrQueryResponse rsp = rb.rsp; SolrParams params = req.getParams(); if (!params.getBool(COMPONENT_NAME, true)) { return; } int nVersions = params.getInt("getVersions", -1); if (nVersions == -1) return; String sync = params.get("sync"); if (sync != null) { processSync(rb, nVersions, sync); return; } UpdateLog ulog = req.getCore().getUpdateHandler().getUpdateLog(); if (ulog == null) return; UpdateLog.RecentUpdates recentUpdates = ulog.getRecentUpdates(); try { rb.rsp.add("versions", recentUpdates.getVersions(nVersions)); } finally { recentUpdates.close(); // cache this somehow? } }
// in core/src/java/org/apache/solr/handler/component/RealTimeGetComponent.java
public void processGetUpdates(ResponseBuilder rb) throws IOException { SolrQueryRequest req = rb.req; SolrQueryResponse rsp = rb.rsp; SolrParams params = req.getParams(); if (!params.getBool(COMPONENT_NAME, true)) { return; } String versionsStr = params.get("getUpdates"); if (versionsStr == null) return; UpdateLog ulog = req.getCore().getUpdateHandler().getUpdateLog(); if (ulog == null) return; List<String> versions = StrUtils.splitSmart(versionsStr, ",", true); // TODO: get this from cache instead of rebuilding? UpdateLog.RecentUpdates recentUpdates = ulog.getRecentUpdates(); List<Object> updates = new ArrayList<Object>(versions.size()); long minVersion = Long.MAX_VALUE; try { for (String versionStr : versions) { long version = Long.parseLong(versionStr); try { Object o = recentUpdates.lookup(version); if (o == null) continue; if (version > 0) { minVersion = Math.min(minVersion, version); } // TODO: do any kind of validation here? updates.add(o); } catch (SolrException e) { log.warn("Exception reading log for updates", e); } catch (ClassCastException e) { log.warn("Exception reading log for updates", e); } } // Must return all delete-by-query commands that occur after the first add requested // since they may apply. updates.addAll( recentUpdates.getDeleteByQuery(minVersion)); rb.rsp.add("updates", updates); } finally { recentUpdates.close(); // cache this somehow? } }
// in core/src/java/org/apache/solr/handler/component/MoreLikeThisComponent.java
Override public void prepare(ResponseBuilder rb) throws IOException { }
// in core/src/java/org/apache/solr/handler/component/MoreLikeThisComponent.java
Override public void process(ResponseBuilder rb) throws IOException { SolrParams p = rb.req.getParams(); if( p.getBool( MoreLikeThisParams.MLT, false ) ) { SolrIndexSearcher searcher = rb.req.getSearcher(); NamedList<DocList> sim = getMoreLikeThese( rb, searcher, rb.getResults().docList, rb.getFieldFlags() ); // TODO ???? add this directly to the response? rb.rsp.add( "moreLikeThis", sim ); } }
// in core/src/java/org/apache/solr/handler/component/MoreLikeThisComponent.java
NamedList<DocList> getMoreLikeThese( ResponseBuilder rb, SolrIndexSearcher searcher, DocList docs, int flags ) throws IOException { SolrParams p = rb.req.getParams(); IndexSchema schema = searcher.getSchema(); MoreLikeThisHandler.MoreLikeThisHelper mltHelper = new MoreLikeThisHandler.MoreLikeThisHelper( p, searcher ); NamedList<DocList> mlt = new SimpleOrderedMap<DocList>(); DocIterator iterator = docs.iterator(); SimpleOrderedMap<Object> dbg = null; if( rb.isDebug() ){ dbg = new SimpleOrderedMap<Object>(); } while( iterator.hasNext() ) { int id = iterator.nextDoc(); int rows = p.getInt( MoreLikeThisParams.DOC_COUNT, 5 ); DocListAndSet sim = mltHelper.getMoreLikeThis( id, 0, rows, null, null, flags ); String name = schema.printableUniqueKey( searcher.doc( id ) ); mlt.add(name, sim.docList); if( dbg != null ){ SimpleOrderedMap<Object> docDbg = new SimpleOrderedMap<Object>(); docDbg.add( "rawMLTQuery", mltHelper.getRawMLTQuery().toString() ); docDbg.add( "boostedMLTQuery", mltHelper.getBoostedMLTQuery().toString() ); docDbg.add( "realMLTQuery", mltHelper.getRealMLTQuery().toString() ); SimpleOrderedMap<Object> explains = new SimpleOrderedMap<Object>(); DocIterator mltIte = sim.docList.iterator(); while( mltIte.hasNext() ){ int mltid = mltIte.nextDoc(); String key = schema.printableUniqueKey( searcher.doc( mltid ) ); explains.add( key, searcher.explain( mltHelper.getRealMLTQuery(), mltid ) ); } docDbg.add( "explain", explains ); dbg.add( name, docDbg ); } } // add debug information if( dbg != null ){ rb.addDebugInfo( "moreLikeThis", dbg ); } return mlt; }
// in core/src/java/org/apache/solr/handler/component/TermVectorComponent.java
Override public void process(ResponseBuilder rb) throws IOException { SolrParams params = rb.req.getParams(); if (!params.getBool(COMPONENT_NAME, false)) { return; } NamedList<Object> termVectors = new NamedList<Object>(); rb.rsp.add(TERM_VECTORS, termVectors); FieldOptions allFields = new FieldOptions(); //figure out what options we have, and try to get the appropriate vector allFields.termFreq = params.getBool(TermVectorParams.TF, false); allFields.positions = params.getBool(TermVectorParams.POSITIONS, false); allFields.offsets = params.getBool(TermVectorParams.OFFSETS, false); allFields.docFreq = params.getBool(TermVectorParams.DF, false); allFields.tfIdf = params.getBool(TermVectorParams.TF_IDF, false); //boolean cacheIdf = params.getBool(TermVectorParams.IDF, false); //short cut to all values. if (params.getBool(TermVectorParams.ALL, false)) { allFields.termFreq = true; allFields.positions = true; allFields.offsets = true; allFields.docFreq = true; allFields.tfIdf = true; } String fldLst = params.get(TermVectorParams.FIELDS); if (fldLst == null) { fldLst = params.get(CommonParams.FL); } //use this to validate our fields IndexSchema schema = rb.req.getSchema(); //Build up our per field mapping Map<String, FieldOptions> fieldOptions = new HashMap<String, FieldOptions>(); NamedList<List<String>> warnings = new NamedList<List<String>>(); List<String> noTV = new ArrayList<String>(); List<String> noPos = new ArrayList<String>(); List<String> noOff = new ArrayList<String>(); //we have specific fields to retrieve if (fldLst != null) { String [] fields = SolrPluginUtils.split(fldLst); for (String field : fields) { SchemaField sf = schema.getFieldOrNull(field); if (sf != null) { if (sf.storeTermVector()) { FieldOptions option = fieldOptions.get(field); if (option == null) { option = new FieldOptions(); option.fieldName = field; fieldOptions.put(field, option); } //get the per field mappings option.termFreq = params.getFieldBool(field, TermVectorParams.TF, allFields.termFreq); option.docFreq = params.getFieldBool(field, TermVectorParams.DF, allFields.docFreq); option.tfIdf = params.getFieldBool(field, TermVectorParams.TF_IDF, allFields.tfIdf); //Validate these are even an option option.positions = params.getFieldBool(field, TermVectorParams.POSITIONS, allFields.positions); if (option.positions && !sf.storeTermPositions()){ noPos.add(field); } option.offsets = params.getFieldBool(field, TermVectorParams.OFFSETS, allFields.offsets); if (option.offsets && !sf.storeTermOffsets()){ noOff.add(field); } } else {//field doesn't have term vectors noTV.add(field); } } else { //field doesn't exist throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "undefined field: " + field); } } } //else, deal with all fields boolean hasWarnings = false; if (!noTV.isEmpty()) { warnings.add("noTermVectors", noTV); hasWarnings = true; } if (!noPos.isEmpty()) { warnings.add("noPositions", noPos); hasWarnings = true; } if (!noOff.isEmpty()) { warnings.add("noOffsets", noOff); hasWarnings = true; } if (hasWarnings) { termVectors.add("warnings", warnings); } DocListAndSet listAndSet = rb.getResults(); List<Integer> docIds = getInts(params.getParams(TermVectorParams.DOC_IDS)); Iterator<Integer> iter; if (docIds != null && !docIds.isEmpty()) { iter = docIds.iterator(); } else { DocList list = listAndSet.docList; iter = list.iterator(); } SolrIndexSearcher searcher = rb.req.getSearcher(); IndexReader reader = searcher.getIndexReader(); //the TVMapper is a TermVectorMapper which can be used to optimize loading of Term Vectors SchemaField keyField = schema.getUniqueKeyField(); String uniqFieldName = null; if (keyField != null) { uniqFieldName = keyField.getName(); } //Only load the id field to get the uniqueKey of that //field final String finalUniqFieldName = uniqFieldName; final List<String> uniqValues = new ArrayList<String>(); // TODO: is this required to be single-valued? if so, we should STOP // once we find it... final StoredFieldVisitor getUniqValue = new StoredFieldVisitor() { @Override public void stringField(FieldInfo fieldInfo, String value) throws IOException { uniqValues.add(value); } @Override public void intField(FieldInfo fieldInfo, int value) throws IOException { uniqValues.add(Integer.toString(value)); } @Override public void longField(FieldInfo fieldInfo, long value) throws IOException { uniqValues.add(Long.toString(value)); } @Override public Status needsField(FieldInfo fieldInfo) throws IOException { return (fieldInfo.name.equals(finalUniqFieldName)) ? Status.YES : Status.NO; } }; TermsEnum termsEnum = null; while (iter.hasNext()) { Integer docId = iter.next(); NamedList<Object> docNL = new NamedList<Object>(); termVectors.add("doc-" + docId, docNL); if (keyField != null) { reader.document(docId, getUniqValue); String uniqVal = null; if (uniqValues.size() != 0) { uniqVal = uniqValues.get(0); uniqValues.clear(); docNL.add("uniqueKey", uniqVal); termVectors.add("uniqueKeyFieldName", uniqFieldName); } } if (!fieldOptions.isEmpty()) { for (Map.Entry<String, FieldOptions> entry : fieldOptions.entrySet()) { final String field = entry.getKey(); final Terms vector = reader.getTermVector(docId, field); if (vector != null) { termsEnum = vector.iterator(termsEnum); mapOneVector(docNL, entry.getValue(), reader, docId, vector.iterator(termsEnum), field); } } } else { // extract all fields final Fields vectors = reader.getTermVectors(docId); final FieldsEnum fieldsEnum = vectors.iterator(); String field; while((field = fieldsEnum.next()) != null) { Terms terms = fieldsEnum.terms(); if (terms != null) { termsEnum = terms.iterator(termsEnum); mapOneVector(docNL, allFields, reader, docId, termsEnum, field); } } } } }
// in core/src/java/org/apache/solr/handler/component/TermVectorComponent.java
Override public void stringField(FieldInfo fieldInfo, String value) throws IOException { uniqValues.add(value); }
// in core/src/java/org/apache/solr/handler/component/TermVectorComponent.java
Override public void intField(FieldInfo fieldInfo, int value) throws IOException { uniqValues.add(Integer.toString(value)); }
// in core/src/java/org/apache/solr/handler/component/TermVectorComponent.java
Override public void longField(FieldInfo fieldInfo, long value) throws IOException { uniqValues.add(Long.toString(value)); }
// in core/src/java/org/apache/solr/handler/component/TermVectorComponent.java
Override public Status needsField(FieldInfo fieldInfo) throws IOException { return (fieldInfo.name.equals(finalUniqFieldName)) ? Status.YES : Status.NO; }
// in core/src/java/org/apache/solr/handler/component/TermVectorComponent.java
private void mapOneVector(NamedList<Object> docNL, FieldOptions fieldOptions, IndexReader reader, int docID, TermsEnum termsEnum, String field) throws IOException { NamedList<Object> fieldNL = new NamedList<Object>(); docNL.add(field, fieldNL); BytesRef text; DocsAndPositionsEnum dpEnum = null; while((text = termsEnum.next()) != null) { String term = text.utf8ToString(); NamedList<Object> termInfo = new NamedList<Object>(); fieldNL.add(term, termInfo); final int freq = (int) termsEnum.totalTermFreq(); if (fieldOptions.termFreq == true) { termInfo.add("tf", freq); } dpEnum = termsEnum.docsAndPositions(null, dpEnum, fieldOptions.offsets); boolean useOffsets = fieldOptions.offsets; if (dpEnum == null) { useOffsets = false; dpEnum = termsEnum.docsAndPositions(null, dpEnum, false); } boolean usePositions = false; if (dpEnum != null) { dpEnum.nextDoc(); usePositions = fieldOptions.positions; } NamedList<Number> theOffsets = null; if (useOffsets) { theOffsets = new NamedList<Number>(); termInfo.add("offsets", theOffsets); } NamedList<Integer> positionsNL = null; if (usePositions || theOffsets != null) { for (int i = 0; i < freq; i++) { final int pos = dpEnum.nextPosition(); if (usePositions && pos >= 0) { if (positionsNL == null) { positionsNL = new NamedList<Integer>(); termInfo.add("positions", positionsNL); } positionsNL.add("position", pos); } if (theOffsets != null) { theOffsets.add("start", dpEnum.startOffset()); theOffsets.add("end", dpEnum.endOffset()); } } } if (fieldOptions.docFreq) { termInfo.add("df", getDocFreq(reader, field, text)); } if (fieldOptions.tfIdf) { double tfIdfVal = ((double) freq) / getDocFreq(reader, field, text); termInfo.add("tf-idf", tfIdfVal); } } }
// in core/src/java/org/apache/solr/handler/component/TermVectorComponent.java
Override public int distributedProcess(ResponseBuilder rb) throws IOException { int result = ResponseBuilder.STAGE_DONE; if (rb.stage == ResponseBuilder.STAGE_GET_FIELDS) { //Go ask each shard for it's vectors // for each shard, collect the documents for that shard. HashMap<String, Collection<ShardDoc>> shardMap = new HashMap<String, Collection<ShardDoc>>(); for (ShardDoc sdoc : rb.resultIds.values()) { Collection<ShardDoc> shardDocs = shardMap.get(sdoc.shard); if (shardDocs == null) { shardDocs = new ArrayList<ShardDoc>(); shardMap.put(sdoc.shard, shardDocs); } shardDocs.add(sdoc); } // Now create a request for each shard to retrieve the stored fields for (Collection<ShardDoc> shardDocs : shardMap.values()) { ShardRequest sreq = new ShardRequest(); sreq.purpose = ShardRequest.PURPOSE_GET_FIELDS; sreq.shards = new String[]{shardDocs.iterator().next().shard}; sreq.params = new ModifiableSolrParams(); // add original params sreq.params.add(rb.req.getParams()); sreq.params.remove(CommonParams.Q);//remove the query ArrayList<String> ids = new ArrayList<String>(shardDocs.size()); for (ShardDoc shardDoc : shardDocs) { ids.add(shardDoc.id.toString()); } sreq.params.add(TermVectorParams.DOC_IDS, StrUtils.join(ids, ',')); rb.addRequest(this, sreq); } result = ResponseBuilder.STAGE_DONE; } return result; }
// in core/src/java/org/apache/solr/handler/component/TermVectorComponent.java
Override public void prepare(ResponseBuilder rb) throws IOException { }
// in core/src/java/org/apache/solr/handler/admin/ShowFileRequestHandler.java
Override public void handleRequestBody(SolrQueryRequest req, SolrQueryResponse rsp) throws IOException, KeeperException, InterruptedException { CoreContainer coreContainer = req.getCore().getCoreDescriptor().getCoreContainer(); if (coreContainer.isZooKeeperAware()) { showFromZooKeeper(req, rsp, coreContainer); } else { showFromFileSystem(req, rsp); } }
// in core/src/java/org/apache/solr/handler/admin/ShowFileRequestHandler.java
private void showFromFileSystem(SolrQueryRequest req, SolrQueryResponse rsp) throws IOException { File adminFile = null; final SolrResourceLoader loader = req.getCore().getResourceLoader(); File configdir = new File( loader.getConfigDir() ); if (!configdir.exists()) { // TODO: maybe we should just open it this way to start with? try { configdir = new File( loader.getClassLoader().getResource(loader.getConfigDir()).toURI() ); } catch (URISyntaxException e) { throw new SolrException( ErrorCode.FORBIDDEN, "Can not access configuration directory!"); } } String fname = req.getParams().get("file", null); if( fname == null ) { adminFile = configdir; } else { fname = fname.replace( '\\', '/' ); // normalize slashes if( hiddenFiles.contains( fname.toUpperCase(Locale.ENGLISH) ) ) { throw new SolrException( ErrorCode.FORBIDDEN, "Can not access: "+fname ); } if( fname.indexOf( ".." ) >= 0 ) { throw new SolrException( ErrorCode.FORBIDDEN, "Invalid path: "+fname ); } adminFile = new File( configdir, fname ); } // Make sure the file exists, is readable and is not a hidden file if( !adminFile.exists() ) { throw new SolrException( ErrorCode.BAD_REQUEST, "Can not find: "+adminFile.getName() + " ["+adminFile.getAbsolutePath()+"]" ); } if( !adminFile.canRead() || adminFile.isHidden() ) { throw new SolrException( ErrorCode.BAD_REQUEST, "Can not show: "+adminFile.getName() + " ["+adminFile.getAbsolutePath()+"]" ); } // Show a directory listing if( adminFile.isDirectory() ) { int basePath = configdir.getAbsolutePath().length() + 1; NamedList<SimpleOrderedMap<Object>> files = new SimpleOrderedMap<SimpleOrderedMap<Object>>(); for( File f : adminFile.listFiles() ) { String path = f.getAbsolutePath().substring( basePath ); path = path.replace( '\\', '/' ); // normalize slashes if( hiddenFiles.contains( path.toUpperCase(Locale.ENGLISH) ) ) { continue; // don't show 'hidden' files } if( f.isHidden() || f.getName().startsWith( "." ) ) { continue; // skip hidden system files... } SimpleOrderedMap<Object> fileInfo = new SimpleOrderedMap<Object>(); files.add( path, fileInfo ); if( f.isDirectory() ) { fileInfo.add( "directory", true ); } else { // TODO? content type fileInfo.add( "size", f.length() ); } fileInfo.add( "modified", new Date( f.lastModified() ) ); } rsp.add( "files", files ); } else { // Include the file contents //The file logic depends on RawResponseWriter, so force its use. ModifiableSolrParams params = new ModifiableSolrParams( req.getParams() ); params.set( CommonParams.WT, "raw" ); req.setParams(params); ContentStreamBase content = new ContentStreamBase.FileStream( adminFile ); content.setContentType( req.getParams().get( USE_CONTENT_TYPE ) ); rsp.add(RawResponseWriter.CONTENT, content); } rsp.setHttpCaching(false); }
// in core/src/java/org/apache/solr/handler/admin/PropertiesRequestHandler.java
Override public void handleRequestBody(SolrQueryRequest req, SolrQueryResponse rsp) throws IOException { Object props = null; String name = req.getParams().get( "name" ); if( name != null ) { NamedList<String> p = new SimpleOrderedMap<String>(); p.add( name, System.getProperty(name) ); props = p; } else { props = System.getProperties(); } rsp.add( "system.properties", props ); rsp.setHttpCaching(false); }
// in core/src/java/org/apache/solr/handler/admin/LukeRequestHandler.java
private static SimpleOrderedMap<Object> getDocumentFieldsInfo( Document doc, int docId, IndexReader reader, IndexSchema schema ) throws IOException { final CharsRef spare = new CharsRef(); SimpleOrderedMap<Object> finfo = new SimpleOrderedMap<Object>(); for( Object o : doc.getFields() ) { Field field = (Field)o; SimpleOrderedMap<Object> f = new SimpleOrderedMap<Object>(); SchemaField sfield = schema.getFieldOrNull( field.name() ); FieldType ftype = (sfield==null)?null:sfield.getType(); f.add( "type", (ftype==null)?null:ftype.getTypeName() ); f.add( "schema", getFieldFlags( sfield ) ); f.add( "flags", getFieldFlags( field ) ); Term t = new Term(field.name(), ftype!=null ? ftype.storedToIndexed(field) : field.stringValue()); f.add( "value", (ftype==null)?null:ftype.toExternal( field ) ); // TODO: this really should be "stored" f.add( "internal", field.stringValue() ); // may be a binary number BytesRef bytes = field.binaryValue(); if (bytes != null) { f.add( "binary", Base64.byteArrayToBase64(bytes.bytes, bytes.offset, bytes.length)); } f.add( "boost", field.boost() ); f.add( "docFreq", t.text()==null ? 0 : reader.docFreq( t ) ); // this can be 0 for non-indexed fields // If we have a term vector, return that if( field.fieldType().storeTermVectors() ) { try { Terms v = reader.getTermVector( docId, field.name() ); if( v != null ) { SimpleOrderedMap<Integer> tfv = new SimpleOrderedMap<Integer>(); final TermsEnum termsEnum = v.iterator(null); BytesRef text; while((text = termsEnum.next()) != null) { final int freq = (int) termsEnum.totalTermFreq(); UnicodeUtil.UTF8toUTF16(text, spare); tfv.add(spare.toString(), freq); } f.add( "termVector", tfv ); } } catch( Exception ex ) { log.warn( "error writing term vector", ex ); } } finfo.add( field.name(), f ); } return finfo; }
// in core/src/java/org/apache/solr/handler/admin/LukeRequestHandler.java
private static Document getFirstLiveDoc(AtomicReader reader, String fieldName, Terms terms) throws IOException { DocsEnum docsEnum = null; TermsEnum termsEnum = terms.iterator(null); BytesRef text; // Deal with the chance that the first bunch of terms are in deleted documents. Is there a better way? for (int idx = 0; idx < 1000 && docsEnum == null; ++idx) { text = termsEnum.next(); if (text == null) { // Ran off the end of the terms enum without finding any live docs with that field in them. return null; } Term term = new Term(fieldName, text); docsEnum = reader.termDocsEnum(reader.getLiveDocs(), term.field(), new BytesRef(term.text()), false); if (docsEnum != null) { int docId; if ((docId = docsEnum.nextDoc()) != DocIdSetIterator.NO_MORE_DOCS) { return reader.document(docId); } } } return null; }
// in core/src/java/org/apache/solr/handler/admin/LukeRequestHandler.java
public static SimpleOrderedMap<Object> getIndexInfo(DirectoryReader reader, boolean detail) throws IOException { return getIndexInfo(reader); }
// in core/src/java/org/apache/solr/handler/admin/LukeRequestHandler.java
public static SimpleOrderedMap<Object> getIndexInfo(DirectoryReader reader) throws IOException { Directory dir = reader.directory(); SimpleOrderedMap<Object> indexInfo = new SimpleOrderedMap<Object>(); indexInfo.add("numDocs", reader.numDocs()); indexInfo.add("maxDoc", reader.maxDoc()); indexInfo.add("version", reader.getVersion()); // TODO? Is this different then: IndexReader.getCurrentVersion( dir )? indexInfo.add("segmentCount", reader.getSequentialSubReaders().length); indexInfo.add("current", reader.isCurrent() ); indexInfo.add("hasDeletions", reader.hasDeletions() ); indexInfo.add("directory", dir ); indexInfo.add("userData", reader.getIndexCommit().getUserData()); String s = reader.getIndexCommit().getUserData().get(SolrIndexWriter.COMMIT_TIME_MSEC_KEY); if (s != null) { indexInfo.add("lastModified", new Date(Long.parseLong(s))); } return indexInfo; }
// in core/src/java/org/apache/solr/handler/admin/ThreadDumpHandler.java
Override public void handleRequestBody(SolrQueryRequest req, SolrQueryResponse rsp) throws IOException { SimpleOrderedMap<Object> system = new SimpleOrderedMap<Object>(); rsp.add( "system", system ); ThreadMXBean tmbean = ManagementFactory.getThreadMXBean(); // Thread Count SimpleOrderedMap<Object> nl = new SimpleOrderedMap<Object>(); nl.add( "current",tmbean.getThreadCount() ); nl.add( "peak", tmbean.getPeakThreadCount() ); nl.add( "daemon", tmbean.getDaemonThreadCount() ); system.add( "threadCount", nl ); // Deadlocks ThreadInfo[] tinfos; long[] tids = tmbean.findMonitorDeadlockedThreads(); if (tids != null) { tinfos = tmbean.getThreadInfo(tids, Integer.MAX_VALUE); NamedList<SimpleOrderedMap<Object>> lst = new NamedList<SimpleOrderedMap<Object>>(); for (ThreadInfo ti : tinfos) { if (ti != null) { lst.add( "thread", getThreadInfo( ti, tmbean ) ); } } system.add( "deadlocks", lst ); } // Now show all the threads.... tids = tmbean.getAllThreadIds(); tinfos = tmbean.getThreadInfo(tids, Integer.MAX_VALUE); NamedList<SimpleOrderedMap<Object>> lst = new NamedList<SimpleOrderedMap<Object>>(); for (ThreadInfo ti : tinfos) { if (ti != null) { lst.add( "thread", getThreadInfo( ti, tmbean ) ); } } system.add( "threadDump", lst ); rsp.setHttpCaching(false); }
// in core/src/java/org/apache/solr/handler/admin/ThreadDumpHandler.java
private static SimpleOrderedMap<Object> getThreadInfo( ThreadInfo ti, ThreadMXBean tmbean ) throws IOException { SimpleOrderedMap<Object> info = new SimpleOrderedMap<Object>(); long tid = ti.getThreadId(); info.add( "id", tid ); info.add( "name", ti.getThreadName() ); info.add( "state", ti.getThreadState().toString() ); if (ti.getLockName() != null) { info.add( "lock", ti.getLockName() ); } if (ti.isSuspended()) { info.add( "suspended", true ); } if (ti.isInNative()) { info.add( "native", true ); } if (tmbean.isThreadCpuTimeSupported()) { info.add( "cpuTime", formatNanos(tmbean.getThreadCpuTime(tid)) ); info.add( "userTime", formatNanos(tmbean.getThreadUserTime(tid)) ); } if (ti.getLockOwnerName() != null) { SimpleOrderedMap<Object> owner = new SimpleOrderedMap<Object>(); owner.add( "name", ti.getLockOwnerName() ); owner.add( "id", ti.getLockOwnerId() ); } // Add the stack trace int i=0; String[] trace = new String[ti.getStackTrace().length]; for( StackTraceElement ste : ti.getStackTrace()) { trace[i++] = ste.toString(); } info.add( "stackTrace", trace ); return info; }
// in core/src/java/org/apache/solr/handler/admin/CoreAdminHandler.java
protected boolean handleMergeAction(SolrQueryRequest req, SolrQueryResponse rsp) throws IOException { SolrParams params = req.getParams(); String cname = params.required().get(CoreAdminParams.CORE); SolrCore core = coreContainer.getCore(cname); SolrQueryRequest wrappedReq = null; SolrCore[] sourceCores = null; RefCounted<SolrIndexSearcher>[] searchers = null; // stores readers created from indexDir param values DirectoryReader[] readersToBeClosed = null; Directory[] dirsToBeReleased = null; if (core != null) { try { String[] dirNames = params.getParams(CoreAdminParams.INDEX_DIR); if (dirNames == null || dirNames.length == 0) { String[] sources = params.getParams("srcCore"); if (sources == null || sources.length == 0) throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "At least one indexDir or srcCore must be specified"); sourceCores = new SolrCore[sources.length]; for (int i = 0; i < sources.length; i++) { String source = sources[i]; SolrCore srcCore = coreContainer.getCore(source); if (srcCore == null) throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Core: " + source + " does not exist"); sourceCores[i] = srcCore; } } else { readersToBeClosed = new DirectoryReader[dirNames.length]; dirsToBeReleased = new Directory[dirNames.length]; DirectoryFactory dirFactory = core.getDirectoryFactory(); for (int i = 0; i < dirNames.length; i++) { Directory dir = dirFactory.get(dirNames[i], core.getSolrConfig().indexConfig.lockType); dirsToBeReleased[i] = dir; // TODO: why doesn't this use the IR factory? what is going on here? readersToBeClosed[i] = DirectoryReader.open(dir); } } DirectoryReader[] readers = null; if (readersToBeClosed != null) { readers = readersToBeClosed; } else { readers = new DirectoryReader[sourceCores.length]; searchers = new RefCounted[sourceCores.length]; for (int i = 0; i < sourceCores.length; i++) { SolrCore solrCore = sourceCores[i]; // record the searchers so that we can decref searchers[i] = solrCore.getSearcher(); readers[i] = searchers[i].get().getIndexReader(); } } UpdateRequestProcessorChain processorChain = core.getUpdateProcessingChain(params.get(UpdateParams.UPDATE_CHAIN)); wrappedReq = new LocalSolrQueryRequest(core, req.getParams()); UpdateRequestProcessor processor = processorChain.createProcessor(wrappedReq, rsp); processor.processMergeIndexes(new MergeIndexesCommand(readers, req)); } finally { if (searchers != null) { for (RefCounted<SolrIndexSearcher> searcher : searchers) { if (searcher != null) searcher.decref(); } } if (sourceCores != null) { for (SolrCore solrCore : sourceCores) { if (solrCore != null) solrCore.close(); } } if (readersToBeClosed != null) IOUtils.closeWhileHandlingException(readersToBeClosed); if (dirsToBeReleased != null) { for (Directory dir : dirsToBeReleased) { DirectoryFactory dirFactory = core.getDirectoryFactory(); dirFactory.release(dir); } } if (wrappedReq != null) wrappedReq.close(); core.close(); } } return coreContainer.isPersistent(); }
// in core/src/java/org/apache/solr/handler/admin/CoreAdminHandler.java
protected void handleRequestRecoveryAction(SolrQueryRequest req, SolrQueryResponse rsp) throws IOException { final SolrParams params = req.getParams(); String cname = params.get(CoreAdminParams.CORE); if (cname == null) { cname = ""; } SolrCore core = null; try { core = coreContainer.getCore(cname); if (core != null) { core.getUpdateHandler().getSolrCoreState().doRecovery(coreContainer, cname); } else { SolrException.log(log, "Cound not find core to call recovery:" + cname); } } finally { // no recoveryStrat close for now if (core != null) { core.close(); } } }
// in core/src/java/org/apache/solr/handler/admin/CoreAdminHandler.java
protected void handleWaitForStateAction(SolrQueryRequest req, SolrQueryResponse rsp) throws IOException, InterruptedException { final SolrParams params = req.getParams(); String cname = params.get(CoreAdminParams.CORE); if (cname == null) { cname = ""; } String nodeName = params.get("nodeName"); String coreNodeName = params.get("coreNodeName"); String waitForState = params.get("state"); Boolean checkLive = params.getBool("checkLive"); int pauseFor = params.getInt("pauseFor", 0); String state = null; boolean live = false; int retry = 0; while (true) { SolrCore core = null; try { core = coreContainer.getCore(cname); if (core == null && retry == 30) { throw new SolrException(ErrorCode.BAD_REQUEST, "core not found:" + cname); } if (core != null) { // wait until we are sure the recovering node is ready // to accept updates CloudDescriptor cloudDescriptor = core.getCoreDescriptor() .getCloudDescriptor(); CloudState cloudState = coreContainer.getZkController() .getCloudState(); String collection = cloudDescriptor.getCollectionName(); Slice slice = cloudState.getSlice(collection, cloudDescriptor.getShardId()); if (slice != null) { ZkNodeProps nodeProps = slice.getShards().get(coreNodeName); if (nodeProps != null) { state = nodeProps.get(ZkStateReader.STATE_PROP); live = cloudState.liveNodesContain(nodeName); if (nodeProps != null && state.equals(waitForState)) { if (checkLive == null) { break; } else if (checkLive && live) { break; } else if (!checkLive && !live) { break; } } } } } if (retry++ == 30) { throw new SolrException(ErrorCode.BAD_REQUEST, "I was asked to wait on state " + waitForState + " for " + nodeName + " but I still do not see the request state. I see state: " + state + " live:" + live); } } finally { if (core != null) { core.close(); } } Thread.sleep(1000); } // small safety net for any updates that started with state that // kept it from sending the update to be buffered - // pause for a while to let any outstanding updates finish // System.out.println("I saw state:" + state + " sleep for " + pauseFor + // " live:" + live); Thread.sleep(pauseFor); // solrcloud_debug // try {; // LocalSolrQueryRequest r = new LocalSolrQueryRequest(core, new // ModifiableSolrParams()); // CommitUpdateCommand commitCmd = new CommitUpdateCommand(r, false); // commitCmd.softCommit = true; // core.getUpdateHandler().commit(commitCmd); // RefCounted<SolrIndexSearcher> searchHolder = // core.getNewestSearcher(false); // SolrIndexSearcher searcher = searchHolder.get(); // try { // System.out.println(core.getCoreDescriptor().getCoreContainer().getZkController().getNodeName() // + " to replicate " // + searcher.search(new MatchAllDocsQuery(), 1).totalHits + " gen:" + // core.getDeletionPolicy().getLatestCommit().getGeneration() + " data:" + // core.getDataDir()); // } finally { // searchHolder.decref(); // } // } catch (Exception e) { // // } }
// in core/src/java/org/apache/solr/handler/admin/CoreAdminHandler.java
protected void handleDistribUrlAction(SolrQueryRequest req, SolrQueryResponse rsp) throws IOException, InterruptedException, SolrServerException { // TODO: finish this and tests SolrParams params = req.getParams(); final ModifiableSolrParams newParams = new ModifiableSolrParams(params); newParams.remove("action"); SolrParams required = params.required(); final String subAction = required.get("subAction"); String collection = required.get("collection"); newParams.set(CoreAdminParams.ACTION, subAction); SolrCore core = req.getCore(); ZkController zkController = core.getCoreDescriptor().getCoreContainer() .getZkController(); CloudState cloudState = zkController.getCloudState(); Map<String,Slice> slices = cloudState.getCollectionStates().get(collection); for (Map.Entry<String,Slice> entry : slices.entrySet()) { Slice slice = entry.getValue(); Map<String,ZkNodeProps> shards = slice.getShards(); Set<Map.Entry<String,ZkNodeProps>> shardEntries = shards.entrySet(); for (Map.Entry<String,ZkNodeProps> shardEntry : shardEntries) { final ZkNodeProps node = shardEntry.getValue(); if (cloudState.liveNodesContain(node.get(ZkStateReader.NODE_NAME_PROP))) { newParams.set(CoreAdminParams.CORE, node.get(ZkStateReader.CORE_NAME_PROP)); String replica = node.get(ZkStateReader.BASE_URL_PROP); ShardRequest sreq = new ShardRequest(); newParams.set("qt", "/admin/cores"); sreq.purpose = 1; // TODO: this sucks if (replica.startsWith("http://")) replica = replica.substring(7); sreq.shards = new String[]{replica}; sreq.actualShards = sreq.shards; sreq.params = newParams; shardHandler.submit(sreq, replica, sreq.params); } } } ShardResponse srsp; do { srsp = shardHandler.takeCompletedOrError(); if (srsp != null) { Throwable e = srsp.getException(); if (e != null) { log.error("Error talking to shard: " + srsp.getShard(), e); } } } while(srsp != null); }
// in core/src/java/org/apache/solr/handler/admin/CoreAdminHandler.java
protected NamedList<Object> getCoreStatus(CoreContainer cores, String cname) throws IOException { NamedList<Object> info = new SimpleOrderedMap<Object>(); SolrCore core = cores.getCore(cname); if (core != null) { try { info.add("name", core.getName()); info.add("isDefaultCore", core.getName().equals(cores.getDefaultCoreName())); info.add("instanceDir", normalizePath(core.getResourceLoader().getInstanceDir())); info.add("dataDir", normalizePath(core.getDataDir())); info.add("config", core.getConfigResource()); info.add("schema", core.getSchemaResource()); info.add("startTime", new Date(core.getStartTime())); info.add("uptime", System.currentTimeMillis() - core.getStartTime()); RefCounted<SolrIndexSearcher> searcher = core.getSearcher(); try { SimpleOrderedMap<Object> indexInfo = LukeRequestHandler.getIndexInfo(searcher.get().getIndexReader()); long size = getIndexSize(core); indexInfo.add("sizeInBytes", size); indexInfo.add("size", NumberUtils.readableSize(size)); info.add("index", indexInfo); } finally { searcher.decref(); } } finally { core.close(); } } return info; }
// in core/src/java/org/apache/solr/handler/SnapShooter.java
public void copyFiles(Collection<String> files, File destDir) throws IOException { for (String indexFile : files) { File source = new File(solrCore.getIndexDir(), indexFile); copyFile(source, new File(destDir, source.getName()), true); } }
// in core/src/java/org/apache/solr/handler/SnapShooter.java
public void copyFile(File source, File destination, boolean preserveFileDate) throws IOException { // check source exists if (!source.exists()) { String message = "File " + source + " does not exist"; throw new FileNotFoundException(message); } // does destinations directory exist ? if (destination.getParentFile() != null && !destination.getParentFile().exists()) { destination.getParentFile().mkdirs(); } // make sure we can write to destination if (destination.exists() && !destination.canWrite()) { String message = "Unable to open file " + destination + " for writing."; throw new IOException(message); } FileInputStream input = null; FileOutputStream output = null; try { input = new FileInputStream(source); output = new FileOutputStream(destination); int count = 0; int n = 0; int rcnt = 0; while (-1 != (n = input.read(buffer))) { output.write(buffer, 0, n); count += n; rcnt++; /*** // reserve every 4.6875 MB if (rcnt == 150) { rcnt = 0; delPolicy.setReserveDuration(indexCommit.getVersion(), reserveTime); } ***/ } } finally { try { IOUtils.closeQuietly(input); } finally { IOUtils.closeQuietly(output); } } if (source.length() != destination.length()) { String message = "Failed to copy full contents from " + source + " to " + destination; throw new IOException(message); } if (preserveFileDate) { // file copy should preserve file date destination.setLastModified(source.lastModified()); } }
// in core/src/java/org/apache/solr/analysis/SynonymFilterFactory.java
private SynonymMap loadSolrSynonyms(ResourceLoader loader, boolean dedup, Analyzer analyzer) throws IOException, ParseException { final boolean expand = getBoolean("expand", true); String synonyms = args.get("synonyms"); if (synonyms == null) throw new InitializationException("Missing required argument 'synonyms'."); CharsetDecoder decoder = Charset.forName("UTF-8").newDecoder() .onMalformedInput(CodingErrorAction.REPORT) .onUnmappableCharacter(CodingErrorAction.REPORT); SolrSynonymParser parser = new SolrSynonymParser(dedup, expand, analyzer); File synonymFile = new File(synonyms); if (synonymFile.exists()) { decoder.reset(); parser.add(new InputStreamReader(loader.openResource(synonyms), decoder)); } else { List<String> files = StrUtils.splitFileNames(synonyms); for (String file : files) { decoder.reset(); parser.add(new InputStreamReader(loader.openResource(file), decoder)); } } return parser.build(); }
// in core/src/java/org/apache/solr/analysis/SynonymFilterFactory.java
private SynonymMap loadWordnetSynonyms(ResourceLoader loader, boolean dedup, Analyzer analyzer) throws IOException, ParseException { final boolean expand = getBoolean("expand", true); String synonyms = args.get("synonyms"); if (synonyms == null) throw new InitializationException("Missing required argument 'synonyms'."); CharsetDecoder decoder = Charset.forName("UTF-8").newDecoder() .onMalformedInput(CodingErrorAction.REPORT) .onUnmappableCharacter(CodingErrorAction.REPORT); WordnetSynonymParser parser = new WordnetSynonymParser(dedup, expand, analyzer); File synonymFile = new File(synonyms); if (synonymFile.exists()) { decoder.reset(); parser.add(new InputStreamReader(loader.openResource(synonyms), decoder)); } else { List<String> files = StrUtils.splitFileNames(synonyms); for (String file : files) { decoder.reset(); parser.add(new InputStreamReader(loader.openResource(file), decoder)); } } return parser.build(); }
// in core/src/java/org/apache/solr/analysis/TrieTokenizerFactory.java
Override public void reset(Reader input) throws IOException { try { super.reset(input); input = super.input; char[] buf = new char[32]; int len = input.read(buf); this.startOfs = correctOffset(0); this.endOfs = correctOffset(len); String v = new String(buf, 0, len); try { switch (type) { case INTEGER: ts.setIntValue(Integer.parseInt(v)); break; case FLOAT: ts.setFloatValue(Float.parseFloat(v)); break; case LONG: ts.setLongValue(Long.parseLong(v)); break; case DOUBLE: ts.setDoubleValue(Double.parseDouble(v)); break; case DATE: ts.setLongValue(dateField.parseMath(null, v).getTime()); break; default: throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Unknown type for trie field"); } } catch (NumberFormatException nfe) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Invalid Number: " + v); } } catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Unable to create TrieIndexTokenizer", e); } }
// in core/src/java/org/apache/solr/analysis/TrieTokenizerFactory.java
Override public void close() throws IOException { super.close(); ts.close(); }
// in core/src/java/org/apache/solr/analysis/TrieTokenizerFactory.java
Override public void reset() throws IOException { super.reset(); ts.reset(); }
// in core/src/java/org/apache/solr/analysis/TrieTokenizerFactory.java
Override public boolean incrementToken() throws IOException { if (ts.incrementToken()) { ofsAtt.setOffset(startOfs, endOfs); return true; } return false; }
// in core/src/java/org/apache/solr/analysis/TrieTokenizerFactory.java
Override public void end() throws IOException { ts.end(); ofsAtt.setOffset(endOfs, endOfs); }
// in core/src/java/org/apache/solr/analysis/ReversedWildcardFilter.java
Override public boolean incrementToken() throws IOException { if( save != null ) { // clearAttributes(); // not currently necessary restoreState(save); save = null; return true; } if (!input.incrementToken()) return false; // pass through zero-length terms int oldLen = termAtt.length(); if (oldLen ==0) return true; int origOffset = posAtt.getPositionIncrement(); if (withOriginal == true){ posAtt.setPositionIncrement(0); save = captureState(); } char [] buffer = termAtt.resizeBuffer(oldLen + 1); buffer[oldLen] = markerChar; reverse(buffer, 0, oldLen + 1); posAtt.setPositionIncrement(origOffset); termAtt.copyBuffer(buffer, 0, oldLen +1); return true; }
// in core/src/java/org/apache/solr/analysis/TokenizerChain.java
Override protected void reset(Reader reader) throws IOException { // the tokenizers are currently reset by the indexing process, so only // the tokenizer needs to be reset. Reader r = initReader(reader); super.reset(r); }
// in core/src/java/org/apache/solr/analysis/LegacyHTMLStripCharFilter.java
public static void main(String[] args) throws IOException { Reader in = new LegacyHTMLStripCharFilter( CharReader.get(new InputStreamReader(System.in))); int ch; while ( (ch=in.read()) != -1 ) System.out.print((char)ch); }
// in core/src/java/org/apache/solr/analysis/LegacyHTMLStripCharFilter.java
private int next() throws IOException { int len = pushed.length(); if (len>0) { int ch = pushed.charAt(len-1); pushed.setLength(len-1); return ch; } numRead++; return input.read(); }
// in core/src/java/org/apache/solr/analysis/LegacyHTMLStripCharFilter.java
private int nextSkipWS() throws IOException { int ch=next(); while(isSpace(ch)) ch=next(); return ch; }
// in core/src/java/org/apache/solr/analysis/LegacyHTMLStripCharFilter.java
private int peek() throws IOException { int len = pushed.length(); if (len>0) { return pushed.charAt(len-1); } numRead++; int ch = input.read(); push(ch); return ch; }
// in core/src/java/org/apache/solr/analysis/LegacyHTMLStripCharFilter.java
private void saveState() throws IOException { lastMark = numRead; input.mark(readAheadLimit); }
// in core/src/java/org/apache/solr/analysis/LegacyHTMLStripCharFilter.java
private void restoreState() throws IOException { input.reset(); pushed.setLength(0); }
// in core/src/java/org/apache/solr/analysis/LegacyHTMLStripCharFilter.java
private int readNumericEntity() throws IOException { // "&#" has already been read at this point int eaten = 2; // is this decimal, hex, or nothing at all. int ch = next(); int base=10; boolean invalid=false; sb.setLength(0); if (isDigit(ch)) { // decimal character entity sb.append((char)ch); for (int i=0; i<10; i++) { ch = next(); if (isDigit(ch)) { sb.append((char)ch); } else { break; } } } else if (ch=='x') { eaten++; // hex character entity base=16; sb.setLength(0); for (int i=0; i<10; i++) { ch = next(); if (isHex(ch)) { sb.append((char)ch); } else { break; } } } else { return MISMATCH; } // In older HTML, an entity may not have always been terminated // with a semicolon. We'll also treat EOF or whitespace as terminating // the entity. try { if (ch==';' || ch==-1) { // do not account for the eaten ";" due to the fact that we do output a char numWhitespace = sb.length() + eaten; return Integer.parseInt(sb.toString(), base); } // if whitespace terminated the entity, we need to return // that whitespace on the next call to read(). if (isSpace(ch)) { push(ch); numWhitespace = sb.length() + eaten; return Integer.parseInt(sb.toString(), base); } } catch (NumberFormatException e) { return MISMATCH; } // Not an entity... return MISMATCH; }
// in core/src/java/org/apache/solr/analysis/LegacyHTMLStripCharFilter.java
private int readEntity() throws IOException { int ch = next(); if (ch=='#') return readNumericEntity(); //read an entity reference // for an entity reference, require the ';' for safety. // otherwise we may try and convert part of some company // names to an entity. "Alpha&Beta Corp" for instance. // // TODO: perhaps I should special case some of the // more common ones like &amp to make the ';' optional... sb.setLength(0); sb.append((char)ch); for (int i=0; i< safeReadAheadLimit; i++) { ch=next(); if (Character.isLetter(ch)) { sb.append((char)ch); } else { break; } } if (ch==';') { String entity=sb.toString(); Character entityChar = entityTable.get(entity); if (entityChar!=null) { numWhitespace = entity.length() + 1 ; return entityChar.charValue(); } } return MISMATCH; }
// in core/src/java/org/apache/solr/analysis/LegacyHTMLStripCharFilter.java
private int readBang(boolean inScript) throws IOException { // at this point, "<!" has been read int ret = readComment(inScript); if (ret==MATCH) return MATCH; if ((numRead - lastMark) < safeReadAheadLimit || peek() == '>' ) { int ch = next(); if (ch=='>') return MATCH; // if it starts with <! and isn't a comment, // simply read until ">" //since we did readComment already, it may be the case that we are already deep into the read ahead buffer //so, we may need to abort sooner while ((numRead - lastMark) < safeReadAheadLimit) { ch = next(); if (ch=='>') { return MATCH; } else if (ch<0) { return MISMATCH; } } } return MISMATCH; }
// in core/src/java/org/apache/solr/analysis/LegacyHTMLStripCharFilter.java
private int readComment(boolean inScript) throws IOException { // at this point "<!" has been read int ch = next(); if (ch!='-') { // not a comment push(ch); return MISMATCH; } ch = next(); if (ch!='-') { // not a comment push(ch); push('-'); return MISMATCH; } /*two extra calls to next() here, so make sure we don't read past our mark*/ while ((numRead - lastMark) < safeReadAheadLimit -3 ) { ch = next(); if (ch<0) return MISMATCH; if (ch=='-') { ch = next(); if (ch<0) return MISMATCH; if (ch!='-') { push(ch); continue; } ch = next(); if (ch<0) return MISMATCH; if (ch!='>') { push(ch); push('-'); continue; } return MATCH; } else if ((ch=='\'' || ch=='"') && inScript) { push(ch); int ret=readScriptString(); // if this wasn't a string, there's not much we can do // at this point without having a stack of stream states in // order to "undo" just the latest. } else if (ch=='<') { eatSSI(); } } return MISMATCH; }
// in core/src/java/org/apache/solr/analysis/LegacyHTMLStripCharFilter.java
private int readTag() throws IOException { // at this point '<' has already been read int ch = next(); if (!isAlpha(ch)) { push(ch); return MISMATCH; } sb.setLength(0); sb.append((char)ch); while((numRead - lastMark) < safeReadAheadLimit) { ch = next(); if (isIdChar(ch)) { sb.append((char)ch); } else if (ch=='/') { // Hmmm, a tag can close with "/>" as well as "/ >" // read end tag '/>' or '/ >', etc return nextSkipWS()=='>' ? MATCH : MISMATCH; } else { break; } } if (escapedTags!=null && escapedTags.contains(sb.toString())){ //if this is a reservedTag, then keep it return MISMATCH; } // After the tag id, there needs to be either whitespace or // '>' if ( !(ch=='>' || isSpace(ch)) ) { return MISMATCH; } if (ch!='>') { // process attributes while ((numRead - lastMark) < safeReadAheadLimit) { ch=next(); if (isSpace(ch)) { continue; } else if (isFirstIdChar(ch)) { push(ch); int ret = readAttr2(); if (ret==MISMATCH) return ret; } else if (ch=='/') { // read end tag '/>' or '/ >', etc return nextSkipWS()=='>' ? MATCH : MISMATCH; } else if (ch=='>') { break; } else { return MISMATCH; } } if ((numRead - lastMark) >= safeReadAheadLimit){ return MISMATCH;//exit out if we exceeded the buffer } } // We only get to this point after we have read the // entire tag. Now let's see if it's a special tag. String name=sb.toString(); if (name.equalsIgnoreCase("script") || name.equalsIgnoreCase("style")) { // The content of script and style elements is // CDATA in HTML 4 but PCDATA in XHTML. /* From HTML4: Although the STYLE and SCRIPT elements use CDATA for their data model, for these elements, CDATA must be handled differently by user agents. Markup and entities must be treated as raw text and passed to the application as is. The first occurrence of the character sequence "</" (end-tag open delimiter) is treated as terminating the end of the element's content. In valid documents, this would be the end tag for the element. */ // discard everything until endtag is hit (except // if it occurs in a comment. // reset the stream mark to here, since we know that we sucessfully matched // a tag, and if we can't find the end tag, this is where we will want // to roll back to. saveState(); pushed.setLength(0); return findEndTag(); } return MATCH; }
// in core/src/java/org/apache/solr/analysis/LegacyHTMLStripCharFilter.java
int findEndTag() throws IOException { while ((numRead - lastMark) < safeReadAheadLimit) { int ch = next(); if (ch=='<') { ch = next(); // skip looking for end-tag in comments if (ch=='!') { int ret = readBang(true); if (ret==MATCH) continue; // yikes... what now? It wasn't a comment, but I can't get // back to the state I was at. Just continue from where I // am I guess... continue; } // did we match "</" if (ch!='/') { push(ch); continue; } int ret = readName(false); if (ret==MISMATCH) return MISMATCH; ch=nextSkipWS(); if (ch!='>') return MISMATCH; return MATCH; } else if (ch=='\'' || ch=='"') { // read javascript string to avoid a false match. push(ch); int ret = readScriptString(); // what to do about a non-match (non-terminated string?) // play it safe and index the rest of the data I guess... if (ret==MISMATCH) return MISMATCH; } else if (ch<0) { return MISMATCH; } } return MISMATCH; }
// in core/src/java/org/apache/solr/analysis/LegacyHTMLStripCharFilter.java
private int readScriptString() throws IOException { int quoteChar = next(); if (quoteChar!='\'' && quoteChar!='"') return MISMATCH; while((numRead - lastMark) < safeReadAheadLimit) { int ch = next(); if (ch==quoteChar) return MATCH; else if (ch=='\\') { ch=next(); } else if (ch<0) { return MISMATCH; } else if (ch=='<') { eatSSI(); } } return MISMATCH; }
// in core/src/java/org/apache/solr/analysis/LegacyHTMLStripCharFilter.java
private int readName(boolean checkEscaped) throws IOException { StringBuilder builder = (checkEscaped && escapedTags!=null) ? new StringBuilder() : null; int ch = next(); if (builder!=null) builder.append((char)ch); if (!isFirstIdChar(ch)) return MISMATCH; ch = next(); if (builder!=null) builder.append((char)ch); while(isIdChar(ch)) { ch=next(); if (builder!=null) builder.append((char)ch); } if (ch!=-1) { push(ch); } //strip off the trailing > if (builder!=null && escapedTags.contains(builder.substring(0, builder.length() - 1))){ return MISMATCH; } return MATCH; }
// in core/src/java/org/apache/solr/analysis/LegacyHTMLStripCharFilter.java
private int readAttr2() throws IOException { if ((numRead - lastMark < safeReadAheadLimit)) { int ch = next(); if (!isFirstIdChar(ch)) return MISMATCH; ch = next(); while(isIdChar(ch) && ((numRead - lastMark) < safeReadAheadLimit)){ ch=next(); } if (isSpace(ch)) ch = nextSkipWS(); // attributes may not have a value at all! // if (ch != '=') return MISMATCH; if (ch != '=') { push(ch); return MATCH; } int quoteChar = nextSkipWS(); if (quoteChar=='"' || quoteChar=='\'') { while ((numRead - lastMark) < safeReadAheadLimit) { ch = next(); if (ch<0) return MISMATCH; else if (ch=='<') { eatSSI(); } else if (ch==quoteChar) { return MATCH; //} else if (ch=='<') { // return MISMATCH; } } } else { // unquoted attribute while ((numRead - lastMark) < safeReadAheadLimit) { ch = next(); if (ch<0) return MISMATCH; else if (isSpace(ch)) { push(ch); return MATCH; } else if (ch=='>') { push(ch); return MATCH; } else if (ch=='<') { eatSSI(); } } } } return MISMATCH; }
// in core/src/java/org/apache/solr/analysis/LegacyHTMLStripCharFilter.java
private int eatSSI() throws IOException { // at this point, only a "<" was read. // on a mismatch, push back the last char so that if it was // a quote that closes the attribute, it will be re-read and matched. int ch = next(); if (ch!='!') { push(ch); return MISMATCH; } ch=next(); if (ch!='-') { push(ch); return MISMATCH; } ch=next(); if (ch!='-') { push(ch); return MISMATCH; } ch=next(); if (ch!='#') { push(ch); return MISMATCH; } push('#'); push('-'); push('-'); return readComment(false); }
// in core/src/java/org/apache/solr/analysis/LegacyHTMLStripCharFilter.java
private int readProcessingInstruction() throws IOException { // "<?" has already been read while ((numRead - lastMark) < safeReadAheadLimit) { int ch = next(); if (ch=='?' && peek()=='>') { next(); return MATCH; } else if (ch==-1) { return MISMATCH; } } return MISMATCH; }
// in core/src/java/org/apache/solr/analysis/LegacyHTMLStripCharFilter.java
Override public int read() throws IOException { // TODO: Do we ever want to preserve CDATA sections? // where do we have to worry about them? // <![ CDATA [ unescaped markup ]]> if (numWhitespace > 0){ numEaten += numWhitespace; addOffCorrectMap(numReturned, numEaten); numWhitespace = 0; } numReturned++; //do not limit this one by the READAHEAD while(true) { int lastNumRead = numRead; int ch = next(); switch (ch) { case '&': saveState(); ch = readEntity(); if (ch>=0) return ch; if (ch==MISMATCH) { restoreState(); return '&'; } break; case '<': saveState(); ch = next(); int ret = MISMATCH; if (ch=='!') { ret = readBang(false); } else if (ch=='/') { ret = readName(true); if (ret==MATCH) { ch=nextSkipWS(); ret= ch=='>' ? MATCH : MISMATCH; } } else if (isAlpha(ch)) { push(ch); ret = readTag(); } else if (ch=='?') { ret = readProcessingInstruction(); } // matched something to be discarded, so break // from this case and continue in the loop if (ret==MATCH) { //break;//was //return whitespace from numWhitespace = (numRead - lastNumRead) - 1;//tack on the -1 since we are returning a space right now return ' '; } // didn't match any HTML constructs, so roll back // the stream state and just return '<' restoreState(); return '<'; default: return ch; } } }
// in core/src/java/org/apache/solr/analysis/LegacyHTMLStripCharFilter.java
Override public int read(char cbuf[], int off, int len) throws IOException { int i=0; for (i=0; i<len; i++) { int ch = read(); if (ch==-1) break; cbuf[off++] = (char)ch; } if (i==0) { if (len==0) return 0; return -1; } return i; }
// in core/src/java/org/apache/solr/analysis/LegacyHTMLStripCharFilter.java
Override public void close() throws IOException { input.close(); }
// in core/src/java/org/apache/solr/client/solrj/embedded/EmbeddedSolrServer.java
Override public NamedList<Object> request(SolrRequest request) throws SolrServerException, IOException { String path = request.getPath(); if( path == null || !path.startsWith( "/" ) ) { path = "/select"; } // Check for cores action SolrCore core = coreContainer.getCore( coreName ); if( core == null ) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "No such core: " + coreName ); } SolrParams params = request.getParams(); if( params == null ) { params = new ModifiableSolrParams(); } // Extract the handler from the path or params SolrRequestHandler handler = core.getRequestHandler( path ); if( handler == null ) { if( "/select".equals( path ) || "/select/".equalsIgnoreCase( path) ) { String qt = params.get( CommonParams.QT ); handler = core.getRequestHandler( qt ); if( handler == null ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "unknown handler: "+qt); } } // Perhaps the path is to manage the cores if( handler == null && coreContainer != null && path.equals( coreContainer.getAdminPath() ) ) { handler = coreContainer.getMultiCoreHandler(); } } if( handler == null ) { core.close(); throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "unknown handler: "+path ); } SolrQueryRequest req = null; try { req = _parser.buildRequestFrom( core, params, request.getContentStreams() ); req.getContext().put( "path", path ); SolrQueryResponse rsp = new SolrQueryResponse(); SolrRequestInfo.setRequestInfo(new SolrRequestInfo(req, rsp)); core.execute( handler, req, rsp ); if( rsp.getException() != null ) { if(rsp.getException() instanceof SolrException) { throw rsp.getException(); } throw new SolrServerException( rsp.getException() ); } // Check if this should stream results if( request.getStreamingResponseCallback() != null ) { try { final StreamingResponseCallback callback = request.getStreamingResponseCallback(); BinaryResponseWriter.Resolver resolver = new BinaryResponseWriter.Resolver( req, rsp.getReturnFields()) { @Override public void writeResults(ResultContext ctx, JavaBinCodec codec) throws IOException { // write an empty list... SolrDocumentList docs = new SolrDocumentList(); docs.setNumFound( ctx.docs.matches() ); docs.setStart( ctx.docs.offset() ); docs.setMaxScore( ctx.docs.maxScore() ); codec.writeSolrDocumentList( docs ); // This will transform writeResultsBody( ctx, codec ); } }; ByteArrayOutputStream out = new ByteArrayOutputStream(); new JavaBinCodec(resolver) { @Override public void writeSolrDocument(SolrDocument doc) throws IOException { callback.streamSolrDocument( doc ); //super.writeSolrDocument( doc, fields ); } @Override public void writeSolrDocumentList(SolrDocumentList docs) throws IOException { if( docs.size() > 0 ) { SolrDocumentList tmp = new SolrDocumentList(); tmp.setMaxScore( docs.getMaxScore() ); tmp.setNumFound( docs.getNumFound() ); tmp.setStart( docs.getStart() ); docs = tmp; } callback.streamDocListInfo( docs.getNumFound(), docs.getStart(), docs.getMaxScore() ); super.writeSolrDocumentList(docs); } }.marshal(rsp.getValues(), out); InputStream in = new ByteArrayInputStream(out.toByteArray()); return (NamedList<Object>) new JavaBinCodec(resolver).unmarshal(in); } catch (Exception ex) { throw new RuntimeException(ex); } } // Now write it out NamedList<Object> normalized = getParsedResponse(req, rsp); return normalized; } catch( IOException iox ) { throw iox; } catch( SolrException sx ) { throw sx; } catch( Exception ex ) { throw new SolrServerException( ex ); } finally { if (req != null) req.close(); core.close(); SolrRequestInfo.clearRequestInfo(); } }
// in core/src/java/org/apache/solr/client/solrj/embedded/EmbeddedSolrServer.java
Override public void writeResults(ResultContext ctx, JavaBinCodec codec) throws IOException { // write an empty list... SolrDocumentList docs = new SolrDocumentList(); docs.setNumFound( ctx.docs.matches() ); docs.setStart( ctx.docs.offset() ); docs.setMaxScore( ctx.docs.maxScore() ); codec.writeSolrDocumentList( docs ); // This will transform writeResultsBody( ctx, codec ); }
// in core/src/java/org/apache/solr/client/solrj/embedded/EmbeddedSolrServer.java
Override public void writeSolrDocument(SolrDocument doc) throws IOException { callback.streamSolrDocument( doc ); //super.writeSolrDocument( doc, fields ); }
// in core/src/java/org/apache/solr/client/solrj/embedded/EmbeddedSolrServer.java
Override public void writeSolrDocumentList(SolrDocumentList docs) throws IOException { if( docs.size() > 0 ) { SolrDocumentList tmp = new SolrDocumentList(); tmp.setMaxScore( docs.getMaxScore() ); tmp.setNumFound( docs.getNumFound() ); tmp.setStart( docs.getStart() ); docs = tmp; } callback.streamDocListInfo( docs.getNumFound(), docs.getStart(), docs.getMaxScore() ); super.writeSolrDocumentList(docs); }
// in core/src/java/org/apache/solr/client/solrj/embedded/JettySolrRunner.java
Override public void service(HttpServletRequest req, HttpServletResponse res) throws IOException { res.sendError(404, "Can not find: " + req.getRequestURI()); }
// in core/src/java/org/apache/solr/response/TextResponseWriter.java
public void close() throws IOException { writer.flushBuffer(); }
// in core/src/java/org/apache/solr/response/TextResponseWriter.java
public void indent() throws IOException { if (doIndent) indent(level); }
// in core/src/java/org/apache/solr/response/TextResponseWriter.java
public void indent(int lev) throws IOException { writer.write(indentChars, 0, Math.min((lev<<1)+1, indentChars.length)); }
// in core/src/java/org/apache/solr/response/TextResponseWriter.java
public final void writeVal(String name, Object val) throws IOException { // if there get to be enough types, perhaps hashing on the type // to get a handler might be faster (but types must be exact to do that...) // go in order of most common to least common if (val==null) { writeNull(name); } else if (val instanceof String) { writeStr(name, val.toString(), true); // micro-optimization... using toString() avoids a cast first } else if (val instanceof IndexableField) { IndexableField f = (IndexableField)val; SchemaField sf = schema.getFieldOrNull( f.name() ); if( sf != null ) { sf.getType().write(this, name, f); } else { writeStr(name, f.stringValue(), true); } } else if (val instanceof Number) { if (val instanceof Integer) { writeInt(name, val.toString()); } else if (val instanceof Long) { writeLong(name, val.toString()); } else if (val instanceof Float) { // we pass the float instead of using toString() because // it may need special formatting. same for double. writeFloat(name, ((Float)val).floatValue()); } else if (val instanceof Double) { writeDouble(name, ((Double)val).doubleValue()); } else if (val instanceof Short) { writeInt(name, val.toString()); } else if (val instanceof Byte) { writeInt(name, val.toString()); } else { // default... for debugging only writeStr(name, val.getClass().getName() + ':' + val.toString(), true); } } else if (val instanceof Boolean) { writeBool(name, val.toString()); } else if (val instanceof Date) { writeDate(name,(Date)val); } else if (val instanceof Document) { SolrDocument doc = toSolrDocument( (Document)val ); DocTransformer transformer = returnFields.getTransformer(); if( transformer != null ) { TransformContext context = new TransformContext(); context.req = req; transformer.setContext(context); transformer.transform(doc, -1); } writeSolrDocument(name, doc, returnFields, 0 ); } else if (val instanceof SolrDocument) { writeSolrDocument(name, (SolrDocument)val, returnFields, 0); } else if (val instanceof ResultContext) { // requires access to IndexReader writeDocuments(name, (ResultContext)val, returnFields); } else if (val instanceof DocList) { // Should not happen normally ResultContext ctx = new ResultContext(); ctx.docs = (DocList)val; writeDocuments(name, ctx, returnFields); // } // else if (val instanceof DocSet) { // how do we know what fields to read? // todo: have a DocList/DocSet wrapper that // restricts the fields to write...? } else if (val instanceof SolrDocumentList) { writeSolrDocumentList(name, (SolrDocumentList)val, returnFields); } else if (val instanceof Map) { writeMap(name, (Map)val, false, true); } else if (val instanceof NamedList) { writeNamedList(name, (NamedList)val); } else if (val instanceof Iterable) { writeArray(name,((Iterable)val).iterator()); } else if (val instanceof Object[]) { writeArray(name,(Object[])val); } else if (val instanceof Iterator) { writeArray(name,(Iterator)val); } else if (val instanceof byte[]) { byte[] arr = (byte[])val; writeByteArr(name, arr, 0, arr.length); } else if (val instanceof BytesRef) { BytesRef arr = (BytesRef)val; writeByteArr(name, arr.bytes, arr.offset, arr.length); } else { // default... for debugging only writeStr(name, val.getClass().getName() + ':' + val.toString(), true); } }
// in core/src/java/org/apache/solr/response/TextResponseWriter.java
public final void writeSolrDocumentList(String name, SolrDocumentList docs, ReturnFields returnFields) throws IOException { writeStartDocumentList(name, docs.getStart(), docs.size(), docs.getNumFound(), docs.getMaxScore() ); for( int i=0; i<docs.size(); i++ ) { writeSolrDocument( null, docs.get(i), returnFields, i ); } writeEndDocumentList(); }
// in core/src/java/org/apache/solr/response/TextResponseWriter.java
public final void writeDocuments(String name, ResultContext res, ReturnFields fields ) throws IOException { DocList ids = res.docs; TransformContext context = new TransformContext(); context.query = res.query; context.wantsScores = fields.wantsScore() && ids.hasScores(); context.req = req; writeStartDocumentList(name, ids.offset(), ids.size(), ids.matches(), context.wantsScores ? new Float(ids.maxScore()) : null ); DocTransformer transformer = fields.getTransformer(); context.searcher = req.getSearcher(); context.iterator = ids.iterator(); if( transformer != null ) { transformer.setContext( context ); } int sz = ids.size(); Set<String> fnames = fields.getLuceneFieldNames(); for (int i=0; i<sz; i++) { int id = context.iterator.nextDoc(); Document doc = context.searcher.doc(id, fnames); SolrDocument sdoc = toSolrDocument( doc ); if( transformer != null ) { transformer.transform( sdoc, id); } writeSolrDocument( null, sdoc, returnFields, i ); } if( transformer != null ) { transformer.setContext( null ); } writeEndDocumentList(); }
// in core/src/java/org/apache/solr/response/TextResponseWriter.java
public void writeArray(String name, Object[] val) throws IOException { writeArray(name, Arrays.asList(val).iterator()); }
// in core/src/java/org/apache/solr/response/TextResponseWriter.java
public void writeInt(String name, int val) throws IOException { writeInt(name,Integer.toString(val)); }
// in core/src/java/org/apache/solr/response/TextResponseWriter.java
public void writeLong(String name, long val) throws IOException { writeLong(name,Long.toString(val)); }
// in core/src/java/org/apache/solr/response/TextResponseWriter.java
public void writeBool(String name, boolean val) throws IOException { writeBool(name,Boolean.toString(val)); }
// in core/src/java/org/apache/solr/response/TextResponseWriter.java
public void writeFloat(String name, float val) throws IOException { String s = Float.toString(val); // If it's not a normal number, write the value as a string instead. // The following test also handles NaN since comparisons are always false. if (val > Float.NEGATIVE_INFINITY && val < Float.POSITIVE_INFINITY) { writeFloat(name,s); } else { writeStr(name,s,false); } }
// in core/src/java/org/apache/solr/response/TextResponseWriter.java
public void writeDouble(String name, double val) throws IOException { String s = Double.toString(val); // If it's not a normal number, write the value as a string instead. // The following test also handles NaN since comparisons are always false. if (val > Double.NEGATIVE_INFINITY && val < Double.POSITIVE_INFINITY) { writeDouble(name,s); } else { writeStr(name,s,false); } }
// in core/src/java/org/apache/solr/response/TextResponseWriter.java
public void writeDate(String name, Date val) throws IOException { writeDate(name, DateField.formatExternal(val)); }
// in core/src/java/org/apache/solr/response/TextResponseWriter.java
public void writeByteArr(String name, byte[] buf, int offset, int len) throws IOException { writeStr(name, Base64.byteArrayToBase64(buf, offset, len), false); }
// in core/src/java/org/apache/solr/response/JSONResponseWriter.java
public void write(Writer writer, SolrQueryRequest req, SolrQueryResponse rsp) throws IOException { JSONWriter w = new JSONWriter(writer, req, rsp); try { w.writeResponse(); } finally { w.close(); } }
// in core/src/java/org/apache/solr/response/JSONResponseWriter.java
public void writeResponse() throws IOException { if(wrapperFunction!=null) { writer.write(wrapperFunction + "("); } Boolean omitHeader = req.getParams().getBool(CommonParams.OMIT_HEADER); if(omitHeader != null && omitHeader) rsp.getValues().remove("responseHeader"); writeNamedList(null, rsp.getValues()); if(wrapperFunction!=null) { writer.write(')'); } if (doIndent) writer.write('\n'); // ending with a newline looks much better from the command line }
// in core/src/java/org/apache/solr/response/JSONResponseWriter.java
protected void writeKey(String fname, boolean needsEscaping) throws IOException { writeStr(null, fname, needsEscaping); writer.write(':'); }
// in core/src/java/org/apache/solr/response/JSONResponseWriter.java
protected void writeNamedListAsMapMangled(String name, NamedList val) throws IOException { int sz = val.size(); writeMapOpener(sz); incLevel(); // In JSON objects (maps) we can't have null keys or duplicates... // map null to "" and append a qualifier to duplicates. // // a=123,a=456 will be mapped to {a=1,a__1=456} // Disad: this is ambiguous since a real key could be called a__1 // // Another possible mapping could aggregate multiple keys to an array: // a=123,a=456 maps to a=[123,456] // Disad: this is ambiguous with a real single value that happens to be an array // // Both of these mappings have ambiguities. HashMap<String,Integer> repeats = new HashMap<String,Integer>(4); boolean first=true; for (int i=0; i<sz; i++) { String key = val.getName(i); if (key==null) key=""; if (first) { first=false; repeats.put(key,0); } else { writeMapSeparator(); Integer repeatCount = repeats.get(key); if (repeatCount==null) { repeats.put(key,0); } else { String newKey = key; int newCount = repeatCount; do { // avoid generated key clashing with a real key newKey = key + ' ' + (++newCount); repeatCount = repeats.get(newKey); } while (repeatCount != null); repeats.put(key,newCount); key = newKey; } } indent(); writeKey(key, true); writeVal(key,val.getVal(i)); } decLevel(); writeMapCloser(); }
// in core/src/java/org/apache/solr/response/JSONResponseWriter.java
protected void writeNamedListAsMapWithDups(String name, NamedList val) throws IOException { int sz = val.size(); writeMapOpener(sz); incLevel(); for (int i=0; i<sz; i++) { if (i!=0) { writeMapSeparator(); } String key = val.getName(i); if (key==null) key=""; indent(); writeKey(key, true); writeVal(key,val.getVal(i)); } decLevel(); writeMapCloser(); }
// in core/src/java/org/apache/solr/response/JSONResponseWriter.java
protected void writeNamedListAsArrMap(String name, NamedList val) throws IOException { int sz = val.size(); indent(); writeArrayOpener(sz); incLevel(); boolean first=true; for (int i=0; i<sz; i++) { String key = val.getName(i); if (first) { first=false; } else { writeArraySeparator(); } indent(); if (key==null) { writeVal(null,val.getVal(i)); } else { writeMapOpener(1); writeKey(key, true); writeVal(key,val.getVal(i)); writeMapCloser(); } } decLevel(); writeArrayCloser(); }
// in core/src/java/org/apache/solr/response/JSONResponseWriter.java
protected void writeNamedListAsArrArr(String name, NamedList val) throws IOException { int sz = val.size(); indent(); writeArrayOpener(sz); incLevel(); boolean first=true; for (int i=0; i<sz; i++) { String key = val.getName(i); if (first) { first=false; } else { writeArraySeparator(); } indent(); /*** if key is null, just write value??? if (key==null) { writeVal(null,val.getVal(i)); } else { ***/ writeArrayOpener(1); incLevel(); if (key==null) { writeNull(null); } else { writeStr(null, key, true); } writeArraySeparator(); writeVal(key,val.getVal(i)); decLevel(); writeArrayCloser(); } decLevel(); writeArrayCloser(); }
// in core/src/java/org/apache/solr/response/JSONResponseWriter.java
protected void writeNamedListAsFlat(String name, NamedList val) throws IOException { int sz = val.size(); writeArrayOpener(sz); incLevel(); for (int i=0; i<sz; i++) { if (i!=0) { writeArraySeparator(); } String key = val.getName(i); indent(); if (key==null) { writeNull(null); } else { writeStr(null, key, true); } writeArraySeparator(); writeVal(key, val.getVal(i)); } decLevel(); writeArrayCloser(); }
// in core/src/java/org/apache/solr/response/JSONResponseWriter.java
Override public void writeNamedList(String name, NamedList val) throws IOException { if (val instanceof SimpleOrderedMap) { writeNamedListAsMapWithDups(name,val); } else if (namedListStyle==JSON_NL_FLAT) { writeNamedListAsFlat(name,val); } else if (namedListStyle==JSON_NL_MAP){ writeNamedListAsMapWithDups(name,val); } else if (namedListStyle==JSON_NL_ARROFARR) { writeNamedListAsArrArr(name,val); } else if (namedListStyle==JSON_NL_ARROFMAP) { writeNamedListAsArrMap(name,val); } }
// in core/src/java/org/apache/solr/response/JSONResponseWriter.java
Override public void writeSolrDocument(String name, SolrDocument doc, ReturnFields returnFields, int idx) throws IOException { if( idx > 0 ) { writeArraySeparator(); } indent(); writeMapOpener(doc.size()); incLevel(); boolean first=true; for (String fname : doc.getFieldNames()) { if (!returnFields.wantsField(fname)) { continue; } if (first) { first=false; } else { writeMapSeparator(); } indent(); writeKey(fname, true); Object val = doc.getFieldValue(fname); if (val instanceof Collection) { writeVal(fname, val); } else { // if multivalued field, write single value as an array SchemaField sf = schema.getFieldOrNull(fname); if (sf != null && sf.multiValued()) { writeArrayOpener(-1); // no trivial way to determine array size writeVal(fname, val); writeArrayCloser(); } else { writeVal(fname, val); } } } decLevel(); writeMapCloser(); }
// in core/src/java/org/apache/solr/response/JSONResponseWriter.java
Override public void writeStartDocumentList(String name, long start, int size, long numFound, Float maxScore) throws IOException { writeMapOpener((maxScore==null) ? 3 : 4); incLevel(); writeKey("numFound",false); writeLong(null,numFound); writeMapSeparator(); writeKey("start",false); writeLong(null,start); if (maxScore!=null) { writeMapSeparator(); writeKey("maxScore",false); writeFloat(null,maxScore); } writeMapSeparator(); // indent(); writeKey("docs",false); writeArrayOpener(size); incLevel(); }
// in core/src/java/org/apache/solr/response/JSONResponseWriter.java
Override public void writeEndDocumentList() throws IOException { decLevel(); writeArrayCloser(); decLevel(); indent(); writeMapCloser(); }
// in core/src/java/org/apache/solr/response/JSONResponseWriter.java
public void writeMapOpener(int size) throws IOException, IllegalArgumentException { writer.write('{'); }
// in core/src/java/org/apache/solr/response/JSONResponseWriter.java
public void writeMapSeparator() throws IOException { writer.write(','); }
// in core/src/java/org/apache/solr/response/JSONResponseWriter.java
public void writeMapCloser() throws IOException { writer.write('}'); }
// in core/src/java/org/apache/solr/response/JSONResponseWriter.java
public void writeArrayOpener(int size) throws IOException, IllegalArgumentException { writer.write('['); }
// in core/src/java/org/apache/solr/response/JSONResponseWriter.java
public void writeArraySeparator() throws IOException { writer.write(','); }
// in core/src/java/org/apache/solr/response/JSONResponseWriter.java
public void writeArrayCloser() throws IOException { writer.write(']'); }
// in core/src/java/org/apache/solr/response/JSONResponseWriter.java
Override public void writeStr(String name, String val, boolean needsEscaping) throws IOException { // it might be more efficient to use a stringbuilder or write substrings // if writing chars to the stream is slow. if (needsEscaping) { /* http://www.ietf.org/internet-drafts/draft-crockford-jsonorg-json-04.txt All Unicode characters may be placed within the quotation marks except for the characters which must be escaped: quotation mark, reverse solidus, and the control characters (U+0000 through U+001F). */ writer.write('"'); for (int i=0; i<val.length(); i++) { char ch = val.charAt(i); if ((ch > '#' && ch != '\\' && ch < '\u2028') || ch == ' ') { // fast path writer.write(ch); continue; } switch(ch) { case '"': case '\\': writer.write('\\'); writer.write(ch); break; case '\r': writer.write('\\'); writer.write('r'); break; case '\n': writer.write('\\'); writer.write('n'); break; case '\t': writer.write('\\'); writer.write('t'); break; case '\b': writer.write('\\'); writer.write('b'); break; case '\f': writer.write('\\'); writer.write('f'); break; case '\u2028': // fallthrough case '\u2029': unicodeEscape(writer,ch); break; // case '/': default: { if (ch <= 0x1F) { unicodeEscape(writer,ch); } else { writer.write(ch); } } } } writer.write('"'); } else { writer.write('"'); writer.write(val); writer.write('"'); } }
// in core/src/java/org/apache/solr/response/JSONResponseWriter.java
Override public void writeMap(String name, Map val, boolean excludeOuter, boolean isFirstVal) throws IOException { if (!excludeOuter) { writeMapOpener(val.size()); incLevel(); isFirstVal=true; } boolean doIndent = excludeOuter || val.size() > 1; for (Map.Entry entry : (Set<Map.Entry>)val.entrySet()) { Object e = entry.getKey(); String k = e==null ? "" : e.toString(); Object v = entry.getValue(); if (isFirstVal) { isFirstVal=false; } else { writeMapSeparator(); } if (doIndent) indent(); writeKey(k,true); writeVal(k,v); } if (!excludeOuter) { decLevel(); writeMapCloser(); } }
// in core/src/java/org/apache/solr/response/JSONResponseWriter.java
Override public void writeArray(String name, Iterator val) throws IOException { writeArrayOpener(-1); // no trivial way to determine array size incLevel(); boolean first=true; while( val.hasNext() ) { if( !first ) indent(); writeVal(null, val.next()); if( val.hasNext() ) { writeArraySeparator(); } first=false; } decLevel(); writeArrayCloser(); }
// in core/src/java/org/apache/solr/response/JSONResponseWriter.java
Override public void writeNull(String name) throws IOException { writer.write("null"); }
// in core/src/java/org/apache/solr/response/JSONResponseWriter.java
Override public void writeInt(String name, String val) throws IOException { writer.write(val); }
// in core/src/java/org/apache/solr/response/JSONResponseWriter.java
Override public void writeLong(String name, String val) throws IOException { writer.write(val); }
// in core/src/java/org/apache/solr/response/JSONResponseWriter.java
Override public void writeBool(String name, String val) throws IOException { writer.write(val); }
// in core/src/java/org/apache/solr/response/JSONResponseWriter.java
Override public void writeFloat(String name, String val) throws IOException { writer.write(val); }
// in core/src/java/org/apache/solr/response/JSONResponseWriter.java
Override public void writeDouble(String name, String val) throws IOException { writer.write(val); }
// in core/src/java/org/apache/solr/response/JSONResponseWriter.java
Override public void writeDate(String name, String val) throws IOException { writeStr(name, val, false); }
// in core/src/java/org/apache/solr/response/JSONResponseWriter.java
protected static void unicodeEscape(Appendable out, int ch) throws IOException { out.append('\\'); out.append('u'); out.append(hexdigits[(ch>>>12) ]); out.append(hexdigits[(ch>>>8) & 0xf]); out.append(hexdigits[(ch>>>4) & 0xf]); out.append(hexdigits[(ch) & 0xf]); }
// in core/src/java/org/apache/solr/response/JSONResponseWriter.java
Override public void writeFloat(String name, float val) throws IOException { if (Float.isNaN(val)) { writer.write(getNaN()); } else if (Float.isInfinite(val)) { if (val < 0.0f) writer.write('-'); writer.write(getInf()); } else { writeFloat(name, Float.toString(val)); } }
// in core/src/java/org/apache/solr/response/JSONResponseWriter.java
Override public void writeDouble(String name, double val) throws IOException { if (Double.isNaN(val)) { writer.write(getNaN()); } else if (Double.isInfinite(val)) { if (val < 0.0) writer.write('-'); writer.write(getInf()); } else { writeDouble(name, Double.toString(val)); } }
// in core/src/java/org/apache/solr/response/PHPResponseWriter.java
public void write(Writer writer, SolrQueryRequest req, SolrQueryResponse rsp) throws IOException { PHPWriter w = new PHPWriter(writer, req, rsp); try { w.writeResponse(); } finally { w.close(); } }
// in core/src/java/org/apache/solr/response/PHPResponseWriter.java
Override public void writeNamedList(String name, NamedList val) throws IOException { writeNamedListAsMapMangled(name,val); }
// in core/src/java/org/apache/solr/response/PHPResponseWriter.java
Override public void writeMapOpener(int size) throws IOException { writer.write("array("); }
// in core/src/java/org/apache/solr/response/PHPResponseWriter.java
Override public void writeMapCloser() throws IOException { writer.write(')'); }
// in core/src/java/org/apache/solr/response/PHPResponseWriter.java
Override public void writeArrayOpener(int size) throws IOException { writer.write("array("); }
// in core/src/java/org/apache/solr/response/PHPResponseWriter.java
Override public void writeArrayCloser() throws IOException { writer.write(')'); }
// in core/src/java/org/apache/solr/response/PHPResponseWriter.java
Override public void writeNull(String name) throws IOException { writer.write("null"); }
// in core/src/java/org/apache/solr/response/PHPResponseWriter.java
Override protected void writeKey(String fname, boolean needsEscaping) throws IOException { writeStr(null, fname, needsEscaping); writer.write('='); writer.write('>'); }
// in core/src/java/org/apache/solr/response/PHPResponseWriter.java
Override public void writeStr(String name, String val, boolean needsEscaping) throws IOException { if (needsEscaping) { writer.write('\''); for (int i=0; i<val.length(); i++) { char ch = val.charAt(i); switch (ch) { case '\'': case '\\': writer.write('\\'); writer.write(ch); break; default: writer.write(ch); } } writer.write('\''); } else { writer.write('\''); writer.write(val); writer.write('\''); } }
// in core/src/java/org/apache/solr/response/CSVResponseWriter.java
public void write(Writer writer, SolrQueryRequest req, SolrQueryResponse rsp) throws IOException { CSVWriter w = new CSVWriter(writer, req, rsp); try { w.writeResponse(); } finally { w.close(); } }
// in core/src/java/org/apache/solr/response/CSVResponseWriter.java
public void freeze() throws IOException { if (cw.size() > 0) { flush(); result = cw.getInternalBuf(); resultLen = cw.size(); } else { result = buf; resultLen = pos; } }
// in core/src/java/org/apache/solr/response/CSVResponseWriter.java
public void writeResponse() throws IOException { SolrParams params = req.getParams(); strategy = new CSVStrategy(',', '"', CSVStrategy.COMMENTS_DISABLED, CSVStrategy.ESCAPE_DISABLED, false, false, false, true); CSVStrategy strat = strategy; String sep = params.get(CSV_SEPARATOR); if (sep!=null) { if (sep.length()!=1) throw new SolrException( SolrException.ErrorCode.BAD_REQUEST,"Invalid separator:'"+sep+"'"); strat.setDelimiter(sep.charAt(0)); } String nl = params.get(CSV_NEWLINE); if (nl!=null) { if (nl.length()==0) throw new SolrException( SolrException.ErrorCode.BAD_REQUEST,"Invalid newline:'"+nl+"'"); strat.setPrinterNewline(nl); } String encapsulator = params.get(CSV_ENCAPSULATOR); String escape = params.get(CSV_ESCAPE); if (encapsulator!=null) { if (encapsulator.length()!=1) throw new SolrException( SolrException.ErrorCode.BAD_REQUEST,"Invalid encapsulator:'"+encapsulator+"'"); strat.setEncapsulator(encapsulator.charAt(0)); } if (escape!=null) { if (escape.length()!=1) throw new SolrException( SolrException.ErrorCode.BAD_REQUEST,"Invalid escape:'"+escape+"'"); strat.setEscape(escape.charAt(0)); if (encapsulator == null) { strat.setEncapsulator( CSVStrategy.ENCAPSULATOR_DISABLED); } } if (strat.getEscape() == '\\') { // If the escape is the standard backslash, then also enable // unicode escapes (it's harmless since 'u' would not otherwise // be escaped. strat.setUnicodeEscapeInterpretation(true); } printer = new CSVPrinter(writer, strategy); CSVStrategy mvStrategy = new CSVStrategy(strategy.getDelimiter(), CSVStrategy.ENCAPSULATOR_DISABLED, CSVStrategy.COMMENTS_DISABLED, '\\', false, false, false, false); strat = mvStrategy; sep = params.get(MV_SEPARATOR); if (sep!=null) { if (sep.length()!=1) throw new SolrException( SolrException.ErrorCode.BAD_REQUEST,"Invalid mv separator:'"+sep+"'"); strat.setDelimiter(sep.charAt(0)); } encapsulator = params.get(MV_ENCAPSULATOR); escape = params.get(MV_ESCAPE); if (encapsulator!=null) { if (encapsulator.length()!=1) throw new SolrException( SolrException.ErrorCode.BAD_REQUEST,"Invalid mv encapsulator:'"+encapsulator+"'"); strat.setEncapsulator(encapsulator.charAt(0)); if (escape == null) { strat.setEscape(CSVStrategy.ESCAPE_DISABLED); } } escape = params.get(MV_ESCAPE); if (escape!=null) { if (escape.length()!=1) throw new SolrException( SolrException.ErrorCode.BAD_REQUEST,"Invalid mv escape:'"+escape+"'"); strat.setEscape(escape.charAt(0)); // encapsulator will already be disabled if it wasn't specified } Collection<String> fields = returnFields.getLuceneFieldNames(); Object responseObj = rsp.getValues().get("response"); boolean returnOnlyStored = false; if (fields==null) { if (responseObj instanceof SolrDocumentList) { // get the list of fields from the SolrDocumentList fields = new LinkedHashSet<String>(); for (SolrDocument sdoc: (SolrDocumentList)responseObj) { fields.addAll(sdoc.getFieldNames()); } } else { // get the list of fields from the index fields = req.getSearcher().getFieldNames(); } if (returnFields.wantsScore()) { fields.add("score"); } else { fields.remove("score"); } returnOnlyStored = true; } CSVSharedBufPrinter csvPrinterMV = new CSVSharedBufPrinter(mvWriter, mvStrategy); for (String field : fields) { if (!returnFields.wantsField(field)) { continue; } if (field.equals("score")) { CSVField csvField = new CSVField(); csvField.name = "score"; csvFields.put("score", csvField); continue; } SchemaField sf = schema.getFieldOrNull(field); if (sf == null) { FieldType ft = new StrField(); sf = new SchemaField(field, ft); } // Return only stored fields, unless an explicit field list is specified if (returnOnlyStored && sf != null && !sf.stored()) { continue; } // check for per-field overrides sep = params.get("f." + field + '.' + CSV_SEPARATOR); encapsulator = params.get("f." + field + '.' + CSV_ENCAPSULATOR); escape = params.get("f." + field + '.' + CSV_ESCAPE); CSVSharedBufPrinter csvPrinter = csvPrinterMV; if (sep != null || encapsulator != null || escape != null) { // create a new strategy + printer if there were any per-field overrides strat = (CSVStrategy)mvStrategy.clone(); if (sep!=null) { if (sep.length()!=1) throw new SolrException( SolrException.ErrorCode.BAD_REQUEST,"Invalid mv separator:'"+sep+"'"); strat.setDelimiter(sep.charAt(0)); } if (encapsulator!=null) { if (encapsulator.length()!=1) throw new SolrException( SolrException.ErrorCode.BAD_REQUEST,"Invalid mv encapsulator:'"+encapsulator+"'"); strat.setEncapsulator(encapsulator.charAt(0)); if (escape == null) { strat.setEscape(CSVStrategy.ESCAPE_DISABLED); } } if (escape!=null) { if (escape.length()!=1) throw new SolrException( SolrException.ErrorCode.BAD_REQUEST,"Invalid mv escape:'"+escape+"'"); strat.setEscape(escape.charAt(0)); if (encapsulator == null) { strat.setEncapsulator(CSVStrategy.ENCAPSULATOR_DISABLED); } } csvPrinter = new CSVSharedBufPrinter(mvWriter, strat); } CSVField csvField = new CSVField(); csvField.name = field; csvField.sf = sf; csvField.mvPrinter = csvPrinter; csvFields.put(field, csvField); } NullValue = params.get(CSV_NULL, ""); if (params.getBool(CSV_HEADER, true)) { for (CSVField csvField : csvFields.values()) { printer.print(csvField.name); } printer.println(); } if (responseObj instanceof ResultContext ) { writeDocuments(null, (ResultContext)responseObj, returnFields ); } else if (responseObj instanceof DocList) { ResultContext ctx = new ResultContext(); ctx.docs = (DocList)responseObj; writeDocuments(null, ctx, returnFields ); } else if (responseObj instanceof SolrDocumentList) { writeSolrDocumentList(null, (SolrDocumentList)responseObj, returnFields ); } }
// in core/src/java/org/apache/solr/response/CSVResponseWriter.java
Override public void close() throws IOException { if (printer != null) printer.flush(); super.close(); }
// in core/src/java/org/apache/solr/response/CSVResponseWriter.java
Override public void writeNamedList(String name, NamedList val) throws IOException { }
// in core/src/java/org/apache/solr/response/CSVResponseWriter.java
public void writeStartDocumentList(String name, long start, int size, long numFound, Float maxScore) throws IOException { // nothing }
// in core/src/java/org/apache/solr/response/CSVResponseWriter.java
public void writeEndDocumentList() throws IOException { // nothing }
// in core/src/java/org/apache/solr/response/CSVResponseWriter.java
Override public void writeSolrDocument(String name, SolrDocument doc, ReturnFields returnFields, int idx ) throws IOException { if (tmpList == null) { tmpList = new ArrayList(1); tmpList.add(null); } for (CSVField csvField : csvFields.values()) { Object val = doc.getFieldValue(csvField.name); int nVals = val instanceof Collection ? ((Collection)val).size() : (val==null ? 0 : 1); if (nVals == 0) { writeNull(csvField.name); continue; } if ((csvField.sf != null && csvField.sf.multiValued()) || nVals > 1) { Collection values; // normalize to a collection if (val instanceof Collection) { values = (Collection)val; } else { tmpList.set(0, val); values = tmpList; } mvWriter.reset(); csvField.mvPrinter.reset(); // switch the printer to use the multi-valued one CSVPrinter tmp = printer; printer = csvField.mvPrinter; for (Object fval : values) { writeVal(csvField.name, fval); } printer = tmp; // restore the original printer mvWriter.freeze(); printer.print(mvWriter.getFrozenBuf(), 0, mvWriter.getFrozenSize(), true); } else { // normalize to first value if (val instanceof Collection) { Collection values = (Collection)val; val = values.iterator().next(); } writeVal(csvField.name, val); } } printer.println(); }
// in core/src/java/org/apache/solr/response/CSVResponseWriter.java
Override public void writeStr(String name, String val, boolean needsEscaping) throws IOException { printer.print(val, needsEscaping); }
// in core/src/java/org/apache/solr/response/CSVResponseWriter.java
Override public void writeMap(String name, Map val, boolean excludeOuter, boolean isFirstVal) throws IOException { }
// in core/src/java/org/apache/solr/response/CSVResponseWriter.java
Override public void writeArray(String name, Iterator val) throws IOException { }
// in core/src/java/org/apache/solr/response/CSVResponseWriter.java
Override public void writeNull(String name) throws IOException { printer.print(NullValue); }
// in core/src/java/org/apache/solr/response/CSVResponseWriter.java
Override public void writeInt(String name, String val) throws IOException { printer.print(val, false); }
// in core/src/java/org/apache/solr/response/CSVResponseWriter.java
Override public void writeLong(String name, String val) throws IOException { printer.print(val, false); }
// in core/src/java/org/apache/solr/response/CSVResponseWriter.java
Override public void writeBool(String name, String val) throws IOException { printer.print(val, false); }
// in core/src/java/org/apache/solr/response/CSVResponseWriter.java
Override public void writeFloat(String name, String val) throws IOException { printer.print(val, false); }
// in core/src/java/org/apache/solr/response/CSVResponseWriter.java
Override public void writeDouble(String name, String val) throws IOException { printer.print(val, false); }
// in core/src/java/org/apache/solr/response/CSVResponseWriter.java
Override public void writeDate(String name, Date val) throws IOException { StringBuilder sb = new StringBuilder(25); cal = DateUtil.formatDate(val, cal, sb); writeDate(name, sb.toString()); }
// in core/src/java/org/apache/solr/response/CSVResponseWriter.java
Override public void writeDate(String name, String val) throws IOException { printer.print(val, false); }
// in core/src/java/org/apache/solr/response/PythonResponseWriter.java
public void write(Writer writer, SolrQueryRequest req, SolrQueryResponse rsp) throws IOException { PythonWriter w = new PythonWriter(writer, req, rsp); try { w.writeResponse(); } finally { w.close(); } }
// in core/src/java/org/apache/solr/response/PythonResponseWriter.java
Override public void writeNull(String name) throws IOException { writer.write("None"); }
// in core/src/java/org/apache/solr/response/PythonResponseWriter.java
Override public void writeBool(String name, boolean val) throws IOException { writer.write(val ? "True" : "False"); }
// in core/src/java/org/apache/solr/response/PythonResponseWriter.java
Override public void writeBool(String name, String val) throws IOException { writeBool(name,val.charAt(0)=='t'); }
// in core/src/java/org/apache/solr/response/PythonResponseWriter.java
Override public void writeStr(String name, String val, boolean needsEscaping) throws IOException { if (!needsEscaping) { writer.write('\''); writer.write(val); writer.write('\''); return; } // use python unicode strings... // python doesn't tolerate newlines in strings in it's eval(), so we must escape them. StringBuilder sb = new StringBuilder(val.length()); boolean needUnicode=false; for (int i=0; i<val.length(); i++) { char ch = val.charAt(i); switch(ch) { case '\'': case '\\': sb.append('\\'); sb.append(ch); break; case '\r': sb.append("\\r"); break; case '\n': sb.append("\\n"); break; case '\t': sb.append("\\t"); break; default: // we don't strictly have to escape these chars, but it will probably increase // portability to stick to visible ascii if (ch<' ' || ch>127) { unicodeEscape(sb, ch); needUnicode=true; } else { sb.append(ch); } } } if (needUnicode) { writer.write('u'); } writer.write('\''); writer.append(sb); writer.write('\''); }
// in core/src/java/org/apache/solr/response/RubyResponseWriter.java
public void write(Writer writer, SolrQueryRequest req, SolrQueryResponse rsp) throws IOException { RubyWriter w = new RubyWriter(writer, req, rsp); try { w.writeResponse(); } finally { w.close(); } }
// in core/src/java/org/apache/solr/response/RubyResponseWriter.java
Override public void writeNull(String name) throws IOException { writer.write("nil"); }
// in core/src/java/org/apache/solr/response/RubyResponseWriter.java
Override protected void writeKey(String fname, boolean needsEscaping) throws IOException { writeStr(null, fname, needsEscaping); writer.write('='); writer.write('>'); }
// in core/src/java/org/apache/solr/response/RubyResponseWriter.java
Override public void writeStr(String name, String val, boolean needsEscaping) throws IOException { // Ruby doesn't do unicode escapes... so let the servlet container write raw UTF-8 // bytes into the string. // // Use single quoted strings for safety since no evaluation is done within them. // Also, there are very few escapes recognized in a single quoted string, so // only escape the backslash and single quote. writer.write('\''); if (needsEscaping) { for (int i=0; i<val.length(); i++) { char ch = val.charAt(i); if (ch=='\'' || ch=='\\') { writer.write('\\'); } writer.write(ch); } } else { writer.write(val); } writer.write('\''); }
// in core/src/java/org/apache/solr/response/XSLTResponseWriter.java
public void write(Writer writer, SolrQueryRequest request, SolrQueryResponse response) throws IOException { final Transformer t = getTransformer(request); // capture the output of the XMLWriter final CharArrayWriter w = new CharArrayWriter(); XMLWriter.writeResponse(w,request,response); // and write transformed result to our writer final Reader r = new BufferedReader(new CharArrayReader(w.toCharArray())); final StreamSource source = new StreamSource(r); final StreamResult result = new StreamResult(writer); try { t.transform(source, result); } catch(TransformerException te) { final IOException ioe = new IOException("XSLT transformation error"); ioe.initCause(te); throw ioe; } }
// in core/src/java/org/apache/solr/response/XSLTResponseWriter.java
protected Transformer getTransformer(SolrQueryRequest request) throws IOException { final String xslt = request.getParams().get(CommonParams.TR,null); if(xslt==null) { throw new IOException("'" + CommonParams.TR + "' request parameter is required to use the XSLTResponseWriter"); } // not the cleanest way to achieve this SolrConfig solrConfig = request.getCore().getSolrConfig(); // no need to synchronize access to context, right? // Nothing else happens with it at the same time final Map<Object,Object> ctx = request.getContext(); Transformer result = (Transformer)ctx.get(CONTEXT_TRANSFORMER_KEY); if(result==null) { result = TransformerProvider.instance.getTransformer(solrConfig, xslt,xsltCacheLifetimeSeconds.intValue()); result.setErrorListener(xmllog); ctx.put(CONTEXT_TRANSFORMER_KEY,result); } return result; }
// in core/src/java/org/apache/solr/response/XMLWriter.java
public static void writeResponse(Writer writer, SolrQueryRequest req, SolrQueryResponse rsp) throws IOException { XMLWriter xmlWriter = null; try { xmlWriter = new XMLWriter(writer, req, rsp); xmlWriter.writeResponse(); } finally { xmlWriter.close(); } }
// in core/src/java/org/apache/solr/response/XMLWriter.java
public void writeResponse() throws IOException { writer.write(XML_START1); String stylesheet = req.getParams().get("stylesheet"); if (stylesheet != null && stylesheet.length() > 0) { writer.write(XML_STYLESHEET); XML.escapeAttributeValue(stylesheet, writer); writer.write(XML_STYLESHEET_END); } /*** String noSchema = req.getParams().get("noSchema"); // todo - change when schema becomes available? if (false && noSchema == null) writer.write(XML_START2_SCHEMA); else writer.write(XML_START2_NOSCHEMA); ***/ writer.write(XML_START2_NOSCHEMA); // dump response values NamedList<?> lst = rsp.getValues(); Boolean omitHeader = req.getParams().getBool(CommonParams.OMIT_HEADER); if(omitHeader != null && omitHeader) lst.remove("responseHeader"); int sz = lst.size(); int start=0; for (int i=start; i<sz; i++) { writeVal(lst.getName(i),lst.getVal(i)); } writer.write("\n</response>\n"); }
// in core/src/java/org/apache/solr/response/XMLWriter.java
private void writeAttr(String name, String val) throws IOException { writeAttr(name, val, true); }
// in core/src/java/org/apache/solr/response/XMLWriter.java
public void writeAttr(String name, String val, boolean escape) throws IOException{ if (val != null) { writer.write(' '); writer.write(name); writer.write("=\""); if(escape){ XML.escapeAttributeValue(val, writer); } else { writer.write(val); } writer.write('"'); } }
// in core/src/java/org/apache/solr/response/XMLWriter.java
void startTag(String tag, String name, boolean closeTag) throws IOException { if (doIndent) indent(); writer.write('<'); writer.write(tag); if (name!=null) { writeAttr("name", name); if (closeTag) { writer.write("/>"); } else { writer.write(">"); } } else { if (closeTag) { writer.write("/>"); } else { writer.write('>'); } } }
// in core/src/java/org/apache/solr/response/XMLWriter.java
Override public void writeStartDocumentList(String name, long start, int size, long numFound, Float maxScore) throws IOException { if (doIndent) indent(); writer.write("<result"); writeAttr("name",name); writeAttr("numFound",Long.toString(numFound)); writeAttr("start",Long.toString(start)); if(maxScore!=null) { writeAttr("maxScore",Float.toString(maxScore)); } writer.write(">"); incLevel(); }
// in core/src/java/org/apache/solr/response/XMLWriter.java
Override public void writeSolrDocument(String name, SolrDocument doc, ReturnFields returnFields, int idx ) throws IOException { startTag("doc", name, false); incLevel(); for (String fname : doc.getFieldNames()) { if (!returnFields.wantsField(fname)) { continue; } Object val = doc.getFieldValue(fname); if( "_explain_".equals( fname ) ) { System.out.println( val ); } writeVal(fname, val); } decLevel(); writer.write("</doc>"); }
// in core/src/java/org/apache/solr/response/XMLWriter.java
Override public void writeEndDocumentList() throws IOException { decLevel(); if (doIndent) indent(); writer.write("</result>"); }
// in core/src/java/org/apache/solr/response/XMLWriter.java
Override public void writeNamedList(String name, NamedList val) throws IOException { int sz = val.size(); startTag("lst", name, sz<=0); incLevel(); for (int i=0; i<sz; i++) { writeVal(val.getName(i),val.getVal(i)); } decLevel(); if (sz > 0) { if (doIndent) indent(); writer.write("</lst>"); } }
// in core/src/java/org/apache/solr/response/XMLWriter.java
Override public void writeMap(String name, Map map, boolean excludeOuter, boolean isFirstVal) throws IOException { int sz = map.size(); if (!excludeOuter) { startTag("lst", name, sz<=0); incLevel(); } for (Map.Entry entry : (Set<Map.Entry>)map.entrySet()) { Object k = entry.getKey(); Object v = entry.getValue(); // if (sz<indentThreshold) indent(); writeVal( null == k ? null : k.toString(), v); } if (!excludeOuter) { decLevel(); if (sz > 0) { if (doIndent) indent(); writer.write("</lst>"); } } }
// in core/src/java/org/apache/solr/response/XMLWriter.java
Override public void writeArray(String name, Object[] val) throws IOException { writeArray(name, Arrays.asList(val).iterator()); }
// in core/src/java/org/apache/solr/response/XMLWriter.java
Override public void writeArray(String name, Iterator iter) throws IOException { if( iter.hasNext() ) { startTag("arr", name, false ); incLevel(); while( iter.hasNext() ) { writeVal(null, iter.next()); } decLevel(); if (doIndent) indent(); writer.write("</arr>"); } else { startTag("arr", name, true ); } }
// in core/src/java/org/apache/solr/response/XMLWriter.java
Override public void writeNull(String name) throws IOException { writePrim("null",name,"",false); }
// in core/src/java/org/apache/solr/response/XMLWriter.java
Override public void writeStr(String name, String val, boolean escape) throws IOException { writePrim("str",name,val,escape); }
// in core/src/java/org/apache/solr/response/XMLWriter.java
Override public void writeInt(String name, String val) throws IOException { writePrim("int",name,val,false); }
// in core/src/java/org/apache/solr/response/XMLWriter.java
Override public void writeLong(String name, String val) throws IOException { writePrim("long",name,val,false); }
// in core/src/java/org/apache/solr/response/XMLWriter.java
Override public void writeBool(String name, String val) throws IOException { writePrim("bool",name,val,false); }
// in core/src/java/org/apache/solr/response/XMLWriter.java
Override public void writeFloat(String name, String val) throws IOException { writePrim("float",name,val,false); }
// in core/src/java/org/apache/solr/response/XMLWriter.java
Override public void writeFloat(String name, float val) throws IOException { writeFloat(name,Float.toString(val)); }
// in core/src/java/org/apache/solr/response/XMLWriter.java
Override public void writeDouble(String name, String val) throws IOException { writePrim("double",name,val,false); }
// in core/src/java/org/apache/solr/response/XMLWriter.java
Override public void writeDouble(String name, double val) throws IOException { writeDouble(name,Double.toString(val)); }
// in core/src/java/org/apache/solr/response/XMLWriter.java
Override public void writeDate(String name, String val) throws IOException { writePrim("date",name,val,false); }
// in core/src/java/org/apache/solr/response/XMLWriter.java
private void writePrim(String tag, String name, String val, boolean escape) throws IOException { int contentLen = val==null ? 0 : val.length(); startTag(tag, name, contentLen==0); if (contentLen==0) return; if (escape) { XML.escapeCharData(val,writer); } else { writer.write(val,0,contentLen); } writer.write('<'); writer.write('/'); writer.write(tag); writer.write('>'); }
// in core/src/java/org/apache/solr/response/XMLResponseWriter.java
public void write(Writer writer, SolrQueryRequest req, SolrQueryResponse rsp) throws IOException { XMLWriter w = new XMLWriter(writer, req, rsp); try { w.writeResponse(); } finally { w.close(); } }
// in core/src/java/org/apache/solr/response/transform/DocTransformers.java
Override public void transform(SolrDocument doc, int docid) throws IOException { for( DocTransformer a : children ) { a.transform( doc, docid); } }
// in core/src/java/org/apache/solr/response/BinaryResponseWriter.java
public void write(OutputStream out, SolrQueryRequest req, SolrQueryResponse response) throws IOException { Resolver resolver = new Resolver(req, response.getReturnFields()); Boolean omitHeader = req.getParams().getBool(CommonParams.OMIT_HEADER); if (omitHeader != null && omitHeader) response.getValues().remove("responseHeader"); JavaBinCodec codec = new JavaBinCodec(resolver); codec.marshal(response.getValues(), out); }
// in core/src/java/org/apache/solr/response/BinaryResponseWriter.java
public void write(Writer writer, SolrQueryRequest request, SolrQueryResponse response) throws IOException { throw new RuntimeException("This is a binary writer , Cannot write to a characterstream"); }
// in core/src/java/org/apache/solr/response/BinaryResponseWriter.java
public Object resolve(Object o, JavaBinCodec codec) throws IOException { if (o instanceof ResultContext) { writeResults((ResultContext) o, codec); return null; // null means we completely handled it } if (o instanceof DocList) { ResultContext ctx = new ResultContext(); ctx.docs = (DocList) o; writeResults(ctx, codec); return null; // null means we completely handled it } if( o instanceof IndexableField ) { if(schema == null) schema = solrQueryRequest.getSchema(); IndexableField f = (IndexableField)o; SchemaField sf = schema.getFieldOrNull(f.name()); try { o = getValue(sf, f); } catch (Exception e) { LOG.warn("Error reading a field : " + o, e); } } if (o instanceof SolrDocument) { // Remove any fields that were not requested. // This typically happens when distributed search adds // extra fields to an internal request SolrDocument doc = (SolrDocument)o; Iterator<Map.Entry<String, Object>> i = doc.iterator(); while ( i.hasNext() ) { String fname = i.next().getKey(); if ( !returnFields.wantsField( fname ) ) { i.remove(); } } return doc; } return o; }
// in core/src/java/org/apache/solr/response/BinaryResponseWriter.java
protected void writeResultsBody( ResultContext res, JavaBinCodec codec ) throws IOException { DocList ids = res.docs; int sz = ids.size(); codec.writeTag(JavaBinCodec.ARR, sz); if(searcher == null) searcher = solrQueryRequest.getSearcher(); if(schema == null) schema = solrQueryRequest.getSchema(); DocTransformer transformer = returnFields.getTransformer(); TransformContext context = new TransformContext(); context.query = res.query; context.wantsScores = returnFields.wantsScore() && ids.hasScores(); context.req = solrQueryRequest; context.searcher = searcher; if( transformer != null ) { transformer.setContext( context ); } Set<String> fnames = returnFields.getLuceneFieldNames(); context.iterator = ids.iterator(); for (int i = 0; i < sz; i++) { int id = context.iterator.nextDoc(); Document doc = searcher.doc(id, fnames); SolrDocument sdoc = getDoc(doc); if( transformer != null ) { transformer.transform(sdoc, id); } codec.writeSolrDocument(sdoc); } if( transformer != null ) { transformer.setContext( null ); } }
// in core/src/java/org/apache/solr/response/BinaryResponseWriter.java
public void writeResults(ResultContext ctx, JavaBinCodec codec) throws IOException { codec.writeTag(JavaBinCodec.SOLRDOCLST); boolean wantsScores = returnFields.wantsScore() && ctx.docs.hasScores(); List l = new ArrayList(3); l.add((long) ctx.docs.matches()); l.add((long) ctx.docs.offset()); Float maxScore = null; if (wantsScores) { maxScore = ctx.docs.maxScore(); } l.add(maxScore); codec.writeArray(l); // this is a seprate function so that streaming responses can use just that part writeResultsBody( ctx, codec ); }
// in core/src/java/org/apache/solr/response/RawResponseWriter.java
public void write(Writer writer, SolrQueryRequest request, SolrQueryResponse response) throws IOException { Object obj = response.getValues().get( CONTENT ); if( obj != null && (obj instanceof ContentStream ) ) { // copy the contents to the writer... ContentStream content = (ContentStream)obj; Reader reader = content.getReader(); try { IOUtils.copy( reader, writer ); } finally { reader.close(); } } else { getBaseWriter( request ).write( writer, request, response ); } }
// in core/src/java/org/apache/solr/response/RawResponseWriter.java
public void write(OutputStream out, SolrQueryRequest request, SolrQueryResponse response) throws IOException { Object obj = response.getValues().get( CONTENT ); if( obj != null && (obj instanceof ContentStream ) ) { // copy the contents to the writer... ContentStream content = (ContentStream)obj; java.io.InputStream in = content.getStream(); try { IOUtils.copy( in, out ); } finally { in.close(); } } else { //getBaseWriter( request ).write( writer, request, response ); throw new IOException("did not find a CONTENT object"); } }
// in core/src/java/org/apache/solr/response/PHPSerializedResponseWriter.java
public void write(Writer writer, SolrQueryRequest req, SolrQueryResponse rsp) throws IOException { PHPSerializedWriter w = new PHPSerializedWriter(writer, req, rsp); try { w.writeResponse(); } finally { w.close(); } }
// in core/src/java/org/apache/solr/response/PHPSerializedResponseWriter.java
Override public void writeResponse() throws IOException { Boolean omitHeader = req.getParams().getBool(CommonParams.OMIT_HEADER); if(omitHeader != null && omitHeader) rsp.getValues().remove("responseHeader"); writeNamedList(null, rsp.getValues()); }
// in core/src/java/org/apache/solr/response/PHPSerializedResponseWriter.java
Override public void writeNamedList(String name, NamedList val) throws IOException { writeNamedListAsMapMangled(name,val); }
// in core/src/java/org/apache/solr/response/PHPSerializedResponseWriter.java
public void writeStartDocumentList(String name, long start, int size, long numFound, Float maxScore) throws IOException { writeMapOpener((maxScore==null) ? 3 : 4); writeKey("numFound",false); writeLong(null,numFound); writeKey("start",false); writeLong(null,start); if (maxScore!=null) { writeKey("maxScore",false); writeFloat(null,maxScore); } writeKey("docs",false); writeArrayOpener(size); }
// in core/src/java/org/apache/solr/response/PHPSerializedResponseWriter.java
public void writeEndDocumentList() throws IOException { writeArrayCloser(); // doc list writeMapCloser(); }
// in core/src/java/org/apache/solr/response/PHPSerializedResponseWriter.java
Override public void writeSolrDocument(String name, SolrDocument doc, ReturnFields returnFields, int idx) throws IOException { writeKey(idx, false); LinkedHashMap <String,Object> single = new LinkedHashMap<String, Object>(); LinkedHashMap <String,Object> multi = new LinkedHashMap<String, Object>(); for (String fname : doc.getFieldNames()) { if(!returnFields.wantsField(fname)){ continue; } Object val = doc.getFieldValue(fname); if (val instanceof Collection) { multi.put(fname, val); }else{ single.put(fname, val); } } writeMapOpener(single.size() + multi.size()); for(String fname: single.keySet()){ Object val = single.get(fname); writeKey(fname, true); writeVal(fname, val); } for(String fname: multi.keySet()){ writeKey(fname, true); Object val = multi.get(fname); if (!(val instanceof Collection)) { // should never be reached if multivalued fields are stored as a Collection // so I'm assuming a size of 1 just to wrap the single value writeArrayOpener(1); writeVal(fname, val); writeArrayCloser(); }else{ writeVal(fname, val); } } writeMapCloser(); }
// in core/src/java/org/apache/solr/response/PHPSerializedResponseWriter.java
Override public void writeArray(String name, Object[] val) throws IOException { writeMapOpener(val.length); for(int i=0; i < val.length; i++) { writeKey(i, false); writeVal(String.valueOf(i), val[i]); } writeMapCloser(); }
// in core/src/java/org/apache/solr/response/PHPSerializedResponseWriter.java
Override public void writeArray(String name, Iterator val) throws IOException { ArrayList vals = new ArrayList(); while( val.hasNext() ) { vals.add(val.next()); } writeArray(name, vals.toArray()); }
// in core/src/java/org/apache/solr/response/PHPSerializedResponseWriter.java
Override public void writeMapOpener(int size) throws IOException, IllegalArgumentException { // negative size value indicates that something has gone wrong if (size < 0) { throw new IllegalArgumentException("Map size must not be negative"); } writer.write("a:"+size+":{"); }
// in core/src/java/org/apache/solr/response/PHPSerializedResponseWriter.java
Override public void writeMapSeparator() throws IOException { /* NOOP */ }
// in core/src/java/org/apache/solr/response/PHPSerializedResponseWriter.java
Override public void writeMapCloser() throws IOException { writer.write('}'); }
// in core/src/java/org/apache/solr/response/PHPSerializedResponseWriter.java
Override public void writeArrayOpener(int size) throws IOException, IllegalArgumentException { // negative size value indicates that something has gone wrong if (size < 0) { throw new IllegalArgumentException("Array size must not be negative"); } writer.write("a:"+size+":{"); }
// in core/src/java/org/apache/solr/response/PHPSerializedResponseWriter.java
Override public void writeArraySeparator() throws IOException { /* NOOP */ }
// in core/src/java/org/apache/solr/response/PHPSerializedResponseWriter.java
Override public void writeArrayCloser() throws IOException { writer.write('}'); }
// in core/src/java/org/apache/solr/response/PHPSerializedResponseWriter.java
Override public void writeNull(String name) throws IOException { writer.write("N;"); }
// in core/src/java/org/apache/solr/response/PHPSerializedResponseWriter.java
Override protected void writeKey(String fname, boolean needsEscaping) throws IOException { writeStr(null, fname, needsEscaping); }
// in core/src/java/org/apache/solr/response/PHPSerializedResponseWriter.java
void writeKey(int val, boolean needsEscaping) throws IOException { writeInt(null, String.valueOf(val)); }
// in core/src/java/org/apache/solr/response/PHPSerializedResponseWriter.java
Override public void writeBool(String name, boolean val) throws IOException { writer.write(val ? "b:1;" : "b:0;"); }
// in core/src/java/org/apache/solr/response/PHPSerializedResponseWriter.java
Override public void writeBool(String name, String val) throws IOException { writeBool(name, val.charAt(0) == 't'); }
// in core/src/java/org/apache/solr/response/PHPSerializedResponseWriter.java
Override public void writeInt(String name, String val) throws IOException { writer.write("i:"+val+";"); }
// in core/src/java/org/apache/solr/response/PHPSerializedResponseWriter.java
Override public void writeLong(String name, String val) throws IOException { writeInt(name,val); }
// in core/src/java/org/apache/solr/response/PHPSerializedResponseWriter.java
Override public void writeFloat(String name, String val) throws IOException { writeDouble(name,val); }
// in core/src/java/org/apache/solr/response/PHPSerializedResponseWriter.java
Override public void writeDouble(String name, String val) throws IOException { writer.write("d:"+val+";"); }
// in core/src/java/org/apache/solr/response/PHPSerializedResponseWriter.java
Override public void writeStr(String name, String val, boolean needsEscaping) throws IOException { // serialized PHP strings don't need to be escaped at all, however the // string size reported needs be the number of bytes rather than chars. UnicodeUtil.UTF16toUTF8(val, 0, val.length(), utf8); int nBytes = utf8.length; writer.write("s:"); writer.write(Integer.toString(nBytes)); writer.write(":\""); writer.write(val); writer.write("\";"); }
// in core/src/java/org/apache/solr/request/PerSegmentSingleValuedFaceting.java
NamedList<Integer> getFacetCounts(Executor executor) throws IOException { CompletionService<SegFacet> completionService = new ExecutorCompletionService<SegFacet>(executor); // reuse the translation logic to go from top level set to per-segment set baseSet = docs.getTopFilter(); final AtomicReaderContext[] leaves = searcher.getTopReaderContext().leaves(); // The list of pending tasks that aren't immediately submitted // TODO: Is there a completion service, or a delegating executor that can // limit the number of concurrent tasks submitted to a bigger executor? LinkedList<Callable<SegFacet>> pending = new LinkedList<Callable<SegFacet>>(); int threads = nThreads <= 0 ? Integer.MAX_VALUE : nThreads; for (int i=0; i<leaves.length; i++) { final SegFacet segFacet = new SegFacet(leaves[i]); Callable<SegFacet> task = new Callable<SegFacet>() { public SegFacet call() throws Exception { segFacet.countTerms(); return segFacet; } }; // TODO: if limiting threads, submit by largest segment first? if (--threads >= 0) { completionService.submit(task); } else { pending.add(task); } } // now merge the per-segment results PriorityQueue<SegFacet> queue = new PriorityQueue<SegFacet>(leaves.length) { @Override protected boolean lessThan(SegFacet a, SegFacet b) { return a.tempBR.compareTo(b.tempBR) < 0; } }; boolean hasMissingCount=false; int missingCount=0; for (int i=0; i<leaves.length; i++) { SegFacet seg = null; try { Future<SegFacet> future = completionService.take(); seg = future.get(); if (!pending.isEmpty()) { completionService.submit(pending.removeFirst()); } } catch (InterruptedException e) { Thread.currentThread().interrupt(); throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); } catch (ExecutionException e) { Throwable cause = e.getCause(); if (cause instanceof RuntimeException) { throw (RuntimeException)cause; } else { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Error in per-segment faceting on field: " + fieldName, cause); } } if (seg.startTermIndex < seg.endTermIndex) { if (seg.startTermIndex==0) { hasMissingCount=true; missingCount += seg.counts[0]; seg.pos = 1; } else { seg.pos = seg.startTermIndex; } if (seg.pos < seg.endTermIndex) { seg.tenum = seg.si.getTermsEnum(); seg.tenum.seekExact(seg.pos); seg.tempBR = seg.tenum.term(); queue.add(seg); } } } FacetCollector collector; if (sort.equals(FacetParams.FACET_SORT_COUNT) || sort.equals(FacetParams.FACET_SORT_COUNT_LEGACY)) { collector = new CountSortedFacetCollector(offset, limit, mincount); } else { collector = new IndexSortedFacetCollector(offset, limit, mincount); } BytesRef val = new BytesRef(); while (queue.size() > 0) { SegFacet seg = queue.top(); // make a shallow copy val.bytes = seg.tempBR.bytes; val.offset = seg.tempBR.offset; val.length = seg.tempBR.length; int count = 0; do { count += seg.counts[seg.pos - seg.startTermIndex]; // TODO: OPTIMIZATION... // if mincount>0 then seg.pos++ can skip ahead to the next non-zero entry. seg.pos++; if (seg.pos >= seg.endTermIndex) { queue.pop(); seg = queue.top(); } else { seg.tempBR = seg.tenum.next(); seg = queue.updateTop(); } } while (seg != null && val.compareTo(seg.tempBR) == 0); boolean stop = collector.collect(val, count); if (stop) break; } NamedList<Integer> res = collector.getFacetCounts(); // convert labels to readable form FieldType ft = searcher.getSchema().getFieldType(fieldName); int sz = res.size(); for (int i=0; i<sz; i++) { res.setName(i, ft.indexedToReadable(res.getName(i))); } if (missing) { if (!hasMissingCount) { missingCount = SimpleFacets.getFieldMissingCount(searcher,docs,fieldName); } res.add(null, missingCount); } return res; }
// in core/src/java/org/apache/solr/request/PerSegmentSingleValuedFaceting.java
void countTerms() throws IOException { si = FieldCache.DEFAULT.getTermsIndex(context.reader(), fieldName); // SolrCore.log.info("reader= " + reader + " FC=" + System.identityHashCode(si)); if (prefix!=null) { BytesRef prefixRef = new BytesRef(prefix); startTermIndex = si.binarySearchLookup(prefixRef, tempBR); if (startTermIndex<0) startTermIndex=-startTermIndex-1; prefixRef.append(UnicodeUtil.BIG_TERM); // TODO: we could constrain the lower endpoint if we had a binarySearch method that allowed passing start/end endTermIndex = si.binarySearchLookup(prefixRef, tempBR); assert endTermIndex < 0; endTermIndex = -endTermIndex-1; } else { startTermIndex=0; endTermIndex=si.numOrd(); } final int nTerms=endTermIndex-startTermIndex; if (nTerms>0) { // count collection array only needs to be as big as the number of terms we are // going to collect counts for. final int[] counts = this.counts = new int[nTerms]; DocIdSet idSet = baseSet.getDocIdSet(context, null); // this set only includes live docs DocIdSetIterator iter = idSet.iterator(); //// PackedInts.Reader ordReader = si.getDocToOrd(); int doc; final Object arr; if (ordReader.hasArray()) { arr = ordReader.getArray(); } else { arr = null; } if (arr instanceof int[]) { int[] ords = (int[]) arr; if (prefix==null) { while ((doc = iter.nextDoc()) < DocIdSetIterator.NO_MORE_DOCS) { counts[ords[doc]]++; } } else { while ((doc = iter.nextDoc()) < DocIdSetIterator.NO_MORE_DOCS) { int term = ords[doc]; int arrIdx = term-startTermIndex; if (arrIdx>=0 && arrIdx<nTerms) counts[arrIdx]++; } } } else if (arr instanceof short[]) { short[] ords = (short[]) arr; if (prefix==null) { while ((doc = iter.nextDoc()) < DocIdSetIterator.NO_MORE_DOCS) { counts[ords[doc] & 0xffff]++; } } else { while ((doc = iter.nextDoc()) < DocIdSetIterator.NO_MORE_DOCS) { int term = ords[doc] & 0xffff; int arrIdx = term-startTermIndex; if (arrIdx>=0 && arrIdx<nTerms) counts[arrIdx]++; } } } else if (arr instanceof byte[]) { byte[] ords = (byte[]) arr; if (prefix==null) { while ((doc = iter.nextDoc()) < DocIdSetIterator.NO_MORE_DOCS) { counts[ords[doc] & 0xff]++; } } else { while ((doc = iter.nextDoc()) < DocIdSetIterator.NO_MORE_DOCS) { int term = ords[doc] & 0xff; int arrIdx = term-startTermIndex; if (arrIdx>=0 && arrIdx<nTerms) counts[arrIdx]++; } } } else { if (prefix==null) { // specialized version when collecting counts for all terms while ((doc = iter.nextDoc()) < DocIdSetIterator.NO_MORE_DOCS) { counts[si.getOrd(doc)]++; } } else { // version that adjusts term numbers because we aren't collecting the full range while ((doc = iter.nextDoc()) < DocIdSetIterator.NO_MORE_DOCS) { int term = si.getOrd(doc); int arrIdx = term-startTermIndex; if (arrIdx>=0 && arrIdx<nTerms) counts[arrIdx]++; } } } } }
// in core/src/java/org/apache/solr/request/SimpleFacets.java
void parseParams(String type, String param) throws ParseException, IOException { localParams = QueryParsing.getLocalParams(param, req.getParams()); base = docs; facetValue = param; key = param; threads = -1; if (localParams == null) return; // remove local params unless it's a query if (type != FacetParams.FACET_QUERY) { // TODO Cut over to an Enum here facetValue = localParams.get(CommonParams.VALUE); } // reset set the default key now that localParams have been removed key = facetValue; // allow explicit set of the key key = localParams.get(CommonParams.OUTPUT_KEY, key); String threadStr = localParams.get(CommonParams.THREADS); if (threadStr != null) { threads = Integer.parseInt(threadStr); } // figure out if we need a new base DocSet String excludeStr = localParams.get(CommonParams.EXCLUDE); if (excludeStr == null) return; Map<?,?> tagMap = (Map<?,?>)req.getContext().get("tags"); if (tagMap != null && rb != null) { List<String> excludeTagList = StrUtils.splitSmart(excludeStr,','); IdentityHashMap<Query,Boolean> excludeSet = new IdentityHashMap<Query,Boolean>(); for (String excludeTag : excludeTagList) { Object olst = tagMap.get(excludeTag); // tagMap has entries of List<String,List<QParser>>, but subject to change in the future if (!(olst instanceof Collection)) continue; for (Object o : (Collection<?>)olst) { if (!(o instanceof QParser)) continue; QParser qp = (QParser)o; excludeSet.put(qp.getQuery(), Boolean.TRUE); } } if (excludeSet.size() == 0) return; List<Query> qlist = new ArrayList<Query>(); // add the base query if (!excludeSet.containsKey(rb.getQuery())) { qlist.add(rb.getQuery()); } // add the filters if (rb.getFilters() != null) { for (Query q : rb.getFilters()) { if (!excludeSet.containsKey(q)) { qlist.add(q); } } } // get the new base docset for this facet DocSet base = searcher.getDocSet(qlist); if (rb.grouping() && rb.getGroupingSpec().isTruncateGroups()) { Grouping grouping = new Grouping(searcher, null, rb.getQueryCommand(), false, 0, false); if (rb.getGroupingSpec().getFields().length > 0) { grouping.addFieldCommand(rb.getGroupingSpec().getFields()[0], req); } else if (rb.getGroupingSpec().getFunctions().length > 0) { grouping.addFunctionCommand(rb.getGroupingSpec().getFunctions()[0], req); } else { this.base = base; return; } AbstractAllGroupHeadsCollector allGroupHeadsCollector = grouping.getCommands().get(0).createAllGroupCollector(); searcher.search(new MatchAllDocsQuery(), base.getTopFilter(), allGroupHeadsCollector); int maxDoc = searcher.maxDoc(); FixedBitSet fixedBitSet = allGroupHeadsCollector.retrieveGroupHeads(maxDoc); long[] bits = fixedBitSet.getBits(); this.base = new BitDocSet(new OpenBitSet(bits, bits.length)); } else { this.base = base; } } }
// in core/src/java/org/apache/solr/request/SimpleFacets.java
public NamedList<Integer> getFacetQueryCounts() throws IOException,ParseException { NamedList<Integer> res = new SimpleOrderedMap<Integer>(); /* Ignore CommonParams.DF - could have init param facet.query assuming * the schema default with query param DF intented to only affect Q. * If user doesn't want schema default for facet.query, they should be * explicit. */ // SolrQueryParser qp = searcher.getSchema().getSolrQueryParser(null); String[] facetQs = params.getParams(FacetParams.FACET_QUERY); if (null != facetQs && 0 != facetQs.length) { for (String q : facetQs) { parseParams(FacetParams.FACET_QUERY, q); // TODO: slight optimization would prevent double-parsing of any localParams Query qobj = QParser.getParser(q, null, req).getQuery(); res.add(key, searcher.numDocs(qobj, base)); } } return res; }
// in core/src/java/org/apache/solr/request/SimpleFacets.java
public NamedList<Integer> getTermCounts(String field) throws IOException { int offset = params.getFieldInt(field, FacetParams.FACET_OFFSET, 0); int limit = params.getFieldInt(field, FacetParams.FACET_LIMIT, 100); if (limit == 0) return new NamedList<Integer>(); Integer mincount = params.getFieldInt(field, FacetParams.FACET_MINCOUNT); if (mincount==null) { Boolean zeros = params.getFieldBool(field, FacetParams.FACET_ZEROS); // mincount = (zeros!=null && zeros) ? 0 : 1; mincount = (zeros!=null && !zeros) ? 1 : 0; // current default is to include zeros. } boolean missing = params.getFieldBool(field, FacetParams.FACET_MISSING, false); // default to sorting if there is a limit. String sort = params.getFieldParam(field, FacetParams.FACET_SORT, limit>0 ? FacetParams.FACET_SORT_COUNT : FacetParams.FACET_SORT_INDEX); String prefix = params.getFieldParam(field,FacetParams.FACET_PREFIX); NamedList<Integer> counts; SchemaField sf = searcher.getSchema().getField(field); FieldType ft = sf.getType(); // determine what type of faceting method to use String method = params.getFieldParam(field, FacetParams.FACET_METHOD); boolean enumMethod = FacetParams.FACET_METHOD_enum.equals(method); // TODO: default to per-segment or not? boolean per_segment = FacetParams.FACET_METHOD_fcs.equals(method); if (method == null && ft instanceof BoolField) { // Always use filters for booleans... we know the number of values is very small. enumMethod = true; } boolean multiToken = sf.multiValued() || ft.multiValuedFieldCache(); if (TrieField.getMainValuePrefix(ft) != null) { // A TrieField with multiple parts indexed per value... currently only // UnInvertedField can handle this case, so force it's use. enumMethod = false; multiToken = true; } if (params.getFieldBool(field, GroupParams.GROUP_FACET, false)) { counts = getGroupedCounts(searcher, base, field, multiToken, offset,limit, mincount, missing, sort, prefix); } else { // unless the enum method is explicitly specified, use a counting method. if (enumMethod) { counts = getFacetTermEnumCounts(searcher, base, field, offset, limit, mincount,missing,sort,prefix); } else { if (multiToken) { UnInvertedField uif = UnInvertedField.getUnInvertedField(field, searcher); counts = uif.getCounts(searcher, base, offset, limit, mincount,missing,sort,prefix); } else { // TODO: future logic could use filters instead of the fieldcache if // the number of terms in the field is small enough. if (per_segment) { PerSegmentSingleValuedFaceting ps = new PerSegmentSingleValuedFaceting(searcher, base, field, offset,limit, mincount, missing, sort, prefix); Executor executor = threads == 0 ? directExecutor : facetExecutor; ps.setNumThreads(threads); counts = ps.getFacetCounts(executor); } else { counts = getFieldCacheCounts(searcher, base, field, offset,limit, mincount, missing, sort, prefix); } } } } return counts; }
// in core/src/java/org/apache/solr/request/SimpleFacets.java
public NamedList<Integer> getGroupedCounts(SolrIndexSearcher searcher, DocSet base, String field, boolean multiToken, int offset, int limit, int mincount, boolean missing, String sort, String prefix) throws IOException { GroupingSpecification groupingSpecification = rb.getGroupingSpec(); String groupField = groupingSpecification != null ? groupingSpecification.getFields()[0] : null; if (groupField == null) { throw new SolrException ( SolrException.ErrorCode.BAD_REQUEST, "Specify the group.field as parameter or local parameter" ); } BytesRef prefixBR = prefix != null ? new BytesRef(prefix) : null; TermGroupFacetCollector collector = TermGroupFacetCollector.createTermGroupFacetCollector(groupField, field, multiToken, prefixBR, 128); searcher.search(new MatchAllDocsQuery(), base.getTopFilter(), collector); boolean orderByCount = sort.equals(FacetParams.FACET_SORT_COUNT) || sort.equals(FacetParams.FACET_SORT_COUNT_LEGACY); TermGroupFacetCollector.GroupedFacetResult result = collector.mergeSegmentResults(offset + limit, mincount, orderByCount); CharsRef charsRef = new CharsRef(); FieldType facetFieldType = searcher.getSchema().getFieldType(field); NamedList<Integer> facetCounts = new NamedList<Integer>(); List<TermGroupFacetCollector.FacetEntry> scopedEntries = result.getFacetEntries(offset, limit); for (TermGroupFacetCollector.FacetEntry facetEntry : scopedEntries) { facetFieldType.indexedToReadable(facetEntry.getValue(), charsRef); facetCounts.add(charsRef.toString(), facetEntry.getCount()); } if (missing) { facetCounts.add(null, result.getTotalMissingCount()); } return facetCounts; }
// in core/src/java/org/apache/solr/request/SimpleFacets.java
public NamedList<Object> getFacetFieldCounts() throws IOException, ParseException { NamedList<Object> res = new SimpleOrderedMap<Object>(); String[] facetFs = params.getParams(FacetParams.FACET_FIELD); if (null != facetFs) { for (String f : facetFs) { parseParams(FacetParams.FACET_FIELD, f); String termList = localParams == null ? null : localParams.get(CommonParams.TERMS); if (termList != null) { res.add(key, getListedTermCounts(facetValue, termList)); } else { res.add(key, getTermCounts(facetValue)); } } } return res; }
// in core/src/java/org/apache/solr/request/SimpleFacets.java
private NamedList<Integer> getListedTermCounts(String field, String termList) throws IOException { FieldType ft = searcher.getSchema().getFieldType(field); List<String> terms = StrUtils.splitSmart(termList, ",", true); NamedList<Integer> res = new NamedList<Integer>(); for (String term : terms) { String internal = ft.toInternal(term); int count = searcher.numDocs(new TermQuery(new Term(field, internal)), base); res.add(term, count); } return res; }
// in core/src/java/org/apache/solr/request/SimpleFacets.java
public static int getFieldMissingCount(SolrIndexSearcher searcher, DocSet docs, String fieldName) throws IOException { DocSet hasVal = searcher.getDocSet (new TermRangeQuery(fieldName, null, null, false, false)); return docs.andNotSize(hasVal); }
// in core/src/java/org/apache/solr/request/SimpleFacets.java
public static NamedList<Integer> getFieldCacheCounts(SolrIndexSearcher searcher, DocSet docs, String fieldName, int offset, int limit, int mincount, boolean missing, String sort, String prefix) throws IOException { // TODO: If the number of terms is high compared to docs.size(), and zeros==false, // we should use an alternate strategy to avoid // 1) creating another huge int[] for the counts // 2) looping over that huge int[] looking for the rare non-zeros. // // Yet another variation: if docs.size() is small and termvectors are stored, // then use them instead of the FieldCache. // // TODO: this function is too big and could use some refactoring, but // we also need a facet cache, and refactoring of SimpleFacets instead of // trying to pass all the various params around. FieldType ft = searcher.getSchema().getFieldType(fieldName); NamedList<Integer> res = new NamedList<Integer>(); FieldCache.DocTermsIndex si = FieldCache.DEFAULT.getTermsIndex(searcher.getAtomicReader(), fieldName); final BytesRef prefixRef; if (prefix == null) { prefixRef = null; } else if (prefix.length()==0) { prefix = null; prefixRef = null; } else { prefixRef = new BytesRef(prefix); } final BytesRef br = new BytesRef(); int startTermIndex, endTermIndex; if (prefix!=null) { startTermIndex = si.binarySearchLookup(prefixRef, br); if (startTermIndex<0) startTermIndex=-startTermIndex-1; prefixRef.append(UnicodeUtil.BIG_TERM); endTermIndex = si.binarySearchLookup(prefixRef, br); assert endTermIndex < 0; endTermIndex = -endTermIndex-1; } else { startTermIndex=0; endTermIndex=si.numOrd(); } final int nTerms=endTermIndex-startTermIndex; int missingCount = -1; final CharsRef charsRef = new CharsRef(10); if (nTerms>0 && docs.size() >= mincount) { // count collection array only needs to be as big as the number of terms we are // going to collect counts for. final int[] counts = new int[nTerms]; DocIterator iter = docs.iterator(); PackedInts.Reader ordReader = si.getDocToOrd(); final Object arr; if (ordReader.hasArray()) { arr = ordReader.getArray(); } else { arr = null; } if (arr instanceof int[]) { int[] ords = (int[]) arr; if (prefix==null) { while (iter.hasNext()) { counts[ords[iter.nextDoc()]]++; } } else { while (iter.hasNext()) { int term = ords[iter.nextDoc()]; int arrIdx = term-startTermIndex; if (arrIdx>=0 && arrIdx<nTerms) counts[arrIdx]++; } } } else if (arr instanceof short[]) { short[] ords = (short[]) arr; if (prefix==null) { while (iter.hasNext()) { counts[ords[iter.nextDoc()] & 0xffff]++; } } else { while (iter.hasNext()) { int term = ords[iter.nextDoc()] & 0xffff; int arrIdx = term-startTermIndex; if (arrIdx>=0 && arrIdx<nTerms) counts[arrIdx]++; } } } else if (arr instanceof byte[]) { byte[] ords = (byte[]) arr; if (prefix==null) { while (iter.hasNext()) { counts[ords[iter.nextDoc()] & 0xff]++; } } else { while (iter.hasNext()) { int term = ords[iter.nextDoc()] & 0xff; int arrIdx = term-startTermIndex; if (arrIdx>=0 && arrIdx<nTerms) counts[arrIdx]++; } } } else { while (iter.hasNext()) { int term = si.getOrd(iter.nextDoc()); int arrIdx = term-startTermIndex; if (arrIdx>=0 && arrIdx<nTerms) counts[arrIdx]++; } } if (startTermIndex == 0) { missingCount = counts[0]; } // IDEA: we could also maintain a count of "other"... everything that fell outside // of the top 'N' int off=offset; int lim=limit>=0 ? limit : Integer.MAX_VALUE; if (sort.equals(FacetParams.FACET_SORT_COUNT) || sort.equals(FacetParams.FACET_SORT_COUNT_LEGACY)) { int maxsize = limit>0 ? offset+limit : Integer.MAX_VALUE-1; maxsize = Math.min(maxsize, nTerms); LongPriorityQueue queue = new LongPriorityQueue(Math.min(maxsize,1000), maxsize, Long.MIN_VALUE); int min=mincount-1; // the smallest value in the top 'N' values for (int i=(startTermIndex==0)?1:0; i<nTerms; i++) { int c = counts[i]; if (c>min) { // NOTE: we use c>min rather than c>=min as an optimization because we are going in // index order, so we already know that the keys are ordered. This can be very // important if a lot of the counts are repeated (like zero counts would be). // smaller term numbers sort higher, so subtract the term number instead long pair = (((long)c)<<32) + (Integer.MAX_VALUE - i); boolean displaced = queue.insert(pair); if (displaced) min=(int)(queue.top() >>> 32); } } // if we are deep paging, we don't have to order the highest "offset" counts. int collectCount = Math.max(0, queue.size() - off); assert collectCount <= lim; // the start and end indexes of our list "sorted" (starting with the highest value) int sortedIdxStart = queue.size() - (collectCount - 1); int sortedIdxEnd = queue.size() + 1; final long[] sorted = queue.sort(collectCount); for (int i=sortedIdxStart; i<sortedIdxEnd; i++) { long pair = sorted[i]; int c = (int)(pair >>> 32); int tnum = Integer.MAX_VALUE - (int)pair; ft.indexedToReadable(si.lookup(startTermIndex+tnum, br), charsRef); res.add(charsRef.toString(), c); } } else { // add results in index order int i=(startTermIndex==0)?1:0; if (mincount<=0) { // if mincount<=0, then we won't discard any terms and we know exactly // where to start. i+=off; off=0; } for (; i<nTerms; i++) { int c = counts[i]; if (c<mincount || --off>=0) continue; if (--lim<0) break; ft.indexedToReadable(si.lookup(startTermIndex+i, br), charsRef); res.add(charsRef.toString(), c); } } } if (missing) { if (missingCount < 0) { missingCount = getFieldMissingCount(searcher,docs,fieldName); } res.add(null, missingCount); } return res; }
// in core/src/java/org/apache/solr/request/SimpleFacets.java
public NamedList<Integer> getFacetTermEnumCounts(SolrIndexSearcher searcher, DocSet docs, String field, int offset, int limit, int mincount, boolean missing, String sort, String prefix) throws IOException { /* :TODO: potential optimization... * cache the Terms with the highest docFreq and try them first * don't enum if we get our max from them */ // Minimum term docFreq in order to use the filterCache for that term. int minDfFilterCache = params.getFieldInt(field, FacetParams.FACET_ENUM_CACHE_MINDF, 0); // make sure we have a set that is fast for random access, if we will use it for that DocSet fastForRandomSet = docs; if (minDfFilterCache>0 && docs instanceof SortedIntDocSet) { SortedIntDocSet sset = (SortedIntDocSet)docs; fastForRandomSet = new HashDocSet(sset.getDocs(), 0, sset.size()); } IndexSchema schema = searcher.getSchema(); AtomicReader r = searcher.getAtomicReader(); FieldType ft = schema.getFieldType(field); boolean sortByCount = sort.equals("count") || sort.equals("true"); final int maxsize = limit>=0 ? offset+limit : Integer.MAX_VALUE-1; final BoundedTreeSet<CountPair<BytesRef,Integer>> queue = sortByCount ? new BoundedTreeSet<CountPair<BytesRef,Integer>>(maxsize) : null; final NamedList<Integer> res = new NamedList<Integer>(); int min=mincount-1; // the smallest value in the top 'N' values int off=offset; int lim=limit>=0 ? limit : Integer.MAX_VALUE; BytesRef startTermBytes = null; if (prefix != null) { String indexedPrefix = ft.toInternal(prefix); startTermBytes = new BytesRef(indexedPrefix); } Fields fields = r.fields(); Terms terms = fields==null ? null : fields.terms(field); TermsEnum termsEnum = null; SolrIndexSearcher.DocsEnumState deState = null; BytesRef term = null; if (terms != null) { termsEnum = terms.iterator(null); // TODO: OPT: if seek(ord) is supported for this termsEnum, then we could use it for // facet.offset when sorting by index order. if (startTermBytes != null) { if (termsEnum.seekCeil(startTermBytes, true) == TermsEnum.SeekStatus.END) { termsEnum = null; } else { term = termsEnum.term(); } } else { // position termsEnum on first term term = termsEnum.next(); } } DocsEnum docsEnum = null; CharsRef charsRef = new CharsRef(10); if (docs.size() >= mincount) { while (term != null) { if (startTermBytes != null && !StringHelper.startsWith(term, startTermBytes)) break; int df = termsEnum.docFreq(); // If we are sorting, we can use df>min (rather than >=) since we // are going in index order. For certain term distributions this can // make a large difference (for example, many terms with df=1). if (df>0 && df>min) { int c; if (df >= minDfFilterCache) { // use the filter cache if (deState==null) { deState = new SolrIndexSearcher.DocsEnumState(); deState.fieldName = field; deState.liveDocs = r.getLiveDocs(); deState.termsEnum = termsEnum; deState.docsEnum = docsEnum; } c = searcher.numDocs(docs, deState); docsEnum = deState.docsEnum; } else { // iterate over TermDocs to calculate the intersection // TODO: specialize when base docset is a bitset or hash set (skipDocs)? or does it matter for this? // TODO: do this per-segment for better efficiency (MultiDocsEnum just uses base class impl) // TODO: would passing deleted docs lead to better efficiency over checking the fastForRandomSet? docsEnum = termsEnum.docs(null, docsEnum, false); c=0; if (docsEnum instanceof MultiDocsEnum) { MultiDocsEnum.EnumWithSlice[] subs = ((MultiDocsEnum)docsEnum).getSubs(); int numSubs = ((MultiDocsEnum)docsEnum).getNumSubs(); for (int subindex = 0; subindex<numSubs; subindex++) { MultiDocsEnum.EnumWithSlice sub = subs[subindex]; if (sub.docsEnum == null) continue; int base = sub.slice.start; int docid; while ((docid = sub.docsEnum.nextDoc()) != DocIdSetIterator.NO_MORE_DOCS) { if (fastForRandomSet.exists(docid+base)) c++; } } } else { int docid; while ((docid = docsEnum.nextDoc()) != DocIdSetIterator.NO_MORE_DOCS) { if (fastForRandomSet.exists(docid)) c++; } } } if (sortByCount) { if (c>min) { BytesRef termCopy = BytesRef.deepCopyOf(term); queue.add(new CountPair<BytesRef,Integer>(termCopy, c)); if (queue.size()>=maxsize) min=queue.last().val; } } else { if (c >= mincount && --off<0) { if (--lim<0) break; ft.indexedToReadable(term, charsRef); res.add(charsRef.toString(), c); } } } term = termsEnum.next(); } } if (sortByCount) { for (CountPair<BytesRef,Integer> p : queue) { if (--off>=0) continue; if (--lim<0) break; ft.indexedToReadable(p.key, charsRef); res.add(charsRef.toString(), p.val); } } if (missing) { res.add(null, getFieldMissingCount(searcher,docs,field)); } return res; }
// in core/src/java/org/apache/solr/request/SimpleFacets.java
Deprecated public NamedList<Object> getFacetDateCounts() throws IOException, ParseException { final NamedList<Object> resOuter = new SimpleOrderedMap<Object>(); final String[] fields = params.getParams(FacetParams.FACET_DATE); if (null == fields || 0 == fields.length) return resOuter; for (String f : fields) { getFacetDateCounts(f, resOuter); } return resOuter; }
// in core/src/java/org/apache/solr/request/SimpleFacets.java
Deprecated public void getFacetDateCounts(String dateFacet, NamedList<Object> resOuter) throws IOException, ParseException { final IndexSchema schema = searcher.getSchema(); parseParams(FacetParams.FACET_DATE, dateFacet); String f = facetValue; final NamedList<Object> resInner = new SimpleOrderedMap<Object>(); resOuter.add(key, resInner); final SchemaField sf = schema.getField(f); if (! (sf.getType() instanceof DateField)) { throw new SolrException (SolrException.ErrorCode.BAD_REQUEST, "Can not date facet on a field which is not a DateField: " + f); } final DateField ft = (DateField) sf.getType(); final String startS = required.getFieldParam(f,FacetParams.FACET_DATE_START); final Date start; try { start = ft.parseMath(null, startS); } catch (SolrException e) { throw new SolrException (SolrException.ErrorCode.BAD_REQUEST, "date facet 'start' is not a valid Date string: " + startS, e); } final String endS = required.getFieldParam(f,FacetParams.FACET_DATE_END); Date end; // not final, hardend may change this try { end = ft.parseMath(null, endS); } catch (SolrException e) { throw new SolrException (SolrException.ErrorCode.BAD_REQUEST, "date facet 'end' is not a valid Date string: " + endS, e); } if (end.before(start)) { throw new SolrException (SolrException.ErrorCode.BAD_REQUEST, "date facet 'end' comes before 'start': "+endS+" < "+startS); } final String gap = required.getFieldParam(f,FacetParams.FACET_DATE_GAP); final DateMathParser dmp = new DateMathParser(); final int minCount = params.getFieldInt(f,FacetParams.FACET_MINCOUNT, 0); String[] iStrs = params.getFieldParams(f,FacetParams.FACET_DATE_INCLUDE); // Legacy support for default of [lower,upper,edge] for date faceting // this is not handled by FacetRangeInclude.parseParam because // range faceting has differnet defaults final EnumSet<FacetRangeInclude> include = (null == iStrs || 0 == iStrs.length ) ? EnumSet.of(FacetRangeInclude.LOWER, FacetRangeInclude.UPPER, FacetRangeInclude.EDGE) : FacetRangeInclude.parseParam(iStrs); try { Date low = start; while (low.before(end)) { dmp.setNow(low); String label = ft.toExternal(low); Date high = dmp.parseMath(gap); if (end.before(high)) { if (params.getFieldBool(f,FacetParams.FACET_DATE_HARD_END,false)) { high = end; } else { end = high; } } if (high.before(low)) { throw new SolrException (SolrException.ErrorCode.BAD_REQUEST, "date facet infinite loop (is gap negative?)"); } final boolean includeLower = (include.contains(FacetRangeInclude.LOWER) || (include.contains(FacetRangeInclude.EDGE) && low.equals(start))); final boolean includeUpper = (include.contains(FacetRangeInclude.UPPER) || (include.contains(FacetRangeInclude.EDGE) && high.equals(end))); final int count = rangeCount(sf,low,high,includeLower,includeUpper); if (count >= minCount) { resInner.add(label, count); } low = high; } } catch (java.text.ParseException e) { throw new SolrException (SolrException.ErrorCode.BAD_REQUEST, "date facet 'gap' is not a valid Date Math string: " + gap, e); } // explicitly return the gap and end so all the counts // (including before/after/between) are meaningful - even if mincount // has removed the neighboring ranges resInner.add("gap", gap); resInner.add("start", start); resInner.add("end", end); final String[] othersP = params.getFieldParams(f,FacetParams.FACET_DATE_OTHER); if (null != othersP && 0 < othersP.length ) { final Set<FacetRangeOther> others = EnumSet.noneOf(FacetRangeOther.class); for (final String o : othersP) { others.add(FacetRangeOther.get(o)); } // no matter what other values are listed, we don't do // anything if "none" is specified. if (! others.contains(FacetRangeOther.NONE) ) { boolean all = others.contains(FacetRangeOther.ALL); if (all || others.contains(FacetRangeOther.BEFORE)) { // include upper bound if "outer" or if first gap doesn't already include it resInner.add(FacetRangeOther.BEFORE.toString(), rangeCount(sf,null,start, false, (include.contains(FacetRangeInclude.OUTER) || (! (include.contains(FacetRangeInclude.LOWER) || include.contains(FacetRangeInclude.EDGE)))))); } if (all || others.contains(FacetRangeOther.AFTER)) { // include lower bound if "outer" or if last gap doesn't already include it resInner.add(FacetRangeOther.AFTER.toString(), rangeCount(sf,end,null, (include.contains(FacetRangeInclude.OUTER) || (! (include.contains(FacetRangeInclude.UPPER) || include.contains(FacetRangeInclude.EDGE)))), false)); } if (all || others.contains(FacetRangeOther.BETWEEN)) { resInner.add(FacetRangeOther.BETWEEN.toString(), rangeCount(sf,start,end, (include.contains(FacetRangeInclude.LOWER) || include.contains(FacetRangeInclude.EDGE)), (include.contains(FacetRangeInclude.UPPER) || include.contains(FacetRangeInclude.EDGE)))); } } } }
// in core/src/java/org/apache/solr/request/SimpleFacets.java
public NamedList<Object> getFacetRangeCounts() throws IOException, ParseException { final NamedList<Object> resOuter = new SimpleOrderedMap<Object>(); final String[] fields = params.getParams(FacetParams.FACET_RANGE); if (null == fields || 0 == fields.length) return resOuter; for (String f : fields) { getFacetRangeCounts(f, resOuter); } return resOuter; }
// in core/src/java/org/apache/solr/request/SimpleFacets.java
void getFacetRangeCounts(String facetRange, NamedList<Object> resOuter) throws IOException, ParseException { final IndexSchema schema = searcher.getSchema(); parseParams(FacetParams.FACET_RANGE, facetRange); String f = facetValue; final SchemaField sf = schema.getField(f); final FieldType ft = sf.getType(); RangeEndpointCalculator<?> calc = null; if (ft instanceof TrieField) { final TrieField trie = (TrieField)ft; switch (trie.getType()) { case FLOAT: calc = new FloatRangeEndpointCalculator(sf); break; case DOUBLE: calc = new DoubleRangeEndpointCalculator(sf); break; case INTEGER: calc = new IntegerRangeEndpointCalculator(sf); break; case LONG: calc = new LongRangeEndpointCalculator(sf); break; default: throw new SolrException (SolrException.ErrorCode.BAD_REQUEST, "Unable to range facet on tried field of unexpected type:" + f); } } else if (ft instanceof DateField) { calc = new DateRangeEndpointCalculator(sf, null); } else if (ft instanceof SortableIntField) { calc = new IntegerRangeEndpointCalculator(sf); } else if (ft instanceof SortableLongField) { calc = new LongRangeEndpointCalculator(sf); } else if (ft instanceof SortableFloatField) { calc = new FloatRangeEndpointCalculator(sf); } else if (ft instanceof SortableDoubleField) { calc = new DoubleRangeEndpointCalculator(sf); } else { throw new SolrException (SolrException.ErrorCode.BAD_REQUEST, "Unable to range facet on field:" + sf); } resOuter.add(key, getFacetRangeCounts(sf, calc)); }
// in core/src/java/org/apache/solr/request/SimpleFacets.java
private <T extends Comparable<T>> NamedList getFacetRangeCounts (final SchemaField sf, final RangeEndpointCalculator<T> calc) throws IOException { final String f = sf.getName(); final NamedList<Object> res = new SimpleOrderedMap<Object>(); final NamedList<Integer> counts = new NamedList<Integer>(); res.add("counts", counts); final T start = calc.getValue(required.getFieldParam(f,FacetParams.FACET_RANGE_START)); // not final, hardend may change this T end = calc.getValue(required.getFieldParam(f,FacetParams.FACET_RANGE_END)); if (end.compareTo(start) < 0) { throw new SolrException (SolrException.ErrorCode.BAD_REQUEST, "range facet 'end' comes before 'start': "+end+" < "+start); } final String gap = required.getFieldParam(f, FacetParams.FACET_RANGE_GAP); // explicitly return the gap. compute this early so we are more // likely to catch parse errors before attempting math res.add("gap", calc.getGap(gap)); final int minCount = params.getFieldInt(f,FacetParams.FACET_MINCOUNT, 0); final EnumSet<FacetRangeInclude> include = FacetRangeInclude.parseParam (params.getFieldParams(f,FacetParams.FACET_RANGE_INCLUDE)); T low = start; while (low.compareTo(end) < 0) { T high = calc.addGap(low, gap); if (end.compareTo(high) < 0) { if (params.getFieldBool(f,FacetParams.FACET_RANGE_HARD_END,false)) { high = end; } else { end = high; } } if (high.compareTo(low) < 0) { throw new SolrException (SolrException.ErrorCode.BAD_REQUEST, "range facet infinite loop (is gap negative? did the math overflow?)"); } final boolean includeLower = (include.contains(FacetRangeInclude.LOWER) || (include.contains(FacetRangeInclude.EDGE) && 0 == low.compareTo(start))); final boolean includeUpper = (include.contains(FacetRangeInclude.UPPER) || (include.contains(FacetRangeInclude.EDGE) && 0 == high.compareTo(end))); final String lowS = calc.formatValue(low); final String highS = calc.formatValue(high); final int count = rangeCount(sf, lowS, highS, includeLower,includeUpper); if (count >= minCount) { counts.add(lowS, count); } low = high; } // explicitly return the start and end so all the counts // (including before/after/between) are meaningful - even if mincount // has removed the neighboring ranges res.add("start", start); res.add("end", end); final String[] othersP = params.getFieldParams(f,FacetParams.FACET_RANGE_OTHER); if (null != othersP && 0 < othersP.length ) { Set<FacetRangeOther> others = EnumSet.noneOf(FacetRangeOther.class); for (final String o : othersP) { others.add(FacetRangeOther.get(o)); } // no matter what other values are listed, we don't do // anything if "none" is specified. if (! others.contains(FacetRangeOther.NONE) ) { boolean all = others.contains(FacetRangeOther.ALL); final String startS = calc.formatValue(start); final String endS = calc.formatValue(end); if (all || others.contains(FacetRangeOther.BEFORE)) { // include upper bound if "outer" or if first gap doesn't already include it res.add(FacetRangeOther.BEFORE.toString(), rangeCount(sf,null,startS, false, (include.contains(FacetRangeInclude.OUTER) || (! (include.contains(FacetRangeInclude.LOWER) || include.contains(FacetRangeInclude.EDGE)))))); } if (all || others.contains(FacetRangeOther.AFTER)) { // include lower bound if "outer" or if last gap doesn't already include it res.add(FacetRangeOther.AFTER.toString(), rangeCount(sf,endS,null, (include.contains(FacetRangeInclude.OUTER) || (! (include.contains(FacetRangeInclude.UPPER) || include.contains(FacetRangeInclude.EDGE)))), false)); } if (all || others.contains(FacetRangeOther.BETWEEN)) { res.add(FacetRangeOther.BETWEEN.toString(), rangeCount(sf,startS,endS, (include.contains(FacetRangeInclude.LOWER) || include.contains(FacetRangeInclude.EDGE)), (include.contains(FacetRangeInclude.UPPER) || include.contains(FacetRangeInclude.EDGE)))); } } } return res; }
// in core/src/java/org/apache/solr/request/SimpleFacets.java
protected int rangeCount(SchemaField sf, String low, String high, boolean iLow, boolean iHigh) throws IOException { Query rangeQ = sf.getType().getRangeQuery(null, sf,low,high,iLow,iHigh); return searcher.numDocs(rangeQ ,base); }
// in core/src/java/org/apache/solr/request/SimpleFacets.java
Deprecated protected int rangeCount(SchemaField sf, Date low, Date high, boolean iLow, boolean iHigh) throws IOException { Query rangeQ = ((DateField)(sf.getType())).getRangeQuery(null, sf,low,high,iLow,iHigh); return searcher.numDocs(rangeQ ,base); }
// in core/src/java/org/apache/solr/request/UnInvertedField.java
Override protected void visitTerm(TermsEnum te, int termNum) throws IOException { if (termNum >= maxTermCounts.length) { // resize by doubling - for very large number of unique terms, expanding // by 4K and resultant GC will dominate uninvert times. Resize at end if material int[] newMaxTermCounts = new int[maxTermCounts.length*2]; System.arraycopy(maxTermCounts, 0, newMaxTermCounts, 0, termNum); maxTermCounts = newMaxTermCounts; } final BytesRef term = te.term(); if (te.docFreq() > maxTermDocFreq) { TopTerm topTerm = new TopTerm(); topTerm.term = BytesRef.deepCopyOf(term); topTerm.termNum = termNum; bigTerms.put(topTerm.termNum, topTerm); if (deState == null) { deState = new SolrIndexSearcher.DocsEnumState(); deState.fieldName = field; // deState.termsEnum = te.tenum; deState.termsEnum = te; // TODO: check for MultiTermsEnum in SolrIndexSearcher could now fail? deState.docsEnum = docsEnum; deState.minSetSizeCached = maxTermDocFreq; } docsEnum = deState.docsEnum; DocSet set = searcher.getDocSet(deState); maxTermCounts[termNum] = set.size(); } }
// in core/src/java/org/apache/solr/request/UnInvertedField.java
public NamedList<Integer> getCounts(SolrIndexSearcher searcher, DocSet baseDocs, int offset, int limit, Integer mincount, boolean missing, String sort, String prefix) throws IOException { use.incrementAndGet(); FieldType ft = searcher.getSchema().getFieldType(field); NamedList<Integer> res = new NamedList<Integer>(); // order is important DocSet docs = baseDocs; int baseSize = docs.size(); int maxDoc = searcher.maxDoc(); //System.out.println("GET COUNTS field=" + field + " baseSize=" + baseSize + " minCount=" + mincount + " maxDoc=" + maxDoc + " numTermsInField=" + numTermsInField); if (baseSize >= mincount) { final int[] index = this.index; // tricky: we add more more element than we need because we will reuse this array later // for ordering term ords before converting to term labels. final int[] counts = new int[numTermsInField + 1]; // // If there is prefix, find it's start and end term numbers // int startTerm = 0; int endTerm = numTermsInField; // one past the end TermsEnum te = getOrdTermsEnum(searcher.getAtomicReader()); if (te != null && prefix != null && prefix.length() > 0) { final BytesRef prefixBr = new BytesRef(prefix); if (te.seekCeil(prefixBr, true) == TermsEnum.SeekStatus.END) { startTerm = numTermsInField; } else { startTerm = (int) te.ord(); } prefixBr.append(UnicodeUtil.BIG_TERM); if (te.seekCeil(prefixBr, true) == TermsEnum.SeekStatus.END) { endTerm = numTermsInField; } else { endTerm = (int) te.ord(); } } /*********** // Alternative 2: get the docSet of the prefix (could take a while) and // then do the intersection with the baseDocSet first. if (prefix != null && prefix.length() > 0) { docs = searcher.getDocSet(new ConstantScorePrefixQuery(new Term(field, ft.toInternal(prefix))), docs); // The issue with this method are problems of returning 0 counts for terms w/o // the prefix. We can't just filter out those terms later because it may // mean that we didn't collect enough terms in the queue (in the sorted case). } ***********/ boolean doNegative = baseSize > maxDoc >> 1 && termInstances > 0 && startTerm==0 && endTerm==numTermsInField && docs instanceof BitDocSet; if (doNegative) { OpenBitSet bs = (OpenBitSet)((BitDocSet)docs).getBits().clone(); bs.flip(0, maxDoc); // TODO: when iterator across negative elements is available, use that // instead of creating a new bitset and inverting. docs = new BitDocSet(bs, maxDoc - baseSize); // simply negating will mean that we have deleted docs in the set. // that should be OK, as their entries in our table should be empty. //System.out.println(" NEG"); } // For the biggest terms, do straight set intersections for (TopTerm tt : bigTerms.values()) { //System.out.println(" do big termNum=" + tt.termNum + " term=" + tt.term.utf8ToString()); // TODO: counts could be deferred if sorted==false if (tt.termNum >= startTerm && tt.termNum < endTerm) { counts[tt.termNum] = searcher.numDocs(new TermQuery(new Term(field, tt.term)), docs); //System.out.println(" count=" + counts[tt.termNum]); } else { //System.out.println("SKIP term=" + tt.termNum); } } // TODO: we could short-circuit counting altogether for sorted faceting // where we already have enough terms from the bigTerms // TODO: we could shrink the size of the collection array, and // additionally break when the termNumber got above endTerm, but // it would require two extra conditionals in the inner loop (although // they would be predictable for the non-prefix case). // Perhaps a different copy of the code would be warranted. if (termInstances > 0) { DocIterator iter = docs.iterator(); while (iter.hasNext()) { int doc = iter.nextDoc(); //System.out.println("iter doc=" + doc); int code = index[doc]; if ((code & 0xff)==1) { //System.out.println(" ptr"); int pos = code>>>8; int whichArray = (doc >>> 16) & 0xff; byte[] arr = tnums[whichArray]; int tnum = 0; for(;;) { int delta = 0; for(;;) { byte b = arr[pos++]; delta = (delta << 7) | (b & 0x7f); if ((b & 0x80) == 0) break; } if (delta == 0) break; tnum += delta - TNUM_OFFSET; //System.out.println(" tnum=" + tnum); counts[tnum]++; } } else { //System.out.println(" inlined"); int tnum = 0; int delta = 0; for (;;) { delta = (delta << 7) | (code & 0x7f); if ((code & 0x80)==0) { if (delta==0) break; tnum += delta - TNUM_OFFSET; //System.out.println(" tnum=" + tnum); counts[tnum]++; delta = 0; } code >>>= 8; } } } } final CharsRef charsRef = new CharsRef(); int off=offset; int lim=limit>=0 ? limit : Integer.MAX_VALUE; if (sort.equals(FacetParams.FACET_SORT_COUNT) || sort.equals(FacetParams.FACET_SORT_COUNT_LEGACY)) { int maxsize = limit>0 ? offset+limit : Integer.MAX_VALUE-1; maxsize = Math.min(maxsize, numTermsInField); LongPriorityQueue queue = new LongPriorityQueue(Math.min(maxsize,1000), maxsize, Long.MIN_VALUE); int min=mincount-1; // the smallest value in the top 'N' values //System.out.println("START=" + startTerm + " END=" + endTerm); for (int i=startTerm; i<endTerm; i++) { int c = doNegative ? maxTermCounts[i] - counts[i] : counts[i]; if (c>min) { // NOTE: we use c>min rather than c>=min as an optimization because we are going in // index order, so we already know that the keys are ordered. This can be very // important if a lot of the counts are repeated (like zero counts would be). // smaller term numbers sort higher, so subtract the term number instead long pair = (((long)c)<<32) + (Integer.MAX_VALUE - i); boolean displaced = queue.insert(pair); if (displaced) min=(int)(queue.top() >>> 32); } } // now select the right page from the results // if we are deep paging, we don't have to order the highest "offset" counts. int collectCount = Math.max(0, queue.size() - off); assert collectCount <= lim; // the start and end indexes of our list "sorted" (starting with the highest value) int sortedIdxStart = queue.size() - (collectCount - 1); int sortedIdxEnd = queue.size() + 1; final long[] sorted = queue.sort(collectCount); final int[] indirect = counts; // reuse the counts array for the index into the tnums array assert indirect.length >= sortedIdxEnd; for (int i=sortedIdxStart; i<sortedIdxEnd; i++) { long pair = sorted[i]; int c = (int)(pair >>> 32); int tnum = Integer.MAX_VALUE - (int)pair; indirect[i] = i; // store the index for indirect sorting sorted[i] = tnum; // reuse the "sorted" array to store the term numbers for indirect sorting // add a null label for now... we'll fill it in later. res.add(null, c); } // now sort the indexes by the term numbers PrimUtils.sort(sortedIdxStart, sortedIdxEnd, indirect, new PrimUtils.IntComparator() { @Override public int compare(int a, int b) { return (int)sorted[a] - (int)sorted[b]; } @Override public boolean lessThan(int a, int b) { return sorted[a] < sorted[b]; } @Override public boolean equals(int a, int b) { return sorted[a] == sorted[b]; } }); // convert the term numbers to term values and set // as the label //System.out.println("sortStart=" + sortedIdxStart + " end=" + sortedIdxEnd); for (int i=sortedIdxStart; i<sortedIdxEnd; i++) { int idx = indirect[i]; int tnum = (int)sorted[idx]; final String label = getReadableValue(getTermValue(te, tnum), ft, charsRef); //System.out.println(" label=" + label); res.setName(idx - sortedIdxStart, label); } } else { // add results in index order int i=startTerm; if (mincount<=0) { // if mincount<=0, then we won't discard any terms and we know exactly // where to start. i=startTerm+off; off=0; } for (; i<endTerm; i++) { int c = doNegative ? maxTermCounts[i] - counts[i] : counts[i]; if (c<mincount || --off>=0) continue; if (--lim<0) break; final String label = getReadableValue(getTermValue(te, i), ft, charsRef); res.add(label, c); } } }
// in core/src/java/org/apache/solr/request/UnInvertedField.java
public StatsValues getStats(SolrIndexSearcher searcher, DocSet baseDocs, String[] facet) throws IOException { //this function is ripped off nearly wholesale from the getCounts function to use //for multiValued fields within the StatsComponent. may be useful to find common //functionality between the two and refactor code somewhat use.incrementAndGet(); SchemaField sf = searcher.getSchema().getField(field); // FieldType ft = sf.getType(); StatsValues allstats = StatsValuesFactory.createStatsValues(sf); DocSet docs = baseDocs; int baseSize = docs.size(); int maxDoc = searcher.maxDoc(); if (baseSize <= 0) return allstats; DocSet missing = docs.andNot( searcher.getDocSet(new TermRangeQuery(field, null, null, false, false)) ); int i = 0; final FieldFacetStats[] finfo = new FieldFacetStats[facet.length]; //Initialize facetstats, if facets have been passed in FieldCache.DocTermsIndex si; for (String f : facet) { SchemaField facet_sf = searcher.getSchema().getField(f); try { si = FieldCache.DEFAULT.getTermsIndex(searcher.getAtomicReader(), f); } catch (IOException e) { throw new RuntimeException("failed to open field cache for: " + f, e); } finfo[i] = new FieldFacetStats(f, si, sf, facet_sf, numTermsInField); i++; } final int[] index = this.index; final int[] counts = new int[numTermsInField];//keep track of the number of times we see each word in the field for all the documents in the docset TermsEnum te = getOrdTermsEnum(searcher.getAtomicReader()); boolean doNegative = false; if (finfo.length == 0) { //if we're collecting statistics with a facet field, can't do inverted counting doNegative = baseSize > maxDoc >> 1 && termInstances > 0 && docs instanceof BitDocSet; } if (doNegative) { OpenBitSet bs = (OpenBitSet) ((BitDocSet) docs).getBits().clone(); bs.flip(0, maxDoc); // TODO: when iterator across negative elements is available, use that // instead of creating a new bitset and inverting. docs = new BitDocSet(bs, maxDoc - baseSize); // simply negating will mean that we have deleted docs in the set. // that should be OK, as their entries in our table should be empty. } // For the biggest terms, do straight set intersections for (TopTerm tt : bigTerms.values()) { // TODO: counts could be deferred if sorted==false if (tt.termNum >= 0 && tt.termNum < numTermsInField) { final Term t = new Term(field, tt.term); if (finfo.length == 0) { counts[tt.termNum] = searcher.numDocs(new TermQuery(t), docs); } else { //COULD BE VERY SLOW //if we're collecting stats for facet fields, we need to iterate on all matching documents DocSet bigTermDocSet = searcher.getDocSet(new TermQuery(t)).intersection(docs); DocIterator iter = bigTermDocSet.iterator(); while (iter.hasNext()) { int doc = iter.nextDoc(); counts[tt.termNum]++; for (FieldFacetStats f : finfo) { f.facetTermNum(doc, tt.termNum); } } } } } if (termInstances > 0) { DocIterator iter = docs.iterator(); while (iter.hasNext()) { int doc = iter.nextDoc(); int code = index[doc]; if ((code & 0xff) == 1) { int pos = code >>> 8; int whichArray = (doc >>> 16) & 0xff; byte[] arr = tnums[whichArray]; int tnum = 0; for (; ;) { int delta = 0; for (; ;) { byte b = arr[pos++]; delta = (delta << 7) | (b & 0x7f); if ((b & 0x80) == 0) break; } if (delta == 0) break; tnum += delta - TNUM_OFFSET; counts[tnum]++; for (FieldFacetStats f : finfo) { f.facetTermNum(doc, tnum); } } } else { int tnum = 0; int delta = 0; for (; ;) { delta = (delta << 7) | (code & 0x7f); if ((code & 0x80) == 0) { if (delta == 0) break; tnum += delta - TNUM_OFFSET; counts[tnum]++; for (FieldFacetStats f : finfo) { f.facetTermNum(doc, tnum); } delta = 0; } code >>>= 8; } } } } // add results in index order for (i = 0; i < numTermsInField; i++) { int c = doNegative ? maxTermCounts[i] - counts[i] : counts[i]; if (c == 0) continue; BytesRef value = getTermValue(te, i); allstats.accumulate(value, c); //as we've parsed the termnum into a value, lets also accumulate fieldfacet statistics for (FieldFacetStats f : finfo) { f.accumulateTermNum(i, value); } } int c = missing.size(); allstats.addMissing(c); if (finfo.length > 0) { for (FieldFacetStats f : finfo) { Map<String, StatsValues> facetStatsValues = f.facetStatsValues; FieldType facetType = searcher.getSchema().getFieldType(f.name); for (Map.Entry<String,StatsValues> entry : facetStatsValues.entrySet()) { String termLabel = entry.getKey(); int missingCount = searcher.numDocs(new TermQuery(new Term(f.name, facetType.toInternal(termLabel))), missing); entry.getValue().addMissing(missingCount); } allstats.addFacet(f.name, facetStatsValues); } } return allstats; }
// in core/src/java/org/apache/solr/request/UnInvertedField.java
BytesRef getTermValue(TermsEnum te, int termNum) throws IOException { //System.out.println("getTermValue termNum=" + termNum + " this=" + this + " numTerms=" + numTermsInField); if (bigTerms.size() > 0) { // see if the term is one of our big terms. TopTerm tt = bigTerms.get(termNum); if (tt != null) { //System.out.println(" return big " + tt.term); return tt.term; } } return lookupTerm(te, termNum); }
// in core/src/java/org/apache/solr/request/UnInvertedField.java
public static UnInvertedField getUnInvertedField(String field, SolrIndexSearcher searcher) throws IOException { SolrCache<String,UnInvertedField> cache = searcher.getFieldValueCache(); if (cache == null) { return new UnInvertedField(field, searcher); } UnInvertedField uif = cache.get(field); if (uif == null) { synchronized (cache) { uif = cache.get(field); if (uif == null) { uif = new UnInvertedField(field, searcher); cache.put(field, uif); } } } return uif; }
// in core/src/java/org/apache/solr/servlet/SolrDispatchFilter.java
public void doFilter(ServletRequest request, ServletResponse response, FilterChain chain) throws IOException, ServletException { if( abortErrorMessage != null ) { ((HttpServletResponse)response).sendError( 500, abortErrorMessage ); return; } if (this.cores == null) { ((HttpServletResponse)response).sendError( 403, "Server is shutting down" ); return; } CoreContainer cores = this.cores; SolrCore core = null; SolrQueryRequest solrReq = null; if( request instanceof HttpServletRequest) { HttpServletRequest req = (HttpServletRequest)request; HttpServletResponse resp = (HttpServletResponse)response; SolrRequestHandler handler = null; String corename = ""; try { // put the core container in request attribute req.setAttribute("org.apache.solr.CoreContainer", cores); String path = req.getServletPath(); if( req.getPathInfo() != null ) { // this lets you handle /update/commit when /update is a servlet path += req.getPathInfo(); } if( pathPrefix != null && path.startsWith( pathPrefix ) ) { path = path.substring( pathPrefix.length() ); } // check for management path String alternate = cores.getManagementPath(); if (alternate != null && path.startsWith(alternate)) { path = path.substring(0, alternate.length()); } // unused feature ? int idx = path.indexOf( ':' ); if( idx > 0 ) { // save the portion after the ':' for a 'handler' path parameter path = path.substring( 0, idx ); } // Check for the core admin page if( path.equals( cores.getAdminPath() ) ) { handler = cores.getMultiCoreHandler(); solrReq = adminRequestParser.parse(null,path, req); handleAdminRequest(req, response, handler, solrReq); return; } else { //otherwise, we should find a core from the path idx = path.indexOf( "/", 1 ); if( idx > 1 ) { // try to get the corename as a request parameter first corename = path.substring( 1, idx ); core = cores.getCore(corename); if (core != null) { path = path.substring( idx ); } } if (core == null) { if (!cores.isZooKeeperAware() ) { core = cores.getCore(""); } } } if (core == null && cores.isZooKeeperAware()) { // we couldn't find the core - lets make sure a collection was not specified instead core = getCoreByCollection(cores, corename, path); if (core != null) { // we found a core, update the path path = path.substring( idx ); } else { // try the default core core = cores.getCore(""); } // TODO: if we couldn't find it locally, look on other nodes } // With a valid core... if( core != null ) { final SolrConfig config = core.getSolrConfig(); // get or create/cache the parser for the core SolrRequestParsers parser = null; parser = parsers.get(config); if( parser == null ) { parser = new SolrRequestParsers(config); parsers.put(config, parser ); } // Determine the handler from the url path if not set // (we might already have selected the cores handler) if( handler == null && path.length() > 1 ) { // don't match "" or "/" as valid path handler = core.getRequestHandler( path ); // no handler yet but allowed to handle select; let's check if( handler == null && parser.isHandleSelect() ) { if( "/select".equals( path ) || "/select/".equals( path ) ) { solrReq = parser.parse( core, path, req ); String qt = solrReq.getParams().get( CommonParams.QT ); handler = core.getRequestHandler( qt ); if( handler == null ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "unknown handler: "+qt); } if( qt != null && qt.startsWith("/") && (handler instanceof ContentStreamHandlerBase)) { //For security reasons it's a bad idea to allow a leading '/', ex: /select?qt=/update see SOLR-3161 //There was no restriction from Solr 1.4 thru 3.5 and it's not supported for update handlers. throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "Invalid query type. Do not use /select to access: "+qt); } } } } // With a valid handler and a valid core... if( handler != null ) { // if not a /select, create the request if( solrReq == null ) { solrReq = parser.parse( core, path, req ); } final Method reqMethod = Method.getMethod(req.getMethod()); HttpCacheHeaderUtil.setCacheControlHeader(config, resp, reqMethod); // unless we have been explicitly told not to, do cache validation // if we fail cache validation, execute the query if (config.getHttpCachingConfig().isNever304() || !HttpCacheHeaderUtil.doCacheHeaderValidation(solrReq, req, reqMethod, resp)) { SolrQueryResponse solrRsp = new SolrQueryResponse(); /* even for HEAD requests, we need to execute the handler to * ensure we don't get an error (and to make sure the correct * QueryResponseWriter is selected and we get the correct * Content-Type) */ SolrRequestInfo.setRequestInfo(new SolrRequestInfo(solrReq, solrRsp)); this.execute( req, handler, solrReq, solrRsp ); HttpCacheHeaderUtil.checkHttpCachingVeto(solrRsp, resp, reqMethod); // add info to http headers //TODO: See SOLR-232 and SOLR-267. /*try { NamedList solrRspHeader = solrRsp.getResponseHeader(); for (int i=0; i<solrRspHeader.size(); i++) { ((javax.servlet.http.HttpServletResponse) response).addHeader(("Solr-" + solrRspHeader.getName(i)), String.valueOf(solrRspHeader.getVal(i))); } } catch (ClassCastException cce) { log.log(Level.WARNING, "exception adding response header log information", cce); }*/ QueryResponseWriter responseWriter = core.getQueryResponseWriter(solrReq); writeResponse(solrRsp, response, responseWriter, solrReq, reqMethod); } return; // we are done with a valid handler } } log.debug("no handler or core retrieved for " + path + ", follow through..."); } catch (Throwable ex) { sendError( core, solrReq, request, (HttpServletResponse)response, ex ); return; } finally { if( solrReq != null ) { solrReq.close(); } if (core != null) { core.close(); } SolrRequestInfo.clearRequestInfo(); } } // Otherwise let the webapp handle the request chain.doFilter(request, response); }
// in core/src/java/org/apache/solr/servlet/SolrDispatchFilter.java
private void handleAdminRequest(HttpServletRequest req, ServletResponse response, SolrRequestHandler handler, SolrQueryRequest solrReq) throws IOException { SolrQueryResponse solrResp = new SolrQueryResponse(); final NamedList<Object> responseHeader = new SimpleOrderedMap<Object>(); solrResp.add("responseHeader", responseHeader); NamedList toLog = solrResp.getToLog(); toLog.add("webapp", req.getContextPath()); toLog.add("path", solrReq.getContext().get("path")); toLog.add("params", "{" + solrReq.getParamString() + "}"); handler.handleRequest(solrReq, solrResp); SolrCore.setResponseHeaderValues(handler, solrReq, solrResp); StringBuilder sb = new StringBuilder(); for (int i = 0; i < toLog.size(); i++) { String name = toLog.getName(i); Object val = toLog.getVal(i); sb.append(name).append("=").append(val).append(" "); } QueryResponseWriter respWriter = SolrCore.DEFAULT_RESPONSE_WRITERS.get(solrReq.getParams().get(CommonParams.WT)); if (respWriter == null) respWriter = SolrCore.DEFAULT_RESPONSE_WRITERS.get("standard"); writeResponse(solrResp, response, respWriter, solrReq, Method.getMethod(req.getMethod())); }
// in core/src/java/org/apache/solr/servlet/SolrDispatchFilter.java
private void writeResponse(SolrQueryResponse solrRsp, ServletResponse response, QueryResponseWriter responseWriter, SolrQueryRequest solrReq, Method reqMethod) throws IOException { // Now write it out final String ct = responseWriter.getContentType(solrReq, solrRsp); // don't call setContentType on null if (null != ct) response.setContentType(ct); if (solrRsp.getException() != null) { NamedList info = new SimpleOrderedMap(); int code = getErrorInfo(solrRsp.getException(),info); solrRsp.add("error", info); ((HttpServletResponse) response).setStatus(code); } if (Method.HEAD != reqMethod) { if (responseWriter instanceof BinaryQueryResponseWriter) { BinaryQueryResponseWriter binWriter = (BinaryQueryResponseWriter) responseWriter; binWriter.write(response.getOutputStream(), solrReq, solrRsp); } else { String charset = ContentStreamBase.getCharsetFromContentType(ct); Writer out = (charset == null || charset.equalsIgnoreCase("UTF-8")) ? new OutputStreamWriter(response.getOutputStream(), UTF8) : new OutputStreamWriter(response.getOutputStream(), charset); out = new FastWriter(out); responseWriter.write(out, solrReq, solrRsp); out.flush(); } } //else http HEAD request, nothing to write out, waited this long just to get ContentType }
// in core/src/java/org/apache/solr/servlet/SolrDispatchFilter.java
protected void sendError(SolrCore core, SolrQueryRequest req, ServletRequest request, HttpServletResponse response, Throwable ex) throws IOException { try { SolrQueryResponse solrResp = new SolrQueryResponse(); if(ex instanceof Exception) { solrResp.setException((Exception)ex); } else { solrResp.setException(new RuntimeException(ex)); } if(core==null) { core = cores.getCore(""); // default core } if(req==null) { req = new SolrQueryRequestBase(core,new ServletSolrParams(request)) {}; } QueryResponseWriter writer = core.getQueryResponseWriter(req); writeResponse(solrResp, response, writer, req, Method.GET); } catch( Throwable t ) { // This error really does not matter SimpleOrderedMap info = new SimpleOrderedMap(); int code=getErrorInfo(ex, info); response.sendError( code, info.toString() ); } }
// in core/src/java/org/apache/solr/servlet/ZookeeperInfoServlet.java
Override public void doGet(HttpServletRequest request, HttpServletResponse response) throws IOException, ServletException { response.setCharacterEncoding("UTF-8"); response.setContentType("application/json"); // This attribute is set by the SolrDispatchFilter CoreContainer cores = (CoreContainer) request.getAttribute("org.apache.solr.CoreContainer"); String path = request.getParameter("path"); String addr = request.getParameter("addr"); if (addr != null && addr.length() == 0) { addr = null; } String detailS = request.getParameter("detail"); boolean detail = detailS != null && detailS.equals("true"); String dumpS = request.getParameter("dump"); boolean dump = dumpS != null && dumpS.equals("true"); PrintWriter out = response.getWriter(); ZKPrinter printer = new ZKPrinter(response, out, cores.getZkController(), addr); printer.detail = detail; printer.dump = dump; try { printer.print(path); } finally { printer.close(); } }
// in core/src/java/org/apache/solr/servlet/ZookeeperInfoServlet.java
Override public void doPost(HttpServletRequest request, HttpServletResponse response) throws IOException, ServletException { doGet(request, response); }
// in core/src/java/org/apache/solr/servlet/ZookeeperInfoServlet.java
void print(String path) throws IOException { if (zkClient == null) { return; } // normalize path if (path == null) { path = "/"; } else { path.trim(); if (path.length() == 0) { path = "/"; } } if (path.endsWith("/") && path.length() > 1) { path = path.substring(0, path.length() - 1); } int idx = path.lastIndexOf('/'); String parent = idx >= 0 ? path.substring(0, idx) : path; if (parent.length() == 0) { parent = "/"; } CharArr chars = new CharArr(); JSONWriter json = new JSONWriter(chars, 2); json.startObject(); if (detail) { if (!printZnode(json, path)) { return; } json.writeValueSeparator(); } json.writeString("tree"); json.writeNameSeparator(); json.startArray(); if (!printTree(json, path)) { return; // there was an error } json.endArray(); json.endObject(); out.println(chars.toString()); }
// in core/src/java/org/apache/solr/servlet/ZookeeperInfoServlet.java
boolean printTree(JSONWriter json, String path) throws IOException { String label = path; if (!fullpath) { int idx = path.lastIndexOf('/'); label = idx > 0 ? path.substring(idx + 1) : path; } json.startObject(); //writeKeyValue(json, "data", label, true ); json.writeString("data"); json.writeNameSeparator(); json.startObject(); writeKeyValue(json, "title", label, true); json.writeValueSeparator(); json.writeString("attr"); json.writeNameSeparator(); json.startObject(); writeKeyValue(json, "href", "zookeeper?detail=true&path=" + URLEncoder.encode(path, "UTF-8"), true); json.endObject(); json.endObject(); Stat stat = new Stat(); try { // Trickily, the call to zkClient.getData fills in the stat variable byte[] data = zkClient.getData(path, null, stat, true); if (stat.getEphemeralOwner() != 0) { writeKeyValue(json, "ephemeral", true, false); writeKeyValue(json, "version", stat.getVersion(), false); } if (dump) { json.writeValueSeparator(); printZnode(json, path); } /* if (stat.getNumChildren() != 0) { writeKeyValue(json, "children_count", stat.getNumChildren(), false ); out.println(", \"children_count\" : \"" + stat.getNumChildren() + "\""); } */ //if (stat.getDataLength() != 0) if (data != null) { String str = new BytesRef(data).utf8ToString(); //?? writeKeyValue(json, "content", str, false ); // Does nothing now, but on the assumption this will be used later we'll leave it in. If it comes out // the catches below need to be restructured. } } catch (IllegalArgumentException e) { // path doesn't exist (must have been removed) writeKeyValue(json, "warning", "(path gone)", false); } catch (KeeperException e) { writeKeyValue(json, "warning", e.toString(), false); log.warn("Keeper Exception", e); } catch (InterruptedException e) { writeKeyValue(json, "warning", e.toString(), false); log.warn("InterruptedException", e); } if (stat.getNumChildren() > 0) { json.writeValueSeparator(); if (indent) { json.indent(); } json.writeString("children"); json.writeNameSeparator(); json.startArray(); try { List<String> children = zkClient.getChildren(path, null, true); java.util.Collections.sort(children); boolean first = true; for (String child : children) { if (!first) { json.writeValueSeparator(); } String childPath = path + (path.endsWith("/") ? "" : "/") + child; if (!printTree(json, childPath)) { return false; } first = false; } } catch (KeeperException e) { writeError(500, e.toString()); return false; } catch (InterruptedException e) { writeError(500, e.toString()); return false; } catch (IllegalArgumentException e) { // path doesn't exist (must have been removed) json.writeString("(children gone)"); } json.endArray(); } json.endObject(); return true; }
// in core/src/java/org/apache/solr/servlet/ZookeeperInfoServlet.java
boolean printZnode(JSONWriter json, String path) throws IOException { try { Stat stat = new Stat(); // Trickily, the call to zkClient.getData fills in the stat variable byte[] data = zkClient.getData(path, null, stat, true); json.writeString("znode"); json.writeNameSeparator(); json.startObject(); writeKeyValue(json, "path", path, true); json.writeValueSeparator(); json.writeString("prop"); json.writeNameSeparator(); json.startObject(); writeKeyValue(json, "version", stat.getVersion(), true); writeKeyValue(json, "aversion", stat.getAversion(), false); writeKeyValue(json, "children_count", stat.getNumChildren(), false); writeKeyValue(json, "ctime", time(stat.getCtime()), false); writeKeyValue(json, "cversion", stat.getCversion(), false); writeKeyValue(json, "czxid", stat.getCzxid(), false); writeKeyValue(json, "dataLength", stat.getDataLength(), false); writeKeyValue(json, "ephemeralOwner", stat.getEphemeralOwner(), false); writeKeyValue(json, "mtime", time(stat.getMtime()), false); writeKeyValue(json, "mzxid", stat.getMzxid(), false); writeKeyValue(json, "pzxid", stat.getPzxid(), false); json.endObject(); if (data != null) { writeKeyValue(json, "data", new BytesRef(data).utf8ToString(), false); } json.endObject(); } catch (KeeperException e) { writeError(500, e.toString()); return false; } catch (InterruptedException e) { writeError(500, e.toString()); return false; } return true; }
// in core/src/java/org/apache/solr/servlet/SolrRequestParsers.java
public InputStream getStream() throws IOException { return req.getInputStream(); }
// in core/src/java/org/apache/solr/servlet/SolrRequestParsers.java
public InputStream getStream() throws IOException { return item.getInputStream(); }
// in core/src/java/org/apache/solr/servlet/LoadAdminUiServlet.java
Override public void doGet(HttpServletRequest request, HttpServletResponse response) throws IOException, ServletException { response.setCharacterEncoding("UTF-8"); response.setContentType("text/html"); PrintWriter out = response.getWriter(); InputStream in = getServletContext().getResourceAsStream("/admin.html"); if(in != null) { try { // This attribute is set by the SolrDispatchFilter CoreContainer cores = (CoreContainer) request.getAttribute("org.apache.solr.CoreContainer"); String html = IOUtils.toString(in, "UTF-8"); String[] search = new String[] { "${contextPath}", "${adminPath}" }; String[] replace = new String[] { StringEscapeUtils.escapeJavaScript(request.getContextPath()), StringEscapeUtils.escapeJavaScript(cores.getAdminPath()) }; out.println( StringUtils.replaceEach(html, search, replace) ); } finally { IOUtils.closeQuietly(in); } } else { out.println("solr"); } }
// in core/src/java/org/apache/solr/servlet/LoadAdminUiServlet.java
Override public void doPost(HttpServletRequest request, HttpServletResponse response) throws IOException, ServletException { doGet(request, response); }
// in core/src/java/org/apache/solr/servlet/RedirectServlet.java
public void doGet(HttpServletRequest req, HttpServletResponse res) throws ServletException,IOException { res.setStatus(code); res.setHeader("Location", destination); }
// in core/src/java/org/apache/solr/servlet/RedirectServlet.java
public void doPost(HttpServletRequest req, HttpServletResponse res) throws ServletException,IOException { doGet(req,res); }
// in core/src/java/org/apache/solr/servlet/cache/HttpCacheHeaderUtil.java
public static void sendNotModified(HttpServletResponse res) throws IOException { res.setStatus(HttpServletResponse.SC_NOT_MODIFIED); }
// in core/src/java/org/apache/solr/servlet/cache/HttpCacheHeaderUtil.java
public static void sendPreconditionFailed(HttpServletResponse res) throws IOException { res.setStatus(HttpServletResponse.SC_PRECONDITION_FAILED); }
// in core/src/java/org/apache/solr/servlet/cache/HttpCacheHeaderUtil.java
public static boolean doCacheHeaderValidation(final SolrQueryRequest solrReq, final HttpServletRequest req, final Method reqMethod, final HttpServletResponse resp) throws IOException { if (Method.POST==reqMethod || Method.OTHER==reqMethod) { return false; } final long lastMod = HttpCacheHeaderUtil.calcLastModified(solrReq); final String etag = HttpCacheHeaderUtil.calcEtag(solrReq); resp.setDateHeader("Last-Modified", lastMod); resp.setHeader("ETag", etag); if (checkETagValidators(req, resp, reqMethod, etag)) { return true; } if (checkLastModValidators(req, resp, lastMod)) { return true; } return false; }
// in core/src/java/org/apache/solr/servlet/cache/HttpCacheHeaderUtil.java
public static boolean checkLastModValidators(final HttpServletRequest req, final HttpServletResponse resp, final long lastMod) throws IOException { try { // First check for If-Modified-Since because this is the common // used header by HTTP clients final long modifiedSince = req.getDateHeader("If-Modified-Since"); if (modifiedSince != -1L && lastMod <= modifiedSince) { // Send a "not-modified" sendNotModified(resp); return true; } final long unmodifiedSince = req.getDateHeader("If-Unmodified-Since"); if (unmodifiedSince != -1L && lastMod > unmodifiedSince) { // Send a "precondition failed" sendPreconditionFailed(resp); return true; } } catch (IllegalArgumentException iae) { // one of our date headers was not formated properly, ignore it /* NOOP */ } return false; }
// in core/src/java/org/apache/solr/schema/BCDIntField.java
Override public void write(TextResponseWriter writer, String name, IndexableField f) throws IOException { writer.writeInt(name,toExternal(f)); }
// in core/src/java/org/apache/solr/schema/SortableIntField.java
Override public void write(TextResponseWriter writer, String name, IndexableField f) throws IOException { String sval = f.stringValue(); writer.writeInt(name, NumberUtils.SortableStr2int(sval,0,sval.length())); }
// in core/src/java/org/apache/solr/schema/SortableIntField.java
Override public FunctionValues getValues(Map context, AtomicReaderContext readerContext) throws IOException { final int def = defVal; return new DocTermsIndexDocValues(this, readerContext, field) { private final BytesRef spare = new BytesRef(); @Override protected String toTerm(String readableValue) { return NumberUtils.int2sortableStr(readableValue); } @Override public float floatVal(int doc) { return (float)intVal(doc); } @Override public int intVal(int doc) { int ord=termsIndex.getOrd(doc); return ord==0 ? def : NumberUtils.SortableStr2int(termsIndex.lookup(ord, spare),0,3); } @Override public long longVal(int doc) { return (long)intVal(doc); } @Override public double doubleVal(int doc) { return (double)intVal(doc); } @Override public String strVal(int doc) { return Integer.toString(intVal(doc)); } @Override public String toString(int doc) { return description() + '=' + intVal(doc); } @Override public Object objectVal(int doc) { int ord=termsIndex.getOrd(doc); return ord==0 ? null : NumberUtils.SortableStr2int(termsIndex.lookup(ord, spare)); } @Override public ValueFiller getValueFiller() { return new ValueFiller() { private final MutableValueInt mval = new MutableValueInt(); @Override public MutableValue getValue() { return mval; } @Override public void fillValue(int doc) { int ord=termsIndex.getOrd(doc); if (ord == 0) { mval.value = def; mval.exists = false; } else { mval.value = NumberUtils.SortableStr2int(termsIndex.lookup(ord, spare),0,3); mval.exists = true; } } }; } }; }
// in core/src/java/org/apache/solr/schema/StrField.java
Override public void write(TextResponseWriter writer, String name, IndexableField f) throws IOException { writer.writeStr(name, f.stringValue(), true); }
// in core/src/java/org/apache/solr/schema/DoubleField.java
Override public void write(TextResponseWriter writer, String name, IndexableField f) throws IOException { String s = f.stringValue(); // these values may be from a legacy lucene index, which may // not be properly formatted in some output formats, or may // incorrectly have a zero length. if (s.length()==0) { // zero length value means someone mistakenly indexed the value // instead of simply leaving it out. Write a null value instead of a numeric. writer.writeNull(name); return; } try { double val = Double.parseDouble(s); writer.writeDouble(name, val); } catch (NumberFormatException e){ // can't parse - write out the contents as a string so nothing is lost and // clients don't get a parse error. writer.writeStr(name, s, true); } }
// in core/src/java/org/apache/solr/schema/RandomSortField.java
Override public void write(TextResponseWriter writer, String name, IndexableField f) throws IOException { }
// in core/src/java/org/apache/solr/schema/RandomSortField.java
Override public FieldComparator<Integer> newComparator(final String fieldname, final int numHits, int sortPos, boolean reversed) throws IOException { return new FieldComparator<Integer>() { int seed; private final int[] values = new int[numHits]; int bottomVal; @Override public int compare(int slot1, int slot2) { return values[slot1] - values[slot2]; // values will be positive... no overflow possible. } @Override public void setBottom(int slot) { bottomVal = values[slot]; } @Override public int compareBottom(int doc) throws IOException { return bottomVal - hash(doc+seed); } @Override public void copy(int slot, int doc) throws IOException { values[slot] = hash(doc+seed); } @Override public FieldComparator setNextReader(AtomicReaderContext context) throws IOException { seed = getSeed(fieldname, context); return this; } @Override public Integer value(int slot) { return values[slot]; } @Override public int compareDocToValue(int doc, Integer valueObj) { // values will be positive... no overflow possible. return hash(doc+seed) - valueObj.intValue(); } }; }
// in core/src/java/org/apache/solr/schema/RandomSortField.java
Override public int compareBottom(int doc) throws IOException { return bottomVal - hash(doc+seed); }
// in core/src/java/org/apache/solr/schema/RandomSortField.java
Override public void copy(int slot, int doc) throws IOException { values[slot] = hash(doc+seed); }
// in core/src/java/org/apache/solr/schema/RandomSortField.java
Override public FieldComparator setNextReader(AtomicReaderContext context) throws IOException { seed = getSeed(fieldname, context); return this; }
// in core/src/java/org/apache/solr/schema/RandomSortField.java
Override public FunctionValues getValues(Map context, final AtomicReaderContext readerContext) throws IOException { return new IntDocValues(this) { private final int seed = getSeed(field, readerContext); @Override public int intVal(int doc) { return hash(doc+seed); } }; }
// in core/src/java/org/apache/solr/schema/DateField.java
Override public void write(TextResponseWriter writer, String name, IndexableField f) throws IOException { writer.writeDate(name, toExternal(f)); }
// in core/src/java/org/apache/solr/schema/DateField.java
Override public FunctionValues getValues(Map context, AtomicReaderContext readerContext) throws IOException { return new DocTermsIndexDocValues(this, readerContext, field) { @Override protected String toTerm(String readableValue) { // needed for frange queries to work properly return ft.toInternal(readableValue); } @Override public float floatVal(int doc) { return (float)intVal(doc); } @Override public int intVal(int doc) { int ord=termsIndex.getOrd(doc); return ord; } @Override public long longVal(int doc) { return (long)intVal(doc); } @Override public double doubleVal(int doc) { return (double)intVal(doc); } @Override public String strVal(int doc) { int ord=termsIndex.getOrd(doc); if (ord == 0) { return null; } else { final BytesRef br = termsIndex.lookup(ord, spare); return ft.indexedToReadable(br, spareChars).toString(); } } @Override public Object objectVal(int doc) { int ord=termsIndex.getOrd(doc); if (ord == 0) { return null; } else { final BytesRef br = termsIndex.lookup(ord, new BytesRef()); return ft.toObject(null, br); } } @Override public String toString(int doc) { return description() + '=' + intVal(doc); } }; }
// in core/src/java/org/apache/solr/schema/CurrencyField.java
public void write(XMLWriter xmlWriter, String name, IndexableField field) throws IOException { xmlWriter.writeStr(name, field.stringValue(), false); }
// in core/src/java/org/apache/solr/schema/CurrencyField.java
Override public void write(TextResponseWriter writer, String name, IndexableField field) throws IOException { writer.writeStr(name, field.stringValue(), false); }
// in core/src/java/org/apache/solr/schema/CurrencyField.java
public FunctionValues getValues(Map context, AtomicReaderContext reader) throws IOException { final FunctionValues amounts = amountValues.getValues(context, reader); final FunctionValues currencies = currencyValues.getValues(context, reader); return new FunctionValues() { private final int MAX_CURRENCIES_TO_CACHE = 256; private final int[] fractionDigitCache = new int[MAX_CURRENCIES_TO_CACHE]; private final String[] currencyOrdToCurrencyCache = new String[MAX_CURRENCIES_TO_CACHE]; private final double[] exchangeRateCache = new double[MAX_CURRENCIES_TO_CACHE]; private int targetFractionDigits = -1; private int targetCurrencyOrd = -1; private boolean initializedCache; private String getDocCurrencyCode(int doc, int currencyOrd) { if (currencyOrd < MAX_CURRENCIES_TO_CACHE) { String currency = currencyOrdToCurrencyCache[currencyOrd]; if (currency == null) { currencyOrdToCurrencyCache[currencyOrd] = currency = currencies.strVal(doc); } if (currency == null) { currency = defaultCurrency; } if (targetCurrencyOrd == -1 && currency.equals(targetCurrencyCode)) { targetCurrencyOrd = currencyOrd; } return currency; } else { return currencies.strVal(doc); } } public long longVal(int doc) { if (!initializedCache) { for (int i = 0; i < fractionDigitCache.length; i++) { fractionDigitCache[i] = -1; } initializedCache = true; } long amount = amounts.longVal(doc); int currencyOrd = currencies.ordVal(doc); if (currencyOrd == targetCurrencyOrd) { return amount; } double exchangeRate; int sourceFractionDigits; if (targetFractionDigits == -1) { targetFractionDigits = Currency.getInstance(targetCurrencyCode).getDefaultFractionDigits(); } if (currencyOrd < MAX_CURRENCIES_TO_CACHE) { exchangeRate = exchangeRateCache[currencyOrd]; if (exchangeRate <= 0.0) { String sourceCurrencyCode = getDocCurrencyCode(doc, currencyOrd); exchangeRate = exchangeRateCache[currencyOrd] = provider.getExchangeRate(sourceCurrencyCode, targetCurrencyCode); } sourceFractionDigits = fractionDigitCache[currencyOrd]; if (sourceFractionDigits == -1) { String sourceCurrencyCode = getDocCurrencyCode(doc, currencyOrd); sourceFractionDigits = fractionDigitCache[currencyOrd] = Currency.getInstance(sourceCurrencyCode).getDefaultFractionDigits(); } } else { String sourceCurrencyCode = getDocCurrencyCode(doc, currencyOrd); exchangeRate = provider.getExchangeRate(sourceCurrencyCode, targetCurrencyCode); sourceFractionDigits = Currency.getInstance(sourceCurrencyCode).getDefaultFractionDigits(); } return CurrencyValue.convertAmount(exchangeRate, sourceFractionDigits, amount, targetFractionDigits); } public int intVal(int doc) { return (int) longVal(doc); } public double doubleVal(int doc) { return (double) longVal(doc); } public float floatVal(int doc) { return (float) longVal(doc); } public String strVal(int doc) { return Long.toString(longVal(doc)); } public String toString(int doc) { return name() + '(' + amounts.toString(doc) + ',' + currencies.toString(doc) + ')'; } }; }
// in core/src/java/org/apache/solr/schema/SortableFloatField.java
Override public void write(TextResponseWriter writer, String name, IndexableField f) throws IOException { String sval = f.stringValue(); writer.writeFloat(name, NumberUtils.SortableStr2float(sval)); }
// in core/src/java/org/apache/solr/schema/SortableFloatField.java
Override public FunctionValues getValues(Map context, AtomicReaderContext readerContext) throws IOException { final float def = defVal; return new DocTermsIndexDocValues(this, readerContext, field) { private final BytesRef spare = new BytesRef(); @Override protected String toTerm(String readableValue) { return NumberUtils.float2sortableStr(readableValue); } @Override public float floatVal(int doc) { int ord=termsIndex.getOrd(doc); return ord==0 ? def : NumberUtils.SortableStr2float(termsIndex.lookup(ord, spare)); } @Override public int intVal(int doc) { return (int)floatVal(doc); } @Override public long longVal(int doc) { return (long)floatVal(doc); } @Override public double doubleVal(int doc) { return (double)floatVal(doc); } @Override public String strVal(int doc) { return Float.toString(floatVal(doc)); } @Override public String toString(int doc) { return description() + '=' + floatVal(doc); } @Override public Object objectVal(int doc) { int ord=termsIndex.getOrd(doc); return ord==0 ? null : NumberUtils.SortableStr2float(termsIndex.lookup(ord, spare)); } @Override public ValueFiller getValueFiller() { return new ValueFiller() { private final MutableValueFloat mval = new MutableValueFloat(); @Override public MutableValue getValue() { return mval; } @Override public void fillValue(int doc) { int ord=termsIndex.getOrd(doc); if (ord == 0) { mval.value = def; mval.exists = false; } else { mval.value = NumberUtils.SortableStr2float(termsIndex.lookup(ord, spare)); mval.exists = true; } } }; } }; }
// in core/src/java/org/apache/solr/schema/IntField.java
Override public void write(TextResponseWriter writer, String name, IndexableField f) throws IOException { String s = f.stringValue(); // these values may be from a legacy lucene index, which may // not be properly formatted in some output formats, or may // incorrectly have a zero length. if (s.length()==0) { // zero length value means someone mistakenly indexed the value // instead of simply leaving it out. Write a null value instead of a numeric. writer.writeNull(name); return; } try { int val = Integer.parseInt(s); writer.writeInt(name, val); } catch (NumberFormatException e){ // can't parse - write out the contents as a string so nothing is lost and // clients don't get a parse error. writer.writeStr(name, s, true); } }
// in core/src/java/org/apache/solr/schema/FloatField.java
Override public void write(TextResponseWriter writer, String name, IndexableField f) throws IOException { String s = f.stringValue(); // these values may be from a legacy lucene index, which may // not be properly formatted in some output formats, or may // incorrectly have a zero length. if (s.length()==0) { // zero length value means someone mistakenly indexed the value // instead of simply leaving it out. Write a null value instead of a numeric. writer.writeNull(name); return; } try { float fval = Float.parseFloat(s); writer.writeFloat(name, fval); } catch (NumberFormatException e){ // can't parse - write out the contents as a string so nothing is lost and // clients don't get a parse error. writer.writeStr(name, s, true); } }
// in core/src/java/org/apache/solr/schema/TrieField.java
Override public void write(TextResponseWriter writer, String name, IndexableField f) throws IOException { writer.writeVal(name, toObject(f)); }
// in core/src/java/org/apache/solr/schema/BoolField.java
Override public TokenStreamComponents createComponents(String fieldName, Reader reader) { Tokenizer tokenizer = new Tokenizer(reader) { final CharTermAttribute termAtt = addAttribute(CharTermAttribute.class); boolean done = false; @Override public void reset(Reader input) throws IOException { done = false; super.reset(input); } @Override public boolean incrementToken() throws IOException { clearAttributes(); if (done) return false; done = true; int ch = input.read(); if (ch==-1) return false; termAtt.copyBuffer( ((ch=='t' || ch=='T' || ch=='1') ? TRUE_TOKEN : FALSE_TOKEN) ,0,1); return true; } }; return new TokenStreamComponents(tokenizer); }
// in core/src/java/org/apache/solr/schema/BoolField.java
Override public void reset(Reader input) throws IOException { done = false; super.reset(input); }
// in core/src/java/org/apache/solr/schema/BoolField.java
Override public boolean incrementToken() throws IOException { clearAttributes(); if (done) return false; done = true; int ch = input.read(); if (ch==-1) return false; termAtt.copyBuffer( ((ch=='t' || ch=='T' || ch=='1') ? TRUE_TOKEN : FALSE_TOKEN) ,0,1); return true; }
// in core/src/java/org/apache/solr/schema/BoolField.java
Override public void write(TextResponseWriter writer, String name, IndexableField f) throws IOException { writer.writeBool(name, f.stringValue().charAt(0) == 'T'); }
// in core/src/java/org/apache/solr/schema/BoolField.java
Override public FunctionValues getValues(Map context, AtomicReaderContext readerContext) throws IOException { final FieldCache.DocTermsIndex sindex = FieldCache.DEFAULT.getTermsIndex(readerContext.reader(), field); // figure out what ord maps to true int nord = sindex.numOrd(); BytesRef br = new BytesRef(); int tord = -1; for (int i=1; i<nord; i++) { sindex.lookup(i, br); if (br.length==1 && br.bytes[br.offset]=='T') { tord = i; break; } } final int trueOrd = tord; return new BoolDocValues(this) { @Override public boolean boolVal(int doc) { return sindex.getOrd(doc) == trueOrd; } @Override public boolean exists(int doc) { return sindex.getOrd(doc) != 0; } @Override public ValueFiller getValueFiller() { return new ValueFiller() { private final MutableValueBool mval = new MutableValueBool(); @Override public MutableValue getValue() { return mval; } @Override public void fillValue(int doc) { int ord = sindex.getOrd(doc); mval.value = (ord == trueOrd); mval.exists = (ord != 0); } }; } }; }
// in core/src/java/org/apache/solr/schema/BinaryField.java
Override public void write(TextResponseWriter writer, String name, IndexableField f) throws IOException { writer.writeStr(name, toBase64String(toObject(f)), false); }
// in core/src/java/org/apache/solr/schema/LatLonType.java
Override public void write(TextResponseWriter writer, String name, IndexableField f) throws IOException { writer.writeStr(name, f.stringValue(), false); }
// in core/src/java/org/apache/solr/schema/LatLonType.java
Override public Query rewrite(IndexReader reader) throws IOException { return bboxQuery != null ? bboxQuery.rewrite(reader) : this; }
// in core/src/java/org/apache/solr/schema/LatLonType.java
Override public float getValueForNormalization() throws IOException { queryWeight = getBoost(); return queryWeight * queryWeight; }
// in core/src/java/org/apache/solr/schema/LatLonType.java
Override public Scorer scorer(AtomicReaderContext context, boolean scoreDocsInOrder, boolean topScorer, Bits acceptDocs) throws IOException { return new SpatialScorer(context, acceptDocs, this, queryWeight); }
// in core/src/java/org/apache/solr/schema/LatLonType.java
Override public Explanation explain(AtomicReaderContext context, int doc) throws IOException { return ((SpatialScorer)scorer(context, true, true, context.reader().getLiveDocs())).explain(doc); }
// in core/src/java/org/apache/solr/schema/LatLonType.java
Override public int nextDoc() throws IOException { for(;;) { ++doc; if (doc>=maxDoc) { return doc=NO_MORE_DOCS; } if (acceptDocs != null && !acceptDocs.get(doc)) continue; if (!match()) continue; return doc; } }
// in core/src/java/org/apache/solr/schema/LatLonType.java
Override public int advance(int target) throws IOException { // this will work even if target==NO_MORE_DOCS doc=target-1; return nextDoc(); }
// in core/src/java/org/apache/solr/schema/LatLonType.java
Override public float score() throws IOException { double dist = (doc == lastDistDoc) ? lastDist : dist(latVals.doubleVal(doc), lonVals.doubleVal(doc)); return (float)(dist * qWeight); }
// in core/src/java/org/apache/solr/schema/LatLonType.java
public Explanation explain(int doc) throws IOException { advance(doc); boolean matched = this.doc == doc; this.doc = doc; float sc = matched ? score() : 0; double dist = dist(latVals.doubleVal(doc), lonVals.doubleVal(doc)); String description = SpatialDistanceQuery.this.toString(); Explanation result = new ComplexExplanation (this.doc == doc, sc, description + " product of:"); // result.addDetail(new Explanation((float)dist, "hsin("+latVals.explain(doc)+","+lonVals.explain(doc))); result.addDetail(new Explanation((float)dist, "hsin("+latVals.doubleVal(doc)+","+lonVals.doubleVal(doc))); result.addDetail(new Explanation(getBoost(), "boost")); result.addDetail(new Explanation(weight.queryNorm,"queryNorm")); return result; }
// in core/src/java/org/apache/solr/schema/LatLonType.java
Override public void collect(int doc) throws IOException { spatialScorer.doc = doc; if (spatialScorer.match()) delegate.collect(doc); }
// in core/src/java/org/apache/solr/schema/LatLonType.java
Override public void setNextReader(AtomicReaderContext context) throws IOException { maxdoc = context.reader().maxDoc(); spatialScorer = new SpatialScorer(context, null, weight, 1.0f); super.setNextReader(context); }
// in core/src/java/org/apache/solr/schema/LatLonType.java
Override public Weight createWeight(IndexSearcher searcher) throws IOException { // if we were supposed to use bboxQuery, then we should have been rewritten using that query assert bboxQuery == null; return new SpatialWeight(searcher); }
// in core/src/java/org/apache/solr/schema/ByteField.java
Override public void write(TextResponseWriter writer, String name, IndexableField f) throws IOException { String s = f.stringValue(); // these values may be from a legacy lucene index, which may // not be properly formatted in some output formats, or may // incorrectly have a zero length. if (s.length()==0) { // zero length value means someone mistakenly indexed the value // instead of simply leaving it out. Write a null value instead of a numeric. writer.writeNull(name); return; } try { byte val = Byte.parseByte(s); writer.writeInt(name, val); } catch (NumberFormatException e){ // can't parse - write out the contents as a string so nothing is lost and // clients don't get a parse error. writer.writeStr(name, s, true); } }
// in core/src/java/org/apache/solr/schema/PointType.java
Override public void write(TextResponseWriter writer, String name, IndexableField f) throws IOException { writer.writeStr(name, f.stringValue(), false); }
// in core/src/java/org/apache/solr/schema/UUIDField.java
Override public void write(TextResponseWriter writer, String name, IndexableField f) throws IOException { writer.writeStr(name, f.stringValue(), false); }
// in core/src/java/org/apache/solr/schema/SimplePreAnalyzedParser.java
Override public ParseResult parse(Reader reader, AttributeSource parent) throws IOException { ParseResult res = new ParseResult(); StringBuilder sb = new StringBuilder(); char[] buf = new char[128]; int cnt; while ((cnt = reader.read(buf)) > 0) { sb.append(buf, 0, cnt); } String val = sb.toString(); // empty string - accept even without version number if (val.length() == 0) { return res; } // first consume the version int idx = val.indexOf(' '); if (idx == -1) { throw new IOException("Missing VERSION token"); } String version = val.substring(0, idx); if (!VERSION.equals(version)) { throw new IOException("Unknown VERSION " + version); } val = val.substring(idx + 1); // then consume the optional stored part int tsStart = 0; boolean hasStored = false; StringBuilder storedBuf = new StringBuilder(); if (val.charAt(0) == '=') { hasStored = true; if (val.length() > 1) { for (int i = 1; i < val.length(); i++) { char c = val.charAt(i); if (c == '\\') { if (i < val.length() - 1) { c = val.charAt(++i); if (c == '=') { // we recognize only \= escape in the stored part storedBuf.append('='); } else { storedBuf.append('\\'); storedBuf.append(c); continue; } } else { storedBuf.append(c); continue; } } else if (c == '=') { // end of stored text tsStart = i + 1; break; } else { storedBuf.append(c); } } if (tsStart == 0) { // missing end-of-stored marker throw new IOException("Missing end marker of stored part"); } } else { throw new IOException("Unexpected end of stored field"); } } if (hasStored) { res.str = storedBuf.toString(); } Tok tok = new Tok(); StringBuilder attName = new StringBuilder(); StringBuilder attVal = new StringBuilder(); // parser state S s = S.UNDEF; int lastPos = 0; for (int i = tsStart; i < val.length(); i++) { char c = val.charAt(i); if (c == ' ') { // collect leftovers switch (s) { case VALUE : if (attVal.length() == 0) { throw new IOException("Unexpected character '" + c + "' at position " + i + " - empty value of attribute."); } if (attName.length() > 0) { tok.attr.put(attName.toString(), attVal.toString()); } break; case NAME: // attr name without a value ? if (attName.length() > 0) { throw new IOException("Unexpected character '" + c + "' at position " + i + " - missing attribute value."); } else { // accept missing att name and value } break; case TOKEN: case UNDEF: // do nothing, advance to next token } attName.setLength(0); attVal.setLength(0); if (!tok.isEmpty() || s == S.NAME) { AttributeSource.State state = createState(parent, tok, lastPos); if (state != null) res.states.add(state.clone()); } // reset tok s = S.UNDEF; tok.reset(); // skip lastPos++; continue; } StringBuilder tgt = null; switch (s) { case TOKEN: tgt = tok.token; break; case NAME: tgt = attName; break; case VALUE: tgt = attVal; break; case UNDEF: tgt = tok.token; s = S.TOKEN; } if (c == '\\') { if (s == S.TOKEN) lastPos++; if (i >= val.length() - 1) { // end tgt.append(c); continue; } else { c = val.charAt(++i); switch (c) { case '\\' : case '=' : case ',' : case ' ' : tgt.append(c); break; case 'n': tgt.append('\n'); break; case 'r': tgt.append('\r'); break; case 't': tgt.append('\t'); break; default: tgt.append('\\'); tgt.append(c); lastPos++; } } } else { // state switch if (c == ',') { if (s == S.TOKEN) { s = S.NAME; } else if (s == S.VALUE) { // end of value, start of next attr if (attVal.length() == 0) { throw new IOException("Unexpected character '" + c + "' at position " + i + " - empty value of attribute."); } if (attName.length() > 0 && attVal.length() > 0) { tok.attr.put(attName.toString(), attVal.toString()); } // reset attName.setLength(0); attVal.setLength(0); s = S.NAME; } else { throw new IOException("Unexpected character '" + c + "' at position " + i + " - missing attribute value."); } } else if (c == '=') { if (s == S.NAME) { s = S.VALUE; } else { throw new IOException("Unexpected character '" + c + "' at position " + i + " - empty value of attribute."); } } else { tgt.append(c); if (s == S.TOKEN) lastPos++; } } } // collect leftovers if (!tok.isEmpty() || s == S.NAME || s == S.VALUE) { // remaining attrib? if (s == S.VALUE) { if (attName.length() > 0 && attVal.length() > 0) { tok.attr.put(attName.toString(), attVal.toString()); } } AttributeSource.State state = createState(parent, tok, lastPos); if (state != null) res.states.add(state.clone()); } return res; }
// in core/src/java/org/apache/solr/schema/SimplePreAnalyzedParser.java
public String toFormattedString(Field f) throws IOException { StringBuilder sb = new StringBuilder(); sb.append(VERSION + " "); if (f.fieldType().stored()) { String s = f.stringValue(); if (s != null) { // encode the equals sign s = s.replaceAll("=", "\\="); sb.append('='); sb.append(s); sb.append('='); } } TokenStream ts = f.tokenStreamValue(); if (ts != null) { StringBuilder tok = new StringBuilder(); boolean next = false; while (ts.incrementToken()) { if (next) { sb.append(' '); } else { next = true; } tok.setLength(0); Iterator<Class<? extends Attribute>> it = ts.getAttributeClassesIterator(); String cTerm = null; String tTerm = null; while (it.hasNext()) { Class<? extends Attribute> cl = it.next(); if (!ts.hasAttribute(cl)) { continue; } Attribute att = ts.getAttribute(cl); if (cl.isAssignableFrom(CharTermAttribute.class)) { CharTermAttribute catt = (CharTermAttribute)att; cTerm = escape(catt.buffer(), catt.length()); } else if (cl.isAssignableFrom(TermToBytesRefAttribute.class)) { TermToBytesRefAttribute tatt = (TermToBytesRefAttribute)att; char[] tTermChars = tatt.getBytesRef().utf8ToString().toCharArray(); tTerm = escape(tTermChars, tTermChars.length); } else { if (tok.length() > 0) tok.append(','); if (cl.isAssignableFrom(FlagsAttribute.class)) { tok.append("f=" + Integer.toHexString(((FlagsAttribute)att).getFlags())); } else if (cl.isAssignableFrom(OffsetAttribute.class)) { tok.append("s=" + ((OffsetAttribute)att).startOffset() + ",e=" + ((OffsetAttribute)att).endOffset()); } else if (cl.isAssignableFrom(PayloadAttribute.class)) { Payload p = ((PayloadAttribute)att).getPayload(); if (p != null && p.length() > 0) { tok.append("p=" + bytesToHex(p.getData(), p.getOffset(), p.length())); } else if (tok.length() > 0) { tok.setLength(tok.length() - 1); // remove the last comma } } else if (cl.isAssignableFrom(PositionIncrementAttribute.class)) { tok.append("i=" + ((PositionIncrementAttribute)att).getPositionIncrement()); } else if (cl.isAssignableFrom(TypeAttribute.class)) { tok.append("y=" + escape(((TypeAttribute)att).type())); } else { tok.append(cl.getName() + "=" + escape(att.toString())); } } } String term = null; if (cTerm != null) { term = cTerm; } else { term = tTerm; } if (term != null && term.length() > 0) { if (tok.length() > 0) { tok.insert(0, term + ","); } else { tok.insert(0, term); } } sb.append(tok); } } return sb.toString(); }
// in core/src/java/org/apache/solr/schema/LongField.java
Override public void write(TextResponseWriter writer, String name, IndexableField f) throws IOException { String s = f.stringValue(); // these values may be from a legacy lucene index, which may // not be properly formatted in some output formats, or may // incorrectly have a zero length. if (s.length()==0) { // zero length value means someone mistakenly indexed the value // instead of simply leaving it out. Write a null value instead of a numeric. writer.writeNull(name); return; } try { long val = Long.parseLong(s); writer.writeLong(name, val); } catch (NumberFormatException e){ // can't parse - write out the contents as a string so nothing is lost and // clients don't get a parse error. writer.writeStr(name, s, true); } }
// in core/src/java/org/apache/solr/schema/GeoHashField.java
Override public void write(TextResponseWriter writer, String name, IndexableField f) throws IOException { writer.writeStr(name, toExternal(f), false); }
// in core/src/java/org/apache/solr/schema/ExternalFileField.java
Override public void write(TextResponseWriter writer, String name, IndexableField f) throws IOException { throw new UnsupportedOperationException(); }
// in core/src/java/org/apache/solr/schema/ShortField.java
Override public void write(TextResponseWriter writer, String name, IndexableField f) throws IOException { String s = f.stringValue(); // these values may be from a legacy lucene index, which may // not be properly formatted in some output formats, or may // incorrectly have a zero length. if (s.length()==0) { // zero length value means someone mistakenly indexed the value // instead of simply leaving it out. Write a null value instead of a numeric. writer.writeNull(name); return; } try { short val = Short.parseShort(s); writer.writeInt(name, val); } catch (NumberFormatException e){ // can't parse - write out the contents as a string so nothing is lost and // clients don't get a parse error. writer.writeStr(name, s, true); } }
// in core/src/java/org/apache/solr/schema/SchemaField.java
public void write(TextResponseWriter writer, String name, IndexableField val) throws IOException { // name is passed in because it may be null if name should not be used. type.write(writer,name,val); }
// in core/src/java/org/apache/solr/schema/StrFieldSource.java
Override public FunctionValues getValues(Map context, AtomicReaderContext readerContext) throws IOException { return new DocTermsIndexDocValues(this, readerContext, field) { @Override protected String toTerm(String readableValue) { return readableValue; } @Override public int ordVal(int doc) { return termsIndex.getOrd(doc); } @Override public int numOrd() { return termsIndex.numOrd(); } @Override public Object objectVal(int doc) { return strVal(doc); } @Override public String toString(int doc) { return description() + '=' + strVal(doc); } }; }
// in core/src/java/org/apache/solr/schema/PreAnalyzedField.java
Override public void write(TextResponseWriter writer, String name, IndexableField f) throws IOException { writer.writeStr(name, f.stringValue(), true); }
// in core/src/java/org/apache/solr/schema/PreAnalyzedField.java
public String toFormattedString(Field f) throws IOException { return parser.toFormattedString(f); }
// in core/src/java/org/apache/solr/schema/PreAnalyzedField.java
public final boolean incrementToken() throws IOException { // lazy init the iterator if (it == null) { it = cachedStates.iterator(); } if (!it.hasNext()) { return false; } AttributeSource.State state = (State) it.next(); restoreState(state.clone()); return true; }
// in core/src/java/org/apache/solr/schema/PreAnalyzedField.java
Override public void reset(Reader input) throws IOException { super.reset(input); cachedStates.clear(); stringValue = null; binaryValue = null; ParseResult res = parser.parse(input, this); if (res != null) { stringValue = res.str; binaryValue = res.bin; if (res.states != null) { cachedStates.addAll(res.states); } } }
// in core/src/java/org/apache/solr/schema/TextField.java
Override public void write(TextResponseWriter writer, String name, IndexableField f) throws IOException { writer.writeStr(name, f.stringValue(), true); }
// in core/src/java/org/apache/solr/schema/TrieDateField.java
Override public void write(TextResponseWriter writer, String name, IndexableField f) throws IOException { wrappedField.write(writer, name, f); }
// in core/src/java/org/apache/solr/schema/SortableLongField.java
Override public void write(TextResponseWriter writer, String name, IndexableField f) throws IOException { String sval = f.stringValue(); writer.writeLong(name, NumberUtils.SortableStr2long(sval,0,sval.length())); }
// in core/src/java/org/apache/solr/schema/SortableLongField.java
Override public FunctionValues getValues(Map context, AtomicReaderContext readerContext) throws IOException { final long def = defVal; return new DocTermsIndexDocValues(this, readerContext, field) { private final BytesRef spare = new BytesRef(); @Override protected String toTerm(String readableValue) { return NumberUtils.long2sortableStr(readableValue); } @Override public float floatVal(int doc) { return (float)longVal(doc); } @Override public int intVal(int doc) { return (int)longVal(doc); } @Override public long longVal(int doc) { int ord=termsIndex.getOrd(doc); return ord==0 ? def : NumberUtils.SortableStr2long(termsIndex.lookup(ord, spare),0,5); } @Override public double doubleVal(int doc) { return (double)longVal(doc); } @Override public String strVal(int doc) { return Long.toString(longVal(doc)); } @Override public Object objectVal(int doc) { int ord=termsIndex.getOrd(doc); return ord==0 ? null : NumberUtils.SortableStr2long(termsIndex.lookup(ord, spare)); } @Override public String toString(int doc) { return description() + '=' + longVal(doc); } @Override public ValueFiller getValueFiller() { return new ValueFiller() { private final MutableValueLong mval = new MutableValueLong(); @Override public MutableValue getValue() { return mval; } @Override public void fillValue(int doc) { int ord=termsIndex.getOrd(doc); if (ord == 0) { mval.value = def; mval.exists = false; } else { mval.value = NumberUtils.SortableStr2long(termsIndex.lookup(ord, spare),0,5); mval.exists = true; } } }; } }; }
// in core/src/java/org/apache/solr/schema/SortableDoubleField.java
Override public void write(TextResponseWriter writer, String name, IndexableField f) throws IOException { String sval = f.stringValue(); writer.writeDouble(name, NumberUtils.SortableStr2double(sval)); }
// in core/src/java/org/apache/solr/schema/SortableDoubleField.java
Override public FunctionValues getValues(Map context, AtomicReaderContext readerContext) throws IOException { final double def = defVal; return new DocTermsIndexDocValues(this, readerContext, field) { private final BytesRef spare = new BytesRef(); @Override protected String toTerm(String readableValue) { return NumberUtils.double2sortableStr(readableValue); } @Override public float floatVal(int doc) { return (float)doubleVal(doc); } @Override public int intVal(int doc) { return (int)doubleVal(doc); } @Override public long longVal(int doc) { return (long)doubleVal(doc); } @Override public double doubleVal(int doc) { int ord=termsIndex.getOrd(doc); return ord==0 ? def : NumberUtils.SortableStr2double(termsIndex.lookup(ord, spare)); } @Override public String strVal(int doc) { return Double.toString(doubleVal(doc)); } @Override public Object objectVal(int doc) { int ord=termsIndex.getOrd(doc); return ord==0 ? null : NumberUtils.SortableStr2double(termsIndex.lookup(ord, spare)); } @Override public String toString(int doc) { return description() + '=' + doubleVal(doc); } @Override public ValueFiller getValueFiller() { return new ValueFiller() { private final MutableValueDouble mval = new MutableValueDouble(); @Override public MutableValue getValue() { return mval; } @Override public void fillValue(int doc) { int ord=termsIndex.getOrd(doc); if (ord == 0) { mval.value = def; mval.exists = false; } else { mval.value = NumberUtils.SortableStr2double(termsIndex.lookup(ord, spare)); mval.exists = true; } } }; } }; }
// in core/src/java/org/apache/solr/schema/CollationField.java
Override public void write(TextResponseWriter writer, String name, IndexableField f) throws IOException { writer.writeStr(name, f.stringValue(), true); }
// in core/src/java/org/apache/solr/schema/JsonPreAnalyzedParser.java
Override public String toFormattedString(Field f) throws IOException { Map<String,Object> map = new HashMap<String,Object>(); map.put(VERSION_KEY, VERSION); if (f.fieldType().stored()) { String stringValue = f.stringValue(); if (stringValue != null) { map.put(STRING_KEY, stringValue); } BytesRef binaryValue = f.binaryValue(); if (binaryValue != null) { map.put(BINARY_KEY, Base64.byteArrayToBase64(binaryValue.bytes, binaryValue.offset, binaryValue.length)); } } TokenStream ts = f.tokenStreamValue(); if (ts != null) { List<Map<String,Object>> tokens = new LinkedList<Map<String,Object>>(); while (ts.incrementToken()) { Iterator<Class<? extends Attribute>> it = ts.getAttributeClassesIterator(); String cTerm = null; String tTerm = null; Map<String,Object> tok = new TreeMap<String,Object>(); while (it.hasNext()) { Class<? extends Attribute> cl = it.next(); if (!ts.hasAttribute(cl)) { continue; } Attribute att = ts.getAttribute(cl); if (cl.isAssignableFrom(CharTermAttribute.class)) { CharTermAttribute catt = (CharTermAttribute)att; cTerm = new String(catt.buffer(), 0, catt.length()); } else if (cl.isAssignableFrom(TermToBytesRefAttribute.class)) { TermToBytesRefAttribute tatt = (TermToBytesRefAttribute)att; tTerm = tatt.getBytesRef().utf8ToString(); } else { if (cl.isAssignableFrom(FlagsAttribute.class)) { tok.put(FLAGS_KEY, Integer.toHexString(((FlagsAttribute)att).getFlags())); } else if (cl.isAssignableFrom(OffsetAttribute.class)) { tok.put(OFFSET_START_KEY, ((OffsetAttribute)att).startOffset()); tok.put(OFFSET_END_KEY, ((OffsetAttribute)att).endOffset()); } else if (cl.isAssignableFrom(PayloadAttribute.class)) { Payload p = ((PayloadAttribute)att).getPayload(); if (p != null && p.length() > 0) { tok.put(PAYLOAD_KEY, Base64.byteArrayToBase64(p.getData(), p.getOffset(), p.length())); } } else if (cl.isAssignableFrom(PositionIncrementAttribute.class)) { tok.put(POSINCR_KEY, ((PositionIncrementAttribute)att).getPositionIncrement()); } else if (cl.isAssignableFrom(TypeAttribute.class)) { tok.put(TYPE_KEY, ((TypeAttribute)att).type()); } else { tok.put(cl.getName(), att.toString()); } } } String term = null; if (cTerm != null) { term = cTerm; } else { term = tTerm; } if (term != null && term.length() > 0) { tok.put(TOKEN_KEY, term); } tokens.add(tok); } map.put(TOKENS_KEY, tokens); } return JSONUtil.toJSON(map, -1); }
// in core/src/java/org/apache/solr/schema/FieldType.java
Override public TokenStreamComponents createComponents(String fieldName, Reader reader) { Tokenizer ts = new Tokenizer(reader) { final char[] cbuf = new char[maxChars]; final CharTermAttribute termAtt = addAttribute(CharTermAttribute.class); final OffsetAttribute offsetAtt = addAttribute(OffsetAttribute.class); @Override public boolean incrementToken() throws IOException { clearAttributes(); int n = input.read(cbuf,0,maxChars); if (n<=0) return false; String s = toInternal(new String(cbuf,0,n)); termAtt.setEmpty().append(s); offsetAtt.setOffset(correctOffset(0),correctOffset(n)); return true; } }; return new TokenStreamComponents(ts); }
// in core/src/java/org/apache/solr/schema/FieldType.java
Override public boolean incrementToken() throws IOException { clearAttributes(); int n = input.read(cbuf,0,maxChars); if (n<=0) return false; String s = toInternal(new String(cbuf,0,n)); termAtt.setEmpty().append(s); offsetAtt.setOffset(correctOffset(0),correctOffset(n)); return true; }
// in core/src/java/org/apache/solr/internal/csv/CSVParser.java
public String[][] getAllValues() throws IOException { ArrayList records = new ArrayList(); String[] values; String[][] ret = null; while ((values = getLine()) != null) { records.add(values); } if (records.size() > 0) { ret = new String[records.size()][]; records.toArray(ret); } return ret; }
// in core/src/java/org/apache/solr/internal/csv/CSVParser.java
public String nextValue() throws IOException { Token tkn = nextToken(); String ret = null; switch (tkn.type) { case TT_TOKEN: case TT_EORECORD: ret = tkn.content.toString(); break; case TT_EOF: ret = null; break; case TT_INVALID: default: // error no token available (or error) throw new IOException( "(line " + getLineNumber() + ") invalid parse sequence"); // unreachable: break; } return ret; }
// in core/src/java/org/apache/solr/internal/csv/CSVParser.java
public String[] getLine() throws IOException { String[] ret = EMPTY_STRING_ARRAY; record.clear(); while (true) { reusableToken.reset(); nextToken(reusableToken); switch (reusableToken.type) { case TT_TOKEN: record.add(reusableToken.content.toString()); break; case TT_EORECORD: record.add(reusableToken.content.toString()); break; case TT_EOF: if (reusableToken.isReady) { record.add(reusableToken.content.toString()); } else { ret = null; } break; case TT_INVALID: default: // error: throw IOException throw new IOException("(line " + getLineNumber() + ") invalid parse sequence"); // unreachable: break; } if (reusableToken.type != TT_TOKEN) { break; } } if (!record.isEmpty()) { ret = (String[]) record.toArray(new String[record.size()]); } return ret; }
// in core/src/java/org/apache/solr/internal/csv/CSVParser.java
protected Token nextToken() throws IOException { return nextToken(new Token()); }
// in core/src/java/org/apache/solr/internal/csv/CSVParser.java
protected Token nextToken(Token tkn) throws IOException { wsBuf.clear(); // resuse // get the last read char (required for empty line detection) int lastChar = in.readAgain(); // read the next char and set eol /* note: unfourtunately isEndOfLine may consumes a character silently. * this has no effect outside of the method. so a simple workaround * is to call 'readAgain' on the stream... * uh: might using objects instead of base-types (jdk1.5 autoboxing!) */ int c = in.read(); boolean eol = isEndOfLine(c); c = in.readAgain(); // empty line detection: eol AND (last char was EOL or beginning) while (strategy.getIgnoreEmptyLines() && eol && (lastChar == '\n' || lastChar == ExtendedBufferedReader.UNDEFINED) && !isEndOfFile(lastChar)) { // go on char ahead ... lastChar = c; c = in.read(); eol = isEndOfLine(c); c = in.readAgain(); // reached end of file without any content (empty line at the end) if (isEndOfFile(c)) { tkn.type = TT_EOF; return tkn; } } // did we reached eof during the last iteration already ? TT_EOF if (isEndOfFile(lastChar) || (lastChar != strategy.getDelimiter() && isEndOfFile(c))) { tkn.type = TT_EOF; return tkn; } // important: make sure a new char gets consumed in each iteration while (!tkn.isReady && tkn.type != TT_EOF) { // ignore whitespaces at beginning of a token while (strategy.getIgnoreLeadingWhitespaces() && isWhitespace(c) && !eol) { wsBuf.append((char) c); c = in.read(); eol = isEndOfLine(c); } // ok, start of token reached: comment, encapsulated, or token if (c == strategy.getCommentStart()) { // ignore everything till end of line and continue (incr linecount) in.readLine(); tkn = nextToken(tkn.reset()); } else if (c == strategy.getDelimiter()) { // empty token return TT_TOKEN("") tkn.type = TT_TOKEN; tkn.isReady = true; } else if (eol) { // empty token return TT_EORECORD("") //noop: tkn.content.append(""); tkn.type = TT_EORECORD; tkn.isReady = true; } else if (c == strategy.getEncapsulator()) { // consume encapsulated token encapsulatedTokenLexer(tkn, c); } else if (isEndOfFile(c)) { // end of file return TT_EOF() //noop: tkn.content.append(""); tkn.type = TT_EOF; tkn.isReady = true; } else { // next token must be a simple token // add removed blanks when not ignoring whitespace chars... if (!strategy.getIgnoreLeadingWhitespaces()) { tkn.content.append(wsBuf); } simpleTokenLexer(tkn, c); } } return tkn; }
// in core/src/java/org/apache/solr/internal/csv/CSVParser.java
private Token simpleTokenLexer(Token tkn, int c) throws IOException { for (;;) { if (isEndOfLine(c)) { // end of record tkn.type = TT_EORECORD; tkn.isReady = true; break; } else if (isEndOfFile(c)) { // end of file tkn.type = TT_EOF; tkn.isReady = true; break; } else if (c == strategy.getDelimiter()) { // end of token tkn.type = TT_TOKEN; tkn.isReady = true; break; } else if (c == '\\' && strategy.getUnicodeEscapeInterpretation() && in.lookAhead() == 'u') { // interpret unicode escaped chars (like \u0070 -> p) tkn.content.append((char) unicodeEscapeLexer(c)); } else if (c == strategy.getEscape()) { tkn.content.append((char)readEscape(c)); } else { tkn.content.append((char) c); } c = in.read(); } if (strategy.getIgnoreTrailingWhitespaces()) { tkn.content.trimTrailingWhitespace(); } return tkn; }
// in core/src/java/org/apache/solr/internal/csv/CSVParser.java
private Token encapsulatedTokenLexer(Token tkn, int c) throws IOException { // save current line int startLineNumber = getLineNumber(); // ignore the given delimiter // assert c == delimiter; for (;;) { c = in.read(); if (c == '\\' && strategy.getUnicodeEscapeInterpretation() && in.lookAhead()=='u') { tkn.content.append((char) unicodeEscapeLexer(c)); } else if (c == strategy.getEscape()) { tkn.content.append((char)readEscape(c)); } else if (c == strategy.getEncapsulator()) { if (in.lookAhead() == strategy.getEncapsulator()) { // double or escaped encapsulator -> add single encapsulator to token c = in.read(); tkn.content.append((char) c); } else { // token finish mark (encapsulator) reached: ignore whitespace till delimiter for (;;) { c = in.read(); if (c == strategy.getDelimiter()) { tkn.type = TT_TOKEN; tkn.isReady = true; return tkn; } else if (isEndOfFile(c)) { tkn.type = TT_EOF; tkn.isReady = true; return tkn; } else if (isEndOfLine(c)) { // ok eo token reached tkn.type = TT_EORECORD; tkn.isReady = true; return tkn; } else if (!isWhitespace(c)) { // error invalid char between token and next delimiter throw new IOException( "(line " + getLineNumber() + ") invalid char between encapsulated token end delimiter" ); } } } } else if (isEndOfFile(c)) { // error condition (end of file before end of token) throw new IOException( "(startline " + startLineNumber + ")" + "eof reached before encapsulated token finished" ); } else { // consume character tkn.content.append((char) c); } } }
// in core/src/java/org/apache/solr/internal/csv/CSVParser.java
protected int unicodeEscapeLexer(int c) throws IOException { int ret = 0; // ignore 'u' (assume c==\ now) and read 4 hex digits c = in.read(); code.clear(); try { for (int i = 0; i < 4; i++) { c = in.read(); if (isEndOfFile(c) || isEndOfLine(c)) { throw new NumberFormatException("number too short"); } code.append((char) c); } ret = Integer.parseInt(code.toString(), 16); } catch (NumberFormatException e) { throw new IOException( "(line " + getLineNumber() + ") Wrong unicode escape sequence found '" + code.toString() + "'" + e.toString()); } return ret; }
// in core/src/java/org/apache/solr/internal/csv/CSVParser.java
private int readEscape(int c) throws IOException { // assume c is the escape char (normally a backslash) c = in.read(); int out; switch (c) { case 'r': out='\r'; break; case 'n': out='\n'; break; case 't': out='\t'; break; case 'b': out='\b'; break; case 'f': out='\f'; break; default : out=c; } return out; }
// in core/src/java/org/apache/solr/internal/csv/CSVParser.java
private boolean isEndOfLine(int c) throws IOException { // check if we have \r\n... if (c == '\r') { if (in.lookAhead() == '\n') { // note: does not change c outside of this method !! c = in.read(); } } return (c == '\n'); }
// in core/src/java/org/apache/solr/internal/csv/CSVUtils.java
public static String[][] parse(String s) throws IOException { if (s == null) { throw new IllegalArgumentException("Null argument not allowed."); } String[][] result = (new CSVParser(new StringReader(s))).getAllValues(); if (result == null) { // since CSVStrategy ignores empty lines an empty array is returned // (i.e. not "result = new String[][] {{""}};") result = EMPTY_DOUBLE_STRING_ARRAY; } return result; }
// in core/src/java/org/apache/solr/internal/csv/CSVUtils.java
public static String[] parseLine(String s) throws IOException { if (s == null) { throw new IllegalArgumentException("Null argument not allowed."); } // uh,jh: make sure that parseLine("").length == 0 if (s.length() == 0) { return EMPTY_STRING_ARRAY; } return (new CSVParser(new StringReader(s))).getLine(); }
// in core/src/java/org/apache/solr/internal/csv/CSVPrinter.java
public void println() throws IOException { out.write(strategy.getPrinterNewline()); newLine = true; }
// in core/src/java/org/apache/solr/internal/csv/CSVPrinter.java
public void flush() throws IOException { out.flush(); }
// in core/src/java/org/apache/solr/internal/csv/CSVPrinter.java
public void println(String[] values) throws IOException { for (int i = 0; i < values.length; i++) { print(values[i]); } println(); }
// in core/src/java/org/apache/solr/internal/csv/CSVPrinter.java
public void printlnComment(String comment) throws IOException { if(this.strategy.isCommentingDisabled()) { return; } if (!newLine) { println(); } out.write(this.strategy.getCommentStart()); out.write(' '); for (int i = 0; i < comment.length(); i++) { char c = comment.charAt(i); switch (c) { case '\r' : if (i + 1 < comment.length() && comment.charAt(i + 1) == '\n') { i++; } // break intentionally excluded. case '\n' : println(); out.write(this.strategy.getCommentStart()); out.write(' '); break; default : out.write(c); break; } } println(); }
// in core/src/java/org/apache/solr/internal/csv/CSVPrinter.java
public void print(char[] value, int offset, int len, boolean checkForEscape) throws IOException { if (!checkForEscape) { printSep(); out.write(value, offset, len); return; } if (strategy.getEncapsulator() != CSVStrategy.ENCAPSULATOR_DISABLED) { printAndEncapsulate(value, offset, len); } else if (strategy.getEscape() != CSVStrategy.ESCAPE_DISABLED) { printAndEscape(value, offset, len); } else { printSep(); out.write(value, offset, len); } }
// in core/src/java/org/apache/solr/internal/csv/CSVPrinter.java
void printSep() throws IOException { if (newLine) { newLine = false; } else { out.write(this.strategy.getDelimiter()); } }
// in core/src/java/org/apache/solr/internal/csv/CSVPrinter.java
void printAndEscape(char[] value, int offset, int len) throws IOException { int start = offset; int pos = offset; int end = offset + len; printSep(); char delim = this.strategy.getDelimiter(); char escape = this.strategy.getEscape(); while (pos < end) { char c = value[pos]; if (c == '\r' || c=='\n' || c==delim || c==escape) { // write out segment up until this char int l = pos-start; if (l>0) { out.write(value, start, l); } if (c=='\n') c='n'; else if (c=='\r') c='r'; out.write(escape); out.write(c); start = pos+1; // start on the current char after this one } pos++; } // write last segment int l = pos-start; if (l>0) { out.write(value, start, l); } }
// in core/src/java/org/apache/solr/internal/csv/CSVPrinter.java
void printAndEncapsulate(char[] value, int offset, int len) throws IOException { boolean first = newLine; // is this the first value on this line? boolean quote = false; int start = offset; int pos = offset; int end = offset + len; printSep(); char delim = this.strategy.getDelimiter(); char encapsulator = this.strategy.getEncapsulator(); if (len <= 0) { // always quote an empty token that is the first // on the line, as it may be the only thing on the // line. If it were not quoted in that case, // an empty line has no tokens. if (first) { quote = true; } } else { char c = value[pos]; // Hmmm, where did this rule come from? if (first && (c < '0' || (c > '9' && c < 'A') || (c > 'Z' && c < 'a') || (c > 'z'))) { quote = true; // } else if (c == ' ' || c == '\f' || c == '\t') { } else if (c <= '#') { // Some other chars at the start of a value caused the parser to fail, so for now // encapsulate if we start in anything less than '#'. We are being conservative // by including the default comment char too. quote = true; } else { while (pos < end) { c = value[pos]; if (c=='\n' || c=='\r' || c==encapsulator || c==delim) { quote = true; break; } pos++; } if (!quote) { pos = end-1; c = value[pos]; // if (c == ' ' || c == '\f' || c == '\t') { // Some other chars at the end caused the parser to fail, so for now // encapsulate if we end in anything less than ' ' if (c <= ' ') { quote = true; } } } } if (!quote) { // no encapsulation needed - write out the original value out.write(value, offset, len); return; } // we hit something that needed encapsulation out.write(encapsulator); // Pick up where we left off: pos should be positioned on the first character that caused // the need for encapsulation. while (pos<end) { char c = value[pos]; if (c==encapsulator) { // write out the chunk up until this point // add 1 to the length to write out the encapsulator also out.write(value, start, pos-start+1); // put the next starting position on the encapsulator so we will // write it out again with the next string (effectively doubling it) start = pos; } pos++; } // write the last segment out.write(value, start, pos-start); out.write(encapsulator); }
// in core/src/java/org/apache/solr/internal/csv/CSVPrinter.java
public void print(String value, boolean checkForEscape) throws IOException { if (!checkForEscape) { // write directly from string printSep(); out.write(value); return; } if (buf.length < value.length()) { buf = new char[value.length()]; } value.getChars(0, value.length(), buf, 0); print(buf, 0, value.length(), checkForEscape); }
// in core/src/java/org/apache/solr/internal/csv/CSVPrinter.java
public void print(String value) throws IOException { print(value, true); }
// in core/src/java/org/apache/solr/internal/csv/ExtendedBufferedReader.java
public int read() throws IOException { // initalize the lookahead if (lookaheadChar == UNDEFINED) { lookaheadChar = super.read(); } lastChar = lookaheadChar; if (super.ready()) { lookaheadChar = super.read(); } else { lookaheadChar = UNDEFINED; } if (lastChar == '\n') { lineCounter++; } return lastChar; }
// in core/src/java/org/apache/solr/internal/csv/ExtendedBufferedReader.java
public int read(char[] buf, int off, int len) throws IOException { // do not claim if len == 0 if (len == 0) { return 0; } // init lookahead, but do not block !! if (lookaheadChar == UNDEFINED) { if (ready()) { lookaheadChar = super.read(); } else { return -1; } } // 'first read of underlying stream' if (lookaheadChar == -1) { return -1; } // continue until the lookaheadChar would block int cOff = off; while (len > 0 && ready()) { if (lookaheadChar == -1) { // eof stream reached, do not continue return cOff - off; } else { buf[cOff++] = (char) lookaheadChar; if (lookaheadChar == '\n') { lineCounter++; } lastChar = lookaheadChar; lookaheadChar = super.read(); len--; } } return cOff - off; }
// in core/src/java/org/apache/solr/internal/csv/ExtendedBufferedReader.java
public String readUntil(char c) throws IOException { if (lookaheadChar == UNDEFINED) { lookaheadChar = super.read(); } line.clear(); // reuse while (lookaheadChar != c && lookaheadChar != END_OF_STREAM) { line.append((char) lookaheadChar); if (lookaheadChar == '\n') { lineCounter++; } lastChar = lookaheadChar; lookaheadChar = super.read(); } return line.toString(); }
// in core/src/java/org/apache/solr/internal/csv/ExtendedBufferedReader.java
public String readLine() throws IOException { if (lookaheadChar == UNDEFINED) { lookaheadChar = super.read(); } line.clear(); //reuse // return null if end of stream has been reached if (lookaheadChar == END_OF_STREAM) { return null; } // do we have a line termination already char laChar = (char) lookaheadChar; if (laChar == '\n' || laChar == '\r') { lastChar = lookaheadChar; lookaheadChar = super.read(); // ignore '\r\n' as well if ((char) lookaheadChar == '\n') { lastChar = lookaheadChar; lookaheadChar = super.read(); } lineCounter++; return line.toString(); } // create the rest-of-line return and update the lookahead line.append(laChar); String restOfLine = super.readLine(); // TODO involves copying lastChar = lookaheadChar; lookaheadChar = super.read(); if (restOfLine != null) { line.append(restOfLine); } lineCounter++; return line.toString(); }
// in core/src/java/org/apache/solr/internal/csv/ExtendedBufferedReader.java
public long skip(long n) throws IllegalArgumentException, IOException { if (lookaheadChar == UNDEFINED) { lookaheadChar = super.read(); } // illegal argument if (n < 0) { throw new IllegalArgumentException("negative argument not supported"); } // no skipping if (n == 0 || lookaheadChar == END_OF_STREAM) { return 0; } // skip and reread the lookahead-char long skiped = 0; if (n > 1) { skiped = super.skip(n - 1); } lookaheadChar = super.read(); // fixme uh: we should check the skiped sequence for line-terminations... lineCounter = Integer.MIN_VALUE; return skiped + 1; }
// in core/src/java/org/apache/solr/internal/csv/ExtendedBufferedReader.java
public long skipUntil(char c) throws IllegalArgumentException, IOException { if (lookaheadChar == UNDEFINED) { lookaheadChar = super.read(); } long counter = 0; while (lookaheadChar != c && lookaheadChar != END_OF_STREAM) { if (lookaheadChar == '\n') { lineCounter++; } lookaheadChar = super.read(); counter++; } return counter; }
// in core/src/java/org/apache/solr/internal/csv/ExtendedBufferedReader.java
public int lookAhead() throws IOException { if (lookaheadChar == UNDEFINED) { lookaheadChar = super.read(); } return lookaheadChar; }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
public final int docFreq(Term term) throws IOException { return reader.docFreq(term); }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
public void close() throws IOException { if (debug) { if (cachingEnabled) { StringBuilder sb = new StringBuilder(); sb.append("Closing ").append(name); for (SolrCache cache : cacheList) { sb.append("\n\t"); sb.append(cache); } log.debug(sb.toString()); } else { if (debug) log.debug("Closing " + name); } } core.getInfoRegistry().remove(name); // super.close(); // can't use super.close() since it just calls reader.close() and that may only be called once // per reader (even if incRef() was previously called). if (closeReader) reader.decRef(); for (SolrCache cache : cacheList) { cache.close(); } directoryFactory.release(getIndexReader().directory()); // do this at the end so it only gets done if there are no exceptions numCloses.incrementAndGet(); }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
public static void initRegenerators(SolrConfig solrConfig) { if (solrConfig.fieldValueCacheConfig != null && solrConfig.fieldValueCacheConfig.getRegenerator() == null) { solrConfig.fieldValueCacheConfig.setRegenerator( new CacheRegenerator() { public boolean regenerateItem(SolrIndexSearcher newSearcher, SolrCache newCache, SolrCache oldCache, Object oldKey, Object oldVal) throws IOException { if (oldVal instanceof UnInvertedField) { UnInvertedField.getUnInvertedField((String)oldKey, newSearcher); } return true; } } ); }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
public boolean regenerateItem(SolrIndexSearcher newSearcher, SolrCache newCache, SolrCache oldCache, Object oldKey, Object oldVal) throws IOException { if (oldVal instanceof UnInvertedField) { UnInvertedField.getUnInvertedField((String)oldKey, newSearcher); } return true; }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
public boolean regenerateItem(SolrIndexSearcher newSearcher, SolrCache newCache, SolrCache oldCache, Object oldKey, Object oldVal) throws IOException { newSearcher.cacheDocSet((Query)oldKey, null, false); return true; }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
public boolean regenerateItem(SolrIndexSearcher newSearcher, SolrCache newCache, SolrCache oldCache, Object oldKey, Object oldVal) throws IOException { QueryResultKey key = (QueryResultKey)oldKey; int nDocs=1; // request 1 doc and let caching round up to the next window size... // unless the window size is <=1, in which case we will pick // the minimum of the number of documents requested last time and // a reasonable number such as 40. // TODO: make more configurable later... if (queryResultWindowSize<=1) { DocList oldList = (DocList)oldVal; int oldnDocs = oldList.offset() + oldList.size(); // 40 has factors of 2,4,5,10,20 nDocs = Math.min(oldnDocs,40); } int flags=NO_CHECK_QCACHE | key.nc_flags; QueryCommand qc = new QueryCommand(); qc.setQuery(key.query) .setFilterList(key.filters) .setSort(key.sort) .setLen(nDocs) .setSupersetMaxDoc(nDocs) .setFlags(flags); QueryResult qr = new QueryResult(); newSearcher.getDocListC(qr,qc); return true; }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
public QueryResult search(QueryResult qr, QueryCommand cmd) throws IOException { getDocListC(qr,cmd); return qr; }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
Override public void binaryField(FieldInfo fieldInfo, byte[] value, int offset, int length) throws IOException { doc.add(new StoredField(fieldInfo.name, value)); }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
Override public void stringField(FieldInfo fieldInfo, String value) throws IOException { final FieldType ft = new FieldType(TextField.TYPE_STORED); ft.setStoreTermVectors(fieldInfo.hasVectors()); ft.setIndexed(fieldInfo.isIndexed()); ft.setOmitNorms(fieldInfo.omitsNorms()); ft.setIndexOptions(fieldInfo.getIndexOptions()); doc.add(new Field(fieldInfo.name, value, ft)); }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
Override public Document doc(int i) throws IOException { return doc(i, (Set<String>)null); }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
Override public void doc(int n, StoredFieldVisitor visitor) throws IOException { getIndexReader().document(n, visitor); }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
public Document doc(int i, Set<String> fields) throws IOException { Document d; if (documentCache != null) { d = documentCache.get(i); if (d!=null) return d; } if(!enableLazyFieldLoading || fields == null) { d = getIndexReader().document(i); } else { final SetNonLazyFieldSelector visitor = new SetNonLazyFieldSelector(fields, getIndexReader(), i); getIndexReader().document(i, visitor); d = visitor.doc; } if (documentCache != null) { documentCache.put(i, d); } return d; }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
public void readDocs(Document[] docs, DocList ids) throws IOException { readDocs(docs, ids, null); }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
public void readDocs(Document[] docs, DocList ids, Set<String> fields) throws IOException { DocIterator iter = ids.iterator(); for (int i=0; i<docs.length; i++) { docs[i] = doc(iter.nextDoc(), fields); } }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
public Sort weightSort(Sort sort) throws IOException { return (sort != null) ? sort.rewrite(this) : null; }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
public int getFirstMatch(Term t) throws IOException { Fields fields = atomicReader.fields(); if (fields == null) return -1; Terms terms = fields.terms(t.field()); if (terms == null) return -1; BytesRef termBytes = t.bytes(); final TermsEnum termsEnum = terms.iterator(null); if (!termsEnum.seekExact(termBytes, false)) { return -1; } DocsEnum docs = termsEnum.docs(atomicReader.getLiveDocs(), null, false); if (docs == null) return -1; int id = docs.nextDoc(); return id == DocIdSetIterator.NO_MORE_DOCS ? -1 : id; }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
public long lookupId(BytesRef idBytes) throws IOException { String field = schema.getUniqueKeyField().getName(); final AtomicReaderContext[] leaves = leafContexts; for (int i=0; i<leaves.length; i++) { final AtomicReaderContext leaf = leaves[i]; final AtomicReader reader = leaf.reader(); final Fields fields = reader.fields(); if (fields == null) continue; final Bits liveDocs = reader.getLiveDocs(); final DocsEnum docs = reader.termDocsEnum(liveDocs, field, idBytes, false); if (docs == null) continue; int id = docs.nextDoc(); if (id == DocIdSetIterator.NO_MORE_DOCS) continue; assert docs.nextDoc() == DocIdSetIterator.NO_MORE_DOCS; return (((long)i) << 32) | id; } return -1; }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
public void cacheDocSet(Query query, DocSet optionalAnswer, boolean mustCache) throws IOException { // Even if the cache is null, still compute the DocSet as it may serve to warm the Lucene // or OS disk cache. if (optionalAnswer != null) { if (filterCache!=null) { filterCache.put(query,optionalAnswer); } return; } // Throw away the result, relying on the fact that getDocSet // will currently always cache what it found. If getDocSet() starts // using heuristics about what to cache, and mustCache==true, (or if we // want this method to start using heuristics too) then // this needs to change. getDocSet(query); }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
public DocSet getDocSet(Query query) throws IOException { if (query instanceof ExtendedQuery) { ExtendedQuery eq = (ExtendedQuery)query; if (!eq.getCache()) { if (query instanceof WrappedQuery) { query = ((WrappedQuery)query).getWrappedQuery(); } query = QueryUtils.makeQueryable(query); return getDocSetNC(query, null); } } // Get the absolute value (positive version) of this query. If we // get back the same reference, we know it's positive. Query absQ = QueryUtils.getAbs(query); boolean positive = query==absQ; if (filterCache != null) { DocSet absAnswer = filterCache.get(absQ); if (absAnswer!=null) { if (positive) return absAnswer; else return getPositiveDocSet(matchAllDocsQuery).andNot(absAnswer); } } DocSet absAnswer = getDocSetNC(absQ, null); DocSet answer = positive ? absAnswer : getPositiveDocSet(matchAllDocsQuery).andNot(absAnswer); if (filterCache != null) { // cache negative queries as positive filterCache.put(absQ, absAnswer); } return answer; }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
DocSet getPositiveDocSet(Query q) throws IOException { DocSet answer; if (filterCache != null) { answer = filterCache.get(q); if (answer!=null) return answer; } answer = getDocSetNC(q,null); if (filterCache != null) filterCache.put( q,answer); return answer; }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
public DocSet getDocSet(List<Query> queries) throws IOException { ProcessedFilter pf = getProcessedFilter(null, queries); if (pf.answer != null) return pf.answer; DocSetCollector setCollector = new DocSetCollector(maxDoc()>>6, maxDoc()); Collector collector = setCollector; if (pf.postFilter != null) { pf.postFilter.setLastDelegate(collector); collector = pf.postFilter; } final AtomicReaderContext[] leaves = leafContexts; for (int i=0; i<leaves.length; i++) { final AtomicReaderContext leaf = leaves[i]; final AtomicReader reader = leaf.reader(); final Bits liveDocs = reader.getLiveDocs(); // TODO: the filter may already only have liveDocs... DocIdSet idSet = null; if (pf.filter != null) { idSet = pf.filter.getDocIdSet(leaf, liveDocs); if (idSet == null) continue; } DocIdSetIterator idIter = null; if (idSet != null) { idIter = idSet.iterator(); if (idIter == null) continue; } collector.setNextReader(leaf); int max = reader.maxDoc(); if (idIter == null) { for (int docid = 0; docid<max; docid++) { if (liveDocs != null && !liveDocs.get(docid)) continue; collector.collect(docid); } } else { for (int docid = -1; (docid = idIter.advance(docid+1)) < max; ) { collector.collect(docid); } } }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
public ProcessedFilter getProcessedFilter(DocSet setFilter, List<Query> queries) throws IOException { ProcessedFilter pf = new ProcessedFilter(); if (queries==null || queries.size()==0) { if (setFilter != null) pf.filter = setFilter.getTopFilter(); return pf; } DocSet answer=null; boolean[] neg = new boolean[queries.size()+1]; DocSet[] sets = new DocSet[queries.size()+1]; List<Query> notCached = null; List<Query> postFilters = null; int end = 0; int smallestIndex = -1; if (setFilter != null) { answer = sets[end++] = setFilter; smallestIndex = end; } int smallestCount = Integer.MAX_VALUE; for (Query q : queries) { if (q instanceof ExtendedQuery) { ExtendedQuery eq = (ExtendedQuery)q; if (!eq.getCache()) { if (eq.getCost() >= 100 && eq instanceof PostFilter) { if (postFilters == null) postFilters = new ArrayList<Query>(sets.length-end); postFilters.add(q); } else { if (notCached == null) notCached = new ArrayList<Query>(sets.length-end); notCached.add(q); } continue; } } Query posQuery = QueryUtils.getAbs(q); sets[end] = getPositiveDocSet(posQuery); // Negative query if absolute value different from original if (q==posQuery) { neg[end] = false; // keep track of the smallest positive set. // This optimization is only worth it if size() is cached, which it would // be if we don't do any set operations. int sz = sets[end].size(); if (sz<smallestCount) { smallestCount=sz; smallestIndex=end; answer = sets[end]; } } else { neg[end] = true; } end++; } // Are all of our normal cached filters negative? if (end > 0 && answer==null) { answer = getPositiveDocSet(matchAllDocsQuery); } // do negative queries first to shrink set size for (int i=0; i<end; i++) { if (neg[i]) answer = answer.andNot(sets[i]); } for (int i=0; i<end; i++) { if (!neg[i] && i!=smallestIndex) answer = answer.intersection(sets[i]); } if (notCached != null) { Collections.sort(notCached, sortByCost); List<Weight> weights = new ArrayList<Weight>(notCached.size()); for (Query q : notCached) { Query qq = QueryUtils.makeQueryable(q); weights.add(createNormalizedWeight(qq)); } pf.filter = new FilterImpl(answer, weights); } else { if (postFilters == null) { if (answer == null) { answer = getPositiveDocSet(matchAllDocsQuery); } // "answer" is the only part of the filter, so set it. pf.answer = answer; } if (answer != null) { pf.filter = answer.getTopFilter(); } } if (postFilters != null) { Collections.sort(postFilters, sortByCost); for (int i=postFilters.size()-1; i>=0; i--) { DelegatingCollector prev = pf.postFilter; pf.postFilter = ((PostFilter)postFilters.get(i)).getFilterCollector(this); if (prev != null) pf.postFilter.setDelegate(prev); } } return pf; }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
public DocSet getDocSet(DocsEnumState deState) throws IOException { int largestPossible = deState.termsEnum.docFreq(); boolean useCache = filterCache != null && largestPossible >= deState.minSetSizeCached; TermQuery key = null; if (useCache) { key = new TermQuery(new Term(deState.fieldName, BytesRef.deepCopyOf(deState.termsEnum.term()))); DocSet result = filterCache.get(key); if (result != null) return result; } int smallSetSize = maxDoc()>>6; int scratchSize = Math.min(smallSetSize, largestPossible); if (deState.scratch == null || deState.scratch.length < scratchSize) deState.scratch = new int[scratchSize]; final int[] docs = deState.scratch; int upto = 0; int bitsSet = 0; OpenBitSet obs = null; DocsEnum docsEnum = deState.termsEnum.docs(deState.liveDocs, deState.docsEnum, false); if (deState.docsEnum == null) { deState.docsEnum = docsEnum; } if (docsEnum instanceof MultiDocsEnum) { MultiDocsEnum.EnumWithSlice[] subs = ((MultiDocsEnum)docsEnum).getSubs(); int numSubs = ((MultiDocsEnum)docsEnum).getNumSubs(); for (int subindex = 0; subindex<numSubs; subindex++) { MultiDocsEnum.EnumWithSlice sub = subs[subindex]; if (sub.docsEnum == null) continue; int base = sub.slice.start; int docid; if (largestPossible > docs.length) { if (obs == null) obs = new OpenBitSet(maxDoc()); while ((docid = sub.docsEnum.nextDoc()) != DocIdSetIterator.NO_MORE_DOCS) { obs.fastSet(docid + base); bitsSet++; } } else { while ((docid = sub.docsEnum.nextDoc()) != DocIdSetIterator.NO_MORE_DOCS) { docs[upto++] = docid + base; } } } } else { int docid; if (largestPossible > docs.length) { if (obs == null) obs = new OpenBitSet(maxDoc()); while ((docid = docsEnum.nextDoc()) != DocIdSetIterator.NO_MORE_DOCS) { obs.fastSet(docid); bitsSet++; } } else { while ((docid = docsEnum.nextDoc()) != DocIdSetIterator.NO_MORE_DOCS) { docs[upto++] = docid; } } } DocSet result; if (obs != null) { for (int i=0; i<upto; i++) { obs.fastSet(docs[i]); } bitsSet += upto; result = new BitDocSet(obs, bitsSet); } else { result = upto==0 ? DocSet.EMPTY : new SortedIntDocSet(Arrays.copyOf(docs, upto)); } if (useCache) { filterCache.put(key, result); } return result; }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
protected DocSet getDocSetNC(Query query, DocSet filter) throws IOException { DocSetCollector collector = new DocSetCollector(maxDoc()>>6, maxDoc()); if (filter==null) { if (query instanceof TermQuery) { Term t = ((TermQuery)query).getTerm(); final AtomicReaderContext[] leaves = leafContexts; for (int i=0; i<leaves.length; i++) { final AtomicReaderContext leaf = leaves[i]; final AtomicReader reader = leaf.reader(); collector.setNextReader(leaf); Fields fields = reader.fields(); Terms terms = fields.terms(t.field()); BytesRef termBytes = t.bytes(); Bits liveDocs = reader.getLiveDocs(); DocsEnum docsEnum = null; if (terms != null) { final TermsEnum termsEnum = terms.iterator(null); if (termsEnum.seekExact(termBytes, false)) { docsEnum = termsEnum.docs(liveDocs, null, false); } } if (docsEnum != null) { int docid; while ((docid = docsEnum.nextDoc()) != DocIdSetIterator.NO_MORE_DOCS) { collector.collect(docid); } } } } else { super.search(query,null,collector); } return collector.getDocSet(); } else { Filter luceneFilter = filter.getTopFilter(); super.search(query, luceneFilter, collector); return collector.getDocSet(); } }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
public DocSet getDocSet(Query query, DocSet filter) throws IOException { if (filter==null) return getDocSet(query); if (query instanceof ExtendedQuery) { ExtendedQuery eq = (ExtendedQuery)query; if (!eq.getCache()) { if (query instanceof WrappedQuery) { query = ((WrappedQuery)query).getWrappedQuery(); } query = QueryUtils.makeQueryable(query); return getDocSetNC(query, filter); } } // Negative query if absolute value different from original Query absQ = QueryUtils.getAbs(query); boolean positive = absQ==query; DocSet first; if (filterCache != null) { first = filterCache.get(absQ); if (first==null) { first = getDocSetNC(absQ,null); filterCache.put(absQ,first); } return positive ? first.intersection(filter) : filter.andNot(first); } // If there isn't a cache, then do a single filtered query if positive. return positive ? getDocSetNC(absQ,filter) : filter.andNot(getPositiveDocSet(absQ)); }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
public DocList getDocList(Query query, Query filter, Sort lsort, int offset, int len) throws IOException { QueryCommand qc = new QueryCommand(); qc.setQuery(query) .setFilterList(filter) .setSort(lsort) .setOffset(offset) .setLen(len); QueryResult qr = new QueryResult(); search(qr,qc); return qr.getDocList(); }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
public DocList getDocList(Query query, List<Query> filterList, Sort lsort, int offset, int len, int flags) throws IOException { QueryCommand qc = new QueryCommand(); qc.setQuery(query) .setFilterList(filterList) .setSort(lsort) .setOffset(offset) .setLen(len) .setFlags(flags); QueryResult qr = new QueryResult(); search(qr,qc); return qr.getDocList(); }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
private void getDocListC(QueryResult qr, QueryCommand cmd) throws IOException { DocListAndSet out = new DocListAndSet(); qr.setDocListAndSet(out); QueryResultKey key=null; int maxDocRequested = cmd.getOffset() + cmd.getLen(); // check for overflow, and check for # docs in index if (maxDocRequested < 0 || maxDocRequested > maxDoc()) maxDocRequested = maxDoc(); int supersetMaxDoc= maxDocRequested; DocList superset = null; int flags = cmd.getFlags(); Query q = cmd.getQuery(); if (q instanceof ExtendedQuery) { ExtendedQuery eq = (ExtendedQuery)q; if (!eq.getCache()) { flags |= (NO_CHECK_QCACHE | NO_SET_QCACHE | NO_CHECK_FILTERCACHE); } } // we can try and look up the complete query in the cache. // we can't do that if filter!=null though (we don't want to // do hashCode() and equals() for a big DocSet). if (queryResultCache != null && cmd.getFilter()==null && (flags & (NO_CHECK_QCACHE|NO_SET_QCACHE)) != ((NO_CHECK_QCACHE|NO_SET_QCACHE))) { // all of the current flags can be reused during warming, // so set all of them on the cache key. key = new QueryResultKey(q, cmd.getFilterList(), cmd.getSort(), flags); if ((flags & NO_CHECK_QCACHE)==0) { superset = queryResultCache.get(key); if (superset != null) { // check that the cache entry has scores recorded if we need them if ((flags & GET_SCORES)==0 || superset.hasScores()) { // NOTE: subset() returns null if the DocList has fewer docs than // requested out.docList = superset.subset(cmd.getOffset(),cmd.getLen()); } } if (out.docList != null) { // found the docList in the cache... now check if we need the docset too. // OPT: possible future optimization - if the doclist contains all the matches, // use it to make the docset instead of rerunning the query. if (out.docSet==null && ((flags & GET_DOCSET)!=0) ) { if (cmd.getFilterList()==null) { out.docSet = getDocSet(cmd.getQuery()); } else { List<Query> newList = new ArrayList<Query>(cmd.getFilterList().size()+1); newList.add(cmd.getQuery()); newList.addAll(cmd.getFilterList()); out.docSet = getDocSet(newList); } } return; } } // If we are going to generate the result, bump up to the // next resultWindowSize for better caching. if ((flags & NO_SET_QCACHE) == 0) { // handle 0 special case as well as avoid idiv in the common case. if (maxDocRequested < queryResultWindowSize) { supersetMaxDoc=queryResultWindowSize; } else { supersetMaxDoc = ((maxDocRequested -1)/queryResultWindowSize + 1)*queryResultWindowSize; if (supersetMaxDoc < 0) supersetMaxDoc=maxDocRequested; } } else { key = null; // we won't be caching the result } } // OK, so now we need to generate an answer. // One way to do that would be to check if we have an unordered list // of results for the base query. If so, we can apply the filters and then // sort by the resulting set. This can only be used if: // - the sort doesn't contain score // - we don't want score returned. // check if we should try and use the filter cache boolean useFilterCache=false; if ((flags & (GET_SCORES|NO_CHECK_FILTERCACHE))==0 && useFilterForSortedQuery && cmd.getSort() != null && filterCache != null) { useFilterCache=true; SortField[] sfields = cmd.getSort().getSort(); for (SortField sf : sfields) { if (sf.getType() == SortField.Type.SCORE) { useFilterCache=false; break; } } } // disable useFilterCache optimization temporarily if (useFilterCache) { // now actually use the filter cache. // for large filters that match few documents, this may be // slower than simply re-executing the query. if (out.docSet == null) { out.docSet = getDocSet(cmd.getQuery(),cmd.getFilter()); DocSet bigFilt = getDocSet(cmd.getFilterList()); if (bigFilt != null) out.docSet = out.docSet.intersection(bigFilt); } // todo: there could be a sortDocSet that could take a list of // the filters instead of anding them first... // perhaps there should be a multi-docset-iterator superset = sortDocSet(out.docSet,cmd.getSort(),supersetMaxDoc); out.docList = superset.subset(cmd.getOffset(),cmd.getLen()); } else { // do it the normal way... cmd.setSupersetMaxDoc(supersetMaxDoc); if ((flags & GET_DOCSET)!=0) { // this currently conflates returning the docset for the base query vs // the base query and all filters. DocSet qDocSet = getDocListAndSetNC(qr,cmd); // cache the docSet matching the query w/o filtering if (qDocSet!=null && filterCache!=null && !qr.isPartialResults()) filterCache.put(cmd.getQuery(),qDocSet); } else { getDocListNC(qr,cmd); //Parameters: cmd.getQuery(),theFilt,cmd.getSort(),0,supersetMaxDoc,cmd.getFlags(),cmd.getTimeAllowed(),responseHeader); } superset = out.docList; out.docList = superset.subset(cmd.getOffset(),cmd.getLen()); } // lastly, put the superset in the cache if the size is less than or equal // to queryResultMaxDocsCached if (key != null && superset.size() <= queryResultMaxDocsCached && !qr.isPartialResults()) { queryResultCache.put(key, superset); } }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
private void getDocListNC(QueryResult qr,QueryCommand cmd) throws IOException { final long timeAllowed = cmd.getTimeAllowed(); int len = cmd.getSupersetMaxDoc(); int last = len; if (last < 0 || last > maxDoc()) last=maxDoc(); final int lastDocRequested = last; int nDocsReturned; int totalHits; float maxScore; int[] ids; float[] scores; boolean needScores = (cmd.getFlags() & GET_SCORES) != 0; Query query = QueryUtils.makeQueryable(cmd.getQuery()); ProcessedFilter pf = getProcessedFilter(cmd.getFilter(), cmd.getFilterList()); final Filter luceneFilter = pf.filter; // handle zero case... if (lastDocRequested<=0) { final float[] topscore = new float[] { Float.NEGATIVE_INFINITY }; final int[] numHits = new int[1]; Collector collector; if (!needScores) { collector = new Collector () { @Override public void setScorer(Scorer scorer) throws IOException { } @Override public void collect(int doc) throws IOException { numHits[0]++; } @Override public void setNextReader(AtomicReaderContext context) throws IOException { } @Override public boolean acceptsDocsOutOfOrder() { return true; } }; } else { collector = new Collector() { Scorer scorer; @Override public void setScorer(Scorer scorer) throws IOException { this.scorer = scorer; } @Override public void collect(int doc) throws IOException { numHits[0]++; float score = scorer.score(); if (score > topscore[0]) topscore[0]=score; } @Override public void setNextReader(AtomicReaderContext context) throws IOException { } @Override public boolean acceptsDocsOutOfOrder() { return true; } }; } if( timeAllowed > 0 ) { collector = new TimeLimitingCollector(collector, TimeLimitingCollector.getGlobalCounter(), timeAllowed); } if (pf.postFilter != null) { pf.postFilter.setLastDelegate(collector); collector = pf.postFilter; } try { super.search(query, luceneFilter, collector); } catch( TimeLimitingCollector.TimeExceededException x ) { log.warn( "Query: " + query + "; " + x.getMessage() ); qr.setPartialResults(true); } nDocsReturned=0; ids = new int[nDocsReturned]; scores = new float[nDocsReturned]; totalHits = numHits[0]; maxScore = totalHits>0 ? topscore[0] : 0.0f; } else { TopDocsCollector topCollector; if (cmd.getSort() == null) { if(cmd.getScoreDoc() != null) { topCollector = TopScoreDocCollector.create(len, cmd.getScoreDoc(), true); //create the Collector with InOrderPagingCollector } else { topCollector = TopScoreDocCollector.create(len, true); } } else { topCollector = TopFieldCollector.create(weightSort(cmd.getSort()), len, false, needScores, needScores, true); } Collector collector = topCollector; if( timeAllowed > 0 ) { collector = new TimeLimitingCollector(collector, TimeLimitingCollector.getGlobalCounter(), timeAllowed); } if (pf.postFilter != null) { pf.postFilter.setLastDelegate(collector); collector = pf.postFilter; } try { super.search(query, luceneFilter, collector); } catch( TimeLimitingCollector.TimeExceededException x ) { log.warn( "Query: " + query + "; " + x.getMessage() ); qr.setPartialResults(true); } totalHits = topCollector.getTotalHits(); TopDocs topDocs = topCollector.topDocs(0, len); maxScore = totalHits>0 ? topDocs.getMaxScore() : 0.0f; nDocsReturned = topDocs.scoreDocs.length; ids = new int[nDocsReturned]; scores = (cmd.getFlags()&GET_SCORES)!=0 ? new float[nDocsReturned] : null; for (int i=0; i<nDocsReturned; i++) { ScoreDoc scoreDoc = topDocs.scoreDocs[i]; ids[i] = scoreDoc.doc; if (scores != null) scores[i] = scoreDoc.score; } } int sliceLen = Math.min(lastDocRequested,nDocsReturned); if (sliceLen < 0) sliceLen=0; qr.setDocList(new DocSlice(0,sliceLen,ids,scores,totalHits,maxScore)); }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
Override public void setScorer(Scorer scorer) throws IOException { }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
Override public void collect(int doc) throws IOException { numHits[0]++; }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
Override public void setNextReader(AtomicReaderContext context) throws IOException { }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
Override public void setScorer(Scorer scorer) throws IOException { this.scorer = scorer; }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
Override public void collect(int doc) throws IOException { numHits[0]++; float score = scorer.score(); if (score > topscore[0]) topscore[0]=score; }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
Override public void setNextReader(AtomicReaderContext context) throws IOException { }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
private DocSet getDocListAndSetNC(QueryResult qr,QueryCommand cmd) throws IOException { int len = cmd.getSupersetMaxDoc(); int last = len; if (last < 0 || last > maxDoc()) last=maxDoc(); final int lastDocRequested = last; int nDocsReturned; int totalHits; float maxScore; int[] ids; float[] scores; DocSet set; boolean needScores = (cmd.getFlags() & GET_SCORES) != 0; int maxDoc = maxDoc(); int smallSetSize = maxDoc>>6; ProcessedFilter pf = getProcessedFilter(cmd.getFilter(), cmd.getFilterList()); final Filter luceneFilter = pf.filter; Query query = QueryUtils.makeQueryable(cmd.getQuery()); final long timeAllowed = cmd.getTimeAllowed(); // handle zero case... if (lastDocRequested<=0) { final float[] topscore = new float[] { Float.NEGATIVE_INFINITY }; Collector collector; DocSetCollector setCollector; if (!needScores) { collector = setCollector = new DocSetCollector(smallSetSize, maxDoc); } else { collector = setCollector = new DocSetDelegateCollector(smallSetSize, maxDoc, new Collector() { Scorer scorer; @Override public void setScorer(Scorer scorer) throws IOException { this.scorer = scorer; } @Override public void collect(int doc) throws IOException { float score = scorer.score(); if (score > topscore[0]) topscore[0]=score; } @Override public void setNextReader(AtomicReaderContext context) throws IOException { } @Override public boolean acceptsDocsOutOfOrder() { return false; } }); } if( timeAllowed > 0 ) { collector = new TimeLimitingCollector(collector, TimeLimitingCollector.getGlobalCounter(), timeAllowed); } if (pf.postFilter != null) { pf.postFilter.setLastDelegate(collector); collector = pf.postFilter; } try { super.search(query, luceneFilter, collector); } catch( TimeLimitingCollector.TimeExceededException x ) { log.warn( "Query: " + query + "; " + x.getMessage() ); qr.setPartialResults(true); } set = setCollector.getDocSet(); nDocsReturned = 0; ids = new int[nDocsReturned]; scores = new float[nDocsReturned]; totalHits = set.size(); maxScore = totalHits>0 ? topscore[0] : 0.0f; }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
Override public void setScorer(Scorer scorer) throws IOException { this.scorer = scorer; }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
Override public void collect(int doc) throws IOException { float score = scorer.score(); if (score > topscore[0]) topscore[0]=score; }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
Override public void setNextReader(AtomicReaderContext context) throws IOException { }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
public DocList getDocList(Query query, DocSet filter, Sort lsort, int offset, int len) throws IOException { QueryCommand qc = new QueryCommand(); qc.setQuery(query) .setFilter(filter) .setSort(lsort) .setOffset(offset) .setLen(len); QueryResult qr = new QueryResult(); search(qr,qc); return qr.getDocList(); }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
public DocListAndSet getDocListAndSet(Query query, Query filter, Sort lsort, int offset, int len) throws IOException { QueryCommand qc = new QueryCommand(); qc.setQuery(query) .setFilterList(filter) .setSort(lsort) .setOffset(offset) .setLen(len) .setNeedDocSet(true); QueryResult qr = new QueryResult(); search(qr,qc); return qr.getDocListAndSet(); }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
public DocListAndSet getDocListAndSet(Query query, Query filter, Sort lsort, int offset, int len, int flags) throws IOException { QueryCommand qc = new QueryCommand(); qc.setQuery(query) .setFilterList(filter) .setSort(lsort) .setOffset(offset) .setLen(len) .setFlags(flags) .setNeedDocSet(true); QueryResult qr = new QueryResult(); search(qr,qc); return qr.getDocListAndSet(); }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
public DocListAndSet getDocListAndSet(Query query, List<Query> filterList, Sort lsort, int offset, int len) throws IOException { QueryCommand qc = new QueryCommand(); qc.setQuery(query) .setFilterList(filterList) .setSort(lsort) .setOffset(offset) .setLen(len) .setNeedDocSet(true); QueryResult qr = new QueryResult(); search(qr,qc); return qr.getDocListAndSet(); }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
public DocListAndSet getDocListAndSet(Query query, List<Query> filterList, Sort lsort, int offset, int len, int flags) throws IOException { QueryCommand qc = new QueryCommand(); qc.setQuery(query) .setFilterList(filterList) .setSort(lsort) .setOffset(offset) .setLen(len) .setFlags(flags) .setNeedDocSet(true); QueryResult qr = new QueryResult(); search(qr,qc); return qr.getDocListAndSet(); }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
public DocListAndSet getDocListAndSet(Query query, DocSet filter, Sort lsort, int offset, int len) throws IOException { QueryCommand qc = new QueryCommand(); qc.setQuery(query) .setFilter(filter) .setSort(lsort) .setOffset(offset) .setLen(len) .setNeedDocSet(true); QueryResult qr = new QueryResult(); search(qr,qc); return qr.getDocListAndSet(); }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
public DocListAndSet getDocListAndSet(Query query, DocSet filter, Sort lsort, int offset, int len, int flags) throws IOException { QueryCommand qc = new QueryCommand(); qc.setQuery(query) .setFilter(filter) .setSort(lsort) .setOffset(offset) .setLen(len) .setFlags(flags) .setNeedDocSet(true); QueryResult qr = new QueryResult(); search(qr,qc); return qr.getDocListAndSet(); }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
protected DocList sortDocSet(DocSet set, Sort sort, int nDocs) throws IOException { if (nDocs == 0) { // SOLR-2923 return new DocSlice(0, 0, new int[0], null, 0, 0f); } // bit of a hack to tell if a set is sorted - do it better in the future. boolean inOrder = set instanceof BitDocSet || set instanceof SortedIntDocSet; TopDocsCollector topCollector = TopFieldCollector.create(weightSort(sort), nDocs, false, false, false, inOrder); DocIterator iter = set.iterator(); int base=0; int end=0; int readerIndex = 0; while (iter.hasNext()) { int doc = iter.nextDoc(); while (doc>=end) { AtomicReaderContext leaf = leafContexts[readerIndex++]; base = leaf.docBase; end = base + leaf.reader().maxDoc(); topCollector.setNextReader(leaf); // we should never need to set the scorer given the settings for the collector } topCollector.collect(doc-base); } TopDocs topDocs = topCollector.topDocs(0, nDocs); int nDocsReturned = topDocs.scoreDocs.length; int[] ids = new int[nDocsReturned]; for (int i=0; i<nDocsReturned; i++) { ScoreDoc scoreDoc = topDocs.scoreDocs[i]; ids[i] = scoreDoc.doc; } return new DocSlice(0,nDocsReturned,ids,null,topDocs.totalHits,0.0f); }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
public int numDocs(Query a, DocSet b) throws IOException { // Negative query if absolute value different from original Query absQ = QueryUtils.getAbs(a); DocSet positiveA = getPositiveDocSet(absQ); return a==absQ ? b.intersectionSize(positiveA) : b.andNotSize(positiveA); }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
public int numDocs(DocSet a, DocsEnumState deState) throws IOException { // Negative query if absolute value different from original return a.intersectionSize(getDocSet(deState)); }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
public int numDocs(Query a, Query b) throws IOException { Query absA = QueryUtils.getAbs(a); Query absB = QueryUtils.getAbs(b); DocSet positiveA = getPositiveDocSet(absA); DocSet positiveB = getPositiveDocSet(absB); // Negative query if absolute value different from original if (a==absA) { if (b==absB) return positiveA.intersectionSize(positiveB); return positiveA.andNotSize(positiveB); } if (b==absB) return positiveB.andNotSize(positiveA); // if both negative, we need to create a temp DocSet since we // don't have a counting method that takes three. DocSet all = getPositiveDocSet(matchAllDocsQuery); // -a -b == *:*.andNot(a).andNotSize(b) == *.*.andNotSize(a.union(b)) // we use the last form since the intermediate DocSet should normally be smaller. return all.andNotSize(positiveA.union(positiveB)); }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
public Document[] readDocs(DocList ids) throws IOException { Document[] docs = new Document[ids.size()]; readDocs(docs,ids); return docs; }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
public void warm(SolrIndexSearcher old) throws IOException { // Make sure this is first! filters can help queryResults execute! long warmingStartTime = System.currentTimeMillis(); // warm the caches in order... ModifiableSolrParams params = new ModifiableSolrParams(); params.add("warming","true"); for (int i=0; i<cacheList.length; i++) { if (debug) log.debug("autowarming " + this + " from " + old + "\n\t" + old.cacheList[i]); SolrQueryRequest req = new LocalSolrQueryRequest(core,params) { @Override public SolrIndexSearcher getSearcher() { return SolrIndexSearcher.this; } @Override public void close() { } }; SolrQueryResponse rsp = new SolrQueryResponse(); SolrRequestInfo.setRequestInfo(new SolrRequestInfo(req, rsp)); try { this.cacheList[i].warm(this, old.cacheList[i]); } finally { try { req.close(); } finally { SolrRequestInfo.clearRequestInfo(); } } if (debug) log.debug("autowarming result for " + this + "\n\t" + this.cacheList[i]); } warmupTime = System.currentTimeMillis() - warmingStartTime; }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
Override public Explanation explain(Query query, int doc) throws IOException { return super.explain(QueryUtils.makeQueryable(query), doc); }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
Override public DocIdSet getDocIdSet(AtomicReaderContext context, Bits acceptDocs) throws IOException { DocIdSet sub = topFilter == null ? null : topFilter.getDocIdSet(context, acceptDocs); if (weights.size() == 0) return sub; return new FilterSet(sub, context); }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
Override public DocIdSetIterator iterator() throws IOException { List<DocIdSetIterator> iterators = new ArrayList<DocIdSetIterator>(weights.size()+1); if (docIdSet != null) { DocIdSetIterator iter = docIdSet.iterator(); if (iter == null) return null; iterators.add(iter); } for (Weight w : weights) { Scorer scorer = w.scorer(context, true, false, context.reader().getLiveDocs()); if (scorer == null) return null; iterators.add(scorer); } if (iterators.size()==0) return null; if (iterators.size()==1) return iterators.get(0); if (iterators.size()==2) return new DualFilterIterator(iterators.get(0), iterators.get(1)); return new FilterIterator(iterators.toArray(new DocIdSetIterator[iterators.size()])); }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
Override public Bits bits() throws IOException { return null; // don't use random access }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
private int doNext(int doc) throws IOException { int which=0; // index of the iterator with the highest id int i=1; outer: for(;;) { for (; i<iterators.length; i++) { if (i == which) continue; DocIdSetIterator iter = iterators[i]; int next = iter.advance(doc); if (next != doc) { doc = next; which = i; i = 0; continue outer; } } return doc; } }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
Override public int nextDoc() throws IOException { return doNext(first.nextDoc()); }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
Override public int advance(int target) throws IOException { return doNext(first.advance(target)); }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
Override public int nextDoc() throws IOException { int doc = a.nextDoc(); for(;;) { int other = b.advance(doc); if (other == doc) return doc; doc = a.advance(other); if (other == doc) return doc; } }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
Override public int advance(int target) throws IOException { int doc = a.advance(target); for(;;) { int other = b.advance(doc); if (other == doc) return doc; doc = a.advance(other); if (other == doc) return doc; } }
// in core/src/java/org/apache/solr/search/LFUCache.java
public void warm(SolrIndexSearcher searcher, SolrCache old) throws IOException { if (regenerator == null) return; long warmingStartTime = System.currentTimeMillis(); LFUCache other = (LFUCache) old; // warm entries if (autowarmCount != 0) { int sz = other.size(); if (autowarmCount != -1) sz = Math.min(sz, autowarmCount); Map items = other.cache.getMostUsedItems(sz); Map.Entry[] itemsArr = new Map.Entry[items.size()]; int counter = 0; for (Object mapEntry : items.entrySet()) { itemsArr[counter++] = (Map.Entry) mapEntry; } for (int i = itemsArr.length - 1; i >= 0; i--) { try { boolean continueRegen = regenerator.regenerateItem(searcher, this, old, itemsArr[i].getKey(), itemsArr[i].getValue()); if (!continueRegen) break; } catch (Throwable e) { SolrException.log(log, "Error during auto-warming of key:" + itemsArr[i].getKey(), e); } } } warmupTime = System.currentTimeMillis() - warmingStartTime; }
// in core/src/java/org/apache/solr/search/DelegatingCollector.java
Override public void setScorer(Scorer scorer) throws IOException { this.scorer = scorer; delegate.setScorer(scorer); }
// in core/src/java/org/apache/solr/search/DelegatingCollector.java
Override public void collect(int doc) throws IOException { delegate.collect(doc); }
// in core/src/java/org/apache/solr/search/DelegatingCollector.java
Override public void setNextReader(AtomicReaderContext context) throws IOException { this.context = context; this.docBase = context.docBase; delegate.setNextReader(context); }
// in core/src/java/org/apache/solr/search/Grouping.java
public void execute() throws IOException { if (commands.isEmpty()) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Specify at least one field, function or query to group by."); } DocListAndSet out = new DocListAndSet(); qr.setDocListAndSet(out); SolrIndexSearcher.ProcessedFilter pf = searcher.getProcessedFilter(cmd.getFilter(), cmd.getFilterList()); final Filter luceneFilter = pf.filter; maxDoc = searcher.maxDoc(); needScores = (cmd.getFlags() & SolrIndexSearcher.GET_SCORES) != 0; boolean cacheScores = false; // NOTE: Change this when groupSort can be specified per group if (!needScores && !commands.isEmpty()) { if (commands.get(0).groupSort == null) { cacheScores = true; } else { for (SortField field : commands.get(0).groupSort.getSort()) { if (field.getType() == SortField.Type.SCORE) { cacheScores = true; break; } } } } else if (needScores) { cacheScores = needScores; } getDocSet = (cmd.getFlags() & SolrIndexSearcher.GET_DOCSET) != 0; getDocList = (cmd.getFlags() & SolrIndexSearcher.GET_DOCLIST) != 0; query = QueryUtils.makeQueryable(cmd.getQuery()); for (Command cmd : commands) { cmd.prepare(); } AbstractAllGroupHeadsCollector<?> allGroupHeadsCollector = null; List<Collector> collectors = new ArrayList<Collector>(commands.size()); for (Command cmd : commands) { Collector collector = cmd.createFirstPassCollector(); if (collector != null) { collectors.add(collector); } if (getGroupedDocSet && allGroupHeadsCollector == null) { collectors.add(allGroupHeadsCollector = cmd.createAllGroupCollector()); } } Collector allCollectors = MultiCollector.wrap(collectors.toArray(new Collector[collectors.size()])); DocSetCollector setCollector = null; if (getDocSet && allGroupHeadsCollector == null) { setCollector = new DocSetDelegateCollector(maxDoc >> 6, maxDoc, allCollectors); allCollectors = setCollector; } CachingCollector cachedCollector = null; if (cacheSecondPassSearch && allCollectors != null) { int maxDocsToCache = (int) Math.round(maxDoc * (maxDocsPercentageToCache / 100.0d)); // Only makes sense to cache if we cache more than zero. // Maybe we should have a minimum and a maximum, that defines the window we would like caching for. if (maxDocsToCache > 0) { allCollectors = cachedCollector = CachingCollector.create(allCollectors, cacheScores, maxDocsToCache); } } if (pf.postFilter != null) { pf.postFilter.setLastDelegate(allCollectors); allCollectors = pf.postFilter; } if (allCollectors != null) { searchWithTimeLimiter(luceneFilter, allCollectors); } if (getGroupedDocSet && allGroupHeadsCollector != null) { FixedBitSet fixedBitSet = allGroupHeadsCollector.retrieveGroupHeads(maxDoc); long[] bits = fixedBitSet.getBits(); OpenBitSet openBitSet = new OpenBitSet(bits, bits.length); qr.setDocSet(new BitDocSet(openBitSet)); } else if (getDocSet) { qr.setDocSet(setCollector.getDocSet()); } collectors.clear(); for (Command cmd : commands) { Collector collector = cmd.createSecondPassCollector(); if (collector != null) collectors.add(collector); } if (!collectors.isEmpty()) { Collector secondPhaseCollectors = MultiCollector.wrap(collectors.toArray(new Collector[collectors.size()])); if (collectors.size() > 0) { if (cachedCollector != null) { if (cachedCollector.isCached()) { cachedCollector.replay(secondPhaseCollectors); } else { signalCacheWarning = true; logger.warn(String.format("The grouping cache is active, but not used because it exceeded the max cache limit of %d percent", maxDocsPercentageToCache)); logger.warn("Please increase cache size or disable group caching."); searchWithTimeLimiter(luceneFilter, secondPhaseCollectors); } } else { if (pf.postFilter != null) { pf.postFilter.setLastDelegate(secondPhaseCollectors); secondPhaseCollectors = pf.postFilter; } searchWithTimeLimiter(luceneFilter, secondPhaseCollectors); } } } for (Command cmd : commands) { cmd.finish(); } qr.groupedResults = grouped; if (getDocList) { int sz = idSet.size(); int[] ids = new int[sz]; int idx = 0; for (int val : idSet) { ids[idx++] = val; } qr.setDocList(new DocSlice(0, sz, ids, null, maxMatches, maxScore)); } }
// in core/src/java/org/apache/solr/search/Grouping.java
private void searchWithTimeLimiter(final Filter luceneFilter, Collector collector) throws IOException { if (cmd.getTimeAllowed() > 0) { if (timeLimitingCollector == null) { timeLimitingCollector = new TimeLimitingCollector(collector, TimeLimitingCollector.getGlobalCounter(), cmd.getTimeAllowed()); } else { /* * This is so the same timer can be used for grouping's multiple phases. * We don't want to create a new TimeLimitingCollector for each phase because that would * reset the timer for each phase. If time runs out during the first phase, the * second phase should timeout quickly. */ timeLimitingCollector.setCollector(collector); } collector = timeLimitingCollector; } try { searcher.search(query, luceneFilter, collector); } catch (TimeLimitingCollector.TimeExceededException x) { logger.warn( "Query: " + query + "; " + x.getMessage() ); qr.setPartialResults(true); } }
// in core/src/java/org/apache/solr/search/Grouping.java
protected Collector createSecondPassCollector() throws IOException { return null; }
// in core/src/java/org/apache/solr/search/Grouping.java
public AbstractAllGroupHeadsCollector<?> createAllGroupCollector() throws IOException { return null; }
// in core/src/java/org/apache/solr/search/Grouping.java
protected void prepare() throws IOException { actualGroupsToFind = getMax(offset, numGroups, maxDoc); }
// in core/src/java/org/apache/solr/search/Grouping.java
protected Collector createFirstPassCollector() throws IOException { // Ok we don't want groups, but do want a total count if (actualGroupsToFind <= 0) { fallBackCollector = new TotalHitCountCollector(); return fallBackCollector; } sort = sort == null ? Sort.RELEVANCE : sort; firstPass = new TermFirstPassGroupingCollector(groupBy, sort, actualGroupsToFind); return firstPass; }
// in core/src/java/org/apache/solr/search/Grouping.java
protected Collector createSecondPassCollector() throws IOException { if (actualGroupsToFind <= 0) { allGroupsCollector = new TermAllGroupsCollector(groupBy); return totalCount == TotalCount.grouped ? allGroupsCollector : null; } topGroups = format == Format.grouped ? firstPass.getTopGroups(offset, false) : firstPass.getTopGroups(0, false); if (topGroups == null) { if (totalCount == TotalCount.grouped) { allGroupsCollector = new TermAllGroupsCollector(groupBy); fallBackCollector = new TotalHitCountCollector(); return MultiCollector.wrap(allGroupsCollector, fallBackCollector); } else { fallBackCollector = new TotalHitCountCollector(); return fallBackCollector; } } int groupedDocsToCollect = getMax(groupOffset, docsPerGroup, maxDoc); groupedDocsToCollect = Math.max(groupedDocsToCollect, 1); secondPass = new TermSecondPassGroupingCollector( groupBy, topGroups, sort, groupSort, groupedDocsToCollect, needScores, needScores, false ); if (totalCount == TotalCount.grouped) { allGroupsCollector = new TermAllGroupsCollector(groupBy); return MultiCollector.wrap(secondPass, allGroupsCollector); } else { return secondPass; } }
// in core/src/java/org/apache/solr/search/Grouping.java
Override public AbstractAllGroupHeadsCollector<?> createAllGroupCollector() throws IOException { Sort sortWithinGroup = groupSort != null ? groupSort : new Sort(); return TermAllGroupHeadsCollector.create(groupBy, sortWithinGroup); }
// in core/src/java/org/apache/solr/search/Grouping.java
protected void finish() throws IOException { result = secondPass != null ? secondPass.getTopGroups(0) : null; if (main) { mainResult = createSimpleResponse(); return; } NamedList groupResult = commonResponse(); if (format == Format.simple) { groupResult.add("doclist", createSimpleResponse()); return; } List groupList = new ArrayList(); groupResult.add("groups", groupList); // grouped={ key={ groups=[ if (result == null) { return; } // handle case of rows=0 if (numGroups == 0) return; for (GroupDocs<BytesRef> group : result.groups) { NamedList nl = new SimpleOrderedMap(); groupList.add(nl); // grouped={ key={ groups=[ { // To keep the response format compatable with trunk. // In trunk MutableValue can convert an indexed value to its native type. E.g. string to int // The only option I currently see is the use the FieldType for this if (group.groupValue != null) { SchemaField schemaField = searcher.getSchema().getField(groupBy); FieldType fieldType = schemaField.getType(); String readableValue = fieldType.indexedToReadable(group.groupValue.utf8ToString()); IndexableField field = schemaField.createField(readableValue, 0.0f); nl.add("groupValue", fieldType.toObject(field)); } else { nl.add("groupValue", null); } addDocList(nl, group); } }
// in core/src/java/org/apache/solr/search/Grouping.java
protected void prepare() throws IOException { actualGroupsToFind = getMax(offset, numGroups, maxDoc); }
// in core/src/java/org/apache/solr/search/Grouping.java
protected Collector createFirstPassCollector() throws IOException { DocSet groupFilt = searcher.getDocSet(query); topCollector = newCollector(groupSort, needScores); collector = new FilterCollector(groupFilt, topCollector); return collector; }
// in core/src/java/org/apache/solr/search/Grouping.java
TopDocsCollector newCollector(Sort sort, boolean needScores) throws IOException { int groupDocsToCollect = getMax(groupOffset, docsPerGroup, maxDoc); if (sort == null || sort == Sort.RELEVANCE) { return TopScoreDocCollector.create(groupDocsToCollect, true); } else { return TopFieldCollector.create(searcher.weightSort(sort), groupDocsToCollect, false, needScores, needScores, true); } }
// in core/src/java/org/apache/solr/search/Grouping.java
protected void finish() throws IOException { TopDocsCollector topDocsCollector = (TopDocsCollector) collector.getDelegate(); TopDocs topDocs = topDocsCollector.topDocs(); GroupDocs<String> groupDocs = new GroupDocs<String>(topDocs.getMaxScore(), topDocs.totalHits, topDocs.scoreDocs, query.toString(), null); if (main) { mainResult = getDocList(groupDocs); } else { NamedList rsp = commonResponse(); addDocList(rsp, groupDocs); } }
// in core/src/java/org/apache/solr/search/Grouping.java
protected void prepare() throws IOException { Map context = ValueSource.newContext(searcher); groupBy.createWeight(context, searcher); actualGroupsToFind = getMax(offset, numGroups, maxDoc); }
// in core/src/java/org/apache/solr/search/Grouping.java
protected Collector createFirstPassCollector() throws IOException { // Ok we don't want groups, but do want a total count if (actualGroupsToFind <= 0) { fallBackCollector = new TotalHitCountCollector(); return fallBackCollector; } sort = sort == null ? Sort.RELEVANCE : sort; firstPass = new FunctionFirstPassGroupingCollector(groupBy, context, searcher.weightSort(sort), actualGroupsToFind); return firstPass; }
// in core/src/java/org/apache/solr/search/Grouping.java
protected Collector createSecondPassCollector() throws IOException { if (actualGroupsToFind <= 0) { allGroupsCollector = new FunctionAllGroupsCollector(groupBy, context); return totalCount == TotalCount.grouped ? allGroupsCollector : null; } topGroups = format == Format.grouped ? firstPass.getTopGroups(offset, false) : firstPass.getTopGroups(0, false); if (topGroups == null) { if (totalCount == TotalCount.grouped) { allGroupsCollector = new FunctionAllGroupsCollector(groupBy, context); fallBackCollector = new TotalHitCountCollector(); return MultiCollector.wrap(allGroupsCollector, fallBackCollector); } else { fallBackCollector = new TotalHitCountCollector(); return fallBackCollector; } } int groupdDocsToCollect = getMax(groupOffset, docsPerGroup, maxDoc); groupdDocsToCollect = Math.max(groupdDocsToCollect, 1); secondPass = new FunctionSecondPassGroupingCollector( topGroups, sort, groupSort, groupdDocsToCollect, needScores, needScores, false, groupBy, context ); if (totalCount == TotalCount.grouped) { allGroupsCollector = new FunctionAllGroupsCollector(groupBy, context); return MultiCollector.wrap(secondPass, allGroupsCollector); } else { return secondPass; } }
// in core/src/java/org/apache/solr/search/Grouping.java
Override public AbstractAllGroupHeadsCollector<?> createAllGroupCollector() throws IOException { Sort sortWithinGroup = groupSort != null ? groupSort : new Sort(); return new FunctionAllGroupHeadsCollector(groupBy, context, sortWithinGroup); }
// in core/src/java/org/apache/solr/search/Grouping.java
protected void finish() throws IOException { result = secondPass != null ? secondPass.getTopGroups(0) : null; if (main) { mainResult = createSimpleResponse(); return; } NamedList groupResult = commonResponse(); if (format == Format.simple) { groupResult.add("doclist", createSimpleResponse()); return; } List groupList = new ArrayList(); groupResult.add("groups", groupList); // grouped={ key={ groups=[ if (result == null) { return; } // handle case of rows=0 if (numGroups == 0) return; for (GroupDocs<MutableValue> group : result.groups) { NamedList nl = new SimpleOrderedMap(); groupList.add(nl); // grouped={ key={ groups=[ { nl.add("groupValue", group.groupValue.toObject()); addDocList(nl, group); } }
// in core/src/java/org/apache/solr/search/FunctionRangeQuery.java
Override public void collect(int doc) throws IOException { if (doc<maxdoc && scorer.matches(doc)) { delegate.collect(doc); } }
// in core/src/java/org/apache/solr/search/FunctionRangeQuery.java
Override public void setNextReader(AtomicReaderContext context) throws IOException { maxdoc = context.reader().maxDoc(); FunctionValues dv = rangeFilt.getValueSource().getValues(fcontext, context); scorer = dv.getRangeScorer(context.reader(), rangeFilt.getLowerVal(), rangeFilt.getUpperVal(), rangeFilt.isIncludeLower(), rangeFilt.isIncludeUpper()); super.setNextReader(context); }
// in core/src/java/org/apache/solr/search/BitDocSet.java
Override public Filter getTopFilter() { final OpenBitSet bs = bits; // TODO: if cardinality isn't cached, do a quick measure of sparseness // and return null from bits() if too sparse. return new Filter() { @Override public DocIdSet getDocIdSet(final AtomicReaderContext context, final Bits acceptDocs) throws IOException { AtomicReader reader = context.reader(); // all Solr DocSets that are used as filters only include live docs final Bits acceptDocs2 = acceptDocs == null ? null : (reader.getLiveDocs() == acceptDocs ? null : acceptDocs); if (context.isTopLevel) { return BitsFilteredDocIdSet.wrap(bs, acceptDocs); } final int base = context.docBase; final int maxDoc = reader.maxDoc(); final int max = base + maxDoc; // one past the max doc in this segment. return BitsFilteredDocIdSet.wrap(new DocIdSet() { @Override public DocIdSetIterator iterator() throws IOException { return new DocIdSetIterator() { int pos=base-1; int adjustedDoc=-1; @Override public int docID() { return adjustedDoc; } @Override public int nextDoc() throws IOException { pos = bs.nextSetBit(pos+1); return adjustedDoc = (pos>=0 && pos<max) ? pos-base : NO_MORE_DOCS; } @Override public int advance(int target) throws IOException { if (target==NO_MORE_DOCS) return adjustedDoc=NO_MORE_DOCS; pos = bs.nextSetBit(target+base); return adjustedDoc = (pos>=0 && pos<max) ? pos-base : NO_MORE_DOCS; } }; } @Override public boolean isCacheable() { return true; } @Override public Bits bits() throws IOException { return new Bits() { @Override public boolean get(int index) { return bs.fastGet(index + base); } @Override public int length() { return maxDoc; } }; } }, acceptDocs2); } }; }
// in core/src/java/org/apache/solr/search/BitDocSet.java
Override public DocIdSet getDocIdSet(final AtomicReaderContext context, final Bits acceptDocs) throws IOException { AtomicReader reader = context.reader(); // all Solr DocSets that are used as filters only include live docs final Bits acceptDocs2 = acceptDocs == null ? null : (reader.getLiveDocs() == acceptDocs ? null : acceptDocs); if (context.isTopLevel) { return BitsFilteredDocIdSet.wrap(bs, acceptDocs); } final int base = context.docBase; final int maxDoc = reader.maxDoc(); final int max = base + maxDoc; // one past the max doc in this segment. return BitsFilteredDocIdSet.wrap(new DocIdSet() { @Override public DocIdSetIterator iterator() throws IOException { return new DocIdSetIterator() { int pos=base-1; int adjustedDoc=-1; @Override public int docID() { return adjustedDoc; } @Override public int nextDoc() throws IOException { pos = bs.nextSetBit(pos+1); return adjustedDoc = (pos>=0 && pos<max) ? pos-base : NO_MORE_DOCS; } @Override public int advance(int target) throws IOException { if (target==NO_MORE_DOCS) return adjustedDoc=NO_MORE_DOCS; pos = bs.nextSetBit(target+base); return adjustedDoc = (pos>=0 && pos<max) ? pos-base : NO_MORE_DOCS; } }; } @Override public boolean isCacheable() { return true; } @Override public Bits bits() throws IOException { return new Bits() { @Override public boolean get(int index) { return bs.fastGet(index + base); } @Override public int length() { return maxDoc; } }; } }, acceptDocs2); }
// in core/src/java/org/apache/solr/search/BitDocSet.java
Override public DocIdSetIterator iterator() throws IOException { return new DocIdSetIterator() { int pos=base-1; int adjustedDoc=-1; @Override public int docID() { return adjustedDoc; } @Override public int nextDoc() throws IOException { pos = bs.nextSetBit(pos+1); return adjustedDoc = (pos>=0 && pos<max) ? pos-base : NO_MORE_DOCS; } @Override public int advance(int target) throws IOException { if (target==NO_MORE_DOCS) return adjustedDoc=NO_MORE_DOCS; pos = bs.nextSetBit(target+base); return adjustedDoc = (pos>=0 && pos<max) ? pos-base : NO_MORE_DOCS; } }; }
// in core/src/java/org/apache/solr/search/BitDocSet.java
Override public int nextDoc() throws IOException { pos = bs.nextSetBit(pos+1); return adjustedDoc = (pos>=0 && pos<max) ? pos-base : NO_MORE_DOCS; }
// in core/src/java/org/apache/solr/search/BitDocSet.java
Override public int advance(int target) throws IOException { if (target==NO_MORE_DOCS) return adjustedDoc=NO_MORE_DOCS; pos = bs.nextSetBit(target+base); return adjustedDoc = (pos>=0 && pos<max) ? pos-base : NO_MORE_DOCS; }
// in core/src/java/org/apache/solr/search/BitDocSet.java
Override public Bits bits() throws IOException { return new Bits() { @Override public boolean get(int index) { return bs.fastGet(index + base); } @Override public int length() { return maxDoc; } }; }
// in core/src/java/org/apache/solr/search/SolrConstantScoreQuery.java
Override public Query rewrite(IndexReader reader) throws IOException { return this; }
// in core/src/java/org/apache/solr/search/SolrConstantScoreQuery.java
Override public float getValueForNormalization() throws IOException { queryWeight = getBoost(); return queryWeight * queryWeight; }
// in core/src/java/org/apache/solr/search/SolrConstantScoreQuery.java
Override public Scorer scorer(AtomicReaderContext context, boolean scoreDocsInOrder, boolean topScorer, Bits acceptDocs) throws IOException { return new ConstantScorer(context, this, queryWeight, acceptDocs); }
// in core/src/java/org/apache/solr/search/SolrConstantScoreQuery.java
Override public Explanation explain(AtomicReaderContext context, int doc) throws IOException { ConstantScorer cs = new ConstantScorer(context, this, queryWeight, context.reader().getLiveDocs()); boolean exists = cs.docIdSetIterator.advance(doc) == doc; ComplexExplanation result = new ComplexExplanation(); if (exists) { result.setDescription("ConstantScoreQuery(" + filter + "), product of:"); result.setValue(queryWeight); result.setMatch(Boolean.TRUE); result.addDetail(new Explanation(getBoost(), "boost")); result.addDetail(new Explanation(queryNorm,"queryNorm")); } else { result.setDescription("ConstantScoreQuery(" + filter + ") doesn't match id " + doc); result.setValue(0); result.setMatch(Boolean.FALSE); } return result; }
// in core/src/java/org/apache/solr/search/SolrConstantScoreQuery.java
Override public int nextDoc() throws IOException { return docIdSetIterator.nextDoc(); }
// in core/src/java/org/apache/solr/search/SolrConstantScoreQuery.java
Override public float score() throws IOException { return theScore; }
// in core/src/java/org/apache/solr/search/SolrConstantScoreQuery.java
Override public int advance(int target) throws IOException { return docIdSetIterator.advance(target); }
// in core/src/java/org/apache/solr/search/LRUCache.java
public void warm(SolrIndexSearcher searcher, SolrCache<K,V> old) throws IOException { if (regenerator==null) return; long warmingStartTime = System.currentTimeMillis(); LRUCache<K,V> other = (LRUCache<K,V>)old; // warm entries if (isAutowarmingOn()) { Object[] keys,vals = null; // Don't do the autowarming in the synchronized block, just pull out the keys and values. synchronized (other.map) { int sz = autowarm.getWarmCount(other.map.size()); keys = new Object[sz]; vals = new Object[sz]; Iterator<Map.Entry<K, V>> iter = other.map.entrySet().iterator(); // iteration goes from oldest (least recently used) to most recently used, // so we need to skip over the oldest entries. int skip = other.map.size() - sz; for (int i=0; i<skip; i++) iter.next(); for (int i=0; i<sz; i++) { Map.Entry<K,V> entry = iter.next(); keys[i]=entry.getKey(); vals[i]=entry.getValue(); } } // autowarm from the oldest to the newest entries so that the ordering will be // correct in the new cache. for (int i=0; i<keys.length; i++) { try { boolean continueRegen = regenerator.regenerateItem(searcher, this, old, keys[i], vals[i]); if (!continueRegen) break; } catch (Throwable e) { SolrException.log(log,"Error during auto-warming of key:" + keys[i], e); } } } warmupTime = System.currentTimeMillis() - warmingStartTime; }
// in core/src/java/org/apache/solr/search/QueryParsing.java
static FieldType writeFieldName(String name, IndexSchema schema, Appendable out, int flags) throws IOException { FieldType ft = null; ft = schema.getFieldTypeNoEx(name); out.append(name); if (ft == null) { out.append("(UNKNOWN FIELD " + name + ')'); } out.append(':'); return ft; }
// in core/src/java/org/apache/solr/search/QueryParsing.java
static void writeFieldVal(String val, FieldType ft, Appendable out, int flags) throws IOException { if (ft != null) { try { out.append(ft.indexedToReadable(val)); } catch (Exception e) { out.append("EXCEPTION(val="); out.append(val); out.append(")"); } } else { out.append(val); } }
// in core/src/java/org/apache/solr/search/QueryParsing.java
static void writeFieldVal(BytesRef val, FieldType ft, Appendable out, int flags) throws IOException { if (ft != null) { try { CharsRef readable = new CharsRef(); ft.indexedToReadable(val, readable); out.append(readable); } catch (Exception e) { out.append("EXCEPTION(val="); out.append(val.utf8ToString()); out.append(")"); } } else { out.append(val.utf8ToString()); } }
// in core/src/java/org/apache/solr/search/QueryParsing.java
public static void toString(Query query, IndexSchema schema, Appendable out, int flags) throws IOException { boolean writeBoost = true; if (query instanceof TermQuery) { TermQuery q = (TermQuery) query; Term t = q.getTerm(); FieldType ft = writeFieldName(t.field(), schema, out, flags); writeFieldVal(t.bytes(), ft, out, flags); } else if (query instanceof TermRangeQuery) { TermRangeQuery q = (TermRangeQuery) query; String fname = q.getField(); FieldType ft = writeFieldName(fname, schema, out, flags); out.append(q.includesLower() ? '[' : '{'); BytesRef lt = q.getLowerTerm(); BytesRef ut = q.getUpperTerm(); if (lt == null) { out.append('*'); } else { writeFieldVal(lt, ft, out, flags); } out.append(" TO "); if (ut == null) { out.append('*'); } else { writeFieldVal(ut, ft, out, flags); } out.append(q.includesUpper() ? ']' : '}'); } else if (query instanceof NumericRangeQuery) { NumericRangeQuery q = (NumericRangeQuery) query; String fname = q.getField(); FieldType ft = writeFieldName(fname, schema, out, flags); out.append(q.includesMin() ? '[' : '{'); Number lt = q.getMin(); Number ut = q.getMax(); if (lt == null) { out.append('*'); } else { out.append(lt.toString()); } out.append(" TO "); if (ut == null) { out.append('*'); } else { out.append(ut.toString()); } out.append(q.includesMax() ? ']' : '}'); } else if (query instanceof BooleanQuery) { BooleanQuery q = (BooleanQuery) query; boolean needParens = false; if (q.getBoost() != 1.0 || q.getMinimumNumberShouldMatch() != 0 || q.isCoordDisabled()) { needParens = true; } if (needParens) { out.append('('); } boolean first = true; for (BooleanClause c : q.clauses()) { if (!first) { out.append(' '); } else { first = false; } if (c.isProhibited()) { out.append('-'); } else if (c.isRequired()) { out.append('+'); } Query subQuery = c.getQuery(); boolean wrapQuery = false; // TODO: may need to put parens around other types // of queries too, depending on future syntax. if (subQuery instanceof BooleanQuery) { wrapQuery = true; } if (wrapQuery) { out.append('('); } toString(subQuery, schema, out, flags); if (wrapQuery) { out.append(')'); } } if (needParens) { out.append(')'); } if (q.getMinimumNumberShouldMatch() > 0) { out.append('~'); out.append(Integer.toString(q.getMinimumNumberShouldMatch())); } if (q.isCoordDisabled()) { out.append("/no_coord"); } } else if (query instanceof PrefixQuery) { PrefixQuery q = (PrefixQuery) query; Term prefix = q.getPrefix(); FieldType ft = writeFieldName(prefix.field(), schema, out, flags); out.append(prefix.text()); out.append('*'); } else if (query instanceof WildcardQuery) { out.append(query.toString()); writeBoost = false; } else if (query instanceof FuzzyQuery) { out.append(query.toString()); writeBoost = false; } else if (query instanceof ConstantScoreQuery) { out.append(query.toString()); writeBoost = false; } else { out.append(query.getClass().getSimpleName() + '(' + query.toString() + ')'); writeBoost = false; } if (writeBoost && query.getBoost() != 1.0f) { out.append("^"); out.append(Float.toString(query.getBoost())); } }
// in core/src/java/org/apache/solr/search/grouping/distributed/command/TopGroupsFieldCommand.java
public List<Collector> create() throws IOException { if (firstPhaseGroups.isEmpty()) { return Collections.emptyList(); } List<Collector> collectors = new ArrayList<Collector>(); secondPassCollector = new TermSecondPassGroupingCollector( field.getName(), firstPhaseGroups, groupSort, sortWithinGroup, maxDocPerGroup, needScores, needMaxScore, true ); collectors.add(secondPassCollector); return collectors; }
// in core/src/java/org/apache/solr/search/grouping/distributed/command/SearchGroupsFieldCommand.java
public List<Collector> create() throws IOException { List<Collector> collectors = new ArrayList<Collector>(); if (topNGroups > 0) { firstPassGroupingCollector = new TermFirstPassGroupingCollector(field.getName(), groupSort, topNGroups); collectors.add(firstPassGroupingCollector); } if (includeGroupCount) { allGroupsCollector = new TermAllGroupsCollector(field.getName()); collectors.add(allGroupsCollector); } return collectors; }
// in core/src/java/org/apache/solr/search/grouping/distributed/command/QueryCommand.java
public Builder setDocSet(SolrIndexSearcher searcher) throws IOException { return setDocSet(searcher.getDocSet(query)); }
// in core/src/java/org/apache/solr/search/grouping/distributed/command/QueryCommand.java
public List<Collector> create() throws IOException { if (sort == null || sort == Sort.RELEVANCE) { collector = TopScoreDocCollector.create(docsToCollect, true); } else { collector = TopFieldCollector.create(sort, docsToCollect, true, needScores, needScores, true); } filterCollector = new FilterCollector(docSet, collector); return Arrays.asList((Collector) filterCollector); }
// in core/src/java/org/apache/solr/search/grouping/distributed/shardresultserializer/SearchGroupsResultTransformer.java
public NamedList transform(List<Command> data) throws IOException { NamedList<NamedList> result = new NamedList<NamedList>(); for (Command command : data) { final NamedList<Object> commandResult = new NamedList<Object>(); if (SearchGroupsFieldCommand.class.isInstance(command)) { SearchGroupsFieldCommand fieldCommand = (SearchGroupsFieldCommand) command; Pair<Integer, Collection<SearchGroup<BytesRef>>> pair = fieldCommand.result(); Integer groupedCount = pair.getA(); Collection<SearchGroup<BytesRef>> searchGroups = pair.getB(); if (searchGroups != null) { commandResult.add("topGroups", serializeSearchGroup(searchGroups, fieldCommand.getGroupSort())); } if (groupedCount != null) { commandResult.add("groupCount", groupedCount); } } else { continue; } result.add(command.getKey(), commandResult); } return result; }
// in core/src/java/org/apache/solr/search/grouping/distributed/shardresultserializer/SearchGroupsResultTransformer.java
public Map<String, Pair<Integer, Collection<SearchGroup<BytesRef>>>> transformToNative(NamedList<NamedList> shardResponse, Sort groupSort, Sort sortWithinGroup, String shard) throws IOException { Map<String, Pair<Integer, Collection<SearchGroup<BytesRef>>>> result = new HashMap<String, Pair<Integer, Collection<SearchGroup<BytesRef>>>>(); for (Map.Entry<String, NamedList> command : shardResponse) { List<SearchGroup<BytesRef>> searchGroups = new ArrayList<SearchGroup<BytesRef>>(); NamedList topGroupsAndGroupCount = command.getValue(); @SuppressWarnings("unchecked") NamedList<List<Comparable>> rawSearchGroups = (NamedList<List<Comparable>>) topGroupsAndGroupCount.get("topGroups"); if (rawSearchGroups != null) { for (Map.Entry<String, List<Comparable>> rawSearchGroup : rawSearchGroups){ SearchGroup<BytesRef> searchGroup = new SearchGroup<BytesRef>(); searchGroup.groupValue = rawSearchGroup.getKey() != null ? new BytesRef(rawSearchGroup.getKey()) : null; searchGroup.sortValues = rawSearchGroup.getValue().toArray(new Comparable[rawSearchGroup.getValue().size()]); searchGroups.add(searchGroup); } } Integer groupCount = (Integer) topGroupsAndGroupCount.get("groupCount"); result.put(command.getKey(), new Pair<Integer, Collection<SearchGroup<BytesRef>>>(groupCount, searchGroups)); } return result; }
// in core/src/java/org/apache/solr/search/grouping/distributed/shardresultserializer/TopGroupsResultTransformer.java
public NamedList transform(List<Command> data) throws IOException { NamedList<NamedList> result = new NamedList<NamedList>(); for (Command command : data) { NamedList commandResult; if (TopGroupsFieldCommand.class.isInstance(command)) { TopGroupsFieldCommand fieldCommand = (TopGroupsFieldCommand) command; SchemaField groupField = rb.req.getSearcher().getSchema().getField(fieldCommand.getKey()); commandResult = serializeTopGroups(fieldCommand.result(), groupField); } else if (QueryCommand.class.isInstance(command)) { QueryCommand queryCommand = (QueryCommand) command; commandResult = serializeTopDocs(queryCommand.result()); } else { commandResult = null; } result.add(command.getKey(), commandResult); } return result; }
// in core/src/java/org/apache/solr/search/grouping/distributed/shardresultserializer/TopGroupsResultTransformer.java
protected NamedList serializeTopGroups(TopGroups<BytesRef> data, SchemaField groupField) throws IOException { NamedList<Object> result = new NamedList<Object>(); result.add("totalGroupedHitCount", data.totalGroupedHitCount); result.add("totalHitCount", data.totalHitCount); if (data.totalGroupCount != null) { result.add("totalGroupCount", data.totalGroupCount); } CharsRef spare = new CharsRef(); SchemaField uniqueField = rb.req.getSearcher().getSchema().getUniqueKeyField(); for (GroupDocs<BytesRef> searchGroup : data.groups) { NamedList<Object> groupResult = new NamedList<Object>(); groupResult.add("totalHits", searchGroup.totalHits); if (!Float.isNaN(searchGroup.maxScore)) { groupResult.add("maxScore", searchGroup.maxScore); } List<NamedList<Object>> documents = new ArrayList<NamedList<Object>>(); for (int i = 0; i < searchGroup.scoreDocs.length; i++) { NamedList<Object> document = new NamedList<Object>(); documents.add(document); Document doc = retrieveDocument(uniqueField, searchGroup.scoreDocs[i].doc); document.add("id", uniqueField.getType().toExternal(doc.getField(uniqueField.getName()))); if (!Float.isNaN(searchGroup.scoreDocs[i].score)) { document.add("score", searchGroup.scoreDocs[i].score); } if (!(searchGroup.scoreDocs[i] instanceof FieldDoc)) { continue; } FieldDoc fieldDoc = (FieldDoc) searchGroup.scoreDocs[i]; Object[] convertedSortValues = new Object[fieldDoc.fields.length]; for (int j = 0; j < fieldDoc.fields.length; j++) { Object sortValue = fieldDoc.fields[j]; Sort sortWithinGroup = rb.getGroupingSpec().getSortWithinGroup(); SchemaField field = sortWithinGroup.getSort()[j].getField() != null ? rb.req.getSearcher().getSchema().getFieldOrNull(sortWithinGroup.getSort()[j].getField()) : null; if (field != null) { FieldType fieldType = field.getType(); if (sortValue instanceof BytesRef) { UnicodeUtil.UTF8toUTF16((BytesRef)sortValue, spare); String indexedValue = spare.toString(); sortValue = fieldType.toObject(field.createField(fieldType.indexedToReadable(indexedValue), 0.0f)); } else if (sortValue instanceof String) { sortValue = fieldType.toObject(field.createField(fieldType.indexedToReadable((String) sortValue), 0.0f)); } } convertedSortValues[j] = sortValue; } document.add("sortValues", convertedSortValues); } groupResult.add("documents", documents); String groupValue = searchGroup.groupValue != null ? groupField.getType().indexedToReadable(searchGroup.groupValue.utf8ToString()): null; result.add(groupValue, groupResult); } return result; }
// in core/src/java/org/apache/solr/search/grouping/distributed/shardresultserializer/TopGroupsResultTransformer.java
protected NamedList serializeTopDocs(QueryCommandResult result) throws IOException { NamedList<Object> queryResult = new NamedList<Object>(); queryResult.add("matches", result.getMatches()); queryResult.add("totalHits", result.getTopDocs().totalHits); if (rb.getGroupingSpec().isNeedScore()) { queryResult.add("maxScore", result.getTopDocs().getMaxScore()); } List<NamedList> documents = new ArrayList<NamedList>(); queryResult.add("documents", documents); SchemaField uniqueField = rb.req.getSearcher().getSchema().getUniqueKeyField(); CharsRef spare = new CharsRef(); for (ScoreDoc scoreDoc : result.getTopDocs().scoreDocs) { NamedList<Object> document = new NamedList<Object>(); documents.add(document); Document doc = retrieveDocument(uniqueField, scoreDoc.doc); document.add("id", uniqueField.getType().toExternal(doc.getField(uniqueField.getName()))); if (rb.getGroupingSpec().isNeedScore()) { document.add("score", scoreDoc.score); } if (!FieldDoc.class.isInstance(scoreDoc)) { continue; } FieldDoc fieldDoc = (FieldDoc) scoreDoc; Object[] convertedSortValues = new Object[fieldDoc.fields.length]; for (int j = 0; j < fieldDoc.fields.length; j++) { Object sortValue = fieldDoc.fields[j]; Sort groupSort = rb.getGroupingSpec().getGroupSort(); SchemaField field = groupSort.getSort()[j].getField() != null ? rb.req.getSearcher().getSchema().getFieldOrNull(groupSort.getSort()[j].getField()) : null; if (field != null) { FieldType fieldType = field.getType(); if (sortValue instanceof BytesRef) { UnicodeUtil.UTF8toUTF16((BytesRef)sortValue, spare); String indexedValue = spare.toString(); sortValue = fieldType.toObject(field.createField(fieldType.indexedToReadable(indexedValue), 0.0f)); } else if (sortValue instanceof String) { sortValue = fieldType.toObject(field.createField(fieldType.indexedToReadable((String) sortValue), 0.0f)); } } convertedSortValues[j] = sortValue; } document.add("sortValues", convertedSortValues); } return queryResult; }
// in core/src/java/org/apache/solr/search/grouping/distributed/shardresultserializer/TopGroupsResultTransformer.java
private Document retrieveDocument(final SchemaField uniqueField, int doc) throws IOException { DocumentStoredFieldVisitor visitor = new DocumentStoredFieldVisitor(uniqueField.getName()); rb.req.getSearcher().doc(doc, visitor); return visitor.getDocument(); }
// in core/src/java/org/apache/solr/search/grouping/CommandHandler.java
private DocSet computeGroupedDocSet(Query query, Filter luceneFilter, List<Collector> collectors) throws IOException { Command firstCommand = commands.get(0); AbstractAllGroupHeadsCollector termAllGroupHeadsCollector = TermAllGroupHeadsCollector.create(firstCommand.getKey(), firstCommand.getSortWithinGroup()); if (collectors.isEmpty()) { searchWithTimeLimiter(query, luceneFilter, termAllGroupHeadsCollector); } else { collectors.add(termAllGroupHeadsCollector); searchWithTimeLimiter(query, luceneFilter, MultiCollector.wrap(collectors.toArray(new Collector[collectors.size()]))); } int maxDoc = searcher.maxDoc(); long[] bits = termAllGroupHeadsCollector.retrieveGroupHeads(maxDoc).getBits(); return new BitDocSet(new OpenBitSet(bits, bits.length)); }
// in core/src/java/org/apache/solr/search/grouping/CommandHandler.java
private DocSet computeDocSet(Query query, Filter luceneFilter, List<Collector> collectors) throws IOException { int maxDoc = searcher.maxDoc(); DocSetCollector docSetCollector; if (collectors.isEmpty()) { docSetCollector = new DocSetCollector(maxDoc >> 6, maxDoc); } else { Collector wrappedCollectors = MultiCollector.wrap(collectors.toArray(new Collector[collectors.size()])); docSetCollector = new DocSetDelegateCollector(maxDoc >> 6, maxDoc, wrappedCollectors); } searchWithTimeLimiter(query, luceneFilter, docSetCollector); return docSetCollector.getDocSet(); }
// in core/src/java/org/apache/solr/search/grouping/CommandHandler.java
private void searchWithTimeLimiter(final Query query, final Filter luceneFilter, Collector collector) throws IOException { if (queryCommand.getTimeAllowed() > 0 ) { collector = new TimeLimitingCollector(collector, TimeLimitingCollector.getGlobalCounter(), queryCommand.getTimeAllowed()); } TotalHitCountCollector hitCountCollector = new TotalHitCountCollector(); if (includeHitCount) { collector = MultiCollector.wrap(collector, hitCountCollector); } try { searcher.search(query, luceneFilter, collector); } catch (TimeLimitingCollector.TimeExceededException x) { partialResults = true; logger.warn( "Query: " + query + "; " + x.getMessage() ); } if (includeHitCount) { totalHitCount = hitCountCollector.getTotalHits(); } }
// in core/src/java/org/apache/solr/search/grouping/collector/FilterCollector.java
public void setScorer(Scorer scorer) throws IOException { delegate.setScorer(scorer); }
// in core/src/java/org/apache/solr/search/grouping/collector/FilterCollector.java
public void collect(int doc) throws IOException { matches++; if (filter.exists(doc + docBase)) { delegate.collect(doc); } }
// in core/src/java/org/apache/solr/search/grouping/collector/FilterCollector.java
public void setNextReader(AtomicReaderContext context) throws IOException { this.docBase = context.docBase; delegate.setNextReader(context); }
// in core/src/java/org/apache/solr/search/DocSetDelegateCollector.java
Override public void collect(int doc) throws IOException { collector.collect(doc); doc += base; // optimistically collect the first docs in an array // in case the total number will be small enough to represent // as a small set like SortedIntDocSet instead... // Storing in this array will be quicker to convert // than scanning through a potentially huge bit vector. // FUTURE: when search methods all start returning docs in order, maybe // we could have a ListDocSet() and use the collected array directly. if (pos < scratch.length) { scratch[pos]=doc; } else { // this conditional could be removed if BitSet was preallocated, but that // would take up more memory, and add more GC time... if (bits==null) bits = new OpenBitSet(maxDoc); bits.fastSet(doc); } pos++; }
// in core/src/java/org/apache/solr/search/DocSetDelegateCollector.java
Override public void setScorer(Scorer scorer) throws IOException { collector.setScorer(scorer); }
// in core/src/java/org/apache/solr/search/DocSetDelegateCollector.java
Override public void setNextReader(AtomicReaderContext context) throws IOException { collector.setNextReader(context); this.base = context.docBase; }
// in core/src/java/org/apache/solr/search/JoinQParserPlugin.java
Override public Query rewrite(IndexReader reader) throws IOException { // don't rewrite the subQuery return this; }
// in core/src/java/org/apache/solr/search/JoinQParserPlugin.java
public Weight createWeight(IndexSearcher searcher) throws IOException { return new JoinQueryWeight((SolrIndexSearcher)searcher); }
// in core/src/java/org/apache/solr/search/JoinQParserPlugin.java
Override public void close() throws IOException { ref.decref(); }
// in core/src/java/org/apache/solr/search/JoinQParserPlugin.java
Override public void close() throws IOException { fromCore.close(); }
// in core/src/java/org/apache/solr/search/JoinQParserPlugin.java
Override public float getValueForNormalization() throws IOException { queryWeight = getBoost(); return queryWeight * queryWeight; }
// in core/src/java/org/apache/solr/search/JoinQParserPlugin.java
Override public Scorer scorer(AtomicReaderContext context, boolean scoreDocsInOrder, boolean topScorer, Bits acceptDocs) throws IOException { if (filter == null) { boolean debug = rb != null && rb.isDebug(); long start = debug ? System.currentTimeMillis() : 0; resultSet = getDocSet(); long end = debug ? System.currentTimeMillis() : 0; if (debug) { SimpleOrderedMap<Object> dbg = new SimpleOrderedMap<Object>(); dbg.add("time", (end-start)); dbg.add("fromSetSize", fromSetSize); // the input dbg.add("toSetSize", resultSet.size()); // the output dbg.add("fromTermCount", fromTermCount); dbg.add("fromTermTotalDf", fromTermTotalDf); dbg.add("fromTermDirectCount", fromTermDirectCount); dbg.add("fromTermHits", fromTermHits); dbg.add("fromTermHitsTotalDf", fromTermHitsTotalDf); dbg.add("toTermHits", toTermHits); dbg.add("toTermHitsTotalDf", toTermHitsTotalDf); dbg.add("toTermDirectCount", toTermDirectCount); dbg.add("smallSetsDeferred", smallSetsDeferred); dbg.add("toSetDocsAdded", resultListDocs); // TODO: perhaps synchronize addDebug in the future... rb.addDebug(dbg, "join", JoinQuery.this.toString()); } filter = resultSet.getTopFilter(); } // Although this set only includes live docs, other filters can be pushed down to queries. DocIdSet readerSet = filter.getDocIdSet(context, acceptDocs); if (readerSet == null) readerSet=DocIdSet.EMPTY_DOCIDSET; return new JoinScorer(this, readerSet.iterator(), getBoost()); }
// in core/src/java/org/apache/solr/search/JoinQParserPlugin.java
public DocSet getDocSet() throws IOException { OpenBitSet resultBits = null; // minimum docFreq to use the cache int minDocFreqFrom = Math.max(5, fromSearcher.maxDoc() >> 13); int minDocFreqTo = Math.max(5, toSearcher.maxDoc() >> 13); // use a smaller size than normal since we will need to sort and dedup the results int maxSortedIntSize = Math.max(10, toSearcher.maxDoc() >> 10); DocSet fromSet = fromSearcher.getDocSet(q); fromSetSize = fromSet.size(); List<DocSet> resultList = new ArrayList<DocSet>(10); // make sure we have a set that is fast for random access, if we will use it for that DocSet fastForRandomSet = fromSet; if (minDocFreqFrom>0 && fromSet instanceof SortedIntDocSet) { SortedIntDocSet sset = (SortedIntDocSet)fromSet; fastForRandomSet = new HashDocSet(sset.getDocs(), 0, sset.size()); } Fields fromFields = fromSearcher.getAtomicReader().fields(); Fields toFields = fromSearcher==toSearcher ? fromFields : toSearcher.getAtomicReader().fields(); if (fromFields == null) return DocSet.EMPTY; Terms terms = fromFields.terms(fromField); Terms toTerms = toFields.terms(toField); if (terms == null || toTerms==null) return DocSet.EMPTY; String prefixStr = TrieField.getMainValuePrefix(fromSearcher.getSchema().getFieldType(fromField)); BytesRef prefix = prefixStr == null ? null : new BytesRef(prefixStr); BytesRef term = null; TermsEnum termsEnum = terms.iterator(null); TermsEnum toTermsEnum = toTerms.iterator(null); SolrIndexSearcher.DocsEnumState fromDeState = null; SolrIndexSearcher.DocsEnumState toDeState = null; if (prefix == null) { term = termsEnum.next(); } else { if (termsEnum.seekCeil(prefix, true) != TermsEnum.SeekStatus.END) { term = termsEnum.term(); } } Bits fromLiveDocs = fromSearcher.getAtomicReader().getLiveDocs(); Bits toLiveDocs = fromSearcher == toSearcher ? fromLiveDocs : toSearcher.getAtomicReader().getLiveDocs(); fromDeState = new SolrIndexSearcher.DocsEnumState(); fromDeState.fieldName = fromField; fromDeState.liveDocs = fromLiveDocs; fromDeState.termsEnum = termsEnum; fromDeState.docsEnum = null; fromDeState.minSetSizeCached = minDocFreqFrom; toDeState = new SolrIndexSearcher.DocsEnumState(); toDeState.fieldName = toField; toDeState.liveDocs = toLiveDocs; toDeState.termsEnum = toTermsEnum; toDeState.docsEnum = null; toDeState.minSetSizeCached = minDocFreqTo; while (term != null) { if (prefix != null && !StringHelper.startsWith(term, prefix)) break; fromTermCount++; boolean intersects = false; int freq = termsEnum.docFreq(); fromTermTotalDf++; if (freq < minDocFreqFrom) { fromTermDirectCount++; // OK to skip liveDocs, since we check for intersection with docs matching query fromDeState.docsEnum = fromDeState.termsEnum.docs(null, fromDeState.docsEnum, false); DocsEnum docsEnum = fromDeState.docsEnum; if (docsEnum instanceof MultiDocsEnum) { MultiDocsEnum.EnumWithSlice[] subs = ((MultiDocsEnum)docsEnum).getSubs(); int numSubs = ((MultiDocsEnum)docsEnum).getNumSubs(); outer: for (int subindex = 0; subindex<numSubs; subindex++) { MultiDocsEnum.EnumWithSlice sub = subs[subindex]; if (sub.docsEnum == null) continue; int base = sub.slice.start; int docid; while ((docid = sub.docsEnum.nextDoc()) != DocIdSetIterator.NO_MORE_DOCS) { if (fastForRandomSet.exists(docid+base)) { intersects = true; break outer; } } } } else { int docid; while ((docid = docsEnum.nextDoc()) != DocIdSetIterator.NO_MORE_DOCS) { if (fastForRandomSet.exists(docid)) { intersects = true; break; } } } } else { // use the filter cache DocSet fromTermSet = fromSearcher.getDocSet(fromDeState); intersects = fromSet.intersects(fromTermSet); } if (intersects) { fromTermHits++; fromTermHitsTotalDf++; TermsEnum.SeekStatus status = toTermsEnum.seekCeil(term); if (status == TermsEnum.SeekStatus.END) break; if (status == TermsEnum.SeekStatus.FOUND) { toTermHits++; int df = toTermsEnum.docFreq(); toTermHitsTotalDf += df; if (resultBits==null && df + resultListDocs > maxSortedIntSize && resultList.size() > 0) { resultBits = new OpenBitSet(toSearcher.maxDoc()); } // if we don't have a bitset yet, or if the resulting set will be too large // use the filterCache to get a DocSet if (toTermsEnum.docFreq() >= minDocFreqTo || resultBits == null) { // use filter cache DocSet toTermSet = toSearcher.getDocSet(toDeState); resultListDocs += toTermSet.size(); if (resultBits != null) { toTermSet.setBitsOn(resultBits); } else { if (toTermSet instanceof BitDocSet) { resultBits = (OpenBitSet)((BitDocSet)toTermSet).bits.clone(); } else { resultList.add(toTermSet); } } } else { toTermDirectCount++; // need to use liveDocs here so we don't map to any deleted ones toDeState.docsEnum = toDeState.termsEnum.docs(toDeState.liveDocs, toDeState.docsEnum, false); DocsEnum docsEnum = toDeState.docsEnum; if (docsEnum instanceof MultiDocsEnum) { MultiDocsEnum.EnumWithSlice[] subs = ((MultiDocsEnum)docsEnum).getSubs(); int numSubs = ((MultiDocsEnum)docsEnum).getNumSubs(); for (int subindex = 0; subindex<numSubs; subindex++) { MultiDocsEnum.EnumWithSlice sub = subs[subindex]; if (sub.docsEnum == null) continue; int base = sub.slice.start; int docid; while ((docid = sub.docsEnum.nextDoc()) != DocIdSetIterator.NO_MORE_DOCS) { resultListDocs++; resultBits.fastSet(docid + base); } } } else { int docid; while ((docid = docsEnum.nextDoc()) != DocIdSetIterator.NO_MORE_DOCS) { resultListDocs++; resultBits.fastSet(docid); } } } } } term = termsEnum.next(); } smallSetsDeferred = resultList.size(); if (resultBits != null) { for (DocSet set : resultList) { set.setBitsOn(resultBits); } return new BitDocSet(resultBits); } if (resultList.size()==0) { return DocSet.EMPTY; } if (resultList.size() == 1) { return resultList.get(0); } int sz = 0; for (DocSet set : resultList) sz += set.size(); int[] docs = new int[sz]; int pos = 0; for (DocSet set : resultList) { System.arraycopy(((SortedIntDocSet)set).getDocs(), 0, docs, pos, set.size()); pos += set.size(); } Arrays.sort(docs); int[] dedup = new int[sz]; pos = 0; int last = -1; for (int doc : docs) { if (doc != last) dedup[pos++] = doc; last = doc; } if (pos != dedup.length) { dedup = Arrays.copyOf(dedup, pos); } return new SortedIntDocSet(dedup, dedup.length); }
// in core/src/java/org/apache/solr/search/JoinQParserPlugin.java
Override public Explanation explain(AtomicReaderContext context, int doc) throws IOException { Scorer scorer = scorer(context, true, false, context.reader().getLiveDocs()); boolean exists = scorer.advance(doc) == doc; ComplexExplanation result = new ComplexExplanation(); if (exists) { result.setDescription(this.toString() + " , product of:"); result.setValue(queryWeight); result.setMatch(Boolean.TRUE); result.addDetail(new Explanation(getBoost(), "boost")); result.addDetail(new Explanation(queryNorm,"queryNorm")); } else { result.setDescription(this.toString() + " doesn't match id " + doc); result.setValue(0); result.setMatch(Boolean.FALSE); } return result; }
// in core/src/java/org/apache/solr/search/JoinQParserPlugin.java
Override public int nextDoc() throws IOException { return iter.nextDoc(); }
// in core/src/java/org/apache/solr/search/JoinQParserPlugin.java
Override public float score() throws IOException { return score; }
// in core/src/java/org/apache/solr/search/JoinQParserPlugin.java
Override public int advance(int target) throws IOException { return iter.advance(target); }
// in core/src/java/org/apache/solr/search/function/distance/StringDistanceFunction.java
Override public FunctionValues getValues(Map context, AtomicReaderContext readerContext) throws IOException { final FunctionValues str1DV = str1.getValues(context, readerContext); final FunctionValues str2DV = str2.getValues(context, readerContext); return new FloatDocValues(this) { @Override public float floatVal(int doc) { return dist.getDistance(str1DV.strVal(doc), str2DV.strVal(doc)); } @Override public String toString(int doc) { StringBuilder sb = new StringBuilder(); sb.append("strdist").append('('); sb.append(str1DV.toString(doc)).append(',').append(str2DV.toString(doc)) .append(", dist=").append(dist.getClass().getName()); sb.append(')'); return sb.toString(); } }; }
// in core/src/java/org/apache/solr/search/function/distance/GeohashHaversineFunction.java
Override public FunctionValues getValues(Map context, AtomicReaderContext readerContext) throws IOException { final FunctionValues gh1DV = geoHash1.getValues(context, readerContext); final FunctionValues gh2DV = geoHash2.getValues(context, readerContext); return new DoubleDocValues(this) { @Override public double doubleVal(int doc) { return distance(doc, gh1DV, gh2DV); } @Override public String toString(int doc) { StringBuilder sb = new StringBuilder(); sb.append(name()).append('('); sb.append(gh1DV.toString(doc)).append(',').append(gh2DV.toString(doc)); sb.append(')'); return sb.toString(); } }; }
// in core/src/java/org/apache/solr/search/function/distance/GeohashHaversineFunction.java
Override public void createWeight(Map context, IndexSearcher searcher) throws IOException { geoHash1.createWeight(context, searcher); geoHash2.createWeight(context, searcher); }
// in core/src/java/org/apache/solr/search/function/distance/GeohashFunction.java
Override public FunctionValues getValues(Map context, AtomicReaderContext readerContext) throws IOException { final FunctionValues latDV = lat.getValues(context, readerContext); final FunctionValues lonDV = lon.getValues(context, readerContext); return new FunctionValues() { @Override public String strVal(int doc) { return GeohashUtils.encodeLatLon(latDV.doubleVal(doc), lonDV.doubleVal(doc)); } @Override public String toString(int doc) { StringBuilder sb = new StringBuilder(); sb.append(name()).append('('); sb.append(latDV.toString(doc)).append(',').append(lonDV.toString(doc)); sb.append(')'); return sb.toString(); } }; }
// in core/src/java/org/apache/solr/search/function/distance/VectorDistanceFunction.java
Override public FunctionValues getValues(Map context, AtomicReaderContext readerContext) throws IOException { final FunctionValues vals1 = source1.getValues(context, readerContext); final FunctionValues vals2 = source2.getValues(context, readerContext); return new DoubleDocValues(this) { @Override public double doubleVal(int doc) { return distance(doc, vals1, vals2); } @Override public String toString(int doc) { StringBuilder sb = new StringBuilder(); sb.append(name()).append('(').append(power).append(','); boolean firstTime = true; sb.append(vals1.toString(doc)).append(','); sb.append(vals2.toString(doc)); sb.append(')'); return sb.toString(); } }; }
// in core/src/java/org/apache/solr/search/function/distance/VectorDistanceFunction.java
Override public void createWeight(Map context, IndexSearcher searcher) throws IOException { source1.createWeight(context, searcher); source2.createWeight(context, searcher); }
// in core/src/java/org/apache/solr/search/function/distance/HaversineFunction.java
Override public FunctionValues getValues(Map context, AtomicReaderContext readerContext) throws IOException { final FunctionValues vals1 = p1.getValues(context, readerContext); final FunctionValues vals2 = p2.getValues(context, readerContext); return new DoubleDocValues(this) { @Override public double doubleVal(int doc) { return distance(doc, vals1, vals2); } @Override public String toString(int doc) { StringBuilder sb = new StringBuilder(); sb.append(name()).append('('); sb.append(vals1.toString(doc)).append(',').append(vals2.toString(doc)); sb.append(')'); return sb.toString(); } }; }
// in core/src/java/org/apache/solr/search/function/distance/HaversineFunction.java
Override public void createWeight(Map context, IndexSearcher searcher) throws IOException { p1.createWeight(context, searcher); p2.createWeight(context, searcher); }
// in core/src/java/org/apache/solr/search/function/distance/HaversineConstFunction.java
Override public FunctionValues getValues(Map context, AtomicReaderContext readerContext) throws IOException { final FunctionValues latVals = latSource.getValues(context, readerContext); final FunctionValues lonVals = lonSource.getValues(context, readerContext); final double latCenterRad = this.latCenter * DEGREES_TO_RADIANS; final double lonCenterRad = this.lonCenter * DEGREES_TO_RADIANS; final double latCenterRad_cos = this.latCenterRad_cos; return new DoubleDocValues(this) { @Override public double doubleVal(int doc) { double latRad = latVals.doubleVal(doc) * DEGREES_TO_RADIANS; double lonRad = lonVals.doubleVal(doc) * DEGREES_TO_RADIANS; double diffX = latCenterRad - latRad; double diffY = lonCenterRad - lonRad; double hsinX = Math.sin(diffX * 0.5); double hsinY = Math.sin(diffY * 0.5); double h = hsinX * hsinX + (latCenterRad_cos * Math.cos(latRad) * hsinY * hsinY); return (EARTH_MEAN_DIAMETER * Math.atan2(Math.sqrt(h), Math.sqrt(1 - h))); } @Override public String toString(int doc) { return name() + '(' + latVals.toString(doc) + ',' + lonVals.toString(doc) + ',' + latCenter + ',' + lonCenter + ')'; } }; }
// in core/src/java/org/apache/solr/search/function/distance/HaversineConstFunction.java
Override public void createWeight(Map context, IndexSearcher searcher) throws IOException { latSource.createWeight(context, searcher); lonSource.createWeight(context, searcher); }
// in core/src/java/org/apache/solr/search/function/FileFloatSource.java
Override public FunctionValues getValues(Map context, AtomicReaderContext readerContext) throws IOException { final int off = readerContext.docBase; IndexReaderContext topLevelContext = ReaderUtil.getTopLevelContext(readerContext); final float[] arr = getCachedFloats(topLevelContext.reader()); return new FloatDocValues(this) { @Override public float floatVal(int doc) { return arr[doc + off]; } @Override public Object objectVal(int doc) { return floatVal(doc); // TODO: keep track of missing values } }; }
// in core/src/java/org/apache/solr/search/function/ValueSourceRangeFilter.java
Override public DocIdSet getDocIdSet(final Map context, final AtomicReaderContext readerContext, Bits acceptDocs) throws IOException { return BitsFilteredDocIdSet.wrap(new DocIdSet() { @Override public DocIdSetIterator iterator() throws IOException { return valueSource.getValues(context, readerContext).getRangeScorer(readerContext.reader(), lowerVal, upperVal, includeLower, includeUpper); } @Override public Bits bits() throws IOException { return null; // don't use random access } }, acceptDocs); }
// in core/src/java/org/apache/solr/search/function/ValueSourceRangeFilter.java
Override public DocIdSetIterator iterator() throws IOException { return valueSource.getValues(context, readerContext).getRangeScorer(readerContext.reader(), lowerVal, upperVal, includeLower, includeUpper); }
// in core/src/java/org/apache/solr/search/function/ValueSourceRangeFilter.java
Override public Bits bits() throws IOException { return null; // don't use random access }
// in core/src/java/org/apache/solr/search/function/ValueSourceRangeFilter.java
Override public void createWeight(Map context, IndexSearcher searcher) throws IOException { valueSource.createWeight(context, searcher); }
// in core/src/java/org/apache/solr/search/SortedIntDocSet.java
Override public Filter getTopFilter() { return new Filter() { int lastEndIdx = 0; @Override public DocIdSet getDocIdSet(final AtomicReaderContext context, final Bits acceptDocs) throws IOException { AtomicReader reader = context.reader(); // all Solr DocSets that are used as filters only include live docs final Bits acceptDocs2 = acceptDocs == null ? null : (reader.getLiveDocs() == acceptDocs ? null : acceptDocs); final int base = context.docBase; final int maxDoc = reader.maxDoc(); final int max = base + maxDoc; // one past the max doc in this segment. int sidx = Math.max(0,lastEndIdx); if (sidx > 0 && docs[sidx-1] >= base) { // oops, the lastEndIdx isn't correct... we must have been used // in a multi-threaded context, or the indexreaders are being // used out-of-order. start at 0. sidx = 0; } if (sidx < docs.length && docs[sidx] < base) { // if docs[sidx] is < base, we need to seek to find the real start. sidx = findIndex(docs, base, sidx, docs.length-1); } final int startIdx = sidx; // Largest possible end index is limited to the start index // plus the number of docs contained in the segment. Subtract 1 since // the end index is inclusive. int eidx = Math.min(docs.length, startIdx + maxDoc) - 1; // find the real end eidx = findIndex(docs, max, startIdx, eidx) - 1; final int endIdx = eidx; lastEndIdx = endIdx; return BitsFilteredDocIdSet.wrap(new DocIdSet() { @Override public DocIdSetIterator iterator() throws IOException { return new DocIdSetIterator() { int idx = startIdx; int adjustedDoc = -1; @Override public int docID() { return adjustedDoc; } @Override public int nextDoc() throws IOException { return adjustedDoc = (idx > endIdx) ? NO_MORE_DOCS : (docs[idx++] - base); } @Override public int advance(int target) throws IOException { if (idx > endIdx || target==NO_MORE_DOCS) return adjustedDoc=NO_MORE_DOCS; target += base; // probe next int rawDoc = docs[idx++]; if (rawDoc >= target) return adjustedDoc=rawDoc-base; int high = endIdx; // TODO: probe more before resorting to binary search? // binary search while (idx <= high) { int mid = (idx+high) >>> 1; rawDoc = docs[mid]; if (rawDoc < target) { idx = mid+1; } else if (rawDoc > target) { high = mid-1; } else { idx=mid+1; return adjustedDoc=rawDoc - base; } } // low is on the insertion point... if (idx <= endIdx) { return adjustedDoc = docs[idx++] - base; } else { return adjustedDoc=NO_MORE_DOCS; } } }; } @Override public boolean isCacheable() { return true; } @Override public Bits bits() throws IOException { // random access is expensive for this set return null; } }, acceptDocs2); } }; }
// in core/src/java/org/apache/solr/search/SortedIntDocSet.java
Override public DocIdSet getDocIdSet(final AtomicReaderContext context, final Bits acceptDocs) throws IOException { AtomicReader reader = context.reader(); // all Solr DocSets that are used as filters only include live docs final Bits acceptDocs2 = acceptDocs == null ? null : (reader.getLiveDocs() == acceptDocs ? null : acceptDocs); final int base = context.docBase; final int maxDoc = reader.maxDoc(); final int max = base + maxDoc; // one past the max doc in this segment. int sidx = Math.max(0,lastEndIdx); if (sidx > 0 && docs[sidx-1] >= base) { // oops, the lastEndIdx isn't correct... we must have been used // in a multi-threaded context, or the indexreaders are being // used out-of-order. start at 0. sidx = 0; } if (sidx < docs.length && docs[sidx] < base) { // if docs[sidx] is < base, we need to seek to find the real start. sidx = findIndex(docs, base, sidx, docs.length-1); } final int startIdx = sidx; // Largest possible end index is limited to the start index // plus the number of docs contained in the segment. Subtract 1 since // the end index is inclusive. int eidx = Math.min(docs.length, startIdx + maxDoc) - 1; // find the real end eidx = findIndex(docs, max, startIdx, eidx) - 1; final int endIdx = eidx; lastEndIdx = endIdx; return BitsFilteredDocIdSet.wrap(new DocIdSet() { @Override public DocIdSetIterator iterator() throws IOException { return new DocIdSetIterator() { int idx = startIdx; int adjustedDoc = -1; @Override public int docID() { return adjustedDoc; } @Override public int nextDoc() throws IOException { return adjustedDoc = (idx > endIdx) ? NO_MORE_DOCS : (docs[idx++] - base); } @Override public int advance(int target) throws IOException { if (idx > endIdx || target==NO_MORE_DOCS) return adjustedDoc=NO_MORE_DOCS; target += base; // probe next int rawDoc = docs[idx++]; if (rawDoc >= target) return adjustedDoc=rawDoc-base; int high = endIdx; // TODO: probe more before resorting to binary search? // binary search while (idx <= high) { int mid = (idx+high) >>> 1; rawDoc = docs[mid]; if (rawDoc < target) { idx = mid+1; } else if (rawDoc > target) { high = mid-1; } else { idx=mid+1; return adjustedDoc=rawDoc - base; } } // low is on the insertion point... if (idx <= endIdx) { return adjustedDoc = docs[idx++] - base; } else { return adjustedDoc=NO_MORE_DOCS; } } }; } @Override public boolean isCacheable() { return true; } @Override public Bits bits() throws IOException { // random access is expensive for this set return null; } }, acceptDocs2); }
// in core/src/java/org/apache/solr/search/SortedIntDocSet.java
Override public DocIdSetIterator iterator() throws IOException { return new DocIdSetIterator() { int idx = startIdx; int adjustedDoc = -1; @Override public int docID() { return adjustedDoc; } @Override public int nextDoc() throws IOException { return adjustedDoc = (idx > endIdx) ? NO_MORE_DOCS : (docs[idx++] - base); } @Override public int advance(int target) throws IOException { if (idx > endIdx || target==NO_MORE_DOCS) return adjustedDoc=NO_MORE_DOCS; target += base; // probe next int rawDoc = docs[idx++]; if (rawDoc >= target) return adjustedDoc=rawDoc-base; int high = endIdx; // TODO: probe more before resorting to binary search? // binary search while (idx <= high) { int mid = (idx+high) >>> 1; rawDoc = docs[mid]; if (rawDoc < target) { idx = mid+1; } else if (rawDoc > target) { high = mid-1; } else { idx=mid+1; return adjustedDoc=rawDoc - base; } } // low is on the insertion point... if (idx <= endIdx) { return adjustedDoc = docs[idx++] - base; } else { return adjustedDoc=NO_MORE_DOCS; } } }; }
// in core/src/java/org/apache/solr/search/SortedIntDocSet.java
Override public int nextDoc() throws IOException { return adjustedDoc = (idx > endIdx) ? NO_MORE_DOCS : (docs[idx++] - base); }
// in core/src/java/org/apache/solr/search/SortedIntDocSet.java
Override public int advance(int target) throws IOException { if (idx > endIdx || target==NO_MORE_DOCS) return adjustedDoc=NO_MORE_DOCS; target += base; // probe next int rawDoc = docs[idx++]; if (rawDoc >= target) return adjustedDoc=rawDoc-base; int high = endIdx; // TODO: probe more before resorting to binary search? // binary search while (idx <= high) { int mid = (idx+high) >>> 1; rawDoc = docs[mid]; if (rawDoc < target) { idx = mid+1; } else if (rawDoc > target) { high = mid-1; } else { idx=mid+1; return adjustedDoc=rawDoc - base; } } // low is on the insertion point... if (idx <= endIdx) { return adjustedDoc = docs[idx++] - base; } else { return adjustedDoc=NO_MORE_DOCS; } }
// in core/src/java/org/apache/solr/search/SortedIntDocSet.java
Override public Bits bits() throws IOException { // random access is expensive for this set return null; }
// in core/src/java/org/apache/solr/search/LuceneQueryOptimizer.java
public TopDocs optimize(BooleanQuery original, SolrIndexSearcher searcher, int numHits, Query[] queryOut, Filter[] filterOut ) throws IOException { BooleanQuery query = new BooleanQuery(); BooleanQuery filterQuery = null; for (BooleanClause c : original.clauses()) { /*** System.out.println("required="+c.required); System.out.println("boost="+c.query.getBoost()); System.out.println("isTermQuery="+(c.query instanceof TermQuery)); if (c.query instanceof TermQuery) { System.out.println("term="+((TermQuery)c.query).getTerm()); System.out.println("docFreq="+searcher.docFreq(((TermQuery)c.query).getTerm())); } ***/ Query q = c.getQuery(); if (c.isRequired() // required && q.getBoost() == 0.0f // boost is zero && q instanceof TermQuery // TermQuery && (searcher.docFreq(((TermQuery)q).getTerm()) / (float)searcher.maxDoc()) >= threshold) { // check threshold if (filterQuery == null) filterQuery = new BooleanQuery(); filterQuery.add(q, BooleanClause.Occur.MUST); // filter it //System.out.println("WooHoo... qualified to be hoisted to a filter!"); } else { query.add(c); // query it } } Filter filter = null; if (filterQuery != null) { synchronized (cache) { // check cache filter = (Filter)cache.get(filterQuery); } if (filter == null) { // miss filter = new CachingWrapperFilter(new QueryWrapperFilter(filterQuery)); // construct new entry synchronized (cache) { cache.put(filterQuery, filter); // cache it } } } // YCS: added code to pass out optimized query and filter // so they can be used with Hits if (queryOut != null && filterOut != null) { queryOut[0] = query; filterOut[0] = filter; return null; } else { return searcher.search(query, filter, numHits); } }
// in core/src/java/org/apache/solr/search/MissingStringLastComparatorSource.java
Override public FieldComparator newComparator(String fieldname, int numHits, int sortPos, boolean reversed) throws IOException { return new TermOrdValComparator_SML(numHits, fieldname, sortPos, reversed, missingValueProxy); }
// in core/src/java/org/apache/solr/search/MissingStringLastComparatorSource.java
Override public FieldComparator setNextReader(AtomicReaderContext context) throws IOException { return TermOrdValComparator_SML.createComparator(context.reader(), this); }
// in core/src/java/org/apache/solr/search/MissingStringLastComparatorSource.java
Override public FieldComparator setNextReader(AtomicReaderContext context) throws IOException { return TermOrdValComparator_SML.createComparator(context.reader(), parent); }
// in core/src/java/org/apache/solr/search/MissingStringLastComparatorSource.java
public static FieldComparator createComparator(AtomicReader reader, TermOrdValComparator_SML parent) throws IOException { parent.termsIndex = FieldCache.DEFAULT.getTermsIndex(reader, parent.field); final PackedInts.Reader docToOrd = parent.termsIndex.getDocToOrd(); PerSegmentComparator perSegComp = null; if (docToOrd.hasArray()) { final Object arr = docToOrd.getArray(); if (arr instanceof byte[]) { perSegComp = new ByteOrdComparator((byte[]) arr, parent); } else if (arr instanceof short[]) { perSegComp = new ShortOrdComparator((short[]) arr, parent); } else if (arr instanceof int[]) { perSegComp = new IntOrdComparator((int[]) arr, parent); } } if (perSegComp == null) { perSegComp = new AnyOrdComparator(docToOrd, parent); } if (perSegComp.bottomSlot != -1) { perSegComp.setBottom(perSegComp.bottomSlot); } parent.current = perSegComp; return perSegComp; }
// in core/src/java/org/apache/solr/search/WrappedQuery.java
Override public Weight createWeight(IndexSearcher searcher) throws IOException { return q.createWeight(searcher); }
// in core/src/java/org/apache/solr/search/WrappedQuery.java
Override public Query rewrite(IndexReader reader) throws IOException { // currently no need to continue wrapping at this point. return q.rewrite(reader); }
// in core/src/java/org/apache/solr/search/DocSetBase.java
public Filter getTopFilter() { final OpenBitSet bs = getBits(); return new Filter() { @Override public DocIdSet getDocIdSet(final AtomicReaderContext context, Bits acceptDocs) throws IOException { AtomicReader reader = context.reader(); // all Solr DocSets that are used as filters only include live docs final Bits acceptDocs2 = acceptDocs == null ? null : (reader.getLiveDocs() == acceptDocs ? null : acceptDocs); if (context.isTopLevel) { return BitsFilteredDocIdSet.wrap(bs, acceptDocs); } final int base = context.docBase; final int maxDoc = reader.maxDoc(); final int max = base + maxDoc; // one past the max doc in this segment. return BitsFilteredDocIdSet.wrap(new DocIdSet() { @Override public DocIdSetIterator iterator() throws IOException { return new DocIdSetIterator() { int pos=base-1; int adjustedDoc=-1; @Override public int docID() { return adjustedDoc; } @Override public int nextDoc() throws IOException { pos = bs.nextSetBit(pos+1); return adjustedDoc = (pos>=0 && pos<max) ? pos-base : NO_MORE_DOCS; } @Override public int advance(int target) throws IOException { if (target==NO_MORE_DOCS) return adjustedDoc=NO_MORE_DOCS; pos = bs.nextSetBit(target+base); return adjustedDoc = (pos>=0 && pos<max) ? pos-base : NO_MORE_DOCS; } }; } @Override public boolean isCacheable() { return true; } @Override public Bits bits() throws IOException { // sparse filters should not use random access return null; } }, acceptDocs2); } }; }
// in core/src/java/org/apache/solr/search/DocSetBase.java
Override public DocIdSet getDocIdSet(final AtomicReaderContext context, Bits acceptDocs) throws IOException { AtomicReader reader = context.reader(); // all Solr DocSets that are used as filters only include live docs final Bits acceptDocs2 = acceptDocs == null ? null : (reader.getLiveDocs() == acceptDocs ? null : acceptDocs); if (context.isTopLevel) { return BitsFilteredDocIdSet.wrap(bs, acceptDocs); } final int base = context.docBase; final int maxDoc = reader.maxDoc(); final int max = base + maxDoc; // one past the max doc in this segment. return BitsFilteredDocIdSet.wrap(new DocIdSet() { @Override public DocIdSetIterator iterator() throws IOException { return new DocIdSetIterator() { int pos=base-1; int adjustedDoc=-1; @Override public int docID() { return adjustedDoc; } @Override public int nextDoc() throws IOException { pos = bs.nextSetBit(pos+1); return adjustedDoc = (pos>=0 && pos<max) ? pos-base : NO_MORE_DOCS; } @Override public int advance(int target) throws IOException { if (target==NO_MORE_DOCS) return adjustedDoc=NO_MORE_DOCS; pos = bs.nextSetBit(target+base); return adjustedDoc = (pos>=0 && pos<max) ? pos-base : NO_MORE_DOCS; } }; } @Override public boolean isCacheable() { return true; } @Override public Bits bits() throws IOException { // sparse filters should not use random access return null; } }, acceptDocs2); }
// in core/src/java/org/apache/solr/search/DocSetBase.java
Override public DocIdSetIterator iterator() throws IOException { return new DocIdSetIterator() { int pos=base-1; int adjustedDoc=-1; @Override public int docID() { return adjustedDoc; } @Override public int nextDoc() throws IOException { pos = bs.nextSetBit(pos+1); return adjustedDoc = (pos>=0 && pos<max) ? pos-base : NO_MORE_DOCS; } @Override public int advance(int target) throws IOException { if (target==NO_MORE_DOCS) return adjustedDoc=NO_MORE_DOCS; pos = bs.nextSetBit(target+base); return adjustedDoc = (pos>=0 && pos<max) ? pos-base : NO_MORE_DOCS; } }; }
// in core/src/java/org/apache/solr/search/DocSetBase.java
Override public int nextDoc() throws IOException { pos = bs.nextSetBit(pos+1); return adjustedDoc = (pos>=0 && pos<max) ? pos-base : NO_MORE_DOCS; }
// in core/src/java/org/apache/solr/search/DocSetBase.java
Override public int advance(int target) throws IOException { if (target==NO_MORE_DOCS) return adjustedDoc=NO_MORE_DOCS; pos = bs.nextSetBit(target+base); return adjustedDoc = (pos>=0 && pos<max) ? pos-base : NO_MORE_DOCS; }
// in core/src/java/org/apache/solr/search/DocSetBase.java
Override public Bits bits() throws IOException { // sparse filters should not use random access return null; }
// in core/src/java/org/apache/solr/search/DocSetCollector.java
Override public void collect(int doc) throws IOException { doc += base; // optimistically collect the first docs in an array // in case the total number will be small enough to represent // as a small set like SortedIntDocSet instead... // Storing in this array will be quicker to convert // than scanning through a potentially huge bit vector. // FUTURE: when search methods all start returning docs in order, maybe // we could have a ListDocSet() and use the collected array directly. if (pos < scratch.length) { scratch[pos]=doc; } else { // this conditional could be removed if BitSet was preallocated, but that // would take up more memory, and add more GC time... if (bits==null) bits = new OpenBitSet(maxDoc); bits.fastSet(doc); } pos++; }
// in core/src/java/org/apache/solr/search/DocSetCollector.java
Override public void setScorer(Scorer scorer) throws IOException { }
// in core/src/java/org/apache/solr/search/DocSetCollector.java
Override public void setNextReader(AtomicReaderContext context) throws IOException { this.base = context.docBase; }
// in core/src/java/org/apache/solr/search/ValueSourceParser.java
Override public FunctionValues getValues(Map context, AtomicReaderContext readerContext) throws IOException { return new LongDocValues(this) { @Override public float floatVal(int doc) { return fv; } @Override public int intVal(int doc) { return (int) constant; } @Override public long longVal(int doc) { return constant; } @Override public double doubleVal(int doc) { return dv; } @Override public String toString(int doc) { return description(); } }; }
// in core/src/java/org/apache/solr/search/ValueSourceParser.java
Override public FunctionValues getValues(Map context, AtomicReaderContext readerContext) throws IOException { final FunctionValues vals = source.getValues(context, readerContext); return new DoubleDocValues(this) { @Override public double doubleVal(int doc) { return func(doc, vals); } @Override public String toString(int doc) { return name() + '(' + vals.toString(doc) + ')'; } }; }
// in core/src/java/org/apache/solr/search/ValueSourceParser.java
Override public FunctionValues getValues(Map context, AtomicReaderContext readerContext) throws IOException { final FunctionValues aVals = a.getValues(context, readerContext); final FunctionValues bVals = b.getValues(context, readerContext); return new DoubleDocValues(this) { @Override public double doubleVal(int doc) { return func(doc, aVals, bVals); } @Override public String toString(int doc) { return name() + '(' + aVals.toString(doc) + ',' + bVals.toString(doc) + ')'; } }; }
// in core/src/java/org/apache/solr/search/ValueSourceParser.java
Override public void createWeight(Map context, IndexSearcher searcher) throws IOException { }
// in core/src/java/org/apache/solr/search/ValueSourceParser.java
Override public FunctionValues getValues(Map context, AtomicReaderContext readerContext) throws IOException { return new BoolDocValues(this) { @Override public boolean boolVal(int doc) { return constant; } }; }
// in core/src/java/org/apache/solr/search/ValueSourceParser.java
Override public FunctionValues getValues(Map context, AtomicReaderContext readerContext) throws IOException { if (context.get(this) == null) { SolrRequestInfo requestInfo = SolrRequestInfo.getRequestInfo(); throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "testfunc: unweighted value source detected. delegate="+source + " request=" + (requestInfo==null ? "null" : requestInfo.getReq())); } return source.getValues(context, readerContext); }
// in core/src/java/org/apache/solr/search/ValueSourceParser.java
Override public void createWeight(Map context, IndexSearcher searcher) throws IOException { context.put(this, this); }
// in core/src/java/org/apache/solr/search/ValueSourceParser.java
Override public SortField getSortField(boolean reverse) throws IOException { return super.getSortField(reverse); }
// in core/src/java/org/apache/solr/search/SolrFilter.java
Override public DocIdSet getDocIdSet(AtomicReaderContext context, Bits acceptDocs) throws IOException { return getDocIdSet(null, context, acceptDocs); }
// in core/src/java/org/apache/solr/search/FastLRUCache.java
public void warm(SolrIndexSearcher searcher, SolrCache old) throws IOException { if (regenerator == null) return; long warmingStartTime = System.currentTimeMillis(); FastLRUCache other = (FastLRUCache) old; // warm entries if (isAutowarmingOn()) { int sz = autowarm.getWarmCount(other.size()); Map items = other.cache.getLatestAccessedItems(sz); Map.Entry[] itemsArr = new Map.Entry[items.size()]; int counter = 0; for (Object mapEntry : items.entrySet()) { itemsArr[counter++] = (Map.Entry) mapEntry; } for (int i = itemsArr.length - 1; i >= 0; i--) { try { boolean continueRegen = regenerator.regenerateItem(searcher, this, old, itemsArr[i].getKey(), itemsArr[i].getValue()); if (!continueRegen) break; } catch (Throwable e) { SolrException.log(log, "Error during auto-warming of key:" + itemsArr[i].getKey(), e); } } } warmupTime = System.currentTimeMillis() - warmingStartTime; }
// in core/src/java/org/apache/solr/highlight/DefaultSolrHighlighter.java
protected Highlighter getPhraseHighlighter(Query query, String fieldName, SolrQueryRequest request, CachingTokenFilter tokenStream) throws IOException { SolrParams params = request.getParams(); Highlighter highlighter = null; highlighter = new Highlighter( getFormatter(fieldName, params), getEncoder(fieldName, params), getSpanQueryScorer(query, fieldName, tokenStream, request)); highlighter.setTextFragmenter(getFragmenter(fieldName, params)); return highlighter; }
// in core/src/java/org/apache/solr/highlight/DefaultSolrHighlighter.java
private QueryScorer getSpanQueryScorer(Query query, String fieldName, TokenStream tokenStream, SolrQueryRequest request) throws IOException { boolean reqFieldMatch = request.getParams().getFieldBool(fieldName, HighlightParams.FIELD_MATCH, false); Boolean highlightMultiTerm = request.getParams().getBool(HighlightParams.HIGHLIGHT_MULTI_TERM, true); if(highlightMultiTerm == null) { highlightMultiTerm = false; } QueryScorer scorer; if (reqFieldMatch) { scorer = new QueryScorer(query, fieldName); } else { scorer = new QueryScorer(query, null); } scorer.setExpandMultiTermQuery(highlightMultiTerm); return scorer; }
// in core/src/java/org/apache/solr/highlight/DefaultSolrHighlighter.java
private void doHighlightingByHighlighter( Query query, SolrQueryRequest req, NamedList docSummaries, int docId, Document doc, String fieldName ) throws IOException { final SolrIndexSearcher searcher = req.getSearcher(); final IndexSchema schema = searcher.getSchema(); // TODO: Currently in trunk highlighting numeric fields is broken (Lucene) - // so we disable them until fixed (see LUCENE-3080)! // BEGIN: Hack final SchemaField schemaField = schema.getFieldOrNull(fieldName); if (schemaField != null && ( (schemaField.getType() instanceof org.apache.solr.schema.TrieField) || (schemaField.getType() instanceof org.apache.solr.schema.TrieDateField) )) return; // END: Hack SolrParams params = req.getParams(); IndexableField[] docFields = doc.getFields(fieldName); List<String> listFields = new ArrayList<String>(); for (IndexableField field : docFields) { listFields.add(field.stringValue()); } String[] docTexts = (String[]) listFields.toArray(new String[listFields.size()]); // according to Document javadoc, doc.getValues() never returns null. check empty instead of null if (docTexts.length == 0) return; TokenStream tstream = null; int numFragments = getMaxSnippets(fieldName, params); boolean mergeContiguousFragments = isMergeContiguousFragments(fieldName, params); String[] summaries = null; List<TextFragment> frags = new ArrayList<TextFragment>(); TermOffsetsTokenStream tots = null; // to be non-null iff we're using TermOffsets optimization try { TokenStream tvStream = TokenSources.getTokenStream(searcher.getIndexReader(), docId, fieldName); if (tvStream != null) { tots = new TermOffsetsTokenStream(tvStream); } } catch (IllegalArgumentException e) { // No problem. But we can't use TermOffsets optimization. } for (int j = 0; j < docTexts.length; j++) { if( tots != null ) { // if we're using TermOffsets optimization, then get the next // field value's TokenStream (i.e. get field j's TokenStream) from tots: tstream = tots.getMultiValuedTokenStream( docTexts[j].length() ); } else { // fall back to analyzer tstream = createAnalyzerTStream(schema, fieldName, docTexts[j]); } int maxCharsToAnalyze = params.getFieldInt(fieldName, HighlightParams.MAX_CHARS, Highlighter.DEFAULT_MAX_CHARS_TO_ANALYZE); Highlighter highlighter; if (Boolean.valueOf(req.getParams().get(HighlightParams.USE_PHRASE_HIGHLIGHTER, "true"))) { if (maxCharsToAnalyze < 0) { tstream = new CachingTokenFilter(tstream); } else { tstream = new CachingTokenFilter(new OffsetLimitTokenFilter(tstream, maxCharsToAnalyze)); } // get highlighter highlighter = getPhraseHighlighter(query, fieldName, req, (CachingTokenFilter) tstream); // after highlighter initialization, reset tstream since construction of highlighter already used it tstream.reset(); } else { // use "the old way" highlighter = getHighlighter(query, fieldName, req); } if (maxCharsToAnalyze < 0) { highlighter.setMaxDocCharsToAnalyze(docTexts[j].length()); } else { highlighter.setMaxDocCharsToAnalyze(maxCharsToAnalyze); } try { TextFragment[] bestTextFragments = highlighter.getBestTextFragments(tstream, docTexts[j], mergeContiguousFragments, numFragments); for (int k = 0; k < bestTextFragments.length; k++) { if ((bestTextFragments[k] != null) && (bestTextFragments[k].getScore() > 0)) { frags.add(bestTextFragments[k]); } } } catch (InvalidTokenOffsetsException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); } } // sort such that the fragments with the highest score come first Collections.sort(frags, new Comparator<TextFragment>() { public int compare(TextFragment arg0, TextFragment arg1) { return Math.round(arg1.getScore() - arg0.getScore()); } }); // convert fragments back into text // TODO: we can include score and position information in output as snippet attributes if (frags.size() > 0) { ArrayList<String> fragTexts = new ArrayList<String>(); for (TextFragment fragment: frags) { if ((fragment != null) && (fragment.getScore() > 0)) { fragTexts.add(fragment.toString()); } if (fragTexts.size() >= numFragments) break; } summaries = fragTexts.toArray(new String[0]); if (summaries.length > 0) docSummaries.add(fieldName, summaries); } // no summeries made, copy text from alternate field if (summaries == null || summaries.length == 0) { alternateField( docSummaries, params, doc, fieldName ); } }
// in core/src/java/org/apache/solr/highlight/DefaultSolrHighlighter.java
private void doHighlightingByFastVectorHighlighter( FastVectorHighlighter highlighter, FieldQuery fieldQuery, SolrQueryRequest req, NamedList docSummaries, int docId, Document doc, String fieldName ) throws IOException { SolrParams params = req.getParams(); SolrFragmentsBuilder solrFb = getSolrFragmentsBuilder( fieldName, params ); String[] snippets = highlighter.getBestFragments( fieldQuery, req.getSearcher().getIndexReader(), docId, fieldName, params.getFieldInt( fieldName, HighlightParams.FRAGSIZE, 100 ), params.getFieldInt( fieldName, HighlightParams.SNIPPETS, 1 ), getFragListBuilder( fieldName, params ), getFragmentsBuilder( fieldName, params ), solrFb.getPreTags( params, fieldName ), solrFb.getPostTags( params, fieldName ), getEncoder( fieldName, params ) ); if( snippets != null && snippets.length > 0 ) docSummaries.add( fieldName, snippets ); else alternateField( docSummaries, params, doc, fieldName ); }
// in core/src/java/org/apache/solr/highlight/DefaultSolrHighlighter.java
private TokenStream createAnalyzerTStream(IndexSchema schema, String fieldName, String docText) throws IOException { TokenStream tstream; TokenStream ts = schema.getAnalyzer().tokenStream(fieldName, new StringReader(docText)); ts.reset(); tstream = new TokenOrderingFilter(ts, 10); return tstream; }
// in core/src/java/org/apache/solr/highlight/DefaultSolrHighlighter.java
Override public boolean incrementToken() throws IOException { while (!done && queue.size() < windowSize) { if (!input.incrementToken()) { done = true; break; } // reverse iterating for better efficiency since we know the // list is already sorted, and most token start offsets will be too. ListIterator<OrderedToken> iter = queue.listIterator(queue.size()); while(iter.hasPrevious()) { if (offsetAtt.startOffset() >= iter.previous().startOffset) { // insertion will be before what next() would return (what // we just compared against), so move back one so the insertion // will be after. iter.next(); break; } } OrderedToken ot = new OrderedToken(); ot.state = captureState(); ot.startOffset = offsetAtt.startOffset(); iter.add(ot); } if (queue.isEmpty()) { return false; } else { restoreState(queue.removeFirst().state); return true; } }
// in core/src/java/org/apache/solr/highlight/DefaultSolrHighlighter.java
Override public boolean incrementToken() throws IOException { while( true ){ if( bufferedToken == null ) { if (!bufferedTokenStream.incrementToken()) return false; bufferedToken = bufferedTokenStream.captureState(); bufferedStartOffset = bufferedOffsetAtt.startOffset(); bufferedEndOffset = bufferedOffsetAtt.endOffset(); } if( startOffset <= bufferedStartOffset && bufferedEndOffset <= endOffset ){ restoreState(bufferedToken); bufferedToken = null; offsetAtt.setOffset( offsetAtt.startOffset() - startOffset, offsetAtt.endOffset() - startOffset ); return true; } else if( bufferedEndOffset > endOffset ){ startOffset += length + 1; return false; } bufferedToken = null; } }
// in core/src/java/org/apache/solr/spelling/SpellingQueryConverter.java
protected void analyze(Collection<Token> result, Reader text, int offset) throws IOException { TokenStream stream = analyzer.tokenStream("", text); // TODO: support custom attributes CharTermAttribute termAtt = stream.addAttribute(CharTermAttribute.class); FlagsAttribute flagsAtt = stream.addAttribute(FlagsAttribute.class); TypeAttribute typeAtt = stream.addAttribute(TypeAttribute.class); PayloadAttribute payloadAtt = stream.addAttribute(PayloadAttribute.class); PositionIncrementAttribute posIncAtt = stream.addAttribute(PositionIncrementAttribute.class); OffsetAttribute offsetAtt = stream.addAttribute(OffsetAttribute.class); stream.reset(); while (stream.incrementToken()) { Token token = new Token(); token.copyBuffer(termAtt.buffer(), 0, termAtt.length()); token.setStartOffset(offset + offsetAtt.startOffset()); token.setEndOffset(offset + offsetAtt.endOffset()); token.setFlags(flagsAtt.getFlags()); token.setType(typeAtt.type()); token.setPayload(payloadAtt.getPayload()); token.setPositionIncrement(posIncAtt.getPositionIncrement()); result.add(token); } stream.end(); stream.close(); }
// in core/src/java/org/apache/solr/spelling/AbstractLuceneSpellChecker.java
Override public SpellingResult getSuggestions(SpellingOptions options) throws IOException { SpellingResult result = new SpellingResult(options.tokens); IndexReader reader = determineReader(options.reader); Term term = field != null ? new Term(field, "") : null; float theAccuracy = (options.accuracy == Float.MIN_VALUE) ? spellChecker.getAccuracy() : options.accuracy; int count = Math.max(options.count, AbstractLuceneSpellChecker.DEFAULT_SUGGESTION_COUNT); for (Token token : options.tokens) { String tokenText = new String(token.buffer(), 0, token.length()); term = new Term(field, tokenText); int docFreq = 0; if (reader != null) { docFreq = reader.docFreq(term); } String[] suggestions = spellChecker.suggestSimilar(tokenText, ((options.alternativeTermCount == null || docFreq == 0) ? count : options.alternativeTermCount), field != null ? reader : null, // workaround LUCENE-1295 field, options.suggestMode, theAccuracy); if (suggestions.length == 1 && suggestions[0].equals(tokenText) && options.alternativeTermCount == null) { // These are spelled the same, continue on continue; } // If considering alternatives to "correctly-spelled" terms, then add the // original as a viable suggestion. if (options.alternativeTermCount != null && docFreq > 0) { boolean foundOriginal = false; String[] suggestionsWithOrig = new String[suggestions.length + 1]; for (int i = 0; i < suggestions.length; i++) { if (suggestions[i].equals(tokenText)) { foundOriginal = true; break; } suggestionsWithOrig[i + 1] = suggestions[i]; } if (!foundOriginal) { suggestionsWithOrig[0] = tokenText; suggestions = suggestionsWithOrig; } } if (options.extendedResults == true && reader != null && field != null) { result.addFrequency(token, docFreq); int countLimit = Math.min(options.count, suggestions.length); if(countLimit>0) { for (int i = 0; i < countLimit; i++) { term = new Term(field, suggestions[i]); result.add(token, suggestions[i], reader.docFreq(term)); } } else { List<String> suggList = Collections.emptyList(); result.add(token, suggList); } } else { if (suggestions.length > 0) { List<String> suggList = Arrays.asList(suggestions); if (suggestions.length > options.count) { suggList = suggList.subList(0, options.count); } result.add(token, suggList); } else { List<String> suggList = Collections.emptyList(); result.add(token, suggList); } } } return result; }
// in core/src/java/org/apache/solr/spelling/AbstractLuceneSpellChecker.java
Override public void reload(SolrCore core, SolrIndexSearcher searcher) throws IOException { spellChecker.setSpellIndex(index); }
// in core/src/java/org/apache/solr/spelling/AbstractLuceneSpellChecker.java
protected void initIndex() throws IOException { if (indexDir != null) { index = FSDirectory.open(new File(indexDir)); } else { index = new RAMDirectory(); } }
// in core/src/java/org/apache/solr/spelling/suggest/Suggester.java
Override public void reload(SolrCore core, SolrIndexSearcher searcher) throws IOException { LOG.info("reload()"); if (dictionary == null && storeDir != null) { // this may be a firstSearcher event, try loading it if (lookup.load(new FileInputStream(new File(storeDir, factory.storeFileName())))) { return; // loaded ok } LOG.debug("load failed, need to build Lookup again"); } // loading was unsuccessful - build it again build(core, searcher); }
// in core/src/java/org/apache/solr/spelling/suggest/Suggester.java
Override public SpellingResult getSuggestions(SpellingOptions options) throws IOException { LOG.debug("getSuggestions: " + options.tokens); if (lookup == null) { LOG.info("Lookup is null - invoke spellchecker.build first"); return EMPTY_RESULT; } SpellingResult res = new SpellingResult(); CharsRef scratch = new CharsRef(); for (Token t : options.tokens) { scratch.chars = t.buffer(); scratch.offset = 0; scratch.length = t.length(); List<LookupResult> suggestions = lookup.lookup(scratch, (options.suggestMode == SuggestMode.SUGGEST_MORE_POPULAR), options.count); if (suggestions == null) { continue; } if (options.suggestMode != SuggestMode.SUGGEST_MORE_POPULAR) { Collections.sort(suggestions); } for (LookupResult lr : suggestions) { res.add(t, lr.key.toString(), (int)lr.value); } } return res; }
// in core/src/java/org/apache/solr/spelling/DirectSolrSpellChecker.java
Override public void reload(SolrCore core, SolrIndexSearcher searcher) throws IOException {}
// in core/src/java/org/apache/solr/spelling/DirectSolrSpellChecker.java
Override public SpellingResult getSuggestions(SpellingOptions options) throws IOException { LOG.debug("getSuggestions: " + options.tokens); SpellingResult result = new SpellingResult(); float accuracy = (options.accuracy == Float.MIN_VALUE) ? checker.getAccuracy() : options.accuracy; for (Token token : options.tokens) { String tokenText = token.toString(); Term term = new Term(field, tokenText); int freq = options.reader.docFreq(term); int count = (options.alternativeTermCount != null && freq > 0) ? options.alternativeTermCount: options.count; SuggestWord[] suggestions = checker.suggestSimilar(term, count,options.reader, options.suggestMode, accuracy); result.addFrequency(token, freq); // If considering alternatives to "correctly-spelled" terms, then add the // original as a viable suggestion. if (options.alternativeTermCount != null && freq > 0) { boolean foundOriginal = false; SuggestWord[] suggestionsWithOrig = new SuggestWord[suggestions.length + 1]; for (int i = 0; i < suggestions.length; i++) { if (suggestions[i].string.equals(tokenText)) { foundOriginal = true; break; } suggestionsWithOrig[i + 1] = suggestions[i]; } if (!foundOriginal) { SuggestWord orig = new SuggestWord(); orig.freq = freq; orig.string = tokenText; suggestionsWithOrig[0] = orig; suggestions = suggestionsWithOrig; } } if(suggestions.length==0 && freq==0) { List<String> empty = Collections.emptyList(); result.add(token, empty); } else { for (SuggestWord suggestion : suggestions) { result.add(token, suggestion.string, suggestion.freq); } } } return result; }
// in core/src/java/org/apache/solr/spelling/IndexBasedSpellChecker.java
Override public void reload(SolrCore core, SolrIndexSearcher searcher) throws IOException { super.reload(core, searcher); //reload the source initSourceReader(); }
// in core/src/java/org/apache/solr/cloud/SyncStrategy.java
private boolean syncWithReplicas(ZkController zkController, SolrCore core, ZkNodeProps props, String collection, String shardId) throws MalformedURLException, SolrServerException, IOException { List<ZkCoreNodeProps> nodes = zkController.getZkStateReader() .getReplicaProps(collection, shardId, props.get(ZkStateReader.NODE_NAME_PROP), props.get(ZkStateReader.CORE_NAME_PROP), ZkStateReader.ACTIVE); // TODO: // should // there // be a // state // filter? if (nodes == null) { // I have no replicas return true; } List<String> syncWith = new ArrayList<String>(); for (ZkCoreNodeProps node : nodes) { // if we see a leader, must be stale state, and this is the guy that went down if (!node.getNodeProps().keySet().contains(ZkStateReader.LEADER_PROP)) { syncWith.add(node.getCoreUrl()); } } PeerSync peerSync = new PeerSync(core, syncWith, core.getUpdateHandler().getUpdateLog().numRecordsToKeep); return peerSync.sync(); }
// in core/src/java/org/apache/solr/cloud/SyncStrategy.java
private void syncToMe(ZkController zkController, String collection, String shardId, ZkNodeProps leaderProps) throws MalformedURLException, SolrServerException, IOException { // sync everyone else // TODO: we should do this in parallel at least List<ZkCoreNodeProps> nodes = zkController .getZkStateReader() .getReplicaProps(collection, shardId, leaderProps.get(ZkStateReader.NODE_NAME_PROP), leaderProps.get(ZkStateReader.CORE_NAME_PROP), ZkStateReader.ACTIVE); if (nodes == null) { // System.out.println("I have no replicas"); // I have no replicas return; } //System.out.println("tell my replicas to sync"); ZkCoreNodeProps zkLeader = new ZkCoreNodeProps(leaderProps); for (ZkCoreNodeProps node : nodes) { try { // System.out // .println("try and ask " + node.getCoreUrl() + " to sync"); log.info("try and ask " + node.getCoreUrl() + " to sync"); requestSync(zkLeader.getCoreUrl(), node.getCoreName()); } catch (Exception e) { SolrException.log(log, "Error syncing replica to leader", e); } } for(;;) { ShardResponse srsp = shardHandler.takeCompletedOrError(); if (srsp == null) break; boolean success = handleResponse(srsp); //System.out.println("got response:" + success); if (!success) { try { log.info("Sync failed - asking replica to recover."); //System.out.println("Sync failed - asking replica to recover."); RequestRecovery recoverRequestCmd = new RequestRecovery(); recoverRequestCmd.setAction(CoreAdminAction.REQUESTRECOVERY); recoverRequestCmd.setCoreName(((SyncShardRequest)srsp.getShardRequest()).coreName); HttpSolrServer server = new HttpSolrServer(zkLeader.getBaseUrl()); server.request(recoverRequestCmd); } catch (Exception e) { log.info("Could not tell a replica to recover", e); } shardHandler.cancelAll(); break; } } }
// in core/src/java/org/apache/solr/cloud/ElectionContext.java
Override void runLeaderProcess(boolean weAreReplacement) throws KeeperException, InterruptedException, IOException { try { zkClient.makePath(leaderPath, leaderProps == null ? null : ZkStateReader.toJSON(leaderProps), CreateMode.EPHEMERAL, true); } catch (NodeExistsException e) { // if a previous leader ephemeral still exists for some reason, try and // remove it zkClient.delete(leaderPath, -1, true); zkClient.makePath(leaderPath, leaderProps == null ? null : ZkStateReader.toJSON(leaderProps), CreateMode.EPHEMERAL, true); } }
// in core/src/java/org/apache/solr/cloud/ElectionContext.java
Override void runLeaderProcess(boolean weAreReplacement) throws KeeperException, InterruptedException, IOException { if (cc != null) { String coreName = leaderProps.get(ZkStateReader.CORE_NAME_PROP); SolrCore core = null; try { // the first time we are run, we will get a startupCore - after // we will get null and must use cc.getCore core = cc.getCore(coreName); if (core == null) { cancelElection(); throw new SolrException(ErrorCode.SERVER_ERROR, "Fatal Error, SolrCore not found:" + coreName + " in " + cc.getCoreNames()); } // should I be leader? if (weAreReplacement && !shouldIBeLeader(leaderProps)) { // System.out.println("there is a better leader candidate it appears"); rejoinLeaderElection(leaderSeqPath, core); return; } if (weAreReplacement) { if (zkClient.exists(leaderPath, true)) { zkClient.delete(leaderPath, -1, true); } // System.out.println("I may be the new Leader:" + leaderPath // + " - I need to try and sync"); boolean success = syncStrategy.sync(zkController, core, leaderProps); if (!success && anyoneElseActive()) { rejoinLeaderElection(leaderSeqPath, core); return; } } // If I am going to be the leader I have to be active // System.out.println("I am leader go active"); core.getUpdateHandler().getSolrCoreState().cancelRecovery(); zkController.publish(core.getCoreDescriptor(), ZkStateReader.ACTIVE); } finally { if (core != null ) { core.close(); } } } super.runLeaderProcess(weAreReplacement); }
// in core/src/java/org/apache/solr/cloud/ElectionContext.java
private void rejoinLeaderElection(String leaderSeqPath, SolrCore core) throws InterruptedException, KeeperException, IOException { // remove our ephemeral and re join the election // System.out.println("sync failed, delete our election node:" // + leaderSeqPath); zkController.publish(core.getCoreDescriptor(), ZkStateReader.DOWN); cancelElection(); core.getUpdateHandler().getSolrCoreState().doRecovery(cc, core.getName()); leaderElector.joinElection(this); }
// in core/src/java/org/apache/solr/cloud/SolrZkServer.java
Override public void parseProperties(Properties zkProp) throws IOException, ConfigException { for (Entry<Object, Object> entry : zkProp.entrySet()) { String key = entry.getKey().toString().trim(); String value = entry.getValue().toString().trim(); if (key.equals("dataDir")) { dataDir = value; } else if (key.equals("dataLogDir")) { dataLogDir = value; } else if (key.equals("clientPort")) { setClientPort(Integer.parseInt(value)); } else if (key.equals("tickTime")) { tickTime = Integer.parseInt(value); } else if (key.equals("initLimit")) { initLimit = Integer.parseInt(value); } else if (key.equals("syncLimit")) { syncLimit = Integer.parseInt(value); } else if (key.equals("electionAlg")) { electionAlg = Integer.parseInt(value); } else if (key.equals("maxClientCnxns")) { maxClientCnxns = Integer.parseInt(value); } else if (key.startsWith("server.")) { int dot = key.indexOf('.'); long sid = Long.parseLong(key.substring(dot + 1)); String parts[] = value.split(":"); if ((parts.length != 2) && (parts.length != 3)) { LOG.error(value + " does not have the form host:port or host:port:port"); } InetSocketAddress addr = new InetSocketAddress(parts[0], Integer.parseInt(parts[1])); if (parts.length == 2) { servers.put(Long.valueOf(sid), new QuorumPeer.QuorumServer(sid, addr)); } else if (parts.length == 3) { InetSocketAddress electionAddr = new InetSocketAddress( parts[0], Integer.parseInt(parts[2])); servers.put(Long.valueOf(sid), new QuorumPeer.QuorumServer(sid, addr, electionAddr)); } } else if (key.startsWith("group")) { int dot = key.indexOf('.'); long gid = Long.parseLong(key.substring(dot + 1)); numGroups++; String parts[] = value.split(":"); for(String s : parts){ long sid = Long.parseLong(s); if(serverGroup.containsKey(sid)) throw new ConfigException("Server " + sid + "is in multiple groups"); else serverGroup.put(sid, gid); } } else if(key.startsWith("weight")) { int dot = key.indexOf('.'); long sid = Long.parseLong(key.substring(dot + 1)); serverWeight.put(sid, Long.parseLong(value)); } else { System.setProperty("zookeeper." + key, value); } } if (dataDir == null) { throw new IllegalArgumentException("dataDir is not set"); } if (dataLogDir == null) { dataLogDir = dataDir; } else { if (!new File(dataLogDir).isDirectory()) { throw new IllegalArgumentException("dataLogDir " + dataLogDir + " is missing."); } } if (tickTime == 0) { throw new IllegalArgumentException("tickTime is not set"); } if (servers.size() > 1) { if (initLimit == 0) { throw new IllegalArgumentException("initLimit is not set"); } if (syncLimit == 0) { throw new IllegalArgumentException("syncLimit is not set"); } /* * If using FLE, then every server requires a separate election * port. */ if (electionAlg != 0) { for (QuorumPeer.QuorumServer s : servers.values()) { if (s.electionAddr == null) throw new IllegalArgumentException( "Missing election port for server: " + s.id); } } /* * Default of quorum config is majority */ if(serverGroup.size() > 0){ if(servers.size() != serverGroup.size()) throw new ConfigException("Every server must be in exactly one group"); /* * The deafult weight of a server is 1 */ for(QuorumPeer.QuorumServer s : servers.values()){ if(!serverWeight.containsKey(s.id)) serverWeight.put(s.id, (long) 1); } /* * Set the quorumVerifier to be QuorumHierarchical */ quorumVerifier = new QuorumHierarchical(numGroups, serverWeight, serverGroup); } else { /* * The default QuorumVerifier is QuorumMaj */ LOG.info("Defaulting to majority quorums"); quorumVerifier = new QuorumMaj(servers.size()); } File myIdFile = new File(dataDir, "myid"); if (!myIdFile.exists()) { ///////////////// ADDED FOR SOLR ////// Long myid = getMySeverId(); if (myid != null) { serverId = myid; return; } if (zkRun == null) return; //////////////// END ADDED FOR SOLR ////// throw new IllegalArgumentException(myIdFile.toString() + " file is missing"); } BufferedReader br = new BufferedReader(new FileReader(myIdFile)); String myIdString; try { myIdString = br.readLine(); } finally { br.close(); } try { serverId = Long.parseLong(myIdString); } catch (NumberFormatException e) { throw new IllegalArgumentException("serverid " + myIdString + " is not a number"); } } }
// in core/src/java/org/apache/solr/cloud/RecoveryStrategy.java
private void replicate(String nodeName, SolrCore core, ZkNodeProps leaderprops, String baseUrl) throws SolrServerException, IOException { String leaderBaseUrl = leaderprops.get(ZkStateReader.BASE_URL_PROP); ZkCoreNodeProps leaderCNodeProps = new ZkCoreNodeProps(leaderprops); String leaderUrl = leaderCNodeProps.getCoreUrl(); log.info("Attempting to replicate from " + leaderUrl); // if we are the leader, either we are trying to recover faster // then our ephemeral timed out or we are the only node if (!leaderBaseUrl.equals(baseUrl)) { // send commit commitOnLeader(leaderUrl); // use rep handler directly, so we can do this sync rather than async SolrRequestHandler handler = core.getRequestHandler(REPLICATION_HANDLER); if (handler instanceof LazyRequestHandlerWrapper) { handler = ((LazyRequestHandlerWrapper)handler).getWrappedHandler(); } ReplicationHandler replicationHandler = (ReplicationHandler) handler; if (replicationHandler == null) { throw new SolrException(ErrorCode.SERVICE_UNAVAILABLE, "Skipping recovery, no " + REPLICATION_HANDLER + " handler found"); } ModifiableSolrParams solrParams = new ModifiableSolrParams(); solrParams.set(ReplicationHandler.MASTER_URL, leaderUrl + "replication"); if (isClosed()) retries = INTERRUPTED; boolean success = replicationHandler.doFetch(solrParams, true); // TODO: look into making sure force=true does not download files we already have if (!success) { throw new SolrException(ErrorCode.SERVER_ERROR, "Replication for recovery failed."); } // solrcloud_debug // try { // RefCounted<SolrIndexSearcher> searchHolder = core.getNewestSearcher(false); // SolrIndexSearcher searcher = searchHolder.get(); // try { // System.out.println(core.getCoreDescriptor().getCoreContainer().getZkController().getNodeName() + " replicated " // + searcher.search(new MatchAllDocsQuery(), 1).totalHits + " from " + leaderUrl + " gen:" + core.getDeletionPolicy().getLatestCommit().getGeneration() + " data:" + core.getDataDir()); // } finally { // searchHolder.decref(); // } // } catch (Exception e) { // // } } }
// in core/src/java/org/apache/solr/cloud/RecoveryStrategy.java
private void commitOnLeader(String leaderUrl) throws MalformedURLException, SolrServerException, IOException { HttpSolrServer server = new HttpSolrServer(leaderUrl); server.setConnectionTimeout(30000); server.setSoTimeout(30000); UpdateRequest ureq = new UpdateRequest(); ureq.setParams(new ModifiableSolrParams()); ureq.getParams().set(DistributedUpdateProcessor.COMMIT_END_POINT, true); ureq.setAction(AbstractUpdateRequest.ACTION.COMMIT, false, true).process( server); server.shutdown(); }
// in core/src/java/org/apache/solr/cloud/RecoveryStrategy.java
private void sendPrepRecoveryCmd(String leaderBaseUrl, String leaderCoreName) throws MalformedURLException, SolrServerException, IOException { HttpSolrServer server = new HttpSolrServer(leaderBaseUrl); server.setConnectionTimeout(45000); server.setSoTimeout(45000); WaitForState prepCmd = new WaitForState(); prepCmd.setCoreName(leaderCoreName); prepCmd.setNodeName(zkController.getNodeName()); prepCmd.setCoreNodeName(coreZkNodeName); prepCmd.setState(ZkStateReader.RECOVERING); prepCmd.setCheckLive(true); prepCmd.setPauseFor(6000); server.request(prepCmd); server.shutdown(); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
private String getHostAddress(String host) throws IOException { if (host == null) { host = "http://" + InetAddress.getLocalHost().getHostName(); } else { Matcher m = URL_PREFIX.matcher(host); if (m.matches()) { String prefix = m.group(1); host = prefix + host; } else { host = "http://" + host; } } return host; }
// in core/src/java/org/apache/solr/cloud/ZkController.java
public String readConfigName(String collection) throws KeeperException, InterruptedException, IOException { String configName = null; String path = ZkStateReader.COLLECTIONS_ZKNODE + "/" + collection; if (log.isInfoEnabled()) { log.info("Load collection config from:" + path); } byte[] data = zkClient.getData(path, null, null, true); if(data != null) { ZkNodeProps props = ZkNodeProps.load(data); configName = props.get(CONFIGNAME_PROP); } if (configName != null && !zkClient.exists(CONFIGS_ZKNODE + "/" + configName, true)) { log.error("Specified config does not exist in ZooKeeper:" + configName); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "Specified config does not exist in ZooKeeper:" + configName); } return configName; }
// in core/src/java/org/apache/solr/cloud/ZkController.java
private void joinElection(CoreDescriptor cd) throws InterruptedException, KeeperException, IOException { String shardId = cd.getCloudDescriptor().getShardId(); Map<String,String> props = new HashMap<String,String>(); // we only put a subset of props into the leader node props.put(ZkStateReader.BASE_URL_PROP, getBaseUrl()); props.put(ZkStateReader.CORE_NAME_PROP, cd.getName()); props.put(ZkStateReader.NODE_NAME_PROP, getNodeName()); final String coreZkNodeName = getNodeName() + "_" + cd.getName(); ZkNodeProps ourProps = new ZkNodeProps(props); String collection = cd.getCloudDescriptor() .getCollectionName(); ElectionContext context = new ShardLeaderElectionContext(leaderElector, shardId, collection, coreZkNodeName, ourProps, this, cc); leaderElector.setup(context); electionContexts.put(coreZkNodeName, context); leaderElector.joinElection(context); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
private boolean checkRecovery(String coreName, final CoreDescriptor desc, boolean recoverReloadedCores, final boolean isLeader, final CloudDescriptor cloudDesc, final String collection, final String shardZkNodeName, String shardId, ZkNodeProps leaderProps, SolrCore core, CoreContainer cc) throws InterruptedException, KeeperException, IOException, ExecutionException { if (SKIP_AUTO_RECOVERY) { log.warn("Skipping recovery according to sys prop solrcloud.skip.autorecovery"); return false; } boolean doRecovery = true; if (!isLeader) { if (core.isReloaded() && !recoverReloadedCores) { doRecovery = false; } if (doRecovery) { log.info("Core needs to recover:" + core.getName()); core.getUpdateHandler().getSolrCoreState().doRecovery(cc, coreName); return true; } } else { log.info("I am the leader, no recovery necessary"); } return false; }
// in core/src/java/org/apache/solr/cloud/ZkController.java
public void uploadToZK(File dir, String zkPath) throws IOException, KeeperException, InterruptedException { uploadToZK(zkClient, dir, zkPath); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
public void uploadConfigDir(File dir, String configName) throws IOException, KeeperException, InterruptedException { uploadToZK(zkClient, dir, ZkController.CONFIGS_ZKNODE + "/" + configName); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
public void createCollectionZkNode(CloudDescriptor cd) throws KeeperException, InterruptedException, IOException { String collection = cd.getCollectionName(); log.info("Check for collection zkNode:" + collection); String collectionPath = ZkStateReader.COLLECTIONS_ZKNODE + "/" + collection; try { if(!zkClient.exists(collectionPath, true)) { log.info("Creating collection in ZooKeeper:" + collection); SolrParams params = cd.getParams(); try { Map<String,String> collectionProps = new HashMap<String,String>(); // TODO: if collection.configName isn't set, and there isn't already a conf in zk, just use that? String defaultConfigName = System.getProperty(COLLECTION_PARAM_PREFIX+CONFIGNAME_PROP, collection); // params passed in - currently only done via core admin (create core commmand). if (params != null) { Iterator<String> iter = params.getParameterNamesIterator(); while (iter.hasNext()) { String paramName = iter.next(); if (paramName.startsWith(COLLECTION_PARAM_PREFIX)) { collectionProps.put(paramName.substring(COLLECTION_PARAM_PREFIX.length()), params.get(paramName)); } } // if the config name wasn't passed in, use the default if (!collectionProps.containsKey(CONFIGNAME_PROP)) getConfName(collection, collectionPath, collectionProps); } else if(System.getProperty("bootstrap_confdir") != null) { // if we are bootstrapping a collection, default the config for // a new collection to the collection we are bootstrapping log.info("Setting config for collection:" + collection + " to " + defaultConfigName); Properties sysProps = System.getProperties(); for (String sprop : System.getProperties().stringPropertyNames()) { if (sprop.startsWith(COLLECTION_PARAM_PREFIX)) { collectionProps.put(sprop.substring(COLLECTION_PARAM_PREFIX.length()), sysProps.getProperty(sprop)); } } // if the config name wasn't passed in, use the default if (!collectionProps.containsKey(CONFIGNAME_PROP)) collectionProps.put(CONFIGNAME_PROP, defaultConfigName); } else if (Boolean.getBoolean("bootstrap_conf")) { // the conf name should should be the collection name of this core collectionProps.put(CONFIGNAME_PROP, cd.getCollectionName()); } else { getConfName(collection, collectionPath, collectionProps); } ZkNodeProps zkProps = new ZkNodeProps(collectionProps); zkClient.makePath(collectionPath, ZkStateReader.toJSON(zkProps), CreateMode.PERSISTENT, null, true); // ping that there is a new collection zkClient.setData(ZkStateReader.COLLECTIONS_ZKNODE, (byte[])null, true); } catch (KeeperException e) { // its okay if the node already exists if (e.code() != KeeperException.Code.NODEEXISTS) { throw e; } } } else { log.info("Collection zkNode exists"); } } catch (KeeperException e) { // its okay if another beats us creating the node if (e.code() != KeeperException.Code.NODEEXISTS) { throw e; } } }
// in core/src/java/org/apache/solr/cloud/ZkController.java
public static void uploadToZK(SolrZkClient zkClient, File dir, String zkPath) throws IOException, KeeperException, InterruptedException { File[] files = dir.listFiles(); if (files == null) { throw new IllegalArgumentException("Illegal directory: " + dir); } for(File file : files) { if (!file.getName().startsWith(".")) { if (!file.isDirectory()) { zkClient.makePath(zkPath + "/" + file.getName(), file, false, true); } else { uploadToZK(zkClient, file, zkPath + "/" + file.getName()); } } } }
// in core/src/java/org/apache/solr/cloud/ZkController.java
public static void uploadConfigDir(SolrZkClient zkClient, File dir, String configName) throws IOException, KeeperException, InterruptedException { uploadToZK(zkClient, dir, ZkController.CONFIGS_ZKNODE + "/" + configName); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
public static void bootstrapConf(SolrZkClient zkClient, Config cfg, String solrHome) throws IOException, KeeperException, InterruptedException { NodeList nodes = (NodeList)cfg.evaluate("solr/cores/core", XPathConstants.NODESET); for (int i=0; i<nodes.getLength(); i++) { Node node = nodes.item(i); String rawName = DOMUtil.getAttr(node, "name", null); String instanceDir = DOMUtil.getAttr(node, "instanceDir", null); File idir = new File(instanceDir); if (!idir.isAbsolute()) { idir = new File(solrHome, instanceDir); } String confName = DOMUtil.getAttr(node, "collection", null); if (confName == null) { confName = rawName; } ZkController.uploadConfigDir(zkClient, new File(idir, "conf"), confName); } }
// in core/src/java/org/apache/solr/cloud/LeaderElector.java
private void checkIfIamLeader(final int seq, final ElectionContext context, boolean replacement) throws KeeperException, InterruptedException, IOException { // get all other numbers... final String holdElectionPath = context.electionPath + ELECTION_NODE; List<String> seqs = zkClient.getChildren(holdElectionPath, null, true); sortSeqs(seqs); List<Integer> intSeqs = getSeqs(seqs); if (seq <= intSeqs.get(0)) { runIamLeaderProcess(context, replacement); } else { // I am not the leader - watch the node below me int i = 1; for (; i < intSeqs.size(); i++) { int s = intSeqs.get(i); if (seq < s) { // we found who we come before - watch the guy in front break; } } int index = i - 2; if (index < 0) { log.warn("Our node is no longer in line to be leader"); return; } try { zkClient.getData(holdElectionPath + "/" + seqs.get(index), new Watcher() { @Override public void process(WatchedEvent event) { // am I the next leader? try { checkIfIamLeader(seq, context, true); } catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.warn("", e); } catch (IOException e) { log.warn("", e); } catch (Exception e) { log.warn("", e); } } }, null, true); } catch (KeeperException.SessionExpiredException e) { throw e; } catch (KeeperException e) { // we couldn't set our watch - the node before us may already be down? // we need to check if we are the leader again checkIfIamLeader(seq, context, true); } }
// in core/src/java/org/apache/solr/cloud/LeaderElector.java
protected void runIamLeaderProcess(final ElectionContext context, boolean weAreReplacement) throws KeeperException, InterruptedException, IOException { context.runLeaderProcess(weAreReplacement); }
// in core/src/java/org/apache/solr/cloud/LeaderElector.java
public int joinElection(ElectionContext context) throws KeeperException, InterruptedException, IOException { final String shardsElectZkPath = context.electionPath + LeaderElector.ELECTION_NODE; long sessionId = zkClient.getSolrZooKeeper().getSessionId(); String id = sessionId + "-" + context.id; String leaderSeqPath = null; boolean cont = true; int tries = 0; while (cont) { try { leaderSeqPath = zkClient.create(shardsElectZkPath + "/" + id + "-n_", null, CreateMode.EPHEMERAL_SEQUENTIAL, false); context.leaderSeqPath = leaderSeqPath; cont = false; } catch (ConnectionLossException e) { // we don't know if we made our node or not... List<String> entries = zkClient.getChildren(shardsElectZkPath, null, true); boolean foundId = false; for (String entry : entries) { String nodeId = getNodeId(entry); if (id.equals(nodeId)) { // we did create our node... foundId = true; break; } } if (!foundId) { throw e; } } catch (KeeperException.NoNodeException e) { // we must have failed in creating the election node - someone else must // be working on it, lets try again if (tries++ > 9) { throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } cont = true; Thread.sleep(50); } } int seq = getSeq(leaderSeqPath); checkIfIamLeader(seq, context, false); return seq; }
// in core/src/java/org/apache/solr/update/DirectUpdateHandler2.java
private void deleteAll() throws IOException { SolrCore.log.info(core.getLogId()+"REMOVING ALL DOCUMENTS FROM INDEX"); solrCoreState.getIndexWriter(core).deleteAll(); }
// in core/src/java/org/apache/solr/update/DirectUpdateHandler2.java
protected void rollbackWriter() throws IOException { numDocsPending.set(0); solrCoreState.rollbackIndexWriter(core); }
// in core/src/java/org/apache/solr/update/DirectUpdateHandler2.java
Override public int addDoc(AddUpdateCommand cmd) throws IOException { IndexWriter writer = solrCoreState.getIndexWriter(core); addCommands.incrementAndGet(); addCommandsCumulative.incrementAndGet(); int rc=-1; // if there is no ID field, don't overwrite if( idField == null ) { cmd.overwrite = false; } try { if (cmd.overwrite) { Term updateTerm; Term idTerm = new Term(idField.getName(), cmd.getIndexedId()); boolean del = false; if (cmd.updateTerm == null) { updateTerm = idTerm; } else { del = true; updateTerm = cmd.updateTerm; } Document luceneDocument = cmd.getLuceneDocument(); // SolrCore.verbose("updateDocument",updateTerm,luceneDocument,writer); writer.updateDocument(updateTerm, luceneDocument); // SolrCore.verbose("updateDocument",updateTerm,"DONE"); if(del) { // ensure id remains unique BooleanQuery bq = new BooleanQuery(); bq.add(new BooleanClause(new TermQuery(updateTerm), Occur.MUST_NOT)); bq.add(new BooleanClause(new TermQuery(idTerm), Occur.MUST)); writer.deleteDocuments(bq); } } else { // allow duplicates writer.addDocument(cmd.getLuceneDocument()); } // Add to the transaction log *after* successfully adding to the index, if there was no error. // This ordering ensures that if we log it, it's definitely been added to the the index. // This also ensures that if a commit sneaks in-between, that we know everything in a particular // log version was definitely committed. if (ulog != null) ulog.add(cmd); if ((cmd.getFlags() & UpdateCommand.IGNORE_AUTOCOMMIT) == 0) { commitTracker.addedDocument( -1 ); softCommitTracker.addedDocument( cmd.commitWithin ); } rc = 1; } finally { if (rc!=1) { numErrors.incrementAndGet(); numErrorsCumulative.incrementAndGet(); } else { numDocsPending.incrementAndGet(); } } return rc; }
// in core/src/java/org/apache/solr/update/DirectUpdateHandler2.java
Override public void delete(DeleteUpdateCommand cmd) throws IOException { deleteByIdCommands.incrementAndGet(); deleteByIdCommandsCumulative.incrementAndGet(); IndexWriter writer = solrCoreState.getIndexWriter(core); Term deleteTerm = new Term(idField.getName(), cmd.getIndexedId()); // SolrCore.verbose("deleteDocuments",deleteTerm,writer); writer.deleteDocuments(deleteTerm); // SolrCore.verbose("deleteDocuments",deleteTerm,"DONE"); if (ulog != null) ulog.delete(cmd); updateDeleteTrackers(cmd); }
// in core/src/java/org/apache/solr/update/DirectUpdateHandler2.java
Override public void deleteByQuery(DeleteUpdateCommand cmd) throws IOException { deleteByQueryCommands.incrementAndGet(); deleteByQueryCommandsCumulative.incrementAndGet(); boolean madeIt=false; try { Query q; try { // TODO: move this higher in the stack? QParser parser = QParser.getParser(cmd.query, "lucene", cmd.req); q = parser.getQuery(); q = QueryUtils.makeQueryable(q); // peer-sync can cause older deleteByQueries to be executed and could // delete newer documents. We prevent this by adding a clause restricting // version. if ((cmd.getFlags() & UpdateCommand.PEER_SYNC) != 0) { BooleanQuery bq = new BooleanQuery(); bq.add(q, Occur.MUST); SchemaField sf = core.getSchema().getField(VersionInfo.VERSION_FIELD); ValueSource vs = sf.getType().getValueSource(sf, null); ValueSourceRangeFilter filt = new ValueSourceRangeFilter(vs, null, Long.toString(Math.abs(cmd.version)), true, true); FunctionRangeQuery range = new FunctionRangeQuery(filt); bq.add(range, Occur.MUST); q = bq; } } catch (ParseException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); } boolean delAll = MatchAllDocsQuery.class == q.getClass(); // // synchronized to prevent deleteByQuery from running during the "open new searcher" // part of a commit. DBQ needs to signal that a fresh reader will be needed for // a realtime view of the index. When a new searcher is opened after a DBQ, that // flag can be cleared. If those thing happen concurrently, it's not thread safe. // synchronized (this) { if (delAll) { deleteAll(); } else { solrCoreState.getIndexWriter(core).deleteDocuments(q); } if (ulog != null) ulog.deleteByQuery(cmd); } madeIt = true; updateDeleteTrackers(cmd); } finally { if (!madeIt) { numErrors.incrementAndGet(); numErrorsCumulative.incrementAndGet(); } } }
// in core/src/java/org/apache/solr/update/DirectUpdateHandler2.java
Override public int mergeIndexes(MergeIndexesCommand cmd) throws IOException { mergeIndexesCommands.incrementAndGet(); int rc; log.info("start " + cmd); IndexReader[] readers = cmd.readers; if (readers != null && readers.length > 0) { solrCoreState.getIndexWriter(core).addIndexes(readers); rc = 1; } else { rc = 0; } log.info("end_mergeIndexes"); // TODO: consider soft commit issues if (rc == 1 && commitTracker.getTimeUpperBound() > 0) { commitTracker.scheduleCommitWithin(commitTracker.getTimeUpperBound()); } else if (rc == 1 && softCommitTracker.getTimeUpperBound() > 0) { softCommitTracker.scheduleCommitWithin(softCommitTracker.getTimeUpperBound()); } return rc; }
// in core/src/java/org/apache/solr/update/DirectUpdateHandler2.java
public void prepareCommit(CommitUpdateCommand cmd) throws IOException { boolean error=true; try { log.info("start "+cmd); IndexWriter writer = solrCoreState.getIndexWriter(core); writer.prepareCommit(); log.info("end_prepareCommit"); error=false; } finally { if (error) numErrors.incrementAndGet(); } }
// in core/src/java/org/apache/solr/update/DirectUpdateHandler2.java
Override public void commit(CommitUpdateCommand cmd) throws IOException { if (cmd.prepareCommit) { prepareCommit(cmd); return; } IndexWriter writer = solrCoreState.getIndexWriter(core); if (cmd.optimize) { optimizeCommands.incrementAndGet(); } else { commitCommands.incrementAndGet(); if (cmd.expungeDeletes) expungeDeleteCommands.incrementAndGet(); } Future[] waitSearcher = null; if (cmd.waitSearcher) { waitSearcher = new Future[1]; } boolean error=true; try { // only allow one hard commit to proceed at once if (!cmd.softCommit) { commitLock.lock(); } log.info("start "+cmd); // We must cancel pending commits *before* we actually execute the commit. if (cmd.openSearcher) { // we can cancel any pending soft commits if this commit will open a new searcher softCommitTracker.cancelPendingCommit(); } if (!cmd.softCommit && (cmd.openSearcher || !commitTracker.getOpenSearcher())) { // cancel a pending hard commit if this commit is of equal or greater "strength"... // If the autoCommit has openSearcher=true, then this commit must have openSearcher=true // to cancel. commitTracker.cancelPendingCommit(); } if (cmd.optimize) { writer.forceMerge(cmd.maxOptimizeSegments); } else if (cmd.expungeDeletes) { writer.forceMergeDeletes(); } if (!cmd.softCommit) { synchronized (this) { // sync is currently needed to prevent preCommit from being called between preSoft and postSoft... see postSoft comments. if (ulog != null) ulog.preCommit(cmd); } // SolrCore.verbose("writer.commit() start writer=",writer); final Map<String,String> commitData = new HashMap<String,String>(); commitData.put(SolrIndexWriter.COMMIT_TIME_MSEC_KEY, String.valueOf(System.currentTimeMillis())); writer.commit(commitData); // SolrCore.verbose("writer.commit() end"); numDocsPending.set(0); callPostCommitCallbacks(); } else { callPostSoftCommitCallbacks(); } if (cmd.optimize) { callPostOptimizeCallbacks(); } if (cmd.softCommit) { // ulog.preSoftCommit(); synchronized (this) { if (ulog != null) ulog.preSoftCommit(cmd); core.getSearcher(true, false, waitSearcher, true); if (ulog != null) ulog.postSoftCommit(cmd); } // ulog.postSoftCommit(); } else { synchronized (this) { if (ulog != null) ulog.preSoftCommit(cmd); if (cmd.openSearcher) { core.getSearcher(true, false, waitSearcher); } else { // force open a new realtime searcher so realtime-get and versioning code can see the latest RefCounted<SolrIndexSearcher> searchHolder = core.openNewSearcher(true, true); searchHolder.decref(); } if (ulog != null) ulog.postSoftCommit(cmd); } if (ulog != null) ulog.postCommit(cmd); // postCommit currently means new searcher has // also been opened } // reset commit tracking if (cmd.softCommit) { softCommitTracker.didCommit(); } else { commitTracker.didCommit(); } log.info("end_commit_flush"); error=false; } finally { if (!cmd.softCommit) { commitLock.unlock(); } addCommands.set(0); deleteByIdCommands.set(0); deleteByQueryCommands.set(0); if (error) numErrors.incrementAndGet(); } // if we are supposed to wait for the searcher to be registered, then we should do it // outside any synchronized block so that other update operations can proceed. if (waitSearcher!=null && waitSearcher[0] != null) { try { waitSearcher[0].get(); } catch (InterruptedException e) { SolrException.log(log,e); } catch (ExecutionException e) { SolrException.log(log,e); } } }
// in core/src/java/org/apache/solr/update/DirectUpdateHandler2.java
Override public void newIndexWriter() throws IOException { solrCoreState.newIndexWriter(core); }
// in core/src/java/org/apache/solr/update/DirectUpdateHandler2.java
Override public void rollback(RollbackUpdateCommand cmd) throws IOException { rollbackCommands.incrementAndGet(); boolean error=true; try { log.info("start "+cmd); rollbackWriter(); //callPostRollbackCallbacks(); // reset commit tracking commitTracker.didRollback(); softCommitTracker.didRollback(); log.info("end_rollback"); error=false; } finally { addCommandsCumulative.set( addCommandsCumulative.get() - addCommands.getAndSet( 0 ) ); deleteByIdCommandsCumulative.set( deleteByIdCommandsCumulative.get() - deleteByIdCommands.getAndSet( 0 ) ); deleteByQueryCommandsCumulative.set( deleteByQueryCommandsCumulative.get() - deleteByQueryCommands.getAndSet( 0 ) ); if (error) numErrors.incrementAndGet(); } }
// in core/src/java/org/apache/solr/update/DirectUpdateHandler2.java
Override public void close() throws IOException { log.info("closing " + this); commitTracker.close(); softCommitTracker.close(); numDocsPending.set(0); solrCoreState.decref(this); }
// in core/src/java/org/apache/solr/update/DirectUpdateHandler2.java
Override public void closeWriter(IndexWriter writer) throws IOException { boolean clearRequestInfo = false; commitLock.lock(); try { SolrQueryRequest req = new LocalSolrQueryRequest(core, new ModifiableSolrParams()); SolrQueryResponse rsp = new SolrQueryResponse(); if (SolrRequestInfo.getRequestInfo() == null) { clearRequestInfo = true; SolrRequestInfo.setRequestInfo(new SolrRequestInfo(req, rsp)); // important for debugging } if (!commitOnClose) { if (writer != null) { writer.rollback(); } // we shouldn't close the transaction logs either, but leaving them open // means we can't delete them on windows (needed for tests) if (ulog != null) ulog.close(false); return; } // do a commit before we quit? boolean tryToCommit = writer != null && ulog != null && ulog.hasUncommittedChanges() && ulog.getState() == UpdateLog.State.ACTIVE; try { if (tryToCommit) { CommitUpdateCommand cmd = new CommitUpdateCommand(req, false); cmd.openSearcher = false; cmd.waitSearcher = false; cmd.softCommit = false; // TODO: keep other commit callbacks from being called? // this.commit(cmd); // too many test failures using this method... is it because of callbacks? synchronized (this) { ulog.preCommit(cmd); } // todo: refactor this shared code (or figure out why a real CommitUpdateCommand can't be used) final Map<String,String> commitData = new HashMap<String,String>(); commitData.put(SolrIndexWriter.COMMIT_TIME_MSEC_KEY, String.valueOf(System.currentTimeMillis())); writer.commit(commitData); synchronized (this) { ulog.postCommit(cmd); } } } catch (Throwable th) { log.error("Error in final commit", th); } // we went through the normal process to commit, so we don't have to artificially // cap any ulog files. try { if (ulog != null) ulog.close(false); } catch (Throwable th) { log.error("Error closing log files", th); } if (writer != null) writer.close(); } finally { commitLock.unlock(); if (clearRequestInfo) SolrRequestInfo.clearRequestInfo(); } }
// in core/src/java/org/apache/solr/update/processor/DistributedUpdateProcessor.java
Override public void processAdd(AddUpdateCommand cmd) throws IOException { // TODO: check for id field? int hash = 0; if (zkEnabled) { zkCheck(); hash = hash(cmd); nodes = setupRequest(hash); } else { isLeader = getNonZkLeaderAssumption(req); } boolean dropCmd = false; if (!forwardToLeader) { dropCmd = versionAdd(cmd); } if (dropCmd) { // TODO: do we need to add anything to the response? return; } ModifiableSolrParams params = null; if (nodes != null) { params = new ModifiableSolrParams(req.getParams()); params.set(DISTRIB_UPDATE_PARAM, (isLeader ? DistribPhase.FROMLEADER.toString() : DistribPhase.TOLEADER.toString())); params.remove("commit"); // this will be distributed from the local commit cmdDistrib.distribAdd(cmd, nodes, params); } // TODO: what to do when no idField? if (returnVersions && rsp != null && idField != null) { if (addsResponse == null) { addsResponse = new NamedList<String>(); rsp.add("adds",addsResponse); } if (scratch == null) scratch = new CharsRef(); idField.getType().indexedToReadable(cmd.getIndexedId(), scratch); addsResponse.add(scratch.toString(), cmd.getVersion()); } // TODO: keep track of errors? needs to be done at a higher level though since // an id may fail before it gets to this processor. // Given that, it may also make sense to move the version reporting out of this // processor too. }
// in core/src/java/org/apache/solr/update/processor/DistributedUpdateProcessor.java
private void doLocalAdd(AddUpdateCommand cmd) throws IOException { super.processAdd(cmd); }
// in core/src/java/org/apache/solr/update/processor/DistributedUpdateProcessor.java
private void doLocalDelete(DeleteUpdateCommand cmd) throws IOException { super.processDelete(cmd); }
// in core/src/java/org/apache/solr/update/processor/DistributedUpdateProcessor.java
private boolean versionAdd(AddUpdateCommand cmd) throws IOException { BytesRef idBytes = cmd.getIndexedId(); if (vinfo == null || idBytes == null) { super.processAdd(cmd); return false; } // This is only the hash for the bucket, and must be based only on the uniqueKey (i.e. do not use a pluggable hash here) int bucketHash = Hash.murmurhash3_x86_32(idBytes.bytes, idBytes.offset, idBytes.length, 0); // at this point, there is an update we need to try and apply. // we may or may not be the leader. // Find any existing version in the document // TODO: don't reuse update commands any more! long versionOnUpdate = cmd.getVersion(); if (versionOnUpdate == 0) { SolrInputField versionField = cmd.getSolrInputDocument().getField(VersionInfo.VERSION_FIELD); if (versionField != null) { Object o = versionField.getValue(); versionOnUpdate = o instanceof Number ? ((Number) o).longValue() : Long.parseLong(o.toString()); } else { // Find the version String versionOnUpdateS = req.getParams().get(VERSION_FIELD); versionOnUpdate = versionOnUpdateS == null ? 0 : Long.parseLong(versionOnUpdateS); } } boolean isReplay = (cmd.getFlags() & UpdateCommand.REPLAY) != 0; boolean leaderLogic = isLeader && !isReplay; VersionBucket bucket = vinfo.bucket(bucketHash); vinfo.lockForUpdate(); try { synchronized (bucket) { // we obtain the version when synchronized and then do the add so we can ensure that // if version1 < version2 then version1 is actually added before version2. // even if we don't store the version field, synchronizing on the bucket // will enable us to know what version happened first, and thus enable // realtime-get to work reliably. // TODO: if versions aren't stored, do we need to set on the cmd anyway for some reason? // there may be other reasons in the future for a version on the commands if (versionsStored) { long bucketVersion = bucket.highest; if (leaderLogic) { boolean updated = getUpdatedDocument(cmd); if (updated && versionOnUpdate == -1) { versionOnUpdate = 1; // implied "doc must exist" for now... } if (versionOnUpdate != 0) { Long lastVersion = vinfo.lookupVersion(cmd.getIndexedId()); long foundVersion = lastVersion == null ? -1 : lastVersion; if ( versionOnUpdate == foundVersion || (versionOnUpdate < 0 && foundVersion < 0) || (versionOnUpdate==1 && foundVersion > 0) ) { // we're ok if versions match, or if both are negative (all missing docs are equal), or if cmd // specified it must exist (versionOnUpdate==1) and it does. } else { throw new SolrException(ErrorCode.CONFLICT, "version conflict for " + cmd.getPrintableId() + " expected=" + versionOnUpdate + " actual=" + foundVersion); } } long version = vinfo.getNewClock(); cmd.setVersion(version); cmd.getSolrInputDocument().setField(VersionInfo.VERSION_FIELD, version); bucket.updateHighest(version); } else { // The leader forwarded us this update. cmd.setVersion(versionOnUpdate); if (ulog.getState() != UpdateLog.State.ACTIVE && (cmd.getFlags() & UpdateCommand.REPLAY) == 0) { // we're not in an active state, and this update isn't from a replay, so buffer it. cmd.setFlags(cmd.getFlags() | UpdateCommand.BUFFERING); ulog.add(cmd); return true; } // if we aren't the leader, then we need to check that updates were not re-ordered if (bucketVersion != 0 && bucketVersion < versionOnUpdate) { // we're OK... this update has a version higher than anything we've seen // in this bucket so far, so we know that no reordering has yet occured. bucket.updateHighest(versionOnUpdate); } else { // there have been updates higher than the current update. we need to check // the specific version for this id. Long lastVersion = vinfo.lookupVersion(cmd.getIndexedId()); if (lastVersion != null && Math.abs(lastVersion) >= versionOnUpdate) { // This update is a repeat, or was reordered. We need to drop this update. return true; } } } } doLocalAdd(cmd); } // end synchronized (bucket) } finally { vinfo.unlockForUpdate(); } return false; }
// in core/src/java/org/apache/solr/update/processor/DistributedUpdateProcessor.java
boolean getUpdatedDocument(AddUpdateCommand cmd) throws IOException { SolrInputDocument sdoc = cmd.getSolrInputDocument(); boolean update = false; for (SolrInputField sif : sdoc.values()) { if (sif.getValue() instanceof Map) { update = true; break; } } if (!update) return false; BytesRef id = cmd.getIndexedId(); SolrInputDocument oldDoc = RealTimeGetComponent.getInputDocument(cmd.getReq().getCore(), id); if (oldDoc == null) { // not found... allow this in the future (depending on the details of the update, or if the user explicitly sets it). // could also just not change anything here and let the optimistic locking throw the error throw new SolrException(ErrorCode.CONFLICT, "Document not found for update. id=" + cmd.getPrintableId()); } oldDoc.remove(VERSION_FIELD); for (SolrInputField sif : sdoc.values()) { Object val = sif.getValue(); if (val instanceof Map) { for (Entry<String,Object> entry : ((Map<String,Object>) val).entrySet()) { String key = entry.getKey(); Object fieldVal = entry.getValue(); if ("add".equals(key)) { oldDoc.addField( sif.getName(), fieldVal, sif.getBoost()); } else if ("set".equals(key)) { oldDoc.setField(sif.getName(), fieldVal, sif.getBoost()); } else if ("inc".equals(key)) { SolrInputField numericField = oldDoc.get(sif.getName()); if (numericField == null) { oldDoc.setField(sif.getName(), fieldVal, sif.getBoost()); } else { // TODO: fieldtype needs externalToObject? String oldValS = numericField.getFirstValue().toString(); SchemaField sf = cmd.getReq().getSchema().getField(sif.getName()); BytesRef term = new BytesRef(); sf.getType().readableToIndexed(oldValS, term); Object oldVal = sf.getType().toObject(sf, term); String fieldValS = fieldVal.toString(); Number result; if (oldVal instanceof Long) { result = ((Long) oldVal).longValue() + Long.parseLong(fieldValS); } else if (oldVal instanceof Float) { result = ((Float) oldVal).floatValue() + Float.parseFloat(fieldValS); } else if (oldVal instanceof Double) { result = ((Double) oldVal).doubleValue() + Double.parseDouble(fieldValS); } else { // int, short, byte result = ((Integer) oldVal).intValue() + Integer.parseInt(fieldValS); } oldDoc.setField(sif.getName(), result, sif.getBoost()); } } } } else { // normal fields are treated as a "set" oldDoc.put(sif.getName(), sif); } } cmd.solrDoc = oldDoc; return true; }
// in core/src/java/org/apache/solr/update/processor/DistributedUpdateProcessor.java
Override public void processDelete(DeleteUpdateCommand cmd) throws IOException { if (!cmd.isDeleteById()) { doDeleteByQuery(cmd); return; } int hash = 0; if (zkEnabled) { zkCheck(); hash = hash(cmd); nodes = setupRequest(hash); } else { isLeader = getNonZkLeaderAssumption(req); } boolean dropCmd = false; if (!forwardToLeader) { dropCmd = versionDelete(cmd); } if (dropCmd) { // TODO: do we need to add anything to the response? return; } ModifiableSolrParams params = null; if (nodes != null) { params = new ModifiableSolrParams(req.getParams()); params.set(DISTRIB_UPDATE_PARAM, (isLeader ? DistribPhase.FROMLEADER.toString() : DistribPhase.TOLEADER.toString())); params.remove("commit"); // we already will have forwarded this from our local commit cmdDistrib.distribDelete(cmd, nodes, params); } // cmd.getIndexId == null when delete by query // TODO: what to do when no idField? if (returnVersions && rsp != null && cmd.getIndexedId() != null && idField != null) { if (deleteResponse == null) { deleteResponse = new NamedList<String>(); rsp.add("deletes",deleteResponse); } if (scratch == null) scratch = new CharsRef(); idField.getType().indexedToReadable(cmd.getIndexedId(), scratch); deleteResponse.add(scratch.toString(), cmd.getVersion()); // we're returning the version of the delete.. not the version of the doc we deleted. } }
// in core/src/java/org/apache/solr/update/processor/DistributedUpdateProcessor.java
public void doDeleteByQuery(DeleteUpdateCommand cmd) throws IOException { // even in non zk mode, tests simulate updates from a leader if(!zkEnabled) { isLeader = getNonZkLeaderAssumption(req); } else { zkCheck(); } // NONE: we are the first to receive this deleteByQuery // - it must be forwarded to the leader of every shard // TO: we are a leader receiving a forwarded deleteByQuery... we must: // - block all updates (use VersionInfo) // - flush *all* updates going to our replicas // - forward the DBQ to our replicas and wait for the response // - log + execute the local DBQ // FROM: we are a replica receiving a DBQ from our leader // - log + execute the local DBQ DistribPhase phase = DistribPhase.parseParam(req.getParams().get(DISTRIB_UPDATE_PARAM)); if (zkEnabled && DistribPhase.NONE == phase) { boolean leaderForAnyShard = false; // start off by assuming we are not a leader for any shard Map<String,Slice> slices = zkController.getCloudState().getSlices(collection); if (slices == null) { throw new SolrException(ErrorCode.BAD_REQUEST, "Cannot find collection:" + collection + " in " + zkController.getCloudState().getCollections()); } ModifiableSolrParams params = new ModifiableSolrParams(req.getParams()); params.set(DISTRIB_UPDATE_PARAM, DistribPhase.TOLEADER.toString()); List<Node> leaders = new ArrayList<Node>(slices.size()); for (Map.Entry<String,Slice> sliceEntry : slices.entrySet()) { String sliceName = sliceEntry.getKey(); ZkNodeProps leaderProps; try { leaderProps = zkController.getZkStateReader().getLeaderProps(collection, sliceName); } catch (InterruptedException e) { throw new SolrException(ErrorCode.SERVER_ERROR, "Exception finding leader for shard " + sliceName, e); } // TODO: What if leaders changed in the meantime? // should we send out slice-at-a-time and if a node returns "hey, I'm not a leader" (or we get an error because it went down) then look up the new leader? // Am I the leader for this slice? ZkCoreNodeProps coreLeaderProps = new ZkCoreNodeProps(leaderProps); String leaderNodeName = coreLeaderProps.getCoreNodeName(); String coreName = req.getCore().getName(); String coreNodeName = zkController.getNodeName() + "_" + coreName; isLeader = coreNodeName.equals(leaderNodeName); if (isLeader) { // don't forward to ourself leaderForAnyShard = true; } else { leaders.add(new StdNode(coreLeaderProps)); } } params.remove("commit"); // this will be distributed from the local commit cmdDistrib.distribDelete(cmd, leaders, params); if (!leaderForAnyShard) { return; } // change the phase to TOLEADER so we look up and forward to our own replicas (if any) phase = DistribPhase.TOLEADER; } List<Node> replicas = null; if (zkEnabled && DistribPhase.TOLEADER == phase) { // This core should be a leader replicas = setupRequest(); } if (vinfo == null) { super.processDelete(cmd); return; } // at this point, there is an update we need to try and apply. // we may or may not be the leader. // Find the version long versionOnUpdate = cmd.getVersion(); if (versionOnUpdate == 0) { String versionOnUpdateS = req.getParams().get(VERSION_FIELD); versionOnUpdate = versionOnUpdateS == null ? 0 : Long.parseLong(versionOnUpdateS); } versionOnUpdate = Math.abs(versionOnUpdate); // normalize to positive version boolean isReplay = (cmd.getFlags() & UpdateCommand.REPLAY) != 0; boolean leaderLogic = isLeader && !isReplay; if (!leaderLogic && versionOnUpdate==0) { throw new SolrException(ErrorCode.BAD_REQUEST, "missing _version_ on update from leader"); } vinfo.blockUpdates(); try { if (versionsStored) { if (leaderLogic) { long version = vinfo.getNewClock(); cmd.setVersion(-version); // TODO update versions in all buckets doLocalDelete(cmd); } else { cmd.setVersion(-versionOnUpdate); if (ulog.getState() != UpdateLog.State.ACTIVE && (cmd.getFlags() & UpdateCommand.REPLAY) == 0) { // we're not in an active state, and this update isn't from a replay, so buffer it. cmd.setFlags(cmd.getFlags() | UpdateCommand.BUFFERING); ulog.deleteByQuery(cmd); return; } doLocalDelete(cmd); } } // since we don't know which documents were deleted, the easiest thing to do is to invalidate // all real-time caches (i.e. UpdateLog) which involves also getting a new version of the IndexReader // (so cache misses will see up-to-date data) } finally { vinfo.unblockUpdates(); } // TODO: need to handle reorders to replicas somehow // forward to all replicas if (leaderLogic && replicas != null) { ModifiableSolrParams params = new ModifiableSolrParams(req.getParams()); params.set(VERSION_FIELD, Long.toString(cmd.getVersion())); params.set(DISTRIB_UPDATE_PARAM, DistribPhase.FROMLEADER.toString()); cmdDistrib.distribDelete(cmd, replicas, params); cmdDistrib.finish(); } if (returnVersions && rsp != null) { if (deleteByQueryResponse == null) { deleteByQueryResponse = new NamedList<String>(); rsp.add("deleteByQuery",deleteByQueryResponse); } deleteByQueryResponse.add(cmd.getQuery(), cmd.getVersion()); } }
// in core/src/java/org/apache/solr/update/processor/DistributedUpdateProcessor.java
private boolean versionDelete(DeleteUpdateCommand cmd) throws IOException { BytesRef idBytes = cmd.getIndexedId(); if (vinfo == null || idBytes == null) { super.processDelete(cmd); return false; } // This is only the hash for the bucket, and must be based only on the uniqueKey (i.e. do not use a pluggable hash here) int bucketHash = Hash.murmurhash3_x86_32(idBytes.bytes, idBytes.offset, idBytes.length, 0); // at this point, there is an update we need to try and apply. // we may or may not be the leader. // Find the version long versionOnUpdate = cmd.getVersion(); if (versionOnUpdate == 0) { String versionOnUpdateS = req.getParams().get(VERSION_FIELD); versionOnUpdate = versionOnUpdateS == null ? 0 : Long.parseLong(versionOnUpdateS); } long signedVersionOnUpdate = versionOnUpdate; versionOnUpdate = Math.abs(versionOnUpdate); // normalize to positive version boolean isReplay = (cmd.getFlags() & UpdateCommand.REPLAY) != 0; boolean leaderLogic = isLeader && !isReplay; if (!leaderLogic && versionOnUpdate==0) { throw new SolrException(ErrorCode.BAD_REQUEST, "missing _version_ on update from leader"); } VersionBucket bucket = vinfo.bucket(bucketHash); vinfo.lockForUpdate(); try { synchronized (bucket) { if (versionsStored) { long bucketVersion = bucket.highest; if (leaderLogic) { if (signedVersionOnUpdate != 0) { Long lastVersion = vinfo.lookupVersion(cmd.getIndexedId()); long foundVersion = lastVersion == null ? -1 : lastVersion; if ( (signedVersionOnUpdate == foundVersion) || (signedVersionOnUpdate < 0 && foundVersion < 0) || (signedVersionOnUpdate == 1 && foundVersion > 0) ) { // we're ok if versions match, or if both are negative (all missing docs are equal), or if cmd // specified it must exist (versionOnUpdate==1) and it does. } else { throw new SolrException(ErrorCode.CONFLICT, "version conflict for " + cmd.getId() + " expected=" + signedVersionOnUpdate + " actual=" + foundVersion); } } long version = vinfo.getNewClock(); cmd.setVersion(-version); bucket.updateHighest(version); } else { cmd.setVersion(-versionOnUpdate); if (ulog.getState() != UpdateLog.State.ACTIVE && (cmd.getFlags() & UpdateCommand.REPLAY) == 0) { // we're not in an active state, and this update isn't from a replay, so buffer it. cmd.setFlags(cmd.getFlags() | UpdateCommand.BUFFERING); ulog.delete(cmd); return true; } // if we aren't the leader, then we need to check that updates were not re-ordered if (bucketVersion != 0 && bucketVersion < versionOnUpdate) { // we're OK... this update has a version higher than anything we've seen // in this bucket so far, so we know that no reordering has yet occured. bucket.updateHighest(versionOnUpdate); } else { // there have been updates higher than the current update. we need to check // the specific version for this id. Long lastVersion = vinfo.lookupVersion(cmd.getIndexedId()); if (lastVersion != null && Math.abs(lastVersion) >= versionOnUpdate) { // This update is a repeat, or was reordered. We need to drop this update. return true; } } } } doLocalDelete(cmd); return false; } // end synchronized (bucket) } finally { vinfo.unlockForUpdate(); } }
// in core/src/java/org/apache/solr/update/processor/DistributedUpdateProcessor.java
Override public void processCommit(CommitUpdateCommand cmd) throws IOException { if (zkEnabled) { zkCheck(); } if (vinfo != null) { vinfo.lockForUpdate(); } try { if (ulog == null || ulog.getState() == UpdateLog.State.ACTIVE || (cmd.getFlags() & UpdateCommand.REPLAY) != 0) { super.processCommit(cmd); } else { log.info("Ignoring commit while not ACTIVE - state: " + ulog.getState() + " replay:" + (cmd.getFlags() & UpdateCommand.REPLAY)); } } finally { if (vinfo != null) { vinfo.unlockForUpdate(); } } // TODO: we should consider this? commit everyone in the current collection if (zkEnabled) { ModifiableSolrParams params = new ModifiableSolrParams(req.getParams()); if (!params.getBool(COMMIT_END_POINT, false)) { params.set(COMMIT_END_POINT, true); String nodeName = req.getCore().getCoreDescriptor().getCoreContainer() .getZkController().getNodeName(); String shardZkNodeName = nodeName + "_" + req.getCore().getName(); List<Node> nodes = getCollectionUrls(req, req.getCore().getCoreDescriptor() .getCloudDescriptor().getCollectionName(), shardZkNodeName); if (nodes != null) { cmdDistrib.distribCommit(cmd, nodes, params); finish(); } } } }
// in core/src/java/org/apache/solr/update/processor/DistributedUpdateProcessor.java
Override public void finish() throws IOException { doFinish(); if (next != null && nodes == null) next.finish(); }
// in core/src/java/org/apache/solr/update/processor/RunUpdateProcessorFactory.java
Override public void processAdd(AddUpdateCommand cmd) throws IOException { updateHandler.addDoc(cmd); super.processAdd(cmd); changesSinceCommit = true; }
// in core/src/java/org/apache/solr/update/processor/RunUpdateProcessorFactory.java
Override public void processDelete(DeleteUpdateCommand cmd) throws IOException { if( cmd.isDeleteById()) { updateHandler.delete(cmd); } else { updateHandler.deleteByQuery(cmd); } super.processDelete(cmd); changesSinceCommit = true; }
// in core/src/java/org/apache/solr/update/processor/RunUpdateProcessorFactory.java
Override public void processMergeIndexes(MergeIndexesCommand cmd) throws IOException { updateHandler.mergeIndexes(cmd); super.processMergeIndexes(cmd); }
// in core/src/java/org/apache/solr/update/processor/RunUpdateProcessorFactory.java
Override public void processCommit(CommitUpdateCommand cmd) throws IOException { updateHandler.commit(cmd); super.processCommit(cmd); changesSinceCommit = false; }
// in core/src/java/org/apache/solr/update/processor/RunUpdateProcessorFactory.java
Override public void processRollback(RollbackUpdateCommand cmd) throws IOException { updateHandler.rollback(cmd); super.processRollback(cmd); changesSinceCommit = false; }
// in core/src/java/org/apache/solr/update/processor/RunUpdateProcessorFactory.java
Override public void finish() throws IOException { if (changesSinceCommit && updateHandler.getUpdateLog() != null) { updateHandler.getUpdateLog().finish(null); } super.finish(); }
// in core/src/java/org/apache/solr/update/processor/UpdateRequestProcessor.java
public void processAdd(AddUpdateCommand cmd) throws IOException { if (next != null) next.processAdd(cmd); }
// in core/src/java/org/apache/solr/update/processor/UpdateRequestProcessor.java
public void processDelete(DeleteUpdateCommand cmd) throws IOException { if (next != null) next.processDelete(cmd); }
// in core/src/java/org/apache/solr/update/processor/UpdateRequestProcessor.java
public void processMergeIndexes(MergeIndexesCommand cmd) throws IOException { if (next != null) next.processMergeIndexes(cmd); }
// in core/src/java/org/apache/solr/update/processor/UpdateRequestProcessor.java
public void processCommit(CommitUpdateCommand cmd) throws IOException { if (next != null) next.processCommit(cmd); }
// in core/src/java/org/apache/solr/update/processor/UpdateRequestProcessor.java
public void processRollback(RollbackUpdateCommand cmd) throws IOException { if (next != null) next.processRollback(cmd); }
// in core/src/java/org/apache/solr/update/processor/UpdateRequestProcessor.java
public void finish() throws IOException { if (next != null) next.finish(); }
// in core/src/java/org/apache/solr/update/processor/FieldMutatingUpdateProcessor.java
Override public void processAdd(AddUpdateCommand cmd) throws IOException { final SolrInputDocument doc = cmd.getSolrInputDocument(); // make a copy we can iterate over while mutating the doc final Collection<String> fieldNames = new ArrayList<String>(doc.getFieldNames()); for (final String fname : fieldNames) { if (! selector.shouldMutate(fname)) continue; final SolrInputField src = doc.get(fname); SolrInputField dest = null; try { dest = mutate(src); } catch (SolrException e) { String msg = "Unable to mutate field '"+fname+"': "+e.getMessage(); SolrException.log(log, msg, e); throw new SolrException(BAD_REQUEST, msg, e); } if (null == dest) { doc.remove(fname); } else { // semantics of what happens if dest has diff name are hard // we could treat it as a copy, or a rename // for now, don't allow it. if (! fname.equals(dest.getName()) ) { throw new SolrException(SERVER_ERROR, "mutute returned field with different name: " + fname + " => " + dest.getName()); } doc.put(dest.getName(), dest); } } super.processAdd(cmd); }
// in core/src/java/org/apache/solr/update/processor/URLClassifyProcessor.java
Override public void processAdd(AddUpdateCommand command) throws IOException { if (isEnabled()) { SolrInputDocument document = command.getSolrInputDocument(); if (document.containsKey(urlFieldname)) { String url = (String) document.getFieldValue(urlFieldname); try { URL normalizedURL = getNormalizedURL(url); document.setField(lengthFieldname, length(normalizedURL)); document.setField(levelsFieldname, levels(normalizedURL)); document.setField(toplevelpageFieldname, isTopLevelPage(normalizedURL) ? 1 : 0); document.setField(landingpageFieldname, isLandingPage(normalizedURL) ? 1 : 0); if (domainFieldname != null) { document.setField(domainFieldname, normalizedURL.getHost()); } if (canonicalUrlFieldname != null) { document.setField(canonicalUrlFieldname, getCanonicalUrl(normalizedURL)); } log.debug(document.toString()); } catch (MalformedURLException e) { log.warn("cannot get the normalized url for \"" + url + "\" due to " + e.getMessage()); } catch (URISyntaxException e) { log.warn("cannot get the normalized url for \"" + url + "\" due to " + e.getMessage()); } } } super.processAdd(command); }
// in core/src/java/org/apache/solr/update/processor/SignatureUpdateProcessorFactory.java
Override public void processAdd(AddUpdateCommand cmd) throws IOException { if (enabled) { SolrInputDocument doc = cmd.getSolrInputDocument(); List<String> currDocSigFields = null; if (sigFields == null || sigFields.size() == 0) { Collection<String> docFields = doc.getFieldNames(); currDocSigFields = new ArrayList<String>(docFields.size()); currDocSigFields.addAll(docFields); Collections.sort(currDocSigFields); } else { currDocSigFields = sigFields; } Signature sig = req.getCore().getResourceLoader().newInstance(signatureClass, Signature.class); sig.init(params); for (String field : currDocSigFields) { SolrInputField f = doc.getField(field); if (f != null) { sig.add(field); Object o = f.getValue(); if (o instanceof Collection) { for (Object oo : (Collection)o) { sig.add(String.valueOf(oo)); } } else { sig.add(String.valueOf(o)); } } } byte[] signature = sig.getSignature(); char[] arr = new char[signature.length<<1]; for (int i=0; i<signature.length; i++) { int b = signature[i]; int idx = i<<1; arr[idx]= StrUtils.HEX_DIGITS[(b >> 4) & 0xf]; arr[idx+1]= StrUtils.HEX_DIGITS[b & 0xf]; } String sigString = new String(arr); doc.addField(signatureField, sigString); if (overwriteDupes) { cmd.updateTerm = new Term(signatureField, sigString); } } if (next != null) next.processAdd(cmd); }
// in core/src/java/org/apache/solr/update/processor/LogUpdateProcessorFactory.java
Override public void processAdd(AddUpdateCommand cmd) throws IOException { if (logDebug) { log.debug("PRE_UPDATE " + cmd.toString()); } // call delegate first so we can log things like the version that get set later if (next != null) next.processAdd(cmd); // Add a list of added id's to the response if (adds == null) { adds = new ArrayList<String>(); toLog.add("add",adds); } if (adds.size() < maxNumToLog) { long version = cmd.getVersion(); String msg = cmd.getPrintableId(); if (version != 0) msg = msg + " (" + version + ')'; adds.add(msg); } numAdds++; }
// in core/src/java/org/apache/solr/update/processor/LogUpdateProcessorFactory.java
Override public void processDelete( DeleteUpdateCommand cmd ) throws IOException { if (logDebug) { log.debug("PRE_UPDATE " + cmd.toString()); } if (next != null) next.processDelete(cmd); if (cmd.isDeleteById()) { if (deletes == null) { deletes = new ArrayList<String>(); toLog.add("delete",deletes); } if (deletes.size() < maxNumToLog) { long version = cmd.getVersion(); String msg = cmd.getId(); if (version != 0) msg = msg + " (" + version + ')'; deletes.add(msg); } } else { if (toLog.size() < maxNumToLog) { long version = cmd.getVersion(); String msg = cmd.query; if (version != 0) msg = msg + " (" + version + ')'; toLog.add("deleteByQuery", msg); } } numDeletes++; }
// in core/src/java/org/apache/solr/update/processor/LogUpdateProcessorFactory.java
Override public void processMergeIndexes(MergeIndexesCommand cmd) throws IOException { if (logDebug) { log.debug("PRE_UPDATE " + cmd.toString()); } if (next != null) next.processMergeIndexes(cmd); toLog.add("mergeIndexes", cmd.toString()); }
// in core/src/java/org/apache/solr/update/processor/LogUpdateProcessorFactory.java
Override public void processCommit( CommitUpdateCommand cmd ) throws IOException { if (logDebug) { log.debug("PRE_UPDATE " + cmd.toString()); } if (next != null) next.processCommit(cmd); final String msg = cmd.optimize ? "optimize" : "commit"; toLog.add(msg, ""); }
// in core/src/java/org/apache/solr/update/processor/LogUpdateProcessorFactory.java
Override public void processRollback( RollbackUpdateCommand cmd ) throws IOException { if (logDebug) { log.debug("PRE_UPDATE " + cmd.toString()); } if (next != null) next.processRollback(cmd); toLog.add("rollback", ""); }
// in core/src/java/org/apache/solr/update/processor/LogUpdateProcessorFactory.java
Override public void finish() throws IOException { if (logDebug) { log.debug("PRE_UPDATE finish()"); } if (next != null) next.finish(); // LOG A SUMMARY WHEN ALL DONE (INFO LEVEL) NamedList<Object> stdLog = rsp.getToLog(); StringBuilder sb = new StringBuilder(req.getCore().getLogId()); for (int i=0; i<stdLog.size(); i++) { String name = stdLog.getName(i); Object val = stdLog.getVal(i); if (name != null) { sb.append(name).append('='); } sb.append(val).append(' '); } stdLog.clear(); // make it so SolrCore.exec won't log this again // if id lists were truncated, show how many more there were if (adds != null && numAdds > maxNumToLog) { adds.add("... (" + numAdds + " adds)"); } if (deletes != null && numDeletes > maxNumToLog) { deletes.add("... (" + numDeletes + " deletes)"); } long elapsed = rsp.getEndTime() - req.getStartTime(); sb.append(toLog).append(" 0 ").append(elapsed); log.info(sb.toString()); }
// in core/src/java/org/apache/solr/update/processor/UniqFieldsUpdateProcessorFactory.java
Override public void processAdd(AddUpdateCommand cmd) throws IOException { if(fields != null){ SolrInputDocument solrInputDocument = cmd.getSolrInputDocument(); List<Object> uniqList = new ArrayList<Object>(); for (String field : fields) { uniqList.clear(); Collection<Object> col = solrInputDocument.getFieldValues(field); if (col != null) { for (Object o : col) { if(!uniqList.contains(o)) uniqList.add(o); } solrInputDocument.remove(field); for (Object o : uniqList) { solrInputDocument.addField(field, o); } } } } super.processAdd(cmd); }
// in core/src/java/org/apache/solr/update/DefaultSolrCoreState.java
Override public synchronized IndexWriter getIndexWriter(SolrCore core) throws IOException { if (indexWriter == null) { indexWriter = createMainIndexWriter(core, "DirectUpdateHandler2", false, false); } return indexWriter; }
// in core/src/java/org/apache/solr/update/DefaultSolrCoreState.java
Override public synchronized void newIndexWriter(SolrCore core) throws IOException { if (indexWriter != null) { indexWriter.close(); } indexWriter = createMainIndexWriter(core, "DirectUpdateHandler2", false, true); }
// in core/src/java/org/apache/solr/update/DefaultSolrCoreState.java
Override public void decref(IndexWriterCloser closer) throws IOException { synchronized (this) { refCnt--; if (refCnt == 0) { try { if (closer != null) { closer.closeWriter(indexWriter); } else if (indexWriter != null) { indexWriter.close(); } } catch (Throwable t) { log.error("Error during shutdown of writer.", t); } try { directoryFactory.close(); } catch (Throwable t) { log.error("Error during shutdown of directory factory.", t); } try { cancelRecovery(); } catch (Throwable t) { log.error("Error cancelling recovery", t); } closed = true; } } }
// in core/src/java/org/apache/solr/update/DefaultSolrCoreState.java
Override public synchronized void rollbackIndexWriter(SolrCore core) throws IOException { indexWriter.rollback(); newIndexWriter(core); }
// in core/src/java/org/apache/solr/update/DefaultSolrCoreState.java
protected SolrIndexWriter createMainIndexWriter(SolrCore core, String name, boolean removeAllExisting, boolean forceNewDirectory) throws IOException { return new SolrIndexWriter(name, core.getNewIndexDir(), core.getDirectoryFactory(), removeAllExisting, core.getSchema(), core.getSolrConfig().indexConfig, core.getDeletionPolicy(), core.getCodec(), forceNewDirectory); }
// in core/src/java/org/apache/solr/update/SolrIndexWriter.java
private static InfoStream toInfoStream(SolrIndexConfig config) throws IOException { String infoStreamFile = config.infoStreamFile; if (infoStreamFile != null) { File f = new File(infoStreamFile); File parent = f.getParentFile(); if (parent != null) parent.mkdirs(); FileOutputStream fos = new FileOutputStream(f, true); return new PrintStreamInfoStream(new PrintStream(fos, true)); } else { return InfoStream.NO_OUTPUT; } }
// in core/src/java/org/apache/solr/update/SolrIndexWriter.java
Override public void close() throws IOException { log.debug("Closing Writer " + name); Directory directory = getDirectory(); final InfoStream infoStream = isClosed ? null : getConfig().getInfoStream(); try { super.close(); if(infoStream != null) { infoStream.close(); } } finally { isClosed = true; directoryFactory.release(directory); numCloses.incrementAndGet(); } }
// in core/src/java/org/apache/solr/update/SolrIndexWriter.java
Override public void rollback() throws IOException { try { super.rollback(); } finally { isClosed = true; } }
// in core/src/java/org/apache/solr/update/SolrCmdDistributor.java
public void distribDelete(DeleteUpdateCommand cmd, List<Node> urls, ModifiableSolrParams params) throws IOException { checkResponses(false); if (cmd.isDeleteById()) { doDelete(cmd, urls, params); } else { doDelete(cmd, urls, params); } }
// in core/src/java/org/apache/solr/update/SolrCmdDistributor.java
public void distribAdd(AddUpdateCommand cmd, List<Node> nodes, ModifiableSolrParams params) throws IOException { checkResponses(false); // make sure any pending deletes are flushed flushDeletes(1); // TODO: this is brittle // need to make a clone since these commands may be reused AddUpdateCommand clone = new AddUpdateCommand(null); clone.solrDoc = cmd.solrDoc; clone.commitWithin = cmd.commitWithin; clone.overwrite = cmd.overwrite; clone.setVersion(cmd.getVersion()); AddRequest addRequest = new AddRequest(); addRequest.cmd = clone; addRequest.params = params; for (Node node : nodes) { List<AddRequest> alist = adds.get(node); if (alist == null) { alist = new ArrayList<AddRequest>(2); adds.put(node, alist); } alist.add(addRequest); } flushAdds(maxBufferedAddsPerServer); }
// in core/src/java/org/apache/solr/update/SolrCmdDistributor.java
public void distribCommit(CommitUpdateCommand cmd, List<Node> nodes, ModifiableSolrParams params) throws IOException { // Wait for all outstanding responses to make sure that a commit // can't sneak in ahead of adds or deletes we already sent. // We could do this on a per-server basis, but it's more complex // and this solution will lead to commits happening closer together. checkResponses(true); // currently, we dont try to piggy back on outstanding adds or deletes UpdateRequestExt ureq = new UpdateRequestExt(); ureq.setParams(params); addCommit(ureq, cmd); for (Node node : nodes) { submit(ureq, node); } // if the command wanted to block until everything was committed, // then do that here. if (cmd.waitSearcher) { checkResponses(true); } }
// in core/src/java/org/apache/solr/update/SolrCmdDistributor.java
private void doDelete(DeleteUpdateCommand cmd, List<Node> nodes, ModifiableSolrParams params) throws IOException { flushAdds(1); DeleteUpdateCommand clonedCmd = clone(cmd); DeleteRequest deleteRequest = new DeleteRequest(); deleteRequest.cmd = clonedCmd; deleteRequest.params = params; for (Node node : nodes) { List<DeleteRequest> dlist = deletes.get(node); if (dlist == null) { dlist = new ArrayList<DeleteRequest>(2); deletes.put(node, dlist); } dlist.add(deleteRequest); } flushDeletes(maxBufferedDeletesPerServer); }
// in core/src/java/org/apache/solr/update/TransactionLog.java
Override public Object resolve(Object o, JavaBinCodec codec) throws IOException { if (o instanceof BytesRef) { BytesRef br = (BytesRef)o; codec.writeByteArray(br.bytes, br.offset, br.length); return null; } return o; }
// in core/src/java/org/apache/solr/update/TransactionLog.java
Override public void writeExternString(String s) throws IOException { if (s == null) { writeTag(NULL); return; } // no need to synchronize globalStringMap - it's only updated before the first record is written to the log Integer idx = globalStringMap.get(s); if (idx == null) { // write a normal string writeStr(s); } else { // write the extern string writeTag(EXTERN_STRING, idx); } }
// in core/src/java/org/apache/solr/update/TransactionLog.java
Override public String readExternString(FastInputStream fis) throws IOException { int idx = readSize(fis); if (idx != 0) {// idx != 0 is the index of the extern string // no need to synchronize globalStringList - it's only updated before the first record is written to the log return globalStringList.get(idx - 1); } else {// idx == 0 means it has a string value // this shouldn't happen with this codec subclass. throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Corrupt transaction log"); } }
// in core/src/java/org/apache/solr/update/TransactionLog.java
public boolean endsWithCommit() throws IOException { long size; synchronized (this) { fos.flush(); size = fos.size(); } // the end of the file should have the end message (added during a commit) plus a 4 byte size byte[] buf = new byte[ END_MESSAGE.length() ]; long pos = size - END_MESSAGE.length() - 4; if (pos < 0) return false; ChannelFastInputStream is = new ChannelFastInputStream(channel, pos); is.read(buf); for (int i=0; i<buf.length; i++) { if (buf[i] != END_MESSAGE.charAt(i)) return false; } return true; }
// in core/src/java/org/apache/solr/update/TransactionLog.java
public void rollback(long pos) throws IOException { synchronized (this) { assert snapshot_size == pos; fos.flush(); raf.setLength(pos); fos.setWritten(pos); assert fos.size() == pos; numRecords = snapshot_numRecords; } }
// in core/src/java/org/apache/solr/update/TransactionLog.java
private void readHeader(FastInputStream fis) throws IOException { // read existing header fis = fis != null ? fis : new ChannelFastInputStream(channel, 0); LogCodec codec = new LogCodec(); Map header = (Map)codec.unmarshal(fis); fis.readInt(); // skip size // needed to read other records synchronized (this) { globalStringList = (List<String>)header.get("strings"); globalStringMap = new HashMap<String, Integer>(globalStringList.size()); for (int i=0; i<globalStringList.size(); i++) { globalStringMap.put( globalStringList.get(i), i+1); } } }
// in core/src/java/org/apache/solr/update/TransactionLog.java
private void writeLogHeader(LogCodec codec) throws IOException { long pos = fos.size(); assert pos == 0; Map header = new LinkedHashMap<String,Object>(); header.put("SOLR_TLOG",1); // a magic string + version number header.put("strings",globalStringList); codec.marshal(header, fos); endRecord(pos); }
// in core/src/java/org/apache/solr/update/TransactionLog.java
private void endRecord(long startRecordPosition) throws IOException { fos.writeInt((int)(fos.size() - startRecordPosition)); numRecords++; }
// in core/src/java/org/apache/solr/update/TransactionLog.java
public ReverseReader getReverseReader() throws IOException { return new ReverseReader(); }
// in core/src/java/org/apache/solr/update/TransactionLog.java
public Object next() throws IOException, InterruptedException { long pos = fis.position(); synchronized (TransactionLog.this) { if (trace) { log.trace("Reading log record. pos="+pos+" currentSize="+fos.size()); } if (pos >= fos.size()) { return null; } fos.flushBuffer(); } if (pos == 0) { readHeader(fis); // shouldn't currently happen - header and first record are currently written at the same time synchronized (TransactionLog.this) { if (fis.position() >= fos.size()) { return null; } pos = fis.position(); } } Object o = codec.readVal(fis); // skip over record size int size = fis.readInt(); assert size == fis.position() - pos - 4; return o; }
// in core/src/java/org/apache/solr/update/TransactionLog.java
Override public SolrInputDocument readSolrInputDocument(FastInputStream dis) throws IOException { // Given that the SolrInputDocument is last in an add record, it's OK to just skip // reading it completely. return null; }
// in core/src/java/org/apache/solr/update/TransactionLog.java
public Object next() throws IOException { if (prevPos <= 0) return null; long endOfThisRecord = prevPos; int thisLength = nextLength; long recordStart = prevPos - thisLength; // back up to the beginning of the next record prevPos = recordStart - 4; // back up 4 more to read the length of the next record if (prevPos <= 0) return null; // this record is the header long bufferPos = fis.getBufferPos(); if (prevPos >= bufferPos) { // nothing to do... we're within the current buffer } else { // Position buffer so that this record is at the end. // For small records, this will cause subsequent calls to next() to be within the buffer. long seekPos = endOfThisRecord - fis.getBufferSize(); seekPos = Math.min(seekPos, prevPos); // seek to the start of the record if it's larger then the block size. seekPos = Math.max(seekPos, 0); fis.seek(seekPos); fis.peek(); // cause buffer to be filled } fis.seek(prevPos); nextLength = fis.readInt(); // this is the length of the *next* record (i.e. closer to the beginning) // TODO: optionally skip document data Object o = codec.readVal(fis); // assert fis.position() == prevPos + 4 + thisLength; // this is only true if we read all the data (and we currently skip reading SolrInputDocument return o; }
// in core/src/java/org/apache/solr/update/TransactionLog.java
Override public int readWrappedStream(byte[] target, int offset, int len) throws IOException { ByteBuffer bb = ByteBuffer.wrap(target, offset, len); int ret = ch.read(bb, readFromStream); return ret; }
// in core/src/java/org/apache/solr/update/TransactionLog.java
public void seek(long position) throws IOException { if (position <= readFromStream && position >= getBufferPos()) { // seek within buffer pos = (int)(position - getBufferPos()); } else { // long currSize = ch.size(); // not needed - underlying read should handle (unless read never done) // if (position > currSize) throw new EOFException("Read past EOF: seeking to " + position + " on file of size " + currSize + " file=" + ch); readFromStream = position; end = pos = 0; } assert position() == position; }
// in core/src/java/org/apache/solr/update/TransactionLog.java
Override public void close() throws IOException { ch.close(); }
// in core/src/java/org/apache/solr/core/StandardDirectoryFactory.java
Override protected Directory create(String path) throws IOException { return FSDirectory.open(new File(path)); }
// in core/src/java/org/apache/solr/core/RAMDirectoryFactory.java
Override protected Directory create(String path) throws IOException { return new RAMDirectory(); }
// in core/src/java/org/apache/solr/core/CachingDirectoryFactory.java
Override public void close() throws IOException { synchronized (this) { for (CacheValue val : byDirectoryCache.values()) { val.directory.close(); } byDirectoryCache.clear(); byPathCache.clear(); } }
// in core/src/java/org/apache/solr/core/CachingDirectoryFactory.java
private void close(Directory directory) throws IOException { synchronized (this) { CacheValue cacheValue = byDirectoryCache.get(directory); if (cacheValue == null) { throw new IllegalArgumentException("Unknown directory: " + directory + " " + byDirectoryCache); } cacheValue.refCnt--; if (cacheValue.refCnt == 0 && cacheValue.doneWithDir) { directory.close(); byDirectoryCache.remove(directory); byPathCache.remove(cacheValue.path); } } }
// in core/src/java/org/apache/solr/core/CachingDirectoryFactory.java
Override public final Directory get(String path, String rawLockType) throws IOException { return get(path, rawLockType, false); }
// in core/src/java/org/apache/solr/core/CachingDirectoryFactory.java
Override public final Directory get(String path, String rawLockType, boolean forceNew) throws IOException { String fullPath = new File(path).getAbsolutePath(); synchronized (this) { CacheValue cacheValue = byPathCache.get(fullPath); Directory directory = null; if (cacheValue != null) { directory = cacheValue.directory; if (forceNew) { cacheValue.doneWithDir = true; if (cacheValue.refCnt == 0) { close(cacheValue.directory); } } } if (directory == null || forceNew) { directory = create(fullPath); CacheValue newCacheValue = new CacheValue(); newCacheValue.directory = directory; newCacheValue.path = fullPath; injectLockFactory(directory, path, rawLockType); byDirectoryCache.put(directory, newCacheValue); byPathCache.put(fullPath, newCacheValue); } else { cacheValue.refCnt++; } return directory; } }
// in core/src/java/org/apache/solr/core/CachingDirectoryFactory.java
Override public void release(Directory directory) throws IOException { if (directory == null) { throw new NullPointerException(); } close(directory); }
// in core/src/java/org/apache/solr/core/CachingDirectoryFactory.java
private static Directory injectLockFactory(Directory dir, String lockPath, String rawLockType) throws IOException { if (null == rawLockType) { // we default to "simple" for backwards compatibility log.warn("No lockType configured for " + dir + " assuming 'simple'"); rawLockType = "simple"; } final String lockType = rawLockType.toLowerCase(Locale.ENGLISH).trim(); if ("simple".equals(lockType)) { // multiple SimpleFSLockFactory instances should be OK dir.setLockFactory(new SimpleFSLockFactory(lockPath)); } else if ("native".equals(lockType)) { dir.setLockFactory(new NativeFSLockFactory(lockPath)); } else if ("single".equals(lockType)) { if (!(dir.getLockFactory() instanceof SingleInstanceLockFactory)) dir .setLockFactory(new SingleInstanceLockFactory()); } else if ("none".equals(lockType)) { // Recipe for disaster log.error("CONFIGURATION WARNING: locks are disabled on " + dir); dir.setLockFactory(NoLockFactory.getNoLockFactory()); } else { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Unrecognized lockType: " + rawLockType); } return dir; }
// in core/src/java/org/apache/solr/core/NIOFSDirectoryFactory.java
Override protected Directory create(String path) throws IOException { return new NIOFSDirectory(new File(path)); }
// in core/src/java/org/apache/solr/core/MMapDirectoryFactory.java
Override protected Directory create(String path) throws IOException { MMapDirectory mapDirectory = new MMapDirectory(new File(path)); try { mapDirectory.setUseUnmap(unmapHack); } catch (Exception e) { log.warn("Unmap not supported on this JVM, continuing on without setting unmap", e); } mapDirectory.setMaxChunkSize(maxChunk); return mapDirectory; }
// in core/src/java/org/apache/solr/core/SolrCore.java
public SolrCore reload(SolrResourceLoader resourceLoader) throws IOException, ParserConfigurationException, SAXException { // TODO - what if indexwriter settings have changed SolrConfig config = new SolrConfig(resourceLoader, getSolrConfig().getName(), null); IndexSchema schema = new IndexSchema(config, getSchema().getResourceName(), null); updateHandler.incref(); SolrCore core = new SolrCore(getName(), null, config, schema, coreDescriptor, updateHandler); return core; }
// in core/src/java/org/apache/solr/core/SolrCore.java
public SolrIndexSearcher newSearcher(String name) throws IOException { return new SolrIndexSearcher(this, getNewIndexDir(), schema, getSolrConfig().indexConfig, name, false, directoryFactory); }
// in core/src/java/org/apache/solr/core/SolrCore.java
public RefCounted<SolrIndexSearcher> getSearcher(boolean forceNew, boolean returnSearcher, final Future[] waitSearcher) throws IOException { return getSearcher(forceNew, returnSearcher, waitSearcher, false); }
// in core/src/java/org/apache/solr/core/SolrCore.java
public RefCounted<SolrIndexSearcher> getSearcher(boolean forceNew, boolean returnSearcher, final Future[] waitSearcher, boolean updateHandlerReopens) throws IOException { // it may take some time to open an index.... we may need to make // sure that two threads aren't trying to open one at the same time // if it isn't necessary. synchronized (searcherLock) { // see if we can return the current searcher if (_searcher!=null && !forceNew) { if (returnSearcher) { _searcher.incref(); return _searcher; } else { return null; } } // check to see if we can wait for someone else's searcher to be set if (onDeckSearchers>0 && !forceNew && _searcher==null) { try { searcherLock.wait(); } catch (InterruptedException e) { log.info(SolrException.toStr(e)); } } // check again: see if we can return right now if (_searcher!=null && !forceNew) { if (returnSearcher) { _searcher.incref(); return _searcher; } else { return null; } } // At this point, we know we need to open a new searcher... // first: increment count to signal other threads that we are // opening a new searcher. onDeckSearchers++; if (onDeckSearchers < 1) { // should never happen... just a sanity check log.error(logid+"ERROR!!! onDeckSearchers is " + onDeckSearchers); onDeckSearchers=1; // reset } else if (onDeckSearchers > maxWarmingSearchers) { onDeckSearchers--; String msg="Error opening new searcher. exceeded limit of maxWarmingSearchers="+maxWarmingSearchers + ", try again later."; log.warn(logid+""+ msg); // HTTP 503==service unavailable, or 409==Conflict throw new SolrException(SolrException.ErrorCode.SERVICE_UNAVAILABLE,msg); } else if (onDeckSearchers > 1) { log.warn(logid+"PERFORMANCE WARNING: Overlapping onDeckSearchers=" + onDeckSearchers); } } // a signal to decrement onDeckSearchers if something goes wrong. final boolean[] decrementOnDeckCount=new boolean[]{true}; RefCounted<SolrIndexSearcher> currSearcherHolder = null; // searcher we are autowarming from RefCounted<SolrIndexSearcher> searchHolder = null; boolean success = false; openSearcherLock.lock(); try { searchHolder = openNewSearcher(updateHandlerReopens, false); // the searchHolder will be incremented once already (and it will eventually be assigned to _searcher when registered) // increment it again if we are going to return it to the caller. if (returnSearcher) { searchHolder.incref(); } final RefCounted<SolrIndexSearcher> newSearchHolder = searchHolder; final SolrIndexSearcher newSearcher = newSearchHolder.get(); boolean alreadyRegistered = false; synchronized (searcherLock) { if (_searcher == null) { // if there isn't a current searcher then we may // want to register this one before warming is complete instead of waiting. if (solrConfig.useColdSearcher) { registerSearcher(newSearchHolder); decrementOnDeckCount[0]=false; alreadyRegistered=true; } } else { // get a reference to the current searcher for purposes of autowarming. currSearcherHolder=_searcher; currSearcherHolder.incref(); } } final SolrIndexSearcher currSearcher = currSearcherHolder==null ? null : currSearcherHolder.get(); Future future=null; // warm the new searcher based on the current searcher. // should this go before the other event handlers or after? if (currSearcher != null) { future = searcherExecutor.submit( new Callable() { public Object call() throws Exception { try { newSearcher.warm(currSearcher); } catch (Throwable e) { SolrException.log(log,e); } return null; } } ); } if (currSearcher==null && firstSearcherListeners.size() > 0) { future = searcherExecutor.submit( new Callable() { public Object call() throws Exception { try { for (SolrEventListener listener : firstSearcherListeners) { listener.newSearcher(newSearcher,null); } } catch (Throwable e) { SolrException.log(log,null,e); } return null; } } ); }
// in core/src/java/org/apache/solr/core/SolrCore.java
private void registerSearcher(RefCounted<SolrIndexSearcher> newSearcherHolder) throws IOException { synchronized (searcherLock) { try { if (_searcher != null) { _searcher.decref(); // dec refcount for this._searcher _searcher=null; } _searcher = newSearcherHolder; SolrIndexSearcher newSearcher = newSearcherHolder.get(); /*** // a searcher may have been warming asynchronously while the core was being closed. // if this happens, just close the searcher. if (isClosed()) { // NOTE: this should not happen now - see close() for details. // *BUT* if we left it enabled, this could still happen before // close() stopped the executor - so disable this test for now. log.error("Ignoring searcher register on closed core:" + newSearcher); _searcher.decref(); } ***/ newSearcher.register(); // register subitems (caches) log.info(logid+"Registered new searcher " + newSearcher); } catch (Throwable e) { // an exception in register() shouldn't be fatal. log(e); } finally { // wake up anyone waiting for a searcher // even in the face of errors. onDeckSearchers--; searcherLock.notifyAll(); } } }
// in core/src/java/org/apache/solr/core/SolrCore.java
Override public void write(Writer writer, SolrQueryRequest request, SolrQueryResponse response) throws IOException { getWrappedWriter().write(writer, request, response); }
// in core/src/java/org/apache/solr/core/SolrResourceLoader.java
public List<String> getLines(String resource) throws IOException { return getLines(resource, UTF_8); }
// in core/src/java/org/apache/solr/core/SolrResourceLoader.java
public List<String> getLines(String resource, String encoding) throws IOException { return getLines(resource, Charset.forName(encoding)); }
// in core/src/java/org/apache/solr/core/SolrResourceLoader.java
public List<String> getLines(String resource, Charset charset) throws IOException{ BufferedReader input = null; ArrayList<String> lines; try { input = new BufferedReader(new InputStreamReader(openResource(resource), charset.newDecoder() .onMalformedInput(CodingErrorAction.REPORT) .onUnmappableCharacter(CodingErrorAction.REPORT))); lines = new ArrayList<String>(); for (String word=null; (word=input.readLine())!=null;) { // skip initial bom marker if (lines.isEmpty() && word.length() > 0 && word.charAt(0) == '\uFEFF') word = word.substring(1); // skip comments if (word.startsWith("#")) continue; word=word.trim(); // skip blank lines if (word.length()==0) continue; lines.add(word); } }
// in core/src/java/org/apache/solr/core/CoreContainer.java
public CoreContainer initialize() throws IOException, ParserConfigurationException, SAXException { CoreContainer cores = null; String solrHome = SolrResourceLoader.locateSolrHome(); File fconf = new File(solrHome, containerConfigFilename == null ? "solr.xml" : containerConfigFilename); log.info("looking for solr.xml: " + fconf.getAbsolutePath()); cores = new CoreContainer(solrHome); if (fconf.exists()) { cores.load(solrHome, fconf); } else { log.info("no solr.xml file found - using default"); cores.load(solrHome, new InputSource(new ByteArrayInputStream(DEF_SOLR_XML.getBytes("UTF-8")))); cores.configFile = fconf; } containerConfigFilename = cores.getConfigFile().getName(); return cores; }
// in core/src/java/org/apache/solr/core/CoreContainer.java
public void load(String dir, File configFile ) throws ParserConfigurationException, IOException, SAXException { this.configFile = configFile; this.load(dir, new InputSource(configFile.toURI().toASCIIString())); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
public void load(String dir, InputSource cfgis) throws ParserConfigurationException, IOException, SAXException { if (null == dir) { // don't rely on SolrResourceLoader(), determine explicitly first dir = SolrResourceLoader.locateSolrHome(); } log.info("Loading CoreContainer using Solr Home: '{}'", dir); this.loader = new SolrResourceLoader(dir); solrHome = loader.getInstanceDir(); Config cfg = new Config(loader, null, cfgis, null, false); // keep orig config for persist to consult try { this.cfg = new Config(loader, null, copyDoc(cfg.getDocument())); } catch (TransformerException e) { throw new SolrException(ErrorCode.SERVER_ERROR, "", e); } cfg.substituteProperties(); // Initialize Logging if(cfg.getBool("solr/logging/@enabled",true)) { String slf4jImpl = null; String fname = cfg.get("solr/logging/watcher/@class", null); try { slf4jImpl = StaticLoggerBinder.getSingleton().getLoggerFactoryClassStr(); if(fname==null) { if( slf4jImpl.indexOf("Log4j") > 0) { log.warn("Log watching is not yet implemented for log4j" ); } else if( slf4jImpl.indexOf("JDK") > 0) { fname = "JUL"; } } } catch(Throwable ex) { log.warn("Unable to read SLF4J version. LogWatcher will be disabled: "+ex); } // Now load the framework if(fname!=null) { if("JUL".equalsIgnoreCase(fname)) { logging = new JulWatcher(slf4jImpl); } // else if( "Log4j".equals(fname) ) { // logging = new Log4jWatcher(slf4jImpl); // } else { try { logging = loader.newInstance(fname, LogWatcher.class); } catch (Throwable e) { log.warn("Unable to load LogWatcher", e); } } if( logging != null ) { ListenerConfig v = new ListenerConfig(); v.size = cfg.getInt("solr/logging/watcher/@size",50); v.threshold = cfg.get("solr/logging/watcher/@threshold",null); if(v.size>0) { log.info("Registering Log Listener"); logging.registerListener(v, this); } } } } String dcoreName = cfg.get("solr/cores/@defaultCoreName", null); if(dcoreName != null && !dcoreName.isEmpty()) { defaultCoreName = dcoreName; } persistent = cfg.getBool("solr/@persistent", false); libDir = cfg.get("solr/@sharedLib", null); zkHost = cfg.get("solr/@zkHost" , null); adminPath = cfg.get("solr/cores/@adminPath", null); shareSchema = cfg.getBool("solr/cores/@shareSchema", DEFAULT_SHARE_SCHEMA); zkClientTimeout = cfg.getInt("solr/cores/@zkClientTimeout", DEFAULT_ZK_CLIENT_TIMEOUT); hostPort = cfg.get("solr/cores/@hostPort", DEFAULT_HOST_PORT); hostContext = cfg.get("solr/cores/@hostContext", DEFAULT_HOST_CONTEXT); host = cfg.get("solr/cores/@host", null); if(shareSchema){ indexSchemaCache = new ConcurrentHashMap<String ,IndexSchema>(); } adminHandler = cfg.get("solr/cores/@adminHandler", null ); managementPath = cfg.get("solr/cores/@managementPath", null ); zkClientTimeout = Integer.parseInt(System.getProperty("zkClientTimeout", Integer.toString(zkClientTimeout))); initZooKeeper(zkHost, zkClientTimeout); if (libDir != null) { File f = FileUtils.resolvePath(new File(dir), libDir); log.info( "loading shared library: "+f.getAbsolutePath() ); libLoader = SolrResourceLoader.createClassLoader(f, null); } if (adminPath != null) { if (adminHandler == null) { coreAdminHandler = new CoreAdminHandler(this); } else { coreAdminHandler = this.createMultiCoreHandler(adminHandler); } } try { containerProperties = readProperties(cfg, ((NodeList) cfg.evaluate(DEFAULT_HOST_CONTEXT, XPathConstants.NODESET)).item(0)); } catch (Throwable e) { SolrException.log(log,null,e); } NodeList nodes = (NodeList)cfg.evaluate("solr/cores/core", XPathConstants.NODESET); for (int i=0; i<nodes.getLength(); i++) { Node node = nodes.item(i); try { String rawName = DOMUtil.getAttr(node, "name", null); if (null == rawName) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Each core in solr.xml must have a 'name'"); } String name = rawName; CoreDescriptor p = new CoreDescriptor(this, name, DOMUtil.getAttr(node, "instanceDir", null)); // deal with optional settings String opt = DOMUtil.getAttr(node, "config", null); if (opt != null) { p.setConfigName(opt); } opt = DOMUtil.getAttr(node, "schema", null); if (opt != null) { p.setSchemaName(opt); } if (zkController != null) { opt = DOMUtil.getAttr(node, "shard", null); if (opt != null && opt.length() > 0) { p.getCloudDescriptor().setShardId(opt); } opt = DOMUtil.getAttr(node, "collection", null); if (opt != null) { p.getCloudDescriptor().setCollectionName(opt); } opt = DOMUtil.getAttr(node, "roles", null); if(opt != null){ p.getCloudDescriptor().setRoles(opt); } } opt = DOMUtil.getAttr(node, "properties", null); if (opt != null) { p.setPropertiesName(opt); } opt = DOMUtil.getAttr(node, CoreAdminParams.DATA_DIR, null); if (opt != null) { p.setDataDir(opt); } p.setCoreProperties(readProperties(cfg, node)); SolrCore core = create(p); register(name, core, false); // track original names coreToOrigName.put(core, rawName); } catch (Throwable ex) { SolrException.log(log,null,ex); } } }
// in core/src/java/org/apache/solr/core/CoreContainer.java
public SolrCore create(CoreDescriptor dcore) throws ParserConfigurationException, IOException, SAXException { // Make the instanceDir relative to the cores instanceDir if not absolute File idir = new File(dcore.getInstanceDir()); if (!idir.isAbsolute()) { idir = new File(solrHome, dcore.getInstanceDir()); } String instanceDir = idir.getPath(); log.info("Creating SolrCore '{}' using instanceDir: {}", dcore.getName(), instanceDir); // Initialize the solr config SolrResourceLoader solrLoader = null; SolrConfig config = null; String zkConfigName = null; if(zkController == null) { solrLoader = new SolrResourceLoader(instanceDir, libLoader, getCoreProps(instanceDir, dcore.getPropertiesName(),dcore.getCoreProperties())); config = new SolrConfig(solrLoader, dcore.getConfigName(), null); } else { try { String collection = dcore.getCloudDescriptor().getCollectionName(); zkController.createCollectionZkNode(dcore.getCloudDescriptor()); zkConfigName = zkController.readConfigName(collection); if (zkConfigName == null) { log.error("Could not find config name for collection:" + collection); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "Could not find config name for collection:" + collection); } solrLoader = new ZkSolrResourceLoader(instanceDir, zkConfigName, libLoader, getCoreProps(instanceDir, dcore.getPropertiesName(),dcore.getCoreProperties()), zkController); config = getSolrConfigFromZk(zkConfigName, dcore.getConfigName(), solrLoader); } catch (KeeperException e) { log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } } IndexSchema schema = null; if (indexSchemaCache != null) { if (zkController != null) { File schemaFile = new File(dcore.getSchemaName()); if (!schemaFile.isAbsolute()) { schemaFile = new File(solrLoader.getInstanceDir() + "conf" + File.separator + dcore.getSchemaName()); } if (schemaFile.exists()) { String key = schemaFile.getAbsolutePath() + ":" + new SimpleDateFormat("yyyyMMddHHmmss", Locale.US).format(new Date( schemaFile.lastModified())); schema = indexSchemaCache.get(key); if (schema == null) { log.info("creating new schema object for core: " + dcore.name); schema = new IndexSchema(config, dcore.getSchemaName(), null); indexSchemaCache.put(key, schema); } else { log.info("re-using schema object for core: " + dcore.name); } } } else { // TODO: handle caching from ZooKeeper - perhaps using ZooKeepers versioning // Don't like this cache though - how does it empty as last modified changes? } } if(schema == null){ if(zkController != null) { try { schema = getSchemaFromZk(zkConfigName, dcore.getSchemaName(), config, solrLoader); } catch (KeeperException e) { log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } } else { schema = new IndexSchema(config, dcore.getSchemaName(), null); } } SolrCore core = new SolrCore(dcore.getName(), null, config, schema, dcore); if (zkController == null && core.getUpdateHandler().getUpdateLog() != null) { // always kick off recovery if we are in standalone mode. core.getUpdateHandler().getUpdateLog().recoverFromLog(); } return core; }
// in core/src/java/org/apache/solr/core/CoreContainer.java
public void reload(String name) throws ParserConfigurationException, IOException, SAXException { name= checkDefault(name); SolrCore core; synchronized(cores) { core = cores.get(name); } if (core == null) throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "No such core: " + name ); CoreDescriptor cd = core.getCoreDescriptor(); File instanceDir = new File(cd.getInstanceDir()); if (!instanceDir.isAbsolute()) { instanceDir = new File(getSolrHome(), cd.getInstanceDir()); } log.info("Reloading SolrCore '{}' using instanceDir: {}", cd.getName(), instanceDir.getAbsolutePath()); SolrResourceLoader solrLoader; if(zkController == null) { solrLoader = new SolrResourceLoader(instanceDir.getAbsolutePath(), libLoader, getCoreProps(instanceDir.getAbsolutePath(), cd.getPropertiesName(),cd.getCoreProperties())); } else { try { String collection = cd.getCloudDescriptor().getCollectionName(); zkController.createCollectionZkNode(cd.getCloudDescriptor()); String zkConfigName = zkController.readConfigName(collection); if (zkConfigName == null) { log.error("Could not find config name for collection:" + collection); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "Could not find config name for collection:" + collection); } solrLoader = new ZkSolrResourceLoader(instanceDir.getAbsolutePath(), zkConfigName, libLoader, getCoreProps(instanceDir.getAbsolutePath(), cd.getPropertiesName(),cd.getCoreProperties()), zkController); } catch (KeeperException e) { log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } } SolrCore newCore = core.reload(solrLoader); // keep core to orig name link String origName = coreToOrigName.remove(core); if (origName != null) { coreToOrigName.put(newCore, origName); } register(name, newCore, false); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
private SolrConfig getSolrConfigFromZk(String zkConfigName, String solrConfigFileName, SolrResourceLoader resourceLoader) throws IOException, ParserConfigurationException, SAXException, KeeperException, InterruptedException { byte[] config = zkController.getConfigFileData(zkConfigName, solrConfigFileName); InputSource is = new InputSource(new ByteArrayInputStream(config)); is.setSystemId(SystemIdResolver.createSystemIdFromResourceName(solrConfigFileName)); SolrConfig cfg = solrConfigFileName == null ? new SolrConfig( resourceLoader, SolrConfig.DEFAULT_CONF_FILE, is) : new SolrConfig( resourceLoader, solrConfigFileName, is); return cfg; }
// in core/src/java/org/apache/solr/core/SolrDeletionPolicy.java
public void onInit(List commits) throws IOException { log.info("SolrDeletionPolicy.onInit: commits:" + str(commits)); updateCommits((List<IndexCommit>) commits); }
// in core/src/java/org/apache/solr/core/SolrDeletionPolicy.java
public void onCommit(List commits) throws IOException { log.info("SolrDeletionPolicy.onCommit: commits:" + str(commits)); updateCommits((List<IndexCommit>) commits); }
// in core/src/java/org/apache/solr/core/IndexDeletionPolicyWrapper.java
public void onInit(List list) throws IOException { List<IndexCommitWrapper> wrapperList = wrap(list); deletionPolicy.onInit(wrapperList); updateCommitPoints(wrapperList); cleanReserves(); }
// in core/src/java/org/apache/solr/core/IndexDeletionPolicyWrapper.java
public void onCommit(List list) throws IOException { List<IndexCommitWrapper> wrapperList = wrap(list); deletionPolicy.onCommit(wrapperList); updateCommitPoints(wrapperList); cleanReserves(); }
// in core/src/java/org/apache/solr/core/IndexDeletionPolicyWrapper.java
Override public Collection getFileNames() throws IOException { return delegate.getFileNames(); }
// in core/src/java/org/apache/solr/core/IndexDeletionPolicyWrapper.java
Override public Map getUserData() throws IOException { return delegate.getUserData(); }
// in core/src/java/org/apache/solr/core/IndexDeletionPolicyWrapper.java
public static long getCommitTimestamp(IndexCommit commit) throws IOException { final Map<String,String> commitData = commit.getUserData(); String commitTime = commitData.get(SolrIndexWriter.COMMIT_TIME_MSEC_KEY); if (commitTime != null) { return Long.parseLong(commitTime); } else { return 0; } }
// in core/src/java/org/apache/solr/core/SolrXMLSerializer.java
void persist(Writer w, SolrXMLDef solrXMLDef) throws IOException { w.write("<?xml version=\"1.0\" encoding=\"UTF-8\" ?>\n"); w.write("<solr"); Map<String,String> rootSolrAttribs = solrXMLDef.solrAttribs; Set<String> solrAttribKeys = rootSolrAttribs.keySet(); for (String key : solrAttribKeys) { String value = rootSolrAttribs.get(key); writeAttribute(w, key, value); } w.write(">\n"); Properties containerProperties = solrXMLDef.containerProperties; if (containerProperties != null && !containerProperties.isEmpty()) { writeProperties(w, containerProperties, " "); } w.write(INDENT + "<cores"); Map<String,String> coresAttribs = solrXMLDef.coresAttribs; Set<String> coreAttribKeys = coresAttribs.keySet(); for (String key : coreAttribKeys) { String value = coresAttribs.get(key); writeAttribute(w, key, value); } w.write(">\n"); for (SolrCoreXMLDef coreDef : solrXMLDef.coresDefs) { persist(w, coreDef); } w.write(INDENT + "</cores>\n"); w.write("</solr>\n"); }
// in core/src/java/org/apache/solr/core/SolrXMLSerializer.java
private void persist(Writer w, SolrCoreXMLDef coreDef) throws IOException { w.write(INDENT + INDENT + "<core"); Set<String> keys = coreDef.coreAttribs.keySet(); for (String key : keys) { writeAttribute(w, key, coreDef.coreAttribs.get(key)); } Properties properties = coreDef.coreProperties; if (properties == null || properties.isEmpty()) w.write("/>\n"); // core else { w.write(">\n"); writeProperties(w, properties, " "); w.write(INDENT + INDENT + "</core>\n"); } }
// in core/src/java/org/apache/solr/core/SolrXMLSerializer.java
private void writeProperties(Writer w, Properties props, String indent) throws IOException { for (Map.Entry<Object,Object> entry : props.entrySet()) { w.write(indent + "<property"); writeAttribute(w, "name", entry.getKey()); writeAttribute(w, "value", entry.getValue()); w.write("/>\n"); } }
// in core/src/java/org/apache/solr/core/SolrXMLSerializer.java
private void writeAttribute(Writer w, String name, Object value) throws IOException { if (value == null) return; w.write(" "); w.write(name); w.write("=\""); XML.escapeAttributeValue(value.toString(), w); w.write("\""); }
// in core/src/java/org/apache/solr/core/SolrXMLSerializer.java
private static void fileCopy(File src, File dest) throws IOException { IOException xforward = null; FileInputStream fis = null; FileOutputStream fos = null; FileChannel fcin = null; FileChannel fcout = null; try { fis = new FileInputStream(src); fos = new FileOutputStream(dest); fcin = fis.getChannel(); fcout = fos.getChannel(); // do the file copy 32Mb at a time final int MB32 = 32 * 1024 * 1024; long size = fcin.size(); long position = 0; while (position < size) { position += fcin.transferTo(position, MB32, fcout); } } catch (IOException xio) { xforward = xio; } finally { if (fis != null) try { fis.close(); fis = null; } catch (IOException xio) {} if (fos != null) try { fos.close(); fos = null; } catch (IOException xio) {} if (fcin != null && fcin.isOpen()) try { fcin.close(); fcin = null; } catch (IOException xio) {} if (fcout != null && fcout.isOpen()) try { fcout.close(); fcout = null; } catch (IOException xio) {} } if (xforward != null) { throw xforward; } }
// in core/src/java/org/apache/solr/core/SimpleFSDirectoryFactory.java
Override protected Directory create(String path) throws IOException { return new SimpleFSDirectory(new File(path)); }
// in core/src/java/org/apache/solr/core/NRTCachingDirectoryFactory.java
Override protected Directory create(String path) throws IOException { return new NRTCachingDirectory(FSDirectory.open(new File(path)), 4, 48); }
// in core/src/java/org/apache/solr/core/StandardIndexReaderFactory.java
Override public DirectoryReader newReader(Directory indexDir, SolrCore core) throws IOException { return DirectoryReader.open(indexDir, termInfosIndexDivisor); }
// in core/src/java/org/apache/solr/util/SolrPluginUtils.java
public static int numDocs(SolrIndexSearcher s, Query q, Query f) throws IOException { return (null == f) ? s.getDocSet(q).size() : s.numDocs(q,f); }
// in core/src/java/org/apache/solr/util/SolrPluginUtils.java
public static void optimizePreFetchDocs(ResponseBuilder rb, DocList docs, Query query, SolrQueryRequest req, SolrQueryResponse res) throws IOException { SolrIndexSearcher searcher = req.getSearcher(); if(!searcher.enableLazyFieldLoading) { // nothing to do return; } ReturnFields returnFields = res.getReturnFields(); if(returnFields.getLuceneFieldNames() != null) { Set<String> fieldFilter = returnFields.getLuceneFieldNames(); if (rb.doHighlights) { // copy return fields list fieldFilter = new HashSet<String>(fieldFilter); // add highlight fields SolrHighlighter highlighter = HighlightComponent.getHighlighter(req.getCore()); for (String field: highlighter.getHighlightFields(query, req, null)) fieldFilter.add(field); // fetch unique key if one exists. SchemaField keyField = req.getSearcher().getSchema().getUniqueKeyField(); if(null != keyField) fieldFilter.add(keyField.getName()); } // get documents DocIterator iter = docs.iterator(); for (int i=0; i<docs.size(); i++) { searcher.doc(iter.nextDoc(), fieldFilter); } } }
// in core/src/java/org/apache/solr/util/SolrPluginUtils.java
public static NamedList doStandardDebug(SolrQueryRequest req, String userQuery, Query query, DocList results, boolean dbgQuery, boolean dbgResults) throws IOException { NamedList dbg = null; dbg = new SimpleOrderedMap(); SolrIndexSearcher searcher = req.getSearcher(); IndexSchema schema = req.getSchema(); boolean explainStruct = req.getParams().getBool(CommonParams.EXPLAIN_STRUCT, false); if (dbgQuery) { /* userQuery may have been pre-processed .. expose that */ dbg.add("rawquerystring", req.getParams().get(CommonParams.Q)); dbg.add("querystring", userQuery); /* QueryParsing.toString isn't perfect, use it to see converted * values, use regular toString to see any attributes of the * underlying Query it may have missed. */ dbg.add("parsedquery", QueryParsing.toString(query, schema)); dbg.add("parsedquery_toString", query.toString()); } if (dbgResults) { NamedList<Explanation> explain = getExplanations(query, results, searcher, schema); dbg.add("explain", explainStruct ? explanationsToNamedLists(explain) : explanationsToStrings(explain)); String otherQueryS = req.getParams().get(CommonParams.EXPLAIN_OTHER); if (otherQueryS != null && otherQueryS.length() > 0) { DocList otherResults = doSimpleQuery (otherQueryS, req, 0, 10); dbg.add("otherQuery", otherQueryS); NamedList<Explanation> explainO = getExplanations(query, otherResults, searcher, schema); dbg.add("explainOther", explainStruct ? explanationsToNamedLists(explainO) : explanationsToStrings(explainO)); } } return dbg; }
// in core/src/java/org/apache/solr/util/SolrPluginUtils.java
public static NamedList<Explanation> getExplanations (Query query, DocList docs, SolrIndexSearcher searcher, IndexSchema schema) throws IOException { NamedList<Explanation> explainList = new SimpleOrderedMap<Explanation>(); DocIterator iterator = docs.iterator(); for (int i=0; i<docs.size(); i++) { int id = iterator.nextDoc(); Document doc = searcher.doc(id); String strid = schema.printableUniqueKey(doc); explainList.add(strid, searcher.explain(query, id) ); } return explainList; }
// in core/src/java/org/apache/solr/util/SolrPluginUtils.java
public static DocList doSimpleQuery(String sreq, SolrQueryRequest req, int start, int limit) throws IOException { List<String> commands = StrUtils.splitSmart(sreq,';'); String qs = commands.size() >= 1 ? commands.get(0) : ""; try { Query query = QParser.getParser(qs, null, req).getQuery(); // If the first non-query, non-filter command is a simple sort on an indexed field, then // we can use the Lucene sort ability. Sort sort = null; if (commands.size() >= 2) { sort = QueryParsing.parseSort(commands.get(1), req); } DocList results = req.getSearcher().getDocList(query,(DocSet)null, sort, start, limit); return results; } catch (ParseException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Error parsing query: " + qs); } }
// in core/src/java/org/apache/solr/util/SolrPluginUtils.java
public boolean regenerateItem(SolrIndexSearcher newSearcher, SolrCache newCache, SolrCache oldCache, Object oldKey, Object oldVal) throws IOException { newCache.put(oldKey,oldVal); return true; }
// in core/src/java/org/apache/solr/util/SolrPluginUtils.java
public static SolrDocumentList docListToSolrDocumentList( DocList docs, SolrIndexSearcher searcher, Set<String> fields, Map<SolrDocument, Integer> ids ) throws IOException { IndexSchema schema = searcher.getSchema(); SolrDocumentList list = new SolrDocumentList(); list.setNumFound(docs.matches()); list.setMaxScore(docs.maxScore()); list.setStart(docs.offset()); DocIterator dit = docs.iterator(); while (dit.hasNext()) { int docid = dit.nextDoc(); Document luceneDoc = searcher.doc(docid, fields); SolrDocument doc = new SolrDocument(); for( IndexableField field : luceneDoc) { if (null == fields || fields.contains(field.name())) { SchemaField sf = schema.getField( field.name() ); doc.addField( field.name(), sf.getType().toObject( field ) ); } } if (docs.hasScores() && (null == fields || fields.contains("score"))) { doc.addField("score", dit.score()); } list.add( doc ); if( ids != null ) { ids.put( doc, new Integer(docid) ); } } return list; }
// in core/src/java/org/apache/solr/util/FastWriter.java
Override public void write(int c) throws IOException { write((char)c); }
// in core/src/java/org/apache/solr/util/FastWriter.java
public void write(char c) throws IOException { if (pos >= buf.length) { sink.write(buf,0,pos); pos=0; } buf[pos++] = c; }
// in core/src/java/org/apache/solr/util/FastWriter.java
Override public FastWriter append(char c) throws IOException { if (pos >= buf.length) { sink.write(buf,0,pos); pos=0; } buf[pos++] = c; return this; }
// in core/src/java/org/apache/solr/util/FastWriter.java
Override public void write(char cbuf[], int off, int len) throws IOException { int space = buf.length - pos; if (len < space) { System.arraycopy(cbuf, off, buf, pos, len); pos += len; } else if (len<BUFSIZE) { // if the data to write is small enough, buffer it. System.arraycopy(cbuf, off, buf, pos, space); sink.write(buf, 0, buf.length); pos = len-space; System.arraycopy(cbuf, off+space, buf, 0, pos); } else { sink.write(buf,0,pos); // flush pos=0; // don't buffer, just write to sink sink.write(cbuf, off, len); } }
// in core/src/java/org/apache/solr/util/FastWriter.java
Override public void write(String str, int off, int len) throws IOException { int space = buf.length - pos; if (len < space) { str.getChars(off, off+len, buf, pos); pos += len; } else if (len<BUFSIZE) { // if the data to write is small enough, buffer it. str.getChars(off, off+space, buf, pos); sink.write(buf, 0, buf.length); str.getChars(off+space, off+len, buf, 0); pos = len-space; } else { sink.write(buf,0,pos); // flush pos=0; // don't buffer, just write to sink sink.write(str, off, len); } }
// in core/src/java/org/apache/solr/util/FastWriter.java
Override public void flush() throws IOException { sink.write(buf,0,pos); pos=0; sink.flush(); }
// in core/src/java/org/apache/solr/util/FastWriter.java
Override public void close() throws IOException { flush(); sink.close(); }
// in core/src/java/org/apache/solr/util/FastWriter.java
public void flushBuffer() throws IOException { sink.write(buf, 0, pos); pos=0; }
// in core/src/java/org/apache/solr/util/SystemIdResolver.java
URI resolveRelativeURI(String baseURI, String systemId) throws IOException,URISyntaxException { URI uri; // special case for backwards compatibility: if relative systemId starts with "/" (we convert that to an absolute solrres:-URI) if (systemId.startsWith("/")) { uri = new URI(RESOURCE_LOADER_URI_SCHEME, RESOURCE_LOADER_AUTHORITY_ABSOLUTE, "/", null, null).resolve(systemId); } else { // simply parse as URI uri = new URI(systemId); } // do relative resolving if (baseURI != null ) { uri = new URI(baseURI).resolve(uri); } return uri; }
// in core/src/java/org/apache/solr/util/SystemIdResolver.java
public InputSource resolveEntity(String name, String publicId, String baseURI, String systemId) throws IOException { if (systemId == null) return null; try { final URI uri = resolveRelativeURI(baseURI, systemId); // check schema and resolve with ResourceLoader if (RESOURCE_LOADER_URI_SCHEME.equals(uri.getScheme())) { String path = uri.getPath(), authority = uri.getAuthority(); if (!RESOURCE_LOADER_AUTHORITY_ABSOLUTE.equals(authority)) { path = path.substring(1); } try { final InputSource is = new InputSource(loader.openResource(path)); is.setSystemId(uri.toASCIIString()); is.setPublicId(publicId); return is; } catch (RuntimeException re) { // unfortunately XInclude fallback only works with IOException, but openResource() never throws that one throw (IOException) (new IOException(re.getMessage()).initCause(re)); } } else { // resolve all other URIs using the standard resolver return null; } } catch (URISyntaxException use) { log.warn("An URI systax problem occurred during resolving SystemId, falling back to default resolver", use); return null; } }
// in core/src/java/org/apache/solr/util/SystemIdResolver.java
public InputSource resolveEntity(String publicId, String systemId) throws IOException { return resolveEntity(null, publicId, null, systemId); }
// in core/src/java/org/apache/solr/util/xslt/TransformerProvider.java
public synchronized Transformer getTransformer(SolrConfig solrConfig, String filename,int cacheLifetimeSeconds) throws IOException { // For now, the Templates are blindly reloaded once cacheExpires is over. // It'd be better to check the file modification time to reload only if needed. if(lastTemplates!=null && filename.equals(lastFilename) && System.currentTimeMillis() < cacheExpires) { if(log.isDebugEnabled()) { log.debug("Using cached Templates:" + filename); } } else { lastTemplates = getTemplates(solrConfig.getResourceLoader(), filename,cacheLifetimeSeconds); } Transformer result = null; try { result = lastTemplates.newTransformer(); } catch(TransformerConfigurationException tce) { log.error(getClass().getName(), "getTransformer", tce); final IOException ioe = new IOException("newTransformer fails ( " + lastFilename + ")"); ioe.initCause(tce); throw ioe; } return result; }
// in core/src/java/org/apache/solr/util/xslt/TransformerProvider.java
private Templates getTemplates(ResourceLoader loader, String filename,int cacheLifetimeSeconds) throws IOException { Templates result = null; lastFilename = null; try { if(log.isDebugEnabled()) { log.debug("compiling XSLT templates:" + filename); } final String fn = "xslt/" + filename; final TransformerFactory tFactory = TransformerFactory.newInstance(); tFactory.setURIResolver(new SystemIdResolver(loader).asURIResolver()); tFactory.setErrorListener(xmllog); final StreamSource src = new StreamSource(loader.openResource(fn), SystemIdResolver.createSystemIdFromResourceName(fn)); try { result = tFactory.newTemplates(src); } finally { // some XML parsers are broken and don't close the byte stream (but they should according to spec) IOUtils.closeQuietly(src.getInputStream()); } } catch (Exception e) { log.error(getClass().getName(), "newTemplates", e); final IOException ioe = new IOException("Unable to initialize Templates '" + filename + "'"); ioe.initCause(e); throw ioe; } lastFilename = filename; lastTemplates = result; cacheExpires = System.currentTimeMillis() + (cacheLifetimeSeconds * 1000); return result; }
// in core/src/java/org/apache/solr/util/SuggestMissingFactories.java
public static void main(String[] args) throws ClassNotFoundException, IOException, NoSuchMethodException { final File[] files = new File[args.length]; for (int i = 0; i < args.length; i++) { files[i] = new File(args[i]); } final FindClasses finder = new FindClasses(files); final ClassLoader cl = finder.getClassLoader(); final Class TOKENSTREAM = cl.loadClass("org.apache.lucene.analysis.TokenStream"); final Class TOKENIZER = cl.loadClass("org.apache.lucene.analysis.Tokenizer"); final Class TOKENFILTER = cl.loadClass("org.apache.lucene.analysis.TokenFilter"); final Class TOKENIZERFACTORY = cl.loadClass("org.apache.solr.analysis.TokenizerFactory"); final Class TOKENFILTERFACTORY = cl.loadClass("org.apache.solr.analysis.TokenFilterFactory"); final HashSet<Class> result = new HashSet<Class>(finder.findExtends(TOKENIZER)); result.addAll(finder.findExtends(TOKENFILTER)); result.removeAll(finder.findMethodReturns (finder.findExtends(TOKENIZERFACTORY), "create", Reader.class).values()); result.removeAll(finder.findMethodReturns (finder.findExtends(TOKENFILTERFACTORY), "create", TOKENSTREAM).values()); for (final Class c : result) { System.out.println(c.getName()); } }
// in core/src/java/org/apache/solr/util/SuggestMissingFactories.java
public static void main(String[] args) throws ClassNotFoundException, IOException, NoSuchMethodException { FindClasses finder = new FindClasses(new File(args[1])); ClassLoader cl = finder.getClassLoader(); Class clazz = cl.loadClass(args[0]); if (args.length == 2) { System.out.println("Finding all extenders of " + clazz.getName()); for (Class c : finder.findExtends(clazz)) { System.out.println(c.getName()); } } else { String methName = args[2]; System.out.println("Finding all extenders of " + clazz.getName() + " with method: " + methName); Class[] methArgs = new Class[args.length-3]; for (int i = 3; i < args.length; i++) { methArgs[i-3] = cl.loadClass(args[i]); } Map<Class,Class> map = finder.findMethodReturns (finder.findExtends(clazz),methName, methArgs); for (Class key : map.keySet()) { System.out.println(key.getName() + " => " + map.get(key).getName()); } } }
// in core/src/java/org/apache/solr/util/SimplePostTool.java
private static void pipe(InputStream source, OutputStream dest) throws IOException { byte[] buf = new byte[1024]; int read = 0; while ( (read = source.read(buf) ) >= 0) { if (null != dest) dest.write(buf, 0, read); } if (null != dest) dest.flush(); }
// in core/src/java/org/apache/solr/util/FileUtils.java
public static void copyFile(File src , File destination) throws IOException { FileChannel in = null; FileChannel out = null; try { in = new FileInputStream(src).getChannel(); out = new FileOutputStream(destination).getChannel(); in.transferTo(0, in.size(), out); } finally { try { if (in != null) in.close(); } catch (IOException e) {} try { if (out != null) out.close(); } catch (IOException e) {} } }
// in core/src/java/org/apache/solr/util/FileUtils.java
public static void sync(File fullFile) throws IOException { if (fullFile == null || !fullFile.exists()) throw new FileNotFoundException("File does not exist " + fullFile); boolean success = false; int retryCount = 0; IOException exc = null; while(!success && retryCount < 5) { retryCount++; RandomAccessFile file = null; try { try { file = new RandomAccessFile(fullFile, "rw"); file.getFD().sync(); success = true; } finally { if (file != null) file.close(); } } catch (IOException ioe) { if (exc == null) exc = ioe; try { // Pause 5 msec Thread.sleep(5); } catch (InterruptedException ie) { Thread.currentThread().interrupt(); } } } if (!success) // Throw original exception throw exc; }
170
            
// in solrj/src/java/org/apache/solr/common/cloud/ZkStateReader.java
catch (IOException e) { throw new RuntimeException(e); // should never happen w/o using real IO }
// in solrj/src/java/org/apache/solr/common/params/MapSolrParams.java
catch (IOException e) {throw new RuntimeException(e);}
// in solrj/src/java/org/apache/solr/common/params/MultiMapSolrParams.java
catch (IOException e) {throw new RuntimeException(e);}
// in solrj/src/java/org/apache/solr/common/params/ModifiableSolrParams.java
catch (IOException e) {throw new RuntimeException(e);}
// in solrj/src/java/org/apache/solr/client/solrj/request/RequestWriter.java
catch (IOException e) { throw new RuntimeException("Unable to write xml into a stream", e); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/BinaryResponseParser.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "parsing error", e); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/CloudSolrServer.java
catch (IOException e) { throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/StreamingBinaryResponseParser.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "parsing error", e); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/HttpSolrServer.java
catch (IOException ex) { throw new SolrServerException("error reading streams", ex); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/HttpSolrServer.java
catch (IOException e) { throw new SolrServerException( "IOException occured when talking to server at: " + getBaseURL(), e); }
// in solrj/src/java/org/apache/solr/client/solrj/util/ClientUtils.java
catch (IOException e) {throw new RuntimeException(e);}
// in solrj/src/java/org/apache/noggit/CharArr.java
catch (IOException e) { throw new RuntimeException(e); }
// in solrj/src/java/org/apache/noggit/CharArr.java
catch (IOException e) { throw new RuntimeException(e); }
// in solrj/src/java/org/apache/noggit/CharArr.java
catch (IOException e) { throw new RuntimeException(e); }
// in contrib/extraction/src/java/org/apache/solr/handler/extraction/ExtractingRequestHandler.java
catch (IOException e) { throw new SolrException(ErrorCode.SERVER_ERROR, e); }
// in contrib/clustering/src/java/org/apache/solr/handler/clustering/carrot2/CarrotClusteringEngine.java
catch (IOException e) { log.warn("Could not read Solr resource " + resourceName); return new IResource[] {}; }
// in contrib/clustering/src/java/org/apache/solr/handler/clustering/carrot2/CarrotClusteringEngine.java
catch (IOException e) { throw new SolrException(ErrorCode.SERVER_ERROR, e); }
// in contrib/analysis-extras/src/java/org/apache/solr/analysis/StempelPolishStemFilterFactory.java
catch (IOException e) { throw new SolrException(ErrorCode.SERVER_ERROR, "Could not load stem table: " + STEMTABLE); }
// in contrib/analysis-extras/src/java/org/apache/solr/schema/ICUCollationField.java
catch (IOException e) { throw new RuntimeException("Unable to initialize TokenStream to analyze range part: " + part, e); }
// in contrib/analysis-extras/src/java/org/apache/solr/schema/ICUCollationField.java
catch (IOException e) { throw new RuntimeException("error analyzing range part: " + part, e); }
// in contrib/analysis-extras/src/java/org/apache/solr/schema/ICUCollationField.java
catch (IOException e) { throw new RuntimeException("Unable to end & close TokenStream after analyzing range part: " + part, e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/ClobTransformer.java
catch (IOException e) { DataImportHandlerException.wrapAndThrow(DataImportHandlerException.SEVERE, e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/LineEntityProcessor.java
catch (IOException exp) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Problem reading from input", exp); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/SimplePropertiesWriter.java
catch (IOException e) { propOutput = null; }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/SimplePropertiesWriter.java
catch (IOException e) { propInput = null; }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/BinContentStreamDataSource.java
catch (IOException e) { DataImportHandlerException.wrapAndThrow(SEVERE, e); return null; }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/BinContentStreamDataSource.java
catch (IOException e) { /*no op*/ }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/SolrWriter.java
catch (IOException e) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Unable to call finish() on UpdateRequestProcessor", e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/SolrWriter.java
catch (IOException e) { log.error("Exception while deleteing: " + id, e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/SolrWriter.java
catch (IOException e) { log.error("Exception while deleting by query: " + query, e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/SolrWriter.java
catch (IOException e) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Exception in full dump while deleting all documents.", e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/HTMLStripTransformer.java
catch (IOException e) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Failed stripping HTML for column: " + column, e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/ContentStreamDataSource.java
catch (IOException e) { DataImportHandlerException.wrapAndThrow(SEVERE, e); return null; }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/ContentStreamDataSource.java
catch (IOException e) { }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/PlainTextEntityProcessor.java
catch (IOException e) { IOUtils.closeQuietly(r); wrapAndThrow(SEVERE, e, "Exception reading url : " + url); }
// in core/src/java/org/apache/solr/handler/AnalysisRequestHandlerBase.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/handler/AnalysisRequestHandlerBase.java
catch (IOException ioe) { throw new RuntimeException("Error occured while iterating over tokenstream", ioe); }
// in core/src/java/org/apache/solr/handler/AnalysisRequestHandlerBase.java
catch (IOException ioe) { throw new RuntimeException("Error occured while iterating over tokenstream", ioe); }
// in core/src/java/org/apache/solr/handler/AnalysisRequestHandlerBase.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
catch (IOException e) { LOG.error("Unable to copy index file from: " + indexFileInTmpDir + " to: " + indexFileInIndex , e); }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
catch (IOException e) { fsyncException = e; }
// in core/src/java/org/apache/solr/handler/loader/JavabinLoader.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "ERROR handling commit/rollback"); }
// in core/src/java/org/apache/solr/handler/loader/JavabinLoader.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "ERROR adding document " + document); }
// in core/src/java/org/apache/solr/handler/loader/CSVLoaderBase.java
catch (IOException e) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST,e); }
// in core/src/java/org/apache/solr/handler/loader/CSVLoaderBase.java
catch (IOException e) { //Catch the exception and rethrow it with more line information input_err("can't read line: " + line, null, line, e); }
// in core/src/java/org/apache/solr/handler/FieldAnalysisRequestHandler.java
catch (IOException e) { // do nothing, leave value set to the request parameter }
// in core/src/java/org/apache/solr/handler/ReplicationHandler.java
catch (IOException e) { LOG.warn("Exception while reading files for commit " + c, e); }
// in core/src/java/org/apache/solr/handler/ReplicationHandler.java
catch (IOException e) { rsp.add("status", "unable to get file names for given index generation"); rsp.add("exception", e); LOG.warn("Unable to get file names for indexCommit generation: " + gen, e); }
// in core/src/java/org/apache/solr/handler/ReplicationHandler.java
catch (IOException e) { LOG.warn("Unable to get index version : ", e); }
// in core/src/java/org/apache/solr/handler/ReplicationHandler.java
catch (IOException e) { LOG.warn("Unable to get IndexCommit on startup", e); }
// in core/src/java/org/apache/solr/handler/ReplicationHandler.java
catch (IOException e) { LOG.warn("Exception while writing response for params: " + params, e); }
// in core/src/java/org/apache/solr/handler/component/SpellCheckComponent.java
catch (IOException e) { log.error( "Exception in reloading spell check index for spellchecker: " + checker.getDictionaryName(), e); }
// in core/src/java/org/apache/solr/handler/component/StatsComponent.java
catch (IOException e) { throw new RuntimeException( "failed to open field cache for: "+fieldName, e ); }
// in core/src/java/org/apache/solr/handler/component/StatsComponent.java
catch (IOException e) { throw new RuntimeException( "failed to open field cache for: " + facetField, e ); }
// in core/src/java/org/apache/solr/handler/component/TermVectorComponent.java
catch (IOException e) { throw new RuntimeException(e); }
// in core/src/java/org/apache/solr/handler/SnapShooter.java
catch (IOException e) { LOG.error("Unable to release snapshoot lock: " + directoryName + ".lock"); }
// in core/src/java/org/apache/solr/handler/PingRequestHandler.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Unable to write healthcheck flag file", e); }
// in core/src/java/org/apache/solr/analysis/TrieTokenizerFactory.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Unable to create TrieIndexTokenizer", e); }
// in core/src/java/org/apache/solr/analysis/TrieTokenizerFactory.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Unable to create TrieIndexTokenizer", e); }
// in core/src/java/org/apache/solr/analysis/ElisionFilterFactory.java
catch (IOException e) { throw new InitializationException("IOException thrown while loading articles", e); }
// in core/src/java/org/apache/solr/analysis/StemmerOverrideFilterFactory.java
catch (IOException e) { throw new InitializationException("IOException thrown while loading dictionary", e); }
// in core/src/java/org/apache/solr/analysis/KeywordMarkerFilterFactory.java
catch (IOException e) { throw new InitializationException("IOException thrown while loading protected words", e); }
// in core/src/java/org/apache/solr/analysis/SnowballPorterFilterFactory.java
catch (IOException e) { throw new InitializationException("IOException thrown while loading protected words", e); }
// in core/src/java/org/apache/solr/analysis/KeepWordFilterFactory.java
catch (IOException e) { throw new InitializationException("IOException thrown while loading words", e); }
// in core/src/java/org/apache/solr/analysis/JapanesePartOfSpeechStopFilterFactory.java
catch (IOException e) { throw new InitializationException("IOException thrown while loading tags", e); }
// in core/src/java/org/apache/solr/analysis/MappingCharFilterFactory.java
catch( IOException e ){ throw new InitializationException("IOException thrown while loading mappings", e); }
// in core/src/java/org/apache/solr/analysis/PatternTokenizerFactory.java
catch( IOException ex ) { throw new InitializationException("IOException thrown creating PatternTokenizer instance", ex); }
// in core/src/java/org/apache/solr/analysis/DictionaryCompoundWordTokenFilterFactory.java
catch (IOException e) { throw new InitializationException("IOException thrown while loading dictionary", e); }
// in core/src/java/org/apache/solr/analysis/WordDelimiterFilterFactory.java
catch (IOException e) { throw new InitializationException("IOException thrown while loading protected words", e); }
// in core/src/java/org/apache/solr/analysis/WordDelimiterFilterFactory.java
catch (IOException e) { throw new InitializationException("IOException while loading types", e); }
// in core/src/java/org/apache/solr/analysis/TypeTokenFilterFactory.java
catch (IOException e) { throw new InitializationException("IOException thrown while loading types", e); }
// in core/src/java/org/apache/solr/analysis/CommonGramsQueryFilterFactory.java
catch (IOException e) { throw new InitializationException("IOException thrown while loading common word file", e); }
// in core/src/java/org/apache/solr/analysis/StopFilterFactory.java
catch (IOException e) { throw new InitializationException("IOException thrown while loading stopwords", e); }
// in core/src/java/org/apache/solr/analysis/CommonGramsFilterFactory.java
catch (IOException e) { throw new InitializationException("IOException thrown while loading common word file", e); }
// in core/src/java/org/apache/solr/client/solrj/embedded/EmbeddedSolrServer.java
catch( IOException iox ) { throw iox; }
// in core/src/java/org/apache/solr/response/transform/ValueSourceAugmenter.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/response/transform/ValueSourceAugmenter.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "exception at docid " + docid + " for valuesource " + valueSource, e); }
// in core/src/java/org/apache/solr/response/transform/ExplainAugmenterFactory.java
catch (IOException e) { e.printStackTrace(); }
// in core/src/java/org/apache/solr/request/SimpleFacets.java
catch (IOException e) { throw new SolrException(ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/request/UnInvertedField.java
catch (IOException e) { throw new RuntimeException("failed to open field cache for: " + f, e); }
// in core/src/java/org/apache/solr/servlet/cache/HttpCacheHeaderUtil.java
catch (IOException e) { // we're pretty freaking screwed if this happens throw new SolrException(ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/schema/OpenExchangeRatesOrgProvider.java
catch (IOException e) { throw new SolrException(ErrorCode.SERVER_ERROR, "Error closing stream", e); }
// in core/src/java/org/apache/solr/schema/CurrencyField.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/schema/CurrencyField.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Error parsing currency config.", e); }
// in core/src/java/org/apache/solr/schema/CurrencyField.java
catch (IOException e) { throw new SolrException(ErrorCode.BAD_REQUEST, "Error while opening Currency configuration file "+currencyConfigFile, e); }
// in core/src/java/org/apache/solr/schema/CurrencyField.java
catch (IOException e) { e.printStackTrace(); }
// in core/src/java/org/apache/solr/schema/LatLonType.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/schema/PreAnalyzedField.java
catch (IOException e) { return null; }
// in core/src/java/org/apache/solr/schema/TextField.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Unable to initialize TokenStream to analyze multiTerm term: " + part, e); }
// in core/src/java/org/apache/solr/schema/TextField.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST,"error analyzing range part: " + part, e); }
// in core/src/java/org/apache/solr/schema/TextField.java
catch (IOException e) { throw new RuntimeException("Unable to end & close TokenStream after analyzing multiTerm term: " + part, e); }
// in core/src/java/org/apache/solr/schema/TextField.java
catch (IOException e) { throw new RuntimeException("Unable to initialize TokenStream to analyze query text", e); }
// in core/src/java/org/apache/solr/schema/TextField.java
catch (IOException e) { throw new RuntimeException("Unable to initialize TokenStream to analyze query text", e); }
// in core/src/java/org/apache/solr/schema/TextField.java
catch (IOException e) { // ignore }
// in core/src/java/org/apache/solr/schema/TextField.java
catch (IOException e) { // ignore }
// in core/src/java/org/apache/solr/schema/TextField.java
catch (IOException e) { // safe to ignore, because we know the number of tokens }
// in core/src/java/org/apache/solr/schema/TextField.java
catch (IOException e) { // safe to ignore, because we know the number of tokens }
// in core/src/java/org/apache/solr/schema/TextField.java
catch (IOException e) { // safe to ignore, because we know the number of tokens }
// in core/src/java/org/apache/solr/schema/TextField.java
catch (IOException e) { // safe to ignore, because we know the number of tokens }
// in core/src/java/org/apache/solr/schema/CollationField.java
catch (IOException e) { // io error throw new RuntimeException(e); }
// in core/src/java/org/apache/solr/schema/CollationField.java
catch (IOException e) { throw new RuntimeException("Unable to initialize TokenStream to analyze range part: " + part, e); }
// in core/src/java/org/apache/solr/schema/CollationField.java
catch (IOException e) { throw new RuntimeException("error analyzing range part: " + part, e); }
// in core/src/java/org/apache/solr/schema/CollationField.java
catch (IOException e) { throw new RuntimeException("Unable to end & close TokenStream after analyzing range part: " + part, e); }
// in core/src/java/org/apache/solr/internal/csv/CSVUtils.java
catch (IOException e) { // should not happen with StringWriter }
// in core/src/java/org/apache/solr/search/SolrConstantScoreQuery.java
catch (IOException e) { // TODO: remove this if ConstantScoreQuery.createWeight adds IOException throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/search/QueryParsing.java
catch (IOException ioe) { throw ioe; }
// in core/src/java/org/apache/solr/search/QueryParsing.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "error in sort: " + sortSpec, e); }
// in core/src/java/org/apache/solr/search/grouping/distributed/responseprocessor/TopGroupsShardResponseProcessor.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/search/grouping/distributed/responseprocessor/SearchGroupShardResponseProcessor.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/search/function/FileFloatSource.java
catch (IOException e) { // log, use defaults SolrCore.log.error("Error opening external value source file: " +e); return vals; }
// in core/src/java/org/apache/solr/search/function/FileFloatSource.java
catch (IOException e) { // log, use defaults SolrCore.log.error("Error loading external value source: " +e); }
// in core/src/java/org/apache/solr/spelling/SpellingQueryConverter.java
catch (IOException e) { // TODO: shouldn't we log something? }
// in core/src/java/org/apache/solr/spelling/AbstractLuceneSpellChecker.java
catch (IOException e) { throw new RuntimeException(e); }
// in core/src/java/org/apache/solr/spelling/suggest/Suggester.java
catch (IOException e) { LOG.warn("Loading stored lookup data failed", e); }
// in core/src/java/org/apache/solr/spelling/FileBasedSpellChecker.java
catch (IOException e) { throw new RuntimeException(e); }
// in core/src/java/org/apache/solr/spelling/FileBasedSpellChecker.java
catch (IOException e) { log.error( "Unable to load spellings", e); }
// in core/src/java/org/apache/solr/spelling/SuggestQueryConverter.java
catch (IOException e) { throw new RuntimeException(e); }
// in core/src/java/org/apache/solr/spelling/IndexBasedSpellChecker.java
catch (IOException e) { throw new RuntimeException(e); }
// in core/src/java/org/apache/solr/spelling/IndexBasedSpellChecker.java
catch (IOException e) { throw new RuntimeException(e); }
// in core/src/java/org/apache/solr/cloud/SolrZkServer.java
catch (IOException e) { if (zkRun != null) throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/cloud/SolrZkServer.java
catch (IOException e) { throw new ConfigException("Error processing " + path, e); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (IOException e) { log.error("", e); throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Can't create ZooKeeperController", e); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (IOException e) { throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/cloud/LeaderElector.java
catch (IOException e) { log.warn("", e); }
// in core/src/java/org/apache/solr/update/UpdateLog.java
catch (IOException e) { log.error("Error inspecting tlog " + ll); ll.decref(); continue; }
// in core/src/java/org/apache/solr/update/UpdateLog.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Can't open new tlog!", e); }
// in core/src/java/org/apache/solr/update/UpdateLog.java
catch (IOException e) { // failure to read a log record isn't fatal log.error("Exception reading versions from log",e); }
// in core/src/java/org/apache/solr/update/UpdateLog.java
catch (IOException e) { SolrException.log(log,"Error attempting to roll back log", e); return false; }
// in core/src/java/org/apache/solr/update/UpdateLog.java
catch (IOException e) { SolrException.log(log,e); }
// in core/src/java/org/apache/solr/update/UpdateLog.java
catch (IOException ex) { recoveryInfo.errors++; loglog.warn("REYPLAY_ERR: IOException reading log", ex); // could be caused by an incomplete flush if recovering from log }
// in core/src/java/org/apache/solr/update/UpdateLog.java
catch (IOException ex) { recoveryInfo.errors++; loglog.error("Replay exception: final commit.", ex); }
// in core/src/java/org/apache/solr/update/UpdateLog.java
catch (IOException ex) { recoveryInfo.errors++; loglog.error("Replay exception: finish()", ex); }
// in core/src/java/org/apache/solr/update/DirectUpdateHandler2.java
catch (IOException e) { throw new SolrException(ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/update/processor/HTMLStripFieldUpdateProcessorFactory.java
catch (IOException e) { // we tried and failed return s; }
// in core/src/java/org/apache/solr/update/VersionInfo.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Error reading version from index", e); }
// in core/src/java/org/apache/solr/update/PeerSync.java
catch (IOException e) { // TODO: should this be handled separately as a problem with us? // I guess it probably already will by causing replication to be kicked off. sreq.updateException = e; log.error(msg() + "Error applying updates from " + sreq.shards + " ,update=" + o, e); return false; }
// in core/src/java/org/apache/solr/update/TransactionLog.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/update/TransactionLog.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/update/TransactionLog.java
catch (IOException e) { // TODO: reset our file pointer back to "pos", the start of this record. throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Error logging add", e); }
// in core/src/java/org/apache/solr/update/TransactionLog.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/update/TransactionLog.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/update/TransactionLog.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/update/TransactionLog.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/update/TransactionLog.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/update/TransactionLog.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/core/RunExecutableListener.java
catch (IOException e) { // don't throw exception, just log it... SolrException.log(log,e); ret = INVALID_PROCESS_RETURN_CODE; }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (IOException e) { /*no op*/ }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (IOException e) { throw new RuntimeException(e); }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (IOException e) { SolrException.log(log,null,e); return null; }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (IOException e) { log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (IOException e) { log.warn("Error loading properties ",e); }
// in core/src/java/org/apache/solr/core/SolrXMLSerializer.java
catch (java.io.IOException xio) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, xio); }
// in core/src/java/org/apache/solr/core/SolrXMLSerializer.java
catch (IOException xio) { xforward = xio; }
// in core/src/java/org/apache/solr/core/SolrXMLSerializer.java
catch (IOException xio) {}
// in core/src/java/org/apache/solr/core/SolrXMLSerializer.java
catch (IOException xio) {}
// in core/src/java/org/apache/solr/core/SolrXMLSerializer.java
catch (IOException xio) {}
// in core/src/java/org/apache/solr/core/SolrXMLSerializer.java
catch (IOException xio) {}
// in core/src/java/org/apache/solr/util/SystemIdResolver.java
catch (IOException ioe) { throw new TransformerException("Cannot resolve entity", ioe); }
// in core/src/java/org/apache/solr/util/SystemIdResolver.java
catch (IOException ioe) { throw new XMLStreamException("Cannot resolve entity", ioe); }
// in core/src/java/org/apache/solr/util/SimplePostTool.java
catch (IOException e) { fatal("Can't open/read file: " + file); }
// in core/src/java/org/apache/solr/util/SimplePostTool.java
catch (IOException e) { fatal("IOException while closing file: "+ e); }
// in core/src/java/org/apache/solr/util/SimplePostTool.java
catch (IOException e) { fatal("An error occured posting data to "+url+". Please check that Solr is running."); }
// in core/src/java/org/apache/solr/util/SimplePostTool.java
catch (IOException e) { fatal("Connection error (is Solr running at " + solrUrl + " ?): " + e); }
// in core/src/java/org/apache/solr/util/SimplePostTool.java
catch (IOException e) { fatal("IOException while posting data: " + e); }
// in core/src/java/org/apache/solr/util/SimplePostTool.java
catch (IOException x) { /*NOOP*/ }
// in core/src/java/org/apache/solr/util/SimplePostTool.java
catch (IOException e) { fatal("IOException while reading response: " + e); }
// in core/src/java/org/apache/solr/util/SimplePostTool.java
catch (IOException x) { /*NOOP*/ }
// in core/src/java/org/apache/solr/util/FileUtils.java
catch (IOException e) {}
// in core/src/java/org/apache/solr/util/FileUtils.java
catch (IOException e) {}
// in core/src/java/org/apache/solr/util/FileUtils.java
catch (IOException ioe) { if (exc == null) exc = ioe; try { // Pause 5 msec Thread.sleep(5); } catch (InterruptedException ie) { Thread.currentThread().interrupt(); } }
103
            
// in solrj/src/java/org/apache/solr/common/cloud/ZkStateReader.java
catch (IOException e) { throw new RuntimeException(e); // should never happen w/o using real IO }
// in solrj/src/java/org/apache/solr/common/params/MapSolrParams.java
catch (IOException e) {throw new RuntimeException(e);}
// in solrj/src/java/org/apache/solr/common/params/MultiMapSolrParams.java
catch (IOException e) {throw new RuntimeException(e);}
// in solrj/src/java/org/apache/solr/common/params/ModifiableSolrParams.java
catch (IOException e) {throw new RuntimeException(e);}
// in solrj/src/java/org/apache/solr/client/solrj/request/RequestWriter.java
catch (IOException e) { throw new RuntimeException("Unable to write xml into a stream", e); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/BinaryResponseParser.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "parsing error", e); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/CloudSolrServer.java
catch (IOException e) { throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/StreamingBinaryResponseParser.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "parsing error", e); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/HttpSolrServer.java
catch (IOException ex) { throw new SolrServerException("error reading streams", ex); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/HttpSolrServer.java
catch (IOException e) { throw new SolrServerException( "IOException occured when talking to server at: " + getBaseURL(), e); }
// in solrj/src/java/org/apache/solr/client/solrj/util/ClientUtils.java
catch (IOException e) {throw new RuntimeException(e);}
// in solrj/src/java/org/apache/noggit/CharArr.java
catch (IOException e) { throw new RuntimeException(e); }
// in solrj/src/java/org/apache/noggit/CharArr.java
catch (IOException e) { throw new RuntimeException(e); }
// in solrj/src/java/org/apache/noggit/CharArr.java
catch (IOException e) { throw new RuntimeException(e); }
// in contrib/extraction/src/java/org/apache/solr/handler/extraction/ExtractingRequestHandler.java
catch (IOException e) { throw new SolrException(ErrorCode.SERVER_ERROR, e); }
// in contrib/clustering/src/java/org/apache/solr/handler/clustering/carrot2/CarrotClusteringEngine.java
catch (IOException e) { throw new SolrException(ErrorCode.SERVER_ERROR, e); }
// in contrib/analysis-extras/src/java/org/apache/solr/analysis/StempelPolishStemFilterFactory.java
catch (IOException e) { throw new SolrException(ErrorCode.SERVER_ERROR, "Could not load stem table: " + STEMTABLE); }
// in contrib/analysis-extras/src/java/org/apache/solr/schema/ICUCollationField.java
catch (IOException e) { throw new RuntimeException("Unable to initialize TokenStream to analyze range part: " + part, e); }
// in contrib/analysis-extras/src/java/org/apache/solr/schema/ICUCollationField.java
catch (IOException e) { throw new RuntimeException("error analyzing range part: " + part, e); }
// in contrib/analysis-extras/src/java/org/apache/solr/schema/ICUCollationField.java
catch (IOException e) { throw new RuntimeException("Unable to end & close TokenStream after analyzing range part: " + part, e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/LineEntityProcessor.java
catch (IOException exp) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Problem reading from input", exp); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/SolrWriter.java
catch (IOException e) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Unable to call finish() on UpdateRequestProcessor", e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/SolrWriter.java
catch (IOException e) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Exception in full dump while deleting all documents.", e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/HTMLStripTransformer.java
catch (IOException e) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Failed stripping HTML for column: " + column, e); }
// in core/src/java/org/apache/solr/handler/AnalysisRequestHandlerBase.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/handler/AnalysisRequestHandlerBase.java
catch (IOException ioe) { throw new RuntimeException("Error occured while iterating over tokenstream", ioe); }
// in core/src/java/org/apache/solr/handler/AnalysisRequestHandlerBase.java
catch (IOException ioe) { throw new RuntimeException("Error occured while iterating over tokenstream", ioe); }
// in core/src/java/org/apache/solr/handler/AnalysisRequestHandlerBase.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/handler/loader/JavabinLoader.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "ERROR handling commit/rollback"); }
// in core/src/java/org/apache/solr/handler/loader/JavabinLoader.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "ERROR adding document " + document); }
// in core/src/java/org/apache/solr/handler/loader/CSVLoaderBase.java
catch (IOException e) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST,e); }
// in core/src/java/org/apache/solr/handler/component/StatsComponent.java
catch (IOException e) { throw new RuntimeException( "failed to open field cache for: "+fieldName, e ); }
// in core/src/java/org/apache/solr/handler/component/StatsComponent.java
catch (IOException e) { throw new RuntimeException( "failed to open field cache for: " + facetField, e ); }
// in core/src/java/org/apache/solr/handler/component/TermVectorComponent.java
catch (IOException e) { throw new RuntimeException(e); }
// in core/src/java/org/apache/solr/handler/PingRequestHandler.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Unable to write healthcheck flag file", e); }
// in core/src/java/org/apache/solr/analysis/TrieTokenizerFactory.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Unable to create TrieIndexTokenizer", e); }
// in core/src/java/org/apache/solr/analysis/TrieTokenizerFactory.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Unable to create TrieIndexTokenizer", e); }
// in core/src/java/org/apache/solr/analysis/ElisionFilterFactory.java
catch (IOException e) { throw new InitializationException("IOException thrown while loading articles", e); }
// in core/src/java/org/apache/solr/analysis/StemmerOverrideFilterFactory.java
catch (IOException e) { throw new InitializationException("IOException thrown while loading dictionary", e); }
// in core/src/java/org/apache/solr/analysis/KeywordMarkerFilterFactory.java
catch (IOException e) { throw new InitializationException("IOException thrown while loading protected words", e); }
// in core/src/java/org/apache/solr/analysis/SnowballPorterFilterFactory.java
catch (IOException e) { throw new InitializationException("IOException thrown while loading protected words", e); }
// in core/src/java/org/apache/solr/analysis/KeepWordFilterFactory.java
catch (IOException e) { throw new InitializationException("IOException thrown while loading words", e); }
// in core/src/java/org/apache/solr/analysis/JapanesePartOfSpeechStopFilterFactory.java
catch (IOException e) { throw new InitializationException("IOException thrown while loading tags", e); }
// in core/src/java/org/apache/solr/analysis/MappingCharFilterFactory.java
catch( IOException e ){ throw new InitializationException("IOException thrown while loading mappings", e); }
// in core/src/java/org/apache/solr/analysis/PatternTokenizerFactory.java
catch( IOException ex ) { throw new InitializationException("IOException thrown creating PatternTokenizer instance", ex); }
// in core/src/java/org/apache/solr/analysis/DictionaryCompoundWordTokenFilterFactory.java
catch (IOException e) { throw new InitializationException("IOException thrown while loading dictionary", e); }
// in core/src/java/org/apache/solr/analysis/WordDelimiterFilterFactory.java
catch (IOException e) { throw new InitializationException("IOException thrown while loading protected words", e); }
// in core/src/java/org/apache/solr/analysis/WordDelimiterFilterFactory.java
catch (IOException e) { throw new InitializationException("IOException while loading types", e); }
// in core/src/java/org/apache/solr/analysis/TypeTokenFilterFactory.java
catch (IOException e) { throw new InitializationException("IOException thrown while loading types", e); }
// in core/src/java/org/apache/solr/analysis/CommonGramsQueryFilterFactory.java
catch (IOException e) { throw new InitializationException("IOException thrown while loading common word file", e); }
// in core/src/java/org/apache/solr/analysis/StopFilterFactory.java
catch (IOException e) { throw new InitializationException("IOException thrown while loading stopwords", e); }
// in core/src/java/org/apache/solr/analysis/CommonGramsFilterFactory.java
catch (IOException e) { throw new InitializationException("IOException thrown while loading common word file", e); }
// in core/src/java/org/apache/solr/client/solrj/embedded/EmbeddedSolrServer.java
catch( IOException iox ) { throw iox; }
// in core/src/java/org/apache/solr/response/transform/ValueSourceAugmenter.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/response/transform/ValueSourceAugmenter.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "exception at docid " + docid + " for valuesource " + valueSource, e); }
// in core/src/java/org/apache/solr/request/SimpleFacets.java
catch (IOException e) { throw new SolrException(ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/request/UnInvertedField.java
catch (IOException e) { throw new RuntimeException("failed to open field cache for: " + f, e); }
// in core/src/java/org/apache/solr/servlet/cache/HttpCacheHeaderUtil.java
catch (IOException e) { // we're pretty freaking screwed if this happens throw new SolrException(ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/schema/OpenExchangeRatesOrgProvider.java
catch (IOException e) { throw new SolrException(ErrorCode.SERVER_ERROR, "Error closing stream", e); }
// in core/src/java/org/apache/solr/schema/CurrencyField.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/schema/CurrencyField.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Error parsing currency config.", e); }
// in core/src/java/org/apache/solr/schema/CurrencyField.java
catch (IOException e) { throw new SolrException(ErrorCode.BAD_REQUEST, "Error while opening Currency configuration file "+currencyConfigFile, e); }
// in core/src/java/org/apache/solr/schema/LatLonType.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/schema/TextField.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Unable to initialize TokenStream to analyze multiTerm term: " + part, e); }
// in core/src/java/org/apache/solr/schema/TextField.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST,"error analyzing range part: " + part, e); }
// in core/src/java/org/apache/solr/schema/TextField.java
catch (IOException e) { throw new RuntimeException("Unable to end & close TokenStream after analyzing multiTerm term: " + part, e); }
// in core/src/java/org/apache/solr/schema/TextField.java
catch (IOException e) { throw new RuntimeException("Unable to initialize TokenStream to analyze query text", e); }
// in core/src/java/org/apache/solr/schema/TextField.java
catch (IOException e) { throw new RuntimeException("Unable to initialize TokenStream to analyze query text", e); }
// in core/src/java/org/apache/solr/schema/CollationField.java
catch (IOException e) { // io error throw new RuntimeException(e); }
// in core/src/java/org/apache/solr/schema/CollationField.java
catch (IOException e) { throw new RuntimeException("Unable to initialize TokenStream to analyze range part: " + part, e); }
// in core/src/java/org/apache/solr/schema/CollationField.java
catch (IOException e) { throw new RuntimeException("error analyzing range part: " + part, e); }
// in core/src/java/org/apache/solr/schema/CollationField.java
catch (IOException e) { throw new RuntimeException("Unable to end & close TokenStream after analyzing range part: " + part, e); }
// in core/src/java/org/apache/solr/search/SolrConstantScoreQuery.java
catch (IOException e) { // TODO: remove this if ConstantScoreQuery.createWeight adds IOException throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/search/QueryParsing.java
catch (IOException ioe) { throw ioe; }
// in core/src/java/org/apache/solr/search/QueryParsing.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "error in sort: " + sortSpec, e); }
// in core/src/java/org/apache/solr/search/grouping/distributed/responseprocessor/TopGroupsShardResponseProcessor.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/search/grouping/distributed/responseprocessor/SearchGroupShardResponseProcessor.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/spelling/AbstractLuceneSpellChecker.java
catch (IOException e) { throw new RuntimeException(e); }
// in core/src/java/org/apache/solr/spelling/FileBasedSpellChecker.java
catch (IOException e) { throw new RuntimeException(e); }
// in core/src/java/org/apache/solr/spelling/SuggestQueryConverter.java
catch (IOException e) { throw new RuntimeException(e); }
// in core/src/java/org/apache/solr/spelling/IndexBasedSpellChecker.java
catch (IOException e) { throw new RuntimeException(e); }
// in core/src/java/org/apache/solr/spelling/IndexBasedSpellChecker.java
catch (IOException e) { throw new RuntimeException(e); }
// in core/src/java/org/apache/solr/cloud/SolrZkServer.java
catch (IOException e) { if (zkRun != null) throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/cloud/SolrZkServer.java
catch (IOException e) { throw new ConfigException("Error processing " + path, e); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (IOException e) { log.error("", e); throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Can't create ZooKeeperController", e); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (IOException e) { throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/update/UpdateLog.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Can't open new tlog!", e); }
// in core/src/java/org/apache/solr/update/DirectUpdateHandler2.java
catch (IOException e) { throw new SolrException(ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/update/VersionInfo.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Error reading version from index", e); }
// in core/src/java/org/apache/solr/update/TransactionLog.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/update/TransactionLog.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/update/TransactionLog.java
catch (IOException e) { // TODO: reset our file pointer back to "pos", the start of this record. throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Error logging add", e); }
// in core/src/java/org/apache/solr/update/TransactionLog.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/update/TransactionLog.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/update/TransactionLog.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/update/TransactionLog.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/update/TransactionLog.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/update/TransactionLog.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (IOException e) { throw new RuntimeException(e); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (IOException e) { log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/core/SolrXMLSerializer.java
catch (java.io.IOException xio) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, xio); }
// in core/src/java/org/apache/solr/util/SystemIdResolver.java
catch (IOException ioe) { throw new TransformerException("Cannot resolve entity", ioe); }
// in core/src/java/org/apache/solr/util/SystemIdResolver.java
catch (IOException ioe) { throw new XMLStreamException("Cannot resolve entity", ioe); }
81
unknown (Lib) IllegalAccessException 0 0 2
            
// in core/src/java/org/apache/solr/handler/loader/XMLLoader.java
void processUpdate(SolrQueryRequest req, UpdateRequestProcessor processor, XMLStreamReader parser) throws XMLStreamException, IOException, FactoryConfigurationError, InstantiationException, IllegalAccessException, TransformerConfigurationException { AddUpdateCommand addCmd = null; SolrParams params = req.getParams(); while (true) { int event = parser.next(); switch (event) { case XMLStreamConstants.END_DOCUMENT: parser.close(); return; case XMLStreamConstants.START_ELEMENT: String currTag = parser.getLocalName(); if (currTag.equals(UpdateRequestHandler.ADD)) { log.trace("SolrCore.update(add)"); addCmd = new AddUpdateCommand(req); // First look for commitWithin parameter on the request, will be overwritten for individual <add>'s addCmd.commitWithin = params.getInt(UpdateParams.COMMIT_WITHIN, -1); addCmd.overwrite = params.getBool(UpdateParams.OVERWRITE, true); for (int i = 0; i < parser.getAttributeCount(); i++) { String attrName = parser.getAttributeLocalName(i); String attrVal = parser.getAttributeValue(i); if (UpdateRequestHandler.OVERWRITE.equals(attrName)) { addCmd.overwrite = StrUtils.parseBoolean(attrVal); } else if (UpdateRequestHandler.COMMIT_WITHIN.equals(attrName)) { addCmd.commitWithin = Integer.parseInt(attrVal); } else { log.warn("Unknown attribute id in add:" + attrName); } } } else if ("doc".equals(currTag)) { if(addCmd != null) { log.trace("adding doc..."); addCmd.clear(); addCmd.solrDoc = readDoc(parser); processor.processAdd(addCmd); } else { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Unexpected <doc> tag without an <add> tag surrounding it."); } } else if (UpdateRequestHandler.COMMIT.equals(currTag) || UpdateRequestHandler.OPTIMIZE.equals(currTag)) { log.trace("parsing " + currTag); CommitUpdateCommand cmd = new CommitUpdateCommand(req, UpdateRequestHandler.OPTIMIZE.equals(currTag)); ModifiableSolrParams mp = new ModifiableSolrParams(); for (int i = 0; i < parser.getAttributeCount(); i++) { String attrName = parser.getAttributeLocalName(i); String attrVal = parser.getAttributeValue(i); mp.set(attrName, attrVal); } RequestHandlerUtils.validateCommitParams(mp); SolrParams p = SolrParams.wrapDefaults(mp, req.getParams()); // default to the normal request params for commit options RequestHandlerUtils.updateCommit(cmd, p); processor.processCommit(cmd); } // end commit else if (UpdateRequestHandler.ROLLBACK.equals(currTag)) { log.trace("parsing " + currTag); RollbackUpdateCommand cmd = new RollbackUpdateCommand(req); processor.processRollback(cmd); } // end rollback else if (UpdateRequestHandler.DELETE.equals(currTag)) { log.trace("parsing delete"); processDelete(req, processor, parser); } // end delete break; } } }
// in core/src/java/org/apache/solr/handler/component/SearchHandler.java
Override public void handleRequestBody(SolrQueryRequest req, SolrQueryResponse rsp) throws Exception, ParseException, InstantiationException, IllegalAccessException { // int sleep = req.getParams().getInt("sleep",0); // if (sleep > 0) {log.error("SLEEPING for " + sleep); Thread.sleep(sleep);} ResponseBuilder rb = new ResponseBuilder(req, rsp, components); if (rb.requestInfo != null) { rb.requestInfo.setResponseBuilder(rb); } boolean dbg = req.getParams().getBool(CommonParams.DEBUG_QUERY, false); rb.setDebug(dbg); if (dbg == false){//if it's true, we are doing everything anyway. SolrPluginUtils.getDebugInterests(req.getParams().getParams(CommonParams.DEBUG), rb); } final RTimer timer = rb.isDebug() ? new RTimer() : null; ShardHandler shardHandler1 = shardHandlerFactory.getShardHandler(); shardHandler1.checkDistributed(rb); if (timer == null) { // non-debugging prepare phase for( SearchComponent c : components ) { c.prepare(rb); } } else { // debugging prepare phase RTimer subt = timer.sub( "prepare" ); for( SearchComponent c : components ) { rb.setTimer( subt.sub( c.getName() ) ); c.prepare(rb); rb.getTimer().stop(); } subt.stop(); } if (!rb.isDistrib) { // a normal non-distributed request // The semantics of debugging vs not debugging are different enough that // it makes sense to have two control loops if(!rb.isDebug()) { // Process for( SearchComponent c : components ) { c.process(rb); } } else { // Process RTimer subt = timer.sub( "process" ); for( SearchComponent c : components ) { rb.setTimer( subt.sub( c.getName() ) ); c.process(rb); rb.getTimer().stop(); } subt.stop(); timer.stop(); // add the timing info if (rb.isDebugTimings()) { rb.addDebugInfo("timing", timer.asNamedList() ); } } } else { // a distributed request if (rb.outgoing == null) { rb.outgoing = new LinkedList<ShardRequest>(); } rb.finished = new ArrayList<ShardRequest>(); int nextStage = 0; do { rb.stage = nextStage; nextStage = ResponseBuilder.STAGE_DONE; // call all components for( SearchComponent c : components ) { // the next stage is the minimum of what all components report nextStage = Math.min(nextStage, c.distributedProcess(rb)); } // check the outgoing queue and send requests while (rb.outgoing.size() > 0) { // submit all current request tasks at once while (rb.outgoing.size() > 0) { ShardRequest sreq = rb.outgoing.remove(0); sreq.actualShards = sreq.shards; if (sreq.actualShards==ShardRequest.ALL_SHARDS) { sreq.actualShards = rb.shards; } sreq.responses = new ArrayList<ShardResponse>(); // TODO: map from shard to address[] for (String shard : sreq.actualShards) { ModifiableSolrParams params = new ModifiableSolrParams(sreq.params); params.remove(ShardParams.SHARDS); // not a top-level request params.set("distrib", "false"); // not a top-level request params.remove("indent"); params.remove(CommonParams.HEADER_ECHO_PARAMS); params.set(ShardParams.IS_SHARD, true); // a sub (shard) request params.set(ShardParams.SHARD_URL, shard); // so the shard knows what was asked if (rb.requestInfo != null) { // we could try and detect when this is needed, but it could be tricky params.set("NOW", Long.toString(rb.requestInfo.getNOW().getTime())); } String shardQt = params.get(ShardParams.SHARDS_QT); if (shardQt == null) { params.remove(CommonParams.QT); } else { params.set(CommonParams.QT, shardQt); } shardHandler1.submit(sreq, shard, params); } } // now wait for replies, but if anyone puts more requests on // the outgoing queue, send them out immediately (by exiting // this loop) boolean tolerant = rb.req.getParams().getBool(ShardParams.SHARDS_TOLERANT, false); while (rb.outgoing.size() == 0) { ShardResponse srsp = tolerant ? shardHandler1.takeCompletedIncludingErrors(): shardHandler1.takeCompletedOrError(); if (srsp == null) break; // no more requests to wait for // Was there an exception? if (srsp.getException() != null) { // If things are not tolerant, abort everything and rethrow if(!tolerant) { shardHandler1.cancelAll(); if (srsp.getException() instanceof SolrException) { throw (SolrException)srsp.getException(); } else { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, srsp.getException()); } } } rb.finished.add(srsp.getShardRequest()); // let the components see the responses to the request for(SearchComponent c : components) { c.handleResponses(rb, srsp.getShardRequest()); } } } for(SearchComponent c : components) { c.finishStage(rb); } // we are done when the next stage is MAX_VALUE } while (nextStage != Integer.MAX_VALUE); } }
1
            
// in core/src/java/org/apache/solr/util/SolrPluginUtils.java
catch (IllegalAccessException e1) { throw new RuntimeException("Error invoking setter " + setterName + " on class : " + clazz.getName(), e1); }
1
            
// in core/src/java/org/apache/solr/util/SolrPluginUtils.java
catch (IllegalAccessException e1) { throw new RuntimeException("Error invoking setter " + setterName + " on class : " + clazz.getName(), e1); }
1
runtime (Lib) IllegalArgumentException 48
            
// in solrj/src/java/org/apache/solr/common/util/Base64.java
public static byte[] base64ToByteArray(String s) { byte[] alphaToInt = base64ToInt; int sLen = s.length(); int numGroups = sLen / 4; if (4 * numGroups != sLen) throw new IllegalArgumentException( "String length must be a multiple of four."); int missingBytesInLastGroup = 0; int numFullGroups = numGroups; if (sLen != 0) { if (s.charAt(sLen - 1) == '=') { missingBytesInLastGroup++; numFullGroups--; } if (s.charAt(sLen - 2) == '=') missingBytesInLastGroup++; } byte[] result = new byte[3 * numGroups - missingBytesInLastGroup]; // Translate all full groups from base64 to byte array elements int inCursor = 0, outCursor = 0; for (int i = 0; i < numFullGroups; i++) { int ch0 = base64toInt(s.charAt(inCursor++), alphaToInt); int ch1 = base64toInt(s.charAt(inCursor++), alphaToInt); int ch2 = base64toInt(s.charAt(inCursor++), alphaToInt); int ch3 = base64toInt(s.charAt(inCursor++), alphaToInt); result[outCursor++] = (byte) ((ch0 << 2) | (ch1 >> 4)); result[outCursor++] = (byte) ((ch1 << 4) | (ch2 >> 2)); result[outCursor++] = (byte) ((ch2 << 6) | ch3); } // Translate partial group, if present if (missingBytesInLastGroup != 0) { int ch0 = base64toInt(s.charAt(inCursor++), alphaToInt); int ch1 = base64toInt(s.charAt(inCursor++), alphaToInt); result[outCursor++] = (byte) ((ch0 << 2) | (ch1 >> 4)); if (missingBytesInLastGroup == 1) { int ch2 = base64toInt(s.charAt(inCursor++), alphaToInt); result[outCursor++] = (byte) ((ch1 << 4) | (ch2 >> 2)); } } // assert inCursor == s.length()-missingBytesInLastGroup; // assert outCursor == result.length; return result; }
// in solrj/src/java/org/apache/solr/common/util/Base64.java
private static int base64toInt(char c, byte[] alphaToInt) { int result = alphaToInt[c]; if (result < 0) throw new IllegalArgumentException("Illegal character " + c); return result; }
// in solrj/src/java/org/apache/solr/common/util/DateUtil.java
public static Date parseDate( String dateValue, Collection<String> dateFormats, Date startDate ) throws ParseException { if (dateValue == null) { throw new IllegalArgumentException("dateValue is null"); } if (dateFormats == null) { dateFormats = DEFAULT_HTTP_CLIENT_PATTERNS; } if (startDate == null) { startDate = DEFAULT_TWO_DIGIT_YEAR_START; } // trim single quotes around date if present // see issue #5279 if (dateValue.length() > 1 && dateValue.startsWith("'") && dateValue.endsWith("'") ) { dateValue = dateValue.substring(1, dateValue.length() - 1); } SimpleDateFormat dateParser = null; Iterator formatIter = dateFormats.iterator(); while (formatIter.hasNext()) { String format = (String) formatIter.next(); if (dateParser == null) { dateParser = new SimpleDateFormat(format, Locale.US); dateParser.setTimeZone(GMT); dateParser.set2DigitYearStart(startDate); } else { dateParser.applyPattern(format); } try { return dateParser.parse(dateValue); } catch (ParseException pe) { // ignore this exception, we will try the next format } } // we were unable to parse the date throw new ParseException("Unable to parse the date " + dateValue, 0); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
public void setAliveCheckInterval(int interval) { if (interval <= 0) { throw new IllegalArgumentException("Alive check interval must be " + "positive, specified value = " + interval); } this.interval = interval; }
// in contrib/analysis-extras/src/java/org/apache/solr/analysis/MorfologikFilterFactory.java
Override public void init(Map<String,String> args) { super.init(args); String dictionaryName = args.get(DICTIONARY_SCHEMA_ATTRIBUTE); if (dictionaryName != null && !dictionaryName.isEmpty()) { try { DICTIONARY dictionary = DICTIONARY.valueOf(dictionaryName.toUpperCase(Locale.ENGLISH)); assert dictionary != null; this.dictionary = dictionary; } catch (IllegalArgumentException e) { throw new IllegalArgumentException("The " + DICTIONARY_SCHEMA_ATTRIBUTE + " attribute accepts the " + "following constants: " + Arrays.toString(DICTIONARY.values()) + ", this value is invalid: " + dictionaryName); } } }
// in contrib/analysis-extras/src/java/org/apache/solr/schema/ICUCollationField.java
private BytesRef analyzeRangePart(String field, String part) { TokenStream source; try { source = analyzer.tokenStream(field, new StringReader(part)); source.reset(); } catch (IOException e) { throw new RuntimeException("Unable to initialize TokenStream to analyze range part: " + part, e); } TermToBytesRefAttribute termAtt = source.getAttribute(TermToBytesRefAttribute.class); BytesRef bytes = termAtt.getBytesRef(); // we control the analyzer here: most errors are impossible try { if (!source.incrementToken()) throw new IllegalArgumentException("analyzer returned no terms for range part: " + part); termAtt.fillBytesRef(); assert !source.incrementToken(); } catch (IOException e) { throw new RuntimeException("error analyzing range part: " + part, e); } try { source.end(); source.close(); } catch (IOException e) { throw new RuntimeException("Unable to end & close TokenStream after analyzing range part: " + part, e); } return BytesRef.deepCopyOf(bytes); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DocBuilder.java
private String findMatchingPkColumn(String pk, Map<String, Object> row) { if (row.containsKey(pk)) throw new IllegalArgumentException( String.format("deltaQuery returned a row with null for primary key %s", pk)); String resolvedPk = null; for (String columnName : row.keySet()) { if (columnName.endsWith("." + pk) || pk.endsWith("." + columnName)) { if (resolvedPk != null) throw new IllegalArgumentException( String.format( "deltaQuery has more than one column (%s and %s) that might resolve to declared primary key pk='%s'", resolvedPk, columnName, pk)); resolvedPk = columnName; } } if (resolvedPk == null) throw new IllegalArgumentException( String.format("deltaQuery has no column to resolve to declared primary key pk='%s'", pk)); LOG.info(String.format("Resolving deltaQuery column '%s' to match entity's declared pk '%s'", resolvedPk, pk)); return resolvedPk; }
// in core/src/java/org/apache/solr/response/PHPSerializedResponseWriter.java
Override public void writeMapOpener(int size) throws IOException, IllegalArgumentException { // negative size value indicates that something has gone wrong if (size < 0) { throw new IllegalArgumentException("Map size must not be negative"); } writer.write("a:"+size+":{"); }
// in core/src/java/org/apache/solr/response/PHPSerializedResponseWriter.java
Override public void writeArrayOpener(int size) throws IOException, IllegalArgumentException { // negative size value indicates that something has gone wrong if (size < 0) { throw new IllegalArgumentException("Array size must not be negative"); } writer.write("a:"+size+":{"); }
// in core/src/java/org/apache/solr/schema/CollationField.java
private BytesRef analyzeRangePart(String field, String part) { TokenStream source; try { source = analyzer.tokenStream(field, new StringReader(part)); source.reset(); } catch (IOException e) { throw new RuntimeException("Unable to initialize TokenStream to analyze range part: " + part, e); } TermToBytesRefAttribute termAtt = source.getAttribute(TermToBytesRefAttribute.class); BytesRef bytes = termAtt.getBytesRef(); // we control the analyzer here: most errors are impossible try { if (!source.incrementToken()) throw new IllegalArgumentException("analyzer returned no terms for range part: " + part); termAtt.fillBytesRef(); assert !source.incrementToken(); } catch (IOException e) { throw new RuntimeException("error analyzing range part: " + part, e); } try { source.end(); source.close(); } catch (IOException e) { throw new RuntimeException("Unable to end & close TokenStream after analyzing range part: " + part, e); } return BytesRef.deepCopyOf(bytes); }
// in core/src/java/org/apache/solr/internal/csv/CSVUtils.java
public static String[][] parse(String s) throws IOException { if (s == null) { throw new IllegalArgumentException("Null argument not allowed."); } String[][] result = (new CSVParser(new StringReader(s))).getAllValues(); if (result == null) { // since CSVStrategy ignores empty lines an empty array is returned // (i.e. not "result = new String[][] {{""}};") result = EMPTY_DOUBLE_STRING_ARRAY; } return result; }
// in core/src/java/org/apache/solr/internal/csv/CSVUtils.java
public static String[] parseLine(String s) throws IOException { if (s == null) { throw new IllegalArgumentException("Null argument not allowed."); } // uh,jh: make sure that parseLine("").length == 0 if (s.length() == 0) { return EMPTY_STRING_ARRAY; } return (new CSVParser(new StringReader(s))).getLine(); }
// in core/src/java/org/apache/solr/internal/csv/ExtendedBufferedReader.java
public long skip(long n) throws IllegalArgumentException, IOException { if (lookaheadChar == UNDEFINED) { lookaheadChar = super.read(); } // illegal argument if (n < 0) { throw new IllegalArgumentException("negative argument not supported"); } // no skipping if (n == 0 || lookaheadChar == END_OF_STREAM) { return 0; } // skip and reread the lookahead-char long skiped = 0; if (n > 1) { skiped = super.skip(n - 1); } lookaheadChar = super.read(); // fixme uh: we should check the skiped sequence for line-terminations... lineCounter = Integer.MIN_VALUE; return skiped + 1; }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
public QueryCommand setFilterList(List<Query> filterList) { if( filter != null ) { throw new IllegalArgumentException( "Either filter or filterList may be set in the QueryCommand, but not both." ); } this.filterList = filterList; return this; }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
public QueryCommand setFilterList(Query f) { if( filter != null ) { throw new IllegalArgumentException( "Either filter or filterList may be set in the QueryCommand, but not both." ); } filterList = null; if (f != null) { filterList = new ArrayList<Query>(2); filterList.add(f); } return this; }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
public QueryCommand setFilter(DocSet filter) { if( filterList != null ) { throw new IllegalArgumentException( "Either filter or filterList may be set in the QueryCommand, but not both." ); } this.filter = filter; return this; }
// in core/src/java/org/apache/solr/cloud/SolrZkServer.java
public static Properties getProperties(String path) throws ConfigException { File configFile = new File(path); LOG.info("Reading configuration from: " + configFile); try { if (!configFile.exists()) { throw new IllegalArgumentException(configFile.toString() + " file is missing"); } Properties cfg = new Properties(); FileInputStream in = new FileInputStream(configFile); try { cfg.load(in); } finally { in.close(); } return cfg; } catch (IOException e) { throw new ConfigException("Error processing " + path, e); } catch (IllegalArgumentException e) { throw new ConfigException("Error processing " + path, e); } }
// in core/src/java/org/apache/solr/cloud/SolrZkServer.java
Override public void parseProperties(Properties zkProp) throws IOException, ConfigException { for (Entry<Object, Object> entry : zkProp.entrySet()) { String key = entry.getKey().toString().trim(); String value = entry.getValue().toString().trim(); if (key.equals("dataDir")) { dataDir = value; } else if (key.equals("dataLogDir")) { dataLogDir = value; } else if (key.equals("clientPort")) { setClientPort(Integer.parseInt(value)); } else if (key.equals("tickTime")) { tickTime = Integer.parseInt(value); } else if (key.equals("initLimit")) { initLimit = Integer.parseInt(value); } else if (key.equals("syncLimit")) { syncLimit = Integer.parseInt(value); } else if (key.equals("electionAlg")) { electionAlg = Integer.parseInt(value); } else if (key.equals("maxClientCnxns")) { maxClientCnxns = Integer.parseInt(value); } else if (key.startsWith("server.")) { int dot = key.indexOf('.'); long sid = Long.parseLong(key.substring(dot + 1)); String parts[] = value.split(":"); if ((parts.length != 2) && (parts.length != 3)) { LOG.error(value + " does not have the form host:port or host:port:port"); } InetSocketAddress addr = new InetSocketAddress(parts[0], Integer.parseInt(parts[1])); if (parts.length == 2) { servers.put(Long.valueOf(sid), new QuorumPeer.QuorumServer(sid, addr)); } else if (parts.length == 3) { InetSocketAddress electionAddr = new InetSocketAddress( parts[0], Integer.parseInt(parts[2])); servers.put(Long.valueOf(sid), new QuorumPeer.QuorumServer(sid, addr, electionAddr)); } } else if (key.startsWith("group")) { int dot = key.indexOf('.'); long gid = Long.parseLong(key.substring(dot + 1)); numGroups++; String parts[] = value.split(":"); for(String s : parts){ long sid = Long.parseLong(s); if(serverGroup.containsKey(sid)) throw new ConfigException("Server " + sid + "is in multiple groups"); else serverGroup.put(sid, gid); } } else if(key.startsWith("weight")) { int dot = key.indexOf('.'); long sid = Long.parseLong(key.substring(dot + 1)); serverWeight.put(sid, Long.parseLong(value)); } else { System.setProperty("zookeeper." + key, value); } } if (dataDir == null) { throw new IllegalArgumentException("dataDir is not set"); } if (dataLogDir == null) { dataLogDir = dataDir; } else { if (!new File(dataLogDir).isDirectory()) { throw new IllegalArgumentException("dataLogDir " + dataLogDir + " is missing."); } } if (tickTime == 0) { throw new IllegalArgumentException("tickTime is not set"); } if (servers.size() > 1) { if (initLimit == 0) { throw new IllegalArgumentException("initLimit is not set"); } if (syncLimit == 0) { throw new IllegalArgumentException("syncLimit is not set"); } /* * If using FLE, then every server requires a separate election * port. */ if (electionAlg != 0) { for (QuorumPeer.QuorumServer s : servers.values()) { if (s.electionAddr == null) throw new IllegalArgumentException( "Missing election port for server: " + s.id); } } /* * Default of quorum config is majority */ if(serverGroup.size() > 0){ if(servers.size() != serverGroup.size()) throw new ConfigException("Every server must be in exactly one group"); /* * The deafult weight of a server is 1 */ for(QuorumPeer.QuorumServer s : servers.values()){ if(!serverWeight.containsKey(s.id)) serverWeight.put(s.id, (long) 1); } /* * Set the quorumVerifier to be QuorumHierarchical */ quorumVerifier = new QuorumHierarchical(numGroups, serverWeight, serverGroup); } else { /* * The default QuorumVerifier is QuorumMaj */ LOG.info("Defaulting to majority quorums"); quorumVerifier = new QuorumMaj(servers.size()); } File myIdFile = new File(dataDir, "myid"); if (!myIdFile.exists()) { ///////////////// ADDED FOR SOLR ////// Long myid = getMySeverId(); if (myid != null) { serverId = myid; return; } if (zkRun == null) return; //////////////// END ADDED FOR SOLR ////// throw new IllegalArgumentException(myIdFile.toString() + " file is missing"); } BufferedReader br = new BufferedReader(new FileReader(myIdFile)); String myIdString; try { myIdString = br.readLine(); } finally { br.close(); } try { serverId = Long.parseLong(myIdString); } catch (NumberFormatException e) { throw new IllegalArgumentException("serverid " + myIdString + " is not a number"); } } }
// in core/src/java/org/apache/solr/cloud/ZkController.java
public static void uploadToZK(SolrZkClient zkClient, File dir, String zkPath) throws IOException, KeeperException, InterruptedException { File[] files = dir.listFiles(); if (files == null) { throw new IllegalArgumentException("Illegal directory: " + dir); } for(File file : files) { if (!file.getName().startsWith(".")) { if (!file.isDirectory()) { zkClient.makePath(zkPath + "/" + file.getName(), file, false, true); } else { uploadToZK(zkClient, file, zkPath + "/" + file.getName()); } } } }
// in core/src/java/org/apache/solr/core/DefaultCodecFactory.java
Override public PostingsFormat getPostingsFormatForField(String field) { final SchemaField fieldOrNull = schema.getFieldOrNull(field); if (fieldOrNull == null) { throw new IllegalArgumentException("no such field " + field); } String postingsFormatName = fieldOrNull.getType().getPostingsFormat(); if (postingsFormatName != null) { return PostingsFormat.forName(postingsFormatName); } return super.getPostingsFormatForField(field); }
// in core/src/java/org/apache/solr/core/CachingDirectoryFactory.java
private void close(Directory directory) throws IOException { synchronized (this) { CacheValue cacheValue = byDirectoryCache.get(directory); if (cacheValue == null) { throw new IllegalArgumentException("Unknown directory: " + directory + " " + byDirectoryCache); } cacheValue.refCnt--; if (cacheValue.refCnt == 0 && cacheValue.doneWithDir) { directory.close(); byDirectoryCache.remove(directory); byPathCache.remove(cacheValue.path); } } }
// in core/src/java/org/apache/solr/core/CachingDirectoryFactory.java
public void incRef(Directory directory) { synchronized (this) { CacheValue cacheValue = byDirectoryCache.get(directory); if (cacheValue == null) { throw new IllegalArgumentException("Unknown directory: " + directory); } cacheValue.refCnt++; } }
// in core/src/java/org/apache/solr/core/MMapDirectoryFactory.java
Override public void init(NamedList args) { SolrParams params = SolrParams.toSolrParams( args ); maxChunk = params.getInt("maxChunkSize", MMapDirectory.DEFAULT_MAX_BUFF); if (maxChunk <= 0){ throw new IllegalArgumentException("maxChunk must be greater than 0"); } unmapHack = params.getBool("unmap", true); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
protected void initZooKeeper(String zkHost, int zkClientTimeout) { // if zkHost sys property is not set, we are not using ZooKeeper String zookeeperHost; if(zkHost == null) { zookeeperHost = System.getProperty("zkHost"); } else { zookeeperHost = zkHost; } String zkRun = System.getProperty("zkRun"); if (zkRun == null && zookeeperHost == null) return; // not in zk mode // zookeeper in quorum mode currently causes a failure when trying to // register log4j mbeans. See SOLR-2369 // TODO: remove after updating to an slf4j based zookeeper System.setProperty("zookeeper.jmx.log4j.disable", "true"); if (zkRun != null) { String zkDataHome = System.getProperty("zkServerDataDir", solrHome + "zoo_data"); String zkConfHome = System.getProperty("zkServerConfDir", solrHome); zkServer = new SolrZkServer(zkRun, zookeeperHost, zkDataHome, zkConfHome, hostPort); zkServer.parseConfig(); zkServer.start(); // set client from server config if not already set if (zookeeperHost == null) { zookeeperHost = zkServer.getClientString(); } } int zkClientConnectTimeout = 15000; if (zookeeperHost != null) { // we are ZooKeeper enabled try { // If this is an ensemble, allow for a long connect time for other servers to come up if (zkRun != null && zkServer.getServers().size() > 1) { zkClientConnectTimeout = 24 * 60 * 60 * 1000; // 1 day for embedded ensemble log.info("Zookeeper client=" + zookeeperHost + " Waiting for a quorum."); } else { log.info("Zookeeper client=" + zookeeperHost); } zkController = new ZkController(this, zookeeperHost, zkClientTimeout, zkClientConnectTimeout, host, hostPort, hostContext, new CurrentCoreDescriptorProvider() { @Override public List<CoreDescriptor> getCurrentDescriptors() { List<CoreDescriptor> descriptors = new ArrayList<CoreDescriptor>(getCoreNames().size()); for (SolrCore core : getCores()) { descriptors.add(core.getCoreDescriptor()); } return descriptors; } }); String confDir = System.getProperty("bootstrap_confdir"); if(confDir != null) { File dir = new File(confDir); if(!dir.isDirectory()) { throw new IllegalArgumentException("bootstrap_confdir must be a directory of configuration files"); } String confName = System.getProperty(ZkController.COLLECTION_PARAM_PREFIX+ZkController.CONFIGNAME_PROP, "configuration1"); zkController.uploadConfigDir(dir, confName); } boolean boostrapConf = Boolean.getBoolean("bootstrap_conf"); if(boostrapConf) { ZkController.bootstrapConf(zkController.getZkClient(), cfg, solrHome); } } catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } catch (TimeoutException e) { log.error("Could not connect to ZooKeeper", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } catch (IOException e) { log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } catch (KeeperException e) { log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } }
// in core/src/java/org/apache/solr/core/CoreDescriptor.java
public void setConfigName(String name) { if (name == null || name.length() == 0) throw new IllegalArgumentException("name can not be null or empty"); this.configName = name; }
// in core/src/java/org/apache/solr/core/CoreDescriptor.java
public void setSchemaName(String name) { if (name == null || name.length() == 0) throw new IllegalArgumentException("name can not be null or empty"); this.schemaName = name; }
// in core/src/java/org/apache/solr/util/SystemIdResolver.java
public static String createSystemIdFromResourceName(String name) { name = name.replace(File.separatorChar, '/'); final String authority; if (name.startsWith("/")) { // a hack to preserve absolute filenames and keep them absolute after resolving, we set the URI's authority to "@" on absolute filenames: authority = RESOURCE_LOADER_AUTHORITY_ABSOLUTE; } else { authority = null; name = "/" + name; } try { return new URI(RESOURCE_LOADER_URI_SCHEME, authority, name, null, null).toASCIIString(); } catch (URISyntaxException use) { throw new IllegalArgumentException("Invalid syntax of Solr Resource URI", use); } }
// in core/src/java/org/apache/solr/util/DateMathParser.java
public static void add(Calendar c, int val, String unit) { Integer uu = CALENDAR_UNITS.get(unit); if (null == uu) { throw new IllegalArgumentException("Adding Unit not recognized: " + unit); } c.add(uu.intValue(), val); }
// in core/src/java/org/apache/solr/util/DateMathParser.java
public static void round(Calendar c, String unit) { Integer uu = CALENDAR_UNITS.get(unit); if (null == uu) { throw new IllegalArgumentException("Rounding Unit not recognized: " + unit); } int u = uu.intValue(); switch (u) { case Calendar.YEAR: c.clear(Calendar.MONTH); /* fall through */ case Calendar.MONTH: c.clear(Calendar.DAY_OF_MONTH); c.clear(Calendar.DAY_OF_WEEK); c.clear(Calendar.DAY_OF_WEEK_IN_MONTH); c.clear(Calendar.DAY_OF_YEAR); c.clear(Calendar.WEEK_OF_MONTH); c.clear(Calendar.WEEK_OF_YEAR); /* fall through */ case Calendar.DATE: c.clear(Calendar.HOUR_OF_DAY); c.clear(Calendar.HOUR); c.clear(Calendar.AM_PM); /* fall through */ case Calendar.HOUR_OF_DAY: c.clear(Calendar.MINUTE); /* fall through */ case Calendar.MINUTE: c.clear(Calendar.SECOND); /* fall through */ case Calendar.SECOND: c.clear(Calendar.MILLISECOND); break; default: throw new IllegalStateException ("No logic for rounding value ("+u+") " + unit); } }
3
            
// in contrib/analysis-extras/src/java/org/apache/solr/analysis/MorfologikFilterFactory.java
catch (IllegalArgumentException e) { throw new IllegalArgumentException("The " + DICTIONARY_SCHEMA_ATTRIBUTE + " attribute accepts the " + "following constants: " + Arrays.toString(DICTIONARY.values()) + ", this value is invalid: " + dictionaryName); }
// in core/src/java/org/apache/solr/cloud/SolrZkServer.java
catch (NumberFormatException e) { throw new IllegalArgumentException("serverid " + myIdString + " is not a number"); }
// in core/src/java/org/apache/solr/util/SystemIdResolver.java
catch (URISyntaxException use) { throw new IllegalArgumentException("Invalid syntax of Solr Resource URI", use); }
7
            
// in core/src/java/org/apache/solr/response/JSONResponseWriter.java
public void writeMapOpener(int size) throws IOException, IllegalArgumentException { writer.write('{'); }
// in core/src/java/org/apache/solr/response/JSONResponseWriter.java
public void writeArrayOpener(int size) throws IOException, IllegalArgumentException { writer.write('['); }
// in core/src/java/org/apache/solr/response/PHPSerializedResponseWriter.java
Override public void writeMapOpener(int size) throws IOException, IllegalArgumentException { // negative size value indicates that something has gone wrong if (size < 0) { throw new IllegalArgumentException("Map size must not be negative"); } writer.write("a:"+size+":{"); }
// in core/src/java/org/apache/solr/response/PHPSerializedResponseWriter.java
Override public void writeArrayOpener(int size) throws IOException, IllegalArgumentException { // negative size value indicates that something has gone wrong if (size < 0) { throw new IllegalArgumentException("Array size must not be negative"); } writer.write("a:"+size+":{"); }
// in core/src/java/org/apache/solr/internal/csv/ExtendedBufferedReader.java
public long skip(long n) throws IllegalArgumentException, IOException { if (lookaheadChar == UNDEFINED) { lookaheadChar = super.read(); } // illegal argument if (n < 0) { throw new IllegalArgumentException("negative argument not supported"); } // no skipping if (n == 0 || lookaheadChar == END_OF_STREAM) { return 0; } // skip and reread the lookahead-char long skiped = 0; if (n > 1) { skiped = super.skip(n - 1); } lookaheadChar = super.read(); // fixme uh: we should check the skiped sequence for line-terminations... lineCounter = Integer.MIN_VALUE; return skiped + 1; }
// in core/src/java/org/apache/solr/internal/csv/ExtendedBufferedReader.java
public long skipUntil(char c) throws IllegalArgumentException, IOException { if (lookaheadChar == UNDEFINED) { lookaheadChar = super.read(); } long counter = 0; while (lookaheadChar != c && lookaheadChar != END_OF_STREAM) { if (lookaheadChar == '\n') { lineCounter++; } lookaheadChar = super.read(); counter++; } return counter; }
// in core/src/java/org/apache/solr/update/DocumentBuilder.java
public Document getDoc() throws IllegalArgumentException { // Check for all required fields -- Note, all fields with a // default value are defacto 'required' fields. List<String> missingFields = null; for (SchemaField field : schema.getRequiredFields()) { if (doc.getField(field.getName() ) == null) { if (field.getDefaultValue() != null) { addField(doc, field, field.getDefaultValue(), 1.0f); } else { if (missingFields==null) { missingFields = new ArrayList<String>(1); } missingFields.add(field.getName()); } } } if (missingFields != null) { StringBuilder builder = new StringBuilder(); // add the uniqueKey if possible if( schema.getUniqueKeyField() != null ) { String n = schema.getUniqueKeyField().getName(); String v = doc.getField( n ).stringValue(); builder.append( "Document ["+n+"="+v+"] " ); } builder.append("missing required fields: " ); for (String field : missingFields) { builder.append(field); builder.append(" "); } throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, builder.toString()); } Document ret = doc; doc=null; return ret; }
20
            
// in solrj/src/java/org/apache/solr/common/params/FacetParams.java
catch (IllegalArgumentException e) { throw new SolrException (SolrException.ErrorCode.BAD_REQUEST, label+" is not a valid type of 'other' range facet information",e); }
// in solrj/src/java/org/apache/solr/common/params/FacetParams.java
catch (IllegalArgumentException e) { throw new SolrException (SolrException.ErrorCode.BAD_REQUEST, label+" is not a valid type of 'other' range facet information",e); }
// in solrj/src/java/org/apache/solr/common/params/FacetParams.java
catch (IllegalArgumentException e) { throw new SolrException (SolrException.ErrorCode.BAD_REQUEST, label+" is not a valid type of for range 'include' information",e); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/XMLResponseParser.java
catch( IllegalArgumentException ex ) { // Other implementations will likely throw this exception since "reuse-instance" // isimplementation specific. log.debug( "Unable to set the 'reuse-instance' property for the input factory: "+factory ); }
// in contrib/analysis-extras/src/java/org/apache/solr/analysis/MorfologikFilterFactory.java
catch (IllegalArgumentException e) { throw new IllegalArgumentException("The " + DICTIONARY_SCHEMA_ATTRIBUTE + " attribute accepts the " + "following constants: " + Arrays.toString(DICTIONARY.values()) + ", this value is invalid: " + dictionaryName); }
// in core/src/java/org/apache/solr/handler/DocumentAnalysisRequestHandler.java
catch (IllegalArgumentException ex) { // Other implementations will likely throw this exception since "reuse-instance" // isimplementation specific. log.debug("Unable to set the 'reuse-instance' property for the input factory: " + inputFactory); }
// in core/src/java/org/apache/solr/handler/loader/XMLLoader.java
catch (IllegalArgumentException ex) { // Other implementations will likely throw this exception since "reuse-instance" // isimplementation specific. log.debug("Unable to set the 'reuse-instance' property for the input chain: " + inputFactory); }
// in core/src/java/org/apache/solr/handler/component/TermsComponent.java
catch (IllegalArgumentException iae) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Unknown terms regex flag '" + flagParam + "'"); }
// in core/src/java/org/apache/solr/handler/component/QueryComponent.java
catch (IllegalArgumentException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, String.format("Illegal %s parameter", GroupParams.GROUP_FORMAT)); }
// in core/src/java/org/apache/solr/handler/PingRequestHandler.java
catch (IllegalArgumentException iae){ throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Unknown action: " + actionParam); }
// in core/src/java/org/apache/solr/servlet/ZookeeperInfoServlet.java
catch (IllegalArgumentException e) { // path doesn't exist (must have been removed) writeKeyValue(json, "warning", "(path gone)", false); }
// in core/src/java/org/apache/solr/servlet/ZookeeperInfoServlet.java
catch (IllegalArgumentException e) { // path doesn't exist (must have been removed) json.writeString("(children gone)"); }
// in core/src/java/org/apache/solr/servlet/cache/HttpCacheHeaderUtil.java
catch (IllegalArgumentException iae) { // one of our date headers was not formated properly, ignore it /* NOOP */ }
// in core/src/java/org/apache/solr/schema/TrieField.java
catch (IllegalArgumentException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Invalid type specified in schema.xml for field: " + args.get("name"), e); }
// in core/src/java/org/apache/solr/highlight/DefaultSolrHighlighter.java
catch (IllegalArgumentException e) { // No problem. But we can't use TermOffsets optimization. }
// in core/src/java/org/apache/solr/cloud/SolrZkServer.java
catch (IllegalArgumentException e) { throw new ConfigException("Error processing " + path, e); }
// in core/src/java/org/apache/solr/update/processor/DistributedUpdateProcessor.java
catch (IllegalArgumentException e) { throw new SolrException (SolrException.ErrorCode.BAD_REQUEST, "Illegal value for " + DISTRIB_UPDATE_PARAM + ": " + param, e); }
// in core/src/java/org/apache/solr/core/Config.java
catch (IllegalArgumentException iae) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Invalid luceneMatchVersion '" + matchVersion + "', valid values are: " + Arrays.toString(Version.values()) + " or a string in format 'V.V'", iae); }
// in core/src/java/org/apache/solr/util/DateMathParser.java
catch (IllegalArgumentException e) { throw new ParseException ("Unit not recognized: \"" + ops[pos-1] + "\"", pos-1); }
// in core/src/java/org/apache/solr/util/DateMathParser.java
catch (IllegalArgumentException e) { throw new ParseException ("Unit not recognized: \"" + ops[pos-1] + "\"", pos-1); }
13
            
// in solrj/src/java/org/apache/solr/common/params/FacetParams.java
catch (IllegalArgumentException e) { throw new SolrException (SolrException.ErrorCode.BAD_REQUEST, label+" is not a valid type of 'other' range facet information",e); }
// in solrj/src/java/org/apache/solr/common/params/FacetParams.java
catch (IllegalArgumentException e) { throw new SolrException (SolrException.ErrorCode.BAD_REQUEST, label+" is not a valid type of 'other' range facet information",e); }
// in solrj/src/java/org/apache/solr/common/params/FacetParams.java
catch (IllegalArgumentException e) { throw new SolrException (SolrException.ErrorCode.BAD_REQUEST, label+" is not a valid type of for range 'include' information",e); }
// in contrib/analysis-extras/src/java/org/apache/solr/analysis/MorfologikFilterFactory.java
catch (IllegalArgumentException e) { throw new IllegalArgumentException("The " + DICTIONARY_SCHEMA_ATTRIBUTE + " attribute accepts the " + "following constants: " + Arrays.toString(DICTIONARY.values()) + ", this value is invalid: " + dictionaryName); }
// in core/src/java/org/apache/solr/handler/component/TermsComponent.java
catch (IllegalArgumentException iae) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Unknown terms regex flag '" + flagParam + "'"); }
// in core/src/java/org/apache/solr/handler/component/QueryComponent.java
catch (IllegalArgumentException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, String.format("Illegal %s parameter", GroupParams.GROUP_FORMAT)); }
// in core/src/java/org/apache/solr/handler/PingRequestHandler.java
catch (IllegalArgumentException iae){ throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Unknown action: " + actionParam); }
// in core/src/java/org/apache/solr/schema/TrieField.java
catch (IllegalArgumentException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Invalid type specified in schema.xml for field: " + args.get("name"), e); }
// in core/src/java/org/apache/solr/cloud/SolrZkServer.java
catch (IllegalArgumentException e) { throw new ConfigException("Error processing " + path, e); }
// in core/src/java/org/apache/solr/update/processor/DistributedUpdateProcessor.java
catch (IllegalArgumentException e) { throw new SolrException (SolrException.ErrorCode.BAD_REQUEST, "Illegal value for " + DISTRIB_UPDATE_PARAM + ": " + param, e); }
// in core/src/java/org/apache/solr/core/Config.java
catch (IllegalArgumentException iae) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Invalid luceneMatchVersion '" + matchVersion + "', valid values are: " + Arrays.toString(Version.values()) + " or a string in format 'V.V'", iae); }
// in core/src/java/org/apache/solr/util/DateMathParser.java
catch (IllegalArgumentException e) { throw new ParseException ("Unit not recognized: \"" + ops[pos-1] + "\"", pos-1); }
// in core/src/java/org/apache/solr/util/DateMathParser.java
catch (IllegalArgumentException e) { throw new ParseException ("Unit not recognized: \"" + ops[pos-1] + "\"", pos-1); }
12
runtime (Lib) IllegalStateException 19
            
// in solrj/src/java/org/apache/solr/common/cloud/CloudState.java
public String getShard(int hash, String collection) { RangeInfo rangInfo = getRanges(collection); int cnt = 0; for (Range range : rangInfo.ranges) { if (hash < range.max) { return rangInfo.shardList.get(cnt); } cnt++; } throw new IllegalStateException("The HashPartitioner failed"); }
// in solrj/src/java/org/apache/solr/client/solrj/request/FieldAnalysisRequest.java
Override public FieldAnalysisResponse process(SolrServer server) throws SolrServerException, IOException { if (fieldTypes == null && fieldNames == null) { throw new IllegalStateException("At least one field type or field name need to be specified"); } if (fieldValue == null) { throw new IllegalStateException("The field value must be set"); } long startTime = System.currentTimeMillis(); FieldAnalysisResponse res = new FieldAnalysisResponse(); res.setResponse(server.request(this)); res.setElapsedTime(System.currentTimeMillis() - startTime); return res; }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/SortedMapBackedCache.java
private void checkOpen(boolean shouldItBe) { if (!isOpen && shouldItBe) { throw new IllegalStateException( "Must call open() before using this cache."); } if (isOpen && !shouldItBe) { throw new IllegalStateException("The cache is already open."); } }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/SortedMapBackedCache.java
private void checkReadOnly() { if (isReadOnly) { throw new IllegalStateException("Cache is read-only."); } }
// in core/src/java/org/apache/solr/client/solrj/embedded/JettySolrRunner.java
public int getLocalPort() { if (lastPort == -1) { throw new IllegalStateException("You cannot get the port until this instance has started"); } return lastPort; }
// in core/src/java/org/apache/solr/search/grouping/distributed/command/TopGroupsFieldCommand.java
public TopGroupsFieldCommand build() { if (field == null || groupSort == null || sortWithinGroup == null || firstPhaseGroups == null || maxDocPerGroup == null) { throw new IllegalStateException("All required fields must be set"); } return new TopGroupsFieldCommand(field, groupSort, sortWithinGroup, firstPhaseGroups, maxDocPerGroup, needScores, needMaxScore); }
// in core/src/java/org/apache/solr/search/grouping/distributed/command/SearchGroupsFieldCommand.java
public SearchGroupsFieldCommand build() { if (field == null || groupSort == null || topNGroups == null) { throw new IllegalStateException("All fields must be set"); } return new SearchGroupsFieldCommand(field, groupSort, topNGroups, includeGroupCount); }
// in core/src/java/org/apache/solr/search/grouping/distributed/command/QueryCommand.java
public QueryCommand build() { if (sort == null || query == null || docSet == null || docsToCollect == null) { throw new IllegalStateException("All fields must be set"); } return new QueryCommand(sort, query, docsToCollect, needScores, docSet, queryString); }
// in core/src/java/org/apache/solr/search/grouping/CommandHandler.java
public CommandHandler build() { if (queryCommand == null || searcher == null) { throw new IllegalStateException("All fields must be set"); } return new CommandHandler(queryCommand, commands, searcher, needDocSet, truncateGroups, includeHitCount); }
// in core/src/java/org/apache/solr/logging/jul/JulWatcher.java
Override public void setThreshold(String level) { if(handler==null) { throw new IllegalStateException("Must have an handler"); } handler.setLevel( Level.parse(level) ); }
// in core/src/java/org/apache/solr/logging/jul/JulWatcher.java
Override public String getThreshold() { if(handler==null) { throw new IllegalStateException("Must have an handler"); } return handler.getLevel().toString(); }
// in core/src/java/org/apache/solr/logging/jul/JulWatcher.java
Override public void registerListener(ListenerConfig cfg, CoreContainer container) { if(history!=null) { throw new IllegalStateException("History already registered"); } history = new CircularList<LogRecord>(cfg.size); handler = new RecordHandler(this); if(cfg.threshold != null) { handler.setLevel(Level.parse(cfg.threshold)); } else { handler.setLevel(Level.WARNING); } Logger log = LogManager.getLogManager().getLogger(""); log.addHandler(handler); }
// in core/src/java/org/apache/solr/cloud/LeaderElector.java
private int getSeq(String nStringSequence) { int seq = 0; Matcher m = LEADER_SEQ.matcher(nStringSequence); if (m.matches()) { seq = Integer.parseInt(m.group(1)); } else { throw new IllegalStateException("Could not find regex match in:" + nStringSequence); } return seq; }
// in core/src/java/org/apache/solr/cloud/LeaderElector.java
private String getNodeId(String nStringSequence) { String id; Matcher m = SESSION_ID.matcher(nStringSequence); if (m.matches()) { id = m.group(1); } else { throw new IllegalStateException("Could not find regex match in:" + nStringSequence); } return id; }
// in core/src/java/org/apache/solr/update/DefaultSolrCoreState.java
Override public synchronized void incref() { if (refCnt == 0) { throw new IllegalStateException("IndexWriter has been closed"); } refCnt++; }
// in core/src/java/org/apache/solr/core/CoreContainer.java
public SolrCore register(String name, SolrCore core, boolean returnPrevNotClosed) { if( core == null ) { throw new RuntimeException( "Can not register a null core." ); } if( name == null || name.indexOf( '/' ) >= 0 || name.indexOf( '\\' ) >= 0 ){ throw new RuntimeException( "Invalid core name: "+name ); } if (zkController != null) { // this happens before we can receive requests zkController.preRegister(core.getCoreDescriptor()); } SolrCore old = null; synchronized (cores) { if (isShutDown) { core.close(); throw new IllegalStateException("This CoreContainer has been shutdown"); } old = cores.put(name, core); /* * set both the name of the descriptor and the name of the * core, since the descriptors name is used for persisting. */ core.setName(name); core.getCoreDescriptor().name = name; } if( old == null || old == core) { log.info( "registering core: "+name ); registerInZk(core); return null; } else { log.info( "replacing core: "+name ); if (!returnPrevNotClosed) { old.close(); } registerInZk(core); return old; } }
// in core/src/java/org/apache/solr/util/DateMathParser.java
public static void round(Calendar c, String unit) { Integer uu = CALENDAR_UNITS.get(unit); if (null == uu) { throw new IllegalArgumentException("Rounding Unit not recognized: " + unit); } int u = uu.intValue(); switch (u) { case Calendar.YEAR: c.clear(Calendar.MONTH); /* fall through */ case Calendar.MONTH: c.clear(Calendar.DAY_OF_MONTH); c.clear(Calendar.DAY_OF_WEEK); c.clear(Calendar.DAY_OF_WEEK_IN_MONTH); c.clear(Calendar.DAY_OF_YEAR); c.clear(Calendar.WEEK_OF_MONTH); c.clear(Calendar.WEEK_OF_YEAR); /* fall through */ case Calendar.DATE: c.clear(Calendar.HOUR_OF_DAY); c.clear(Calendar.HOUR); c.clear(Calendar.AM_PM); /* fall through */ case Calendar.HOUR_OF_DAY: c.clear(Calendar.MINUTE); /* fall through */ case Calendar.MINUTE: c.clear(Calendar.SECOND); /* fall through */ case Calendar.SECOND: c.clear(Calendar.MILLISECOND); break; default: throw new IllegalStateException ("No logic for rounding value ("+u+") " + unit); } }
0 2
            
// in solrj/src/java/org/apache/solr/client/solrj/impl/HttpClientUtil.java
public InputStream getContent() throws IOException, IllegalStateException { return new GZIPInputStream(wrappedEntity.getContent()); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/HttpClientUtil.java
public InputStream getContent() throws IOException, IllegalStateException { return new InflaterInputStream(wrappedEntity.getContent()); }
1
            
// in core/src/java/org/apache/solr/request/UnInvertedField.java
catch (IllegalStateException ise) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, ise.getMessage()); }
1
            
// in core/src/java/org/apache/solr/request/UnInvertedField.java
catch (IllegalStateException ise) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, ise.getMessage()); }
1
unknown (Lib) IndexOutOfBoundsException 1
            
// in core/src/java/org/apache/solr/logging/CircularList.java
private void checkIndex(int index) { if (index >= size || index < 0) throw new IndexOutOfBoundsException("Index: "+index+", Size: "+size); }
0 0 0 0 0
unknown (Lib) InitializationException 52
            
// in core/src/java/org/apache/solr/analysis/PhoneticFilterFactory.java
Override public void init(Map<String,String> args) { super.init( args ); inject = getBoolean(INJECT, true); String name = args.get( ENCODER ); if( name == null ) { throw new InitializationException("Missing required parameter: " + ENCODER + " [" + registry.keySet() + "]"); } clazz = registry.get(name.toUpperCase(Locale.ENGLISH)); if( clazz == null ) { clazz = resolveEncoder(name); } String v = args.get(MAX_CODE_LENGTH); if (v != null) { maxCodeLength = Integer.valueOf(v); try { setMaxCodeLenMethod = clazz.getMethod("setMaxCodeLen", int.class); } catch (Exception e) { throw new InitializationException("Encoder " + name + " / " + clazz + " does not support " + MAX_CODE_LENGTH, e); } } getEncoder();//trigger initialization for potential problems to be thrown now }
// in core/src/java/org/apache/solr/analysis/PhoneticFilterFactory.java
private Class<? extends Encoder> resolveEncoder(String name) { String lookupName = name; if (name.indexOf('.') == -1) { lookupName = PACKAGE_CONTAINING_ENCODERS + name; } try { return Class.forName(lookupName).asSubclass(Encoder.class); } catch (ClassNotFoundException cnfe) { throw new InitializationException("Unknown encoder: " + name + " must be full class name or one of " + registry.keySet(), cnfe); } catch (ClassCastException e) { throw new InitializationException("Not an encoder: " + name + " must be full class name or one of " + registry.keySet(), e); } }
// in core/src/java/org/apache/solr/analysis/PhoneticFilterFactory.java
protected Encoder getEncoder() { // Unfortunately, Commons-Codec doesn't offer any thread-safe guarantees so we must play it safe and instantiate // every time. A simple benchmark showed this as negligible. try { Encoder encoder = clazz.newInstance(); // Try to set the maxCodeLength if(maxCodeLength != null && setMaxCodeLenMethod != null) { setMaxCodeLenMethod.invoke(encoder, maxCodeLength); } return encoder; } catch (Exception e) { final Throwable t = (e instanceof InvocationTargetException) ? e.getCause() : e; throw new InitializationException("Error initializing encoder: " + name + " / " + clazz, t); } }
// in core/src/java/org/apache/solr/analysis/HunspellStemFilterFactory.java
public void inform(ResourceLoader loader) { assureMatchVersion(); String dictionaryFiles[] = args.get(PARAM_DICTIONARY).split(","); String affixFile = args.get(PARAM_AFFIX); String pic = args.get(PARAM_IGNORE_CASE); if(pic != null) { if(pic.equalsIgnoreCase(TRUE)) ignoreCase = true; else if(pic.equalsIgnoreCase(FALSE)) ignoreCase = false; else throw new InitializationException("Unknown value for " + PARAM_IGNORE_CASE + ": " + pic + ". Must be true or false"); } try { List<InputStream> dictionaries = new ArrayList<InputStream>(); for (String file : dictionaryFiles) { dictionaries.add(loader.openResource(file)); } this.dictionary = new HunspellDictionary(loader.openResource(affixFile), dictionaries, luceneMatchVersion, ignoreCase); } catch (Exception e) { throw new InitializationException("Unable to load hunspell data! [dictionary=" + args.get("dictionary") + ",affix=" + affixFile + "]", e); } }
// in core/src/java/org/apache/solr/analysis/SynonymFilterFactory.java
Override public void inform(ResourceLoader loader) { final boolean ignoreCase = getBoolean("ignoreCase", false); this.ignoreCase = ignoreCase; String tf = args.get("tokenizerFactory"); final TokenizerFactory factory = tf == null ? null : loadTokenizerFactory(loader, tf); Analyzer analyzer = new Analyzer() { @Override protected TokenStreamComponents createComponents(String fieldName, Reader reader) { Tokenizer tokenizer = factory == null ? new WhitespaceTokenizer(Version.LUCENE_50, reader) : factory.create(reader); TokenStream stream = ignoreCase ? new LowerCaseFilter(Version.LUCENE_50, tokenizer) : tokenizer; return new TokenStreamComponents(tokenizer, stream); } }; String format = args.get("format"); try { if (format == null || format.equals("solr")) { // TODO: expose dedup as a parameter? map = loadSolrSynonyms(loader, true, analyzer); } else if (format.equals("wordnet")) { map = loadWordnetSynonyms(loader, true, analyzer); } else { // TODO: somehow make this more pluggable throw new InitializationException("Unrecognized synonyms format: " + format); } } catch (Exception e) { throw new InitializationException("Exception thrown while loading synonyms", e); } if (map.fst == null) { log.warn("Synonyms loaded with " + args + " has empty rule set!"); } }
// in core/src/java/org/apache/solr/analysis/SynonymFilterFactory.java
private SynonymMap loadSolrSynonyms(ResourceLoader loader, boolean dedup, Analyzer analyzer) throws IOException, ParseException { final boolean expand = getBoolean("expand", true); String synonyms = args.get("synonyms"); if (synonyms == null) throw new InitializationException("Missing required argument 'synonyms'."); CharsetDecoder decoder = Charset.forName("UTF-8").newDecoder() .onMalformedInput(CodingErrorAction.REPORT) .onUnmappableCharacter(CodingErrorAction.REPORT); SolrSynonymParser parser = new SolrSynonymParser(dedup, expand, analyzer); File synonymFile = new File(synonyms); if (synonymFile.exists()) { decoder.reset(); parser.add(new InputStreamReader(loader.openResource(synonyms), decoder)); } else { List<String> files = StrUtils.splitFileNames(synonyms); for (String file : files) { decoder.reset(); parser.add(new InputStreamReader(loader.openResource(file), decoder)); } } return parser.build(); }
// in core/src/java/org/apache/solr/analysis/SynonymFilterFactory.java
private SynonymMap loadWordnetSynonyms(ResourceLoader loader, boolean dedup, Analyzer analyzer) throws IOException, ParseException { final boolean expand = getBoolean("expand", true); String synonyms = args.get("synonyms"); if (synonyms == null) throw new InitializationException("Missing required argument 'synonyms'."); CharsetDecoder decoder = Charset.forName("UTF-8").newDecoder() .onMalformedInput(CodingErrorAction.REPORT) .onUnmappableCharacter(CodingErrorAction.REPORT); WordnetSynonymParser parser = new WordnetSynonymParser(dedup, expand, analyzer); File synonymFile = new File(synonyms); if (synonymFile.exists()) { decoder.reset(); parser.add(new InputStreamReader(loader.openResource(synonyms), decoder)); } else { List<String> files = StrUtils.splitFileNames(synonyms); for (String file : files) { decoder.reset(); parser.add(new InputStreamReader(loader.openResource(file), decoder)); } } return parser.build(); }
// in core/src/java/org/apache/solr/analysis/ElisionFilterFactory.java
public void inform(ResourceLoader loader) { String articlesFile = args.get("articles"); boolean ignoreCase = getBoolean("ignoreCase", false); if (articlesFile != null) { try { articles = getWordSet(loader, articlesFile, ignoreCase); } catch (IOException e) { throw new InitializationException("IOException thrown while loading articles", e); } } }
// in core/src/java/org/apache/solr/analysis/TrimFilterFactory.java
Override public void init(Map<String,String> args) { super.init( args ); String v = args.get( "updateOffsets" ); if( v != null ) { try { updateOffsets = Boolean.valueOf( v ); } catch( Exception ex ) { throw new InitializationException("Error reading updateOffsets value. Must be true or false.", ex); } } }
// in core/src/java/org/apache/solr/analysis/GreekLowerCaseFilterFactory.java
Override public void init(Map<String, String> args) { super.init(args); assureMatchVersion(); if (args.containsKey("charset")) throw new InitializationException( "The charset parameter is no longer supported. " + "Please process your documents as Unicode instead."); }
// in core/src/java/org/apache/solr/analysis/HyphenationCompoundWordTokenFilterFactory.java
Override public void init(Map<String, String> args) { super.init(args); assureMatchVersion(); dictFile = args.get("dictionary"); if (args.containsKey("encoding")) encoding = args.get("encoding"); hypFile = args.get("hyphenator"); if (null == hypFile) { throw new InitializationException("Missing required parameter: hyphenator"); } minWordSize = getInt("minWordSize", CompoundWordTokenFilterBase.DEFAULT_MIN_WORD_SIZE); minSubwordSize = getInt("minSubwordSize", CompoundWordTokenFilterBase.DEFAULT_MIN_SUBWORD_SIZE); maxSubwordSize = getInt("maxSubwordSize", CompoundWordTokenFilterBase.DEFAULT_MAX_SUBWORD_SIZE); onlyLongestMatch = getBoolean("onlyLongestMatch", false); }
// in core/src/java/org/apache/solr/analysis/HyphenationCompoundWordTokenFilterFactory.java
public void inform(ResourceLoader loader) { InputStream stream = null; try { if (dictFile != null) // the dictionary can be empty. dictionary = getWordSet(loader, dictFile, false); // TODO: Broken, because we cannot resolve real system id // ResourceLoader should also supply method like ClassLoader to get resource URL stream = loader.openResource(hypFile); final InputSource is = new InputSource(stream); is.setEncoding(encoding); // if it's null let xml parser decide is.setSystemId(hypFile); hyphenator = HyphenationCompoundWordTokenFilter.getHyphenationTree(is); } catch (Exception e) { // TODO: getHyphenationTree really shouldn't throw "Exception" throw new InitializationException("Exception thrown while loading dictionary and hyphenation file", e); } finally { IOUtils.closeQuietly(stream); } }
// in core/src/java/org/apache/solr/analysis/StemmerOverrideFilterFactory.java
public void inform(ResourceLoader loader) { String dictionaryFiles = args.get("dictionary"); ignoreCase = getBoolean("ignoreCase", false); if (dictionaryFiles != null) { assureMatchVersion(); List<String> files = StrUtils.splitFileNames(dictionaryFiles); try { if (files.size() > 0) { dictionary = new CharArrayMap<String>(luceneMatchVersion, files.size() * 10, ignoreCase); for (String file : files) { List<String> list = loader.getLines(file.trim()); for (String line : list) { String[] mapping = line.split("\t", 2); dictionary.put(mapping[0], mapping[1]); } } } } catch (IOException e) { throw new InitializationException("IOException thrown while loading dictionary", e); } } }
// in core/src/java/org/apache/solr/analysis/JapaneseKatakanaStemFilterFactory.java
Override public void init(Map<String, String> args) { super.init(args); minimumLength = getInt(MINIMUM_LENGTH_PARAM, JapaneseKatakanaStemFilter.DEFAULT_MINIMUM_LENGTH); if (minimumLength < 2) { throw new InitializationException("Illegal " + MINIMUM_LENGTH_PARAM + " " + minimumLength + " (must be 2 or greater)"); } }
// in core/src/java/org/apache/solr/analysis/KeywordMarkerFilterFactory.java
public void inform(ResourceLoader loader) { String wordFiles = args.get(PROTECTED_TOKENS); ignoreCase = getBoolean("ignoreCase", false); if (wordFiles != null) { try { protectedWords = getWordSet(loader, wordFiles, ignoreCase); } catch (IOException e) { throw new InitializationException("IOException thrown while loading protected words", e); } } }
// in core/src/java/org/apache/solr/analysis/SnowballPorterFilterFactory.java
public void inform(ResourceLoader loader) { String wordFiles = args.get(PROTECTED_TOKENS); if (wordFiles != null) { try { protectedWords = getWordSet(loader, wordFiles, false); } catch (IOException e) { throw new InitializationException("IOException thrown while loading protected words", e); } } }
// in core/src/java/org/apache/solr/analysis/SnowballPorterFilterFactory.java
Override public void init(Map<String, String> args) { super.init(args); final String cfgLanguage = args.get("language"); if(cfgLanguage!=null) language = cfgLanguage; try { stemClass = Class.forName("org.tartarus.snowball.ext." + language + "Stemmer"); } catch (ClassNotFoundException e) { throw new InitializationException("Can't find class for stemmer language " + language, e); } }
// in core/src/java/org/apache/solr/analysis/SnowballPorterFilterFactory.java
public TokenFilter create(TokenStream input) { SnowballProgram program; try { program = (SnowballProgram)stemClass.newInstance(); } catch (Exception e) { throw new InitializationException("Error instantiating stemmer for language " + language + "from class " + stemClass, e); } if (protectedWords != null) input = new KeywordMarkerFilter(input, protectedWords); return new SnowballFilter(input, program); }
// in core/src/java/org/apache/solr/analysis/KeepWordFilterFactory.java
public void inform(ResourceLoader loader) { String wordFiles = args.get("words"); ignoreCase = getBoolean("ignoreCase", false); enablePositionIncrements = getBoolean("enablePositionIncrements",false); if (wordFiles != null) { try { words = getWordSet(loader, wordFiles, ignoreCase); } catch (IOException e) { throw new InitializationException("IOException thrown while loading words", e); } } }
// in core/src/java/org/apache/solr/analysis/ShingleFilterFactory.java
Override public void init(Map<String, String> args) { super.init(args); maxShingleSize = getInt("maxShingleSize", ShingleFilter.DEFAULT_MAX_SHINGLE_SIZE); if (maxShingleSize < 2) { throw new InitializationException("Invalid maxShingleSize (" + maxShingleSize + ") - must be at least 2"); } minShingleSize = getInt("minShingleSize", ShingleFilter.DEFAULT_MIN_SHINGLE_SIZE); if (minShingleSize < 2) { throw new InitializationException("Invalid minShingleSize (" + minShingleSize + ") - must be at least 2"); } if (minShingleSize > maxShingleSize) { throw new InitializationException("Invalid minShingleSize (" + minShingleSize + ") - must be no greater than maxShingleSize (" + maxShingleSize + ")"); } outputUnigrams = getBoolean("outputUnigrams", true); outputUnigramsIfNoShingles = getBoolean("outputUnigramsIfNoShingles", false); tokenSeparator = args.containsKey("tokenSeparator") ? args.get("tokenSeparator") : ShingleFilter.TOKEN_SEPARATOR; }
// in core/src/java/org/apache/solr/analysis/JapanesePartOfSpeechStopFilterFactory.java
public void inform(ResourceLoader loader) { String stopTagFiles = args.get("tags"); enablePositionIncrements = getBoolean("enablePositionIncrements", false); try { CharArraySet cas = getWordSet(loader, stopTagFiles, false); stopTags = new HashSet<String>(); for (Object element : cas) { char chars[] = (char[]) element; stopTags.add(new String(chars)); } } catch (IOException e) { throw new InitializationException("IOException thrown while loading tags", e); } }
// in core/src/java/org/apache/solr/analysis/MappingCharFilterFactory.java
public void inform(ResourceLoader loader) { mapping = args.get( "mapping" ); if( mapping != null ){ List<String> wlist = null; try{ File mappingFile = new File( mapping ); if( mappingFile.exists() ){ wlist = loader.getLines( mapping ); } else{ List<String> files = StrUtils.splitFileNames( mapping ); wlist = new ArrayList<String>(); for( String file : files ){ List<String> lines = loader.getLines( file.trim() ); wlist.addAll( lines ); } } } catch( IOException e ){ throw new InitializationException("IOException thrown while loading mappings", e); } final NormalizeCharMap.Builder builder = new NormalizeCharMap.Builder(); parseRules( wlist, builder ); normMap = builder.build(); } }
// in core/src/java/org/apache/solr/analysis/MappingCharFilterFactory.java
protected void parseRules( List<String> rules, NormalizeCharMap.Builder builder ){ for( String rule : rules ){ Matcher m = p.matcher( rule ); if( !m.find() ) throw new InitializationException("Invalid Mapping Rule : [" + rule + "], file = " + mapping); builder.add( parseString( m.group( 1 ) ), parseString( m.group( 2 ) ) ); } }
// in core/src/java/org/apache/solr/analysis/MappingCharFilterFactory.java
protected String parseString( String s ){ int readPos = 0; int len = s.length(); int writePos = 0; while( readPos < len ){ char c = s.charAt( readPos++ ); if( c == '\\' ){ if( readPos >= len ) throw new InitializationException("Invalid escaped char in [" + s + "]"); c = s.charAt( readPos++ ); switch( c ) { case '\\' : c = '\\'; break; case '"' : c = '"'; break; case 'n' : c = '\n'; break; case 't' : c = '\t'; break; case 'r' : c = '\r'; break; case 'b' : c = '\b'; break; case 'f' : c = '\f'; break; case 'u' : if( readPos + 3 >= len ) throw new InitializationException("Invalid escaped char in [" + s + "]"); c = (char)Integer.parseInt( s.substring( readPos, readPos + 4 ), 16 ); readPos += 4; break; } } out[writePos++] = c; } return new String( out, 0, writePos ); }
// in core/src/java/org/apache/solr/analysis/JapaneseTokenizerFactory.java
Override public void inform(ResourceLoader loader) { mode = getMode(args); String userDictionaryPath = args.get(USER_DICT_PATH); try { if (userDictionaryPath != null) { InputStream stream = loader.openResource(userDictionaryPath); String encoding = args.get(USER_DICT_ENCODING); if (encoding == null) { encoding = IOUtils.UTF_8; } CharsetDecoder decoder = Charset.forName(encoding).newDecoder() .onMalformedInput(CodingErrorAction.REPORT) .onUnmappableCharacter(CodingErrorAction.REPORT); Reader reader = new InputStreamReader(stream, decoder); userDictionary = new UserDictionary(reader); } else { userDictionary = null; } } catch (Exception e) { throw new InitializationException("Exception thrown while loading dictionary", e); } }
// in core/src/java/org/apache/solr/analysis/DelimitedPayloadTokenFilterFactory.java
public void inform(ResourceLoader loader) { String encoderClass = args.get(ENCODER_ATTR); if (encoderClass.equals("float")){ encoder = new FloatEncoder(); } else if (encoderClass.equals("integer")){ encoder = new IntegerEncoder(); } else if (encoderClass.equals("identity")){ encoder = new IdentityEncoder(); } else { encoder = loader.newInstance(encoderClass, PayloadEncoder.class); } String delim = args.get(DELIMITER_ATTR); if (delim != null){ if (delim.length() == 1) { delimiter = delim.charAt(0); } else{ throw new InitializationException("Delimiter must be one character only"); } } }
// in core/src/java/org/apache/solr/analysis/PatternTokenizerFactory.java
Override public void init(Map<String,String> args) { super.init(args); pattern = getPattern( PATTERN ); group = -1; // use 'split' String g = args.get( GROUP ); if( g != null ) { try { group = Integer.parseInt( g ); } catch( Exception ex ) { throw new InitializationException("invalid group argument: " + g); } } }
// in core/src/java/org/apache/solr/analysis/PatternTokenizerFactory.java
public Tokenizer create(final Reader in) { try { return new PatternTokenizer(in, pattern, group); } catch( IOException ex ) { throw new InitializationException("IOException thrown creating PatternTokenizer instance", ex); } }
// in core/src/java/org/apache/solr/analysis/DictionaryCompoundWordTokenFilterFactory.java
Override public void init(Map<String, String> args) { super.init(args); assureMatchVersion(); dictFile = args.get("dictionary"); if (null == dictFile) { throw new InitializationException("Missing required parameter: dictionary"); } minWordSize= getInt("minWordSize",CompoundWordTokenFilterBase.DEFAULT_MIN_WORD_SIZE); minSubwordSize= getInt("minSubwordSize",CompoundWordTokenFilterBase.DEFAULT_MIN_SUBWORD_SIZE); maxSubwordSize= getInt("maxSubwordSize",CompoundWordTokenFilterBase.DEFAULT_MAX_SUBWORD_SIZE); onlyLongestMatch = getBoolean("onlyLongestMatch",true); }
// in core/src/java/org/apache/solr/analysis/DictionaryCompoundWordTokenFilterFactory.java
public void inform(ResourceLoader loader) { try { dictionary = super.getWordSet(loader, dictFile, false); } catch (IOException e) { throw new InitializationException("IOException thrown while loading dictionary", e); } }
// in core/src/java/org/apache/solr/analysis/WordDelimiterFilterFactory.java
public void inform(ResourceLoader loader) { String wordFiles = args.get(PROTECTED_TOKENS); if (wordFiles != null) { try { protectedWords = getWordSet(loader, wordFiles, false); } catch (IOException e) { throw new InitializationException("IOException thrown while loading protected words", e); } } String types = args.get(TYPES); if (types != null) { try { List<String> files = StrUtils.splitFileNames( types ); List<String> wlist = new ArrayList<String>(); for( String file : files ){ List<String> lines = loader.getLines( file.trim() ); wlist.addAll( lines ); } typeTable = parseTypes(wlist); } catch (IOException e) { throw new InitializationException("IOException while loading types", e); } } }
// in core/src/java/org/apache/solr/analysis/WordDelimiterFilterFactory.java
private byte[] parseTypes(List<String> rules) { SortedMap<Character,Byte> typeMap = new TreeMap<Character,Byte>(); for( String rule : rules ){ Matcher m = typePattern.matcher(rule); if( !m.find() ) throw new InitializationException("Invalid Mapping Rule : [" + rule + "]"); String lhs = parseString(m.group(1).trim()); Byte rhs = parseType(m.group(2).trim()); if (lhs.length() != 1) throw new InitializationException("Invalid Mapping Rule : [" + rule + "]. Only a single character is allowed."); if (rhs == null) throw new InitializationException("Invalid Mapping Rule : [" + rule + "]. Illegal type."); typeMap.put(lhs.charAt(0), rhs); } // ensure the table is always at least as big as DEFAULT_WORD_DELIM_TABLE for performance byte types[] = new byte[Math.max(typeMap.lastKey()+1, WordDelimiterIterator.DEFAULT_WORD_DELIM_TABLE.length)]; for (int i = 0; i < types.length; i++) types[i] = WordDelimiterIterator.getType(i); for (Map.Entry<Character,Byte> mapping : typeMap.entrySet()) types[mapping.getKey()] = mapping.getValue(); return types; }
// in core/src/java/org/apache/solr/analysis/WordDelimiterFilterFactory.java
private String parseString(String s){ int readPos = 0; int len = s.length(); int writePos = 0; while( readPos < len ){ char c = s.charAt( readPos++ ); if( c == '\\' ){ if( readPos >= len ) throw new InitializationException("Invalid escaped char in [" + s + "]"); c = s.charAt( readPos++ ); switch( c ) { case '\\' : c = '\\'; break; case 'n' : c = '\n'; break; case 't' : c = '\t'; break; case 'r' : c = '\r'; break; case 'b' : c = '\b'; break; case 'f' : c = '\f'; break; case 'u' : if( readPos + 3 >= len ) throw new InitializationException("Invalid escaped char in [" + s + "]"); c = (char)Integer.parseInt( s.substring( readPos, readPos + 4 ), 16 ); readPos += 4; break; } } out[writePos++] = c; } return new String( out, 0, writePos ); }
// in core/src/java/org/apache/solr/analysis/PathHierarchyTokenizerFactory.java
Override public void init(Map<String,String> args){ super.init( args ); String v = args.get( "delimiter" ); if( v != null ){ if( v.length() != 1 ){ throw new InitializationException("delimiter should be a char. \"" + v + "\" is invalid"); } else{ delimiter = v.charAt(0); } } else{ delimiter = PathHierarchyTokenizer.DEFAULT_DELIMITER; } v = args.get( "replace" ); if( v != null ){ if( v.length() != 1 ){ throw new InitializationException("replace should be a char. \"" + v + "\" is invalid"); } else{ replacement = v.charAt(0); } } else{ replacement = delimiter; } v = args.get( "reverse" ); if( v != null ){ reverse = "true".equals( v ); } v = args.get( "skip" ); if( v != null ){ skip = Integer.parseInt( v ); } }
// in core/src/java/org/apache/solr/analysis/PatternReplaceFilterFactory.java
Override public void init(Map<String, String> args) { super.init(args); p = getPattern("pattern"); replacement = args.get("replacement"); String r = args.get("replace"); if (null != r) { if (r.equals("all")) { all = true; } else { if (r.equals("first")) { all = false; } else { throw new InitializationException ("Configuration Error: 'replace' must be 'first' or 'all' in " + this.getClass().getName()); } } } }
// in core/src/java/org/apache/solr/analysis/TypeTokenFilterFactory.java
Override public void inform(ResourceLoader loader) { String stopTypesFiles = args.get("types"); enablePositionIncrements = getBoolean("enablePositionIncrements", false); useWhitelist = getBoolean("useWhitelist", false); if (stopTypesFiles != null) { try { List<String> files = StrUtils.splitFileNames(stopTypesFiles); if (files.size() > 0) { stopTypes = new HashSet<String>(); for (String file : files) { List<String> typesLines = loader.getLines(file.trim()); stopTypes.addAll(typesLines); } } } catch (IOException e) { throw new InitializationException("IOException thrown while loading types", e); } } else { throw new InitializationException("Missing required parameter: types."); } }
// in core/src/java/org/apache/solr/analysis/CommonGramsQueryFilterFactory.java
public void inform(ResourceLoader loader) { String commonWordFiles = args.get("words"); ignoreCase = getBoolean("ignoreCase", false); if (commonWordFiles != null) { try { if ("snowball".equalsIgnoreCase(args.get("format"))) { commonWords = getSnowballWordSet(loader, commonWordFiles, ignoreCase); } else { commonWords = getWordSet(loader, commonWordFiles, ignoreCase); } } catch (IOException e) { throw new InitializationException("IOException thrown while loading common word file", e); } } else { commonWords = StopAnalyzer.ENGLISH_STOP_WORDS_SET; } }
// in core/src/java/org/apache/solr/analysis/StopFilterFactory.java
Override public void inform(ResourceLoader loader) { String stopWordFiles = args.get("words"); ignoreCase = getBoolean("ignoreCase",false); enablePositionIncrements = getBoolean("enablePositionIncrements",false); if (stopWordFiles != null) { try { if ("snowball".equalsIgnoreCase(args.get("format"))) { stopWords = getSnowballWordSet(loader, stopWordFiles, ignoreCase); } else { stopWords = getWordSet(loader, stopWordFiles, ignoreCase); } } catch (IOException e) { throw new InitializationException("IOException thrown while loading stopwords", e); } } else { stopWords = new CharArraySet(luceneMatchVersion, StopAnalyzer.ENGLISH_STOP_WORDS_SET, ignoreCase); } }
// in core/src/java/org/apache/solr/analysis/CommonGramsFilterFactory.java
public void inform(ResourceLoader loader) { String commonWordFiles = args.get("words"); ignoreCase = getBoolean("ignoreCase", false); if (commonWordFiles != null) { try { if ("snowball".equalsIgnoreCase(args.get("format"))) { commonWords = getSnowballWordSet(loader, commonWordFiles, ignoreCase); } else { commonWords = getWordSet(loader, commonWordFiles, ignoreCase); } } catch (IOException e) { throw new InitializationException("IOException thrown while loading common word file", e); } } else { commonWords = StopAnalyzer.ENGLISH_STOP_WORDS_SET; } }
27
            
// in core/src/java/org/apache/solr/analysis/PhoneticFilterFactory.java
catch (Exception e) { throw new InitializationException("Encoder " + name + " / " + clazz + " does not support " + MAX_CODE_LENGTH, e); }
// in core/src/java/org/apache/solr/analysis/PhoneticFilterFactory.java
catch (ClassNotFoundException cnfe) { throw new InitializationException("Unknown encoder: " + name + " must be full class name or one of " + registry.keySet(), cnfe); }
// in core/src/java/org/apache/solr/analysis/PhoneticFilterFactory.java
catch (ClassCastException e) { throw new InitializationException("Not an encoder: " + name + " must be full class name or one of " + registry.keySet(), e); }
// in core/src/java/org/apache/solr/analysis/PhoneticFilterFactory.java
catch (Exception e) { final Throwable t = (e instanceof InvocationTargetException) ? e.getCause() : e; throw new InitializationException("Error initializing encoder: " + name + " / " + clazz, t); }
// in core/src/java/org/apache/solr/analysis/HunspellStemFilterFactory.java
catch (Exception e) { throw new InitializationException("Unable to load hunspell data! [dictionary=" + args.get("dictionary") + ",affix=" + affixFile + "]", e); }
// in core/src/java/org/apache/solr/analysis/SynonymFilterFactory.java
catch (Exception e) { throw new InitializationException("Exception thrown while loading synonyms", e); }
// in core/src/java/org/apache/solr/analysis/ElisionFilterFactory.java
catch (IOException e) { throw new InitializationException("IOException thrown while loading articles", e); }
// in core/src/java/org/apache/solr/analysis/TrimFilterFactory.java
catch( Exception ex ) { throw new InitializationException("Error reading updateOffsets value. Must be true or false.", ex); }
// in core/src/java/org/apache/solr/analysis/HyphenationCompoundWordTokenFilterFactory.java
catch (Exception e) { // TODO: getHyphenationTree really shouldn't throw "Exception" throw new InitializationException("Exception thrown while loading dictionary and hyphenation file", e); }
// in core/src/java/org/apache/solr/analysis/StemmerOverrideFilterFactory.java
catch (IOException e) { throw new InitializationException("IOException thrown while loading dictionary", e); }
// in core/src/java/org/apache/solr/analysis/KeywordMarkerFilterFactory.java
catch (IOException e) { throw new InitializationException("IOException thrown while loading protected words", e); }
// in core/src/java/org/apache/solr/analysis/SnowballPorterFilterFactory.java
catch (IOException e) { throw new InitializationException("IOException thrown while loading protected words", e); }
// in core/src/java/org/apache/solr/analysis/SnowballPorterFilterFactory.java
catch (ClassNotFoundException e) { throw new InitializationException("Can't find class for stemmer language " + language, e); }
// in core/src/java/org/apache/solr/analysis/SnowballPorterFilterFactory.java
catch (Exception e) { throw new InitializationException("Error instantiating stemmer for language " + language + "from class " + stemClass, e); }
// in core/src/java/org/apache/solr/analysis/KeepWordFilterFactory.java
catch (IOException e) { throw new InitializationException("IOException thrown while loading words", e); }
// in core/src/java/org/apache/solr/analysis/JapanesePartOfSpeechStopFilterFactory.java
catch (IOException e) { throw new InitializationException("IOException thrown while loading tags", e); }
// in core/src/java/org/apache/solr/analysis/MappingCharFilterFactory.java
catch( IOException e ){ throw new InitializationException("IOException thrown while loading mappings", e); }
// in core/src/java/org/apache/solr/analysis/JapaneseTokenizerFactory.java
catch (Exception e) { throw new InitializationException("Exception thrown while loading dictionary", e); }
// in core/src/java/org/apache/solr/analysis/PatternTokenizerFactory.java
catch( Exception ex ) { throw new InitializationException("invalid group argument: " + g); }
// in core/src/java/org/apache/solr/analysis/PatternTokenizerFactory.java
catch( IOException ex ) { throw new InitializationException("IOException thrown creating PatternTokenizer instance", ex); }
// in core/src/java/org/apache/solr/analysis/DictionaryCompoundWordTokenFilterFactory.java
catch (IOException e) { throw new InitializationException("IOException thrown while loading dictionary", e); }
// in core/src/java/org/apache/solr/analysis/WordDelimiterFilterFactory.java
catch (IOException e) { throw new InitializationException("IOException thrown while loading protected words", e); }
// in core/src/java/org/apache/solr/analysis/WordDelimiterFilterFactory.java
catch (IOException e) { throw new InitializationException("IOException while loading types", e); }
// in core/src/java/org/apache/solr/analysis/TypeTokenFilterFactory.java
catch (IOException e) { throw new InitializationException("IOException thrown while loading types", e); }
// in core/src/java/org/apache/solr/analysis/CommonGramsQueryFilterFactory.java
catch (IOException e) { throw new InitializationException("IOException thrown while loading common word file", e); }
// in core/src/java/org/apache/solr/analysis/StopFilterFactory.java
catch (IOException e) { throw new InitializationException("IOException thrown while loading stopwords", e); }
// in core/src/java/org/apache/solr/analysis/CommonGramsFilterFactory.java
catch (IOException e) { throw new InitializationException("IOException thrown while loading common word file", e); }
0 0 0 0
unknown (Lib) InterruptedException 2
            
// in core/src/java/org/apache/solr/handler/SnapPuller.java
boolean fetchLatestIndex(SolrCore core, boolean force) throws IOException, InterruptedException { successfulInstall = false; replicationStartTime = System.currentTimeMillis(); try { //get the current 'replicateable' index version in the master NamedList response = null; try { response = getLatestVersion(); } catch (Exception e) { LOG.error("Master at: " + masterUrl + " is not available. Index fetch failed. Exception: " + e.getMessage()); return false; } long latestVersion = (Long) response.get(CMD_INDEX_VERSION); long latestGeneration = (Long) response.get(GENERATION); IndexCommit commit; RefCounted<SolrIndexSearcher> searcherRefCounted = null; try { searcherRefCounted = core.getNewestSearcher(false); if (searcherRefCounted == null) { SolrException.log(LOG, "No open searcher found - fetch aborted"); return false; } commit = searcherRefCounted.get().getIndexReader().getIndexCommit(); } finally { if (searcherRefCounted != null) searcherRefCounted.decref(); } if (latestVersion == 0L) { if (force && commit.getGeneration() != 0) { // since we won't get the files for an empty index, // we just clear ours and commit core.getUpdateHandler().getSolrCoreState().getIndexWriter(core).deleteAll(); SolrQueryRequest req = new LocalSolrQueryRequest(core, new ModifiableSolrParams()); core.getUpdateHandler().commit(new CommitUpdateCommand(req, false)); } //there is nothing to be replicated successfulInstall = true; return true; } if (!force && IndexDeletionPolicyWrapper.getCommitTimestamp(commit) == latestVersion) { //master and slave are already in sync just return LOG.info("Slave in sync with master."); successfulInstall = true; return true; } LOG.info("Master's generation: " + latestGeneration); LOG.info("Slave's generation: " + commit.getGeneration()); LOG.info("Starting replication process"); // get the list of files first fetchFileList(latestGeneration); // this can happen if the commit point is deleted before we fetch the file list. if(filesToDownload.isEmpty()) return false; LOG.info("Number of files in latest index in master: " + filesToDownload.size()); // Create the sync service fsyncService = Executors.newSingleThreadExecutor(); // use a synchronized list because the list is read by other threads (to show details) filesDownloaded = Collections.synchronizedList(new ArrayList<Map<String, Object>>()); // if the generateion of master is older than that of the slave , it means they are not compatible to be copied // then a new index direcory to be created and all the files need to be copied boolean isFullCopyNeeded = IndexDeletionPolicyWrapper.getCommitTimestamp(commit) >= latestVersion || force; File tmpIndexDir = createTempindexDir(core); if (isIndexStale()) isFullCopyNeeded = true; successfulInstall = false; boolean deleteTmpIdxDir = true; File indexDir = null ; try { indexDir = new File(core.getIndexDir()); downloadIndexFiles(isFullCopyNeeded, tmpIndexDir, latestGeneration); LOG.info("Total time taken for download : " + ((System.currentTimeMillis() - replicationStartTime) / 1000) + " secs"); Collection<Map<String, Object>> modifiedConfFiles = getModifiedConfFiles(confFilesToDownload); if (!modifiedConfFiles.isEmpty()) { downloadConfFiles(confFilesToDownload, latestGeneration); if (isFullCopyNeeded) { successfulInstall = modifyIndexProps(tmpIndexDir.getName()); deleteTmpIdxDir = false; } else { successfulInstall = copyIndexFiles(tmpIndexDir, indexDir); } if (successfulInstall) { LOG.info("Configuration files are modified, core will be reloaded"); logReplicationTimeAndConfFiles(modifiedConfFiles, successfulInstall);//write to a file time of replication and conf files. reloadCore(); } } else { terminateAndWaitFsyncService(); if (isFullCopyNeeded) { successfulInstall = modifyIndexProps(tmpIndexDir.getName()); deleteTmpIdxDir = false; } else { successfulInstall = copyIndexFiles(tmpIndexDir, indexDir); } if (successfulInstall) { logReplicationTimeAndConfFiles(modifiedConfFiles, successfulInstall); doCommit(); } } replicationStartTime = 0; return successfulInstall; } catch (ReplicationHandlerException e) { LOG.error("User aborted Replication"); return false; } catch (SolrException e) { throw e; } catch (InterruptedException e) { throw new InterruptedException("Index fetch interrupted"); } catch (Exception e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Index fetch failed : ", e); } finally { if (deleteTmpIdxDir) delTree(tmpIndexDir); else delTree(indexDir); } } finally { if (!successfulInstall) { logReplicationTimeAndConfFiles(null, successfulInstall); } filesToDownload = filesDownloaded = confFilesDownloaded = confFilesToDownload = null; replicationStartTime = 0; fileFetcher = null; if (fsyncService != null && !fsyncService.isShutdown()) fsyncService.shutdownNow(); fsyncService = null; stop = false; fsyncException = null; } }
2
            
// in solrj/src/java/org/apache/solr/common/cloud/ZkCmdExecutor.java
catch (KeeperException.ConnectionLossException e) { if (exception == null) { exception = e; } if (Thread.currentThread().isInterrupted()) { Thread.currentThread().interrupt(); throw new InterruptedException(); } if (Thread.currentThread() instanceof SafeStopThread) { if (((SafeStopThread) Thread.currentThread()).isClosed()) { throw new RuntimeException("Interrupted"); } } retryDelay(i); }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
catch (InterruptedException e) { throw new InterruptedException("Index fetch interrupted"); }
111
            
// in solrj/src/java/org/apache/solr/common/cloud/DefaultConnectionStrategy.java
Override public void connect(String serverAddress, int timeout, Watcher watcher, ZkUpdate updater) throws IOException, InterruptedException, TimeoutException { updater.update(new SolrZooKeeper(serverAddress, timeout, watcher)); }
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
public void delete(final String path, final int version, boolean retryOnConnLoss) throws InterruptedException, KeeperException { if (retryOnConnLoss) { zkCmdExecutor.retryOperation(new ZkOperation() { @Override public Stat execute() throws KeeperException, InterruptedException { keeper.delete(path, version); return null; } }); }
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
Override public Stat execute() throws KeeperException, InterruptedException { keeper.delete(path, version); return null; }
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
public Stat exists(final String path, final Watcher watcher, boolean retryOnConnLoss) throws KeeperException, InterruptedException { if (retryOnConnLoss) { return zkCmdExecutor.retryOperation(new ZkOperation() { @Override public Stat execute() throws KeeperException, InterruptedException { return keeper.exists(path, watcher); } }); }
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
Override public Stat execute() throws KeeperException, InterruptedException { return keeper.exists(path, watcher); }
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
public Boolean exists(final String path, boolean retryOnConnLoss) throws KeeperException, InterruptedException { if (retryOnConnLoss) { return zkCmdExecutor.retryOperation(new ZkOperation() { @Override public Boolean execute() throws KeeperException, InterruptedException { return keeper.exists(path, null) != null; } }); }
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
Override public Boolean execute() throws KeeperException, InterruptedException { return keeper.exists(path, null) != null; }
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
public String create(final String path, final byte data[], final List<ACL> acl, final CreateMode createMode, boolean retryOnConnLoss) throws KeeperException, InterruptedException { if (retryOnConnLoss) { return zkCmdExecutor.retryOperation(new ZkOperation() { @Override public String execute() throws KeeperException, InterruptedException { return keeper.create(path, data, acl, createMode); } }); }
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
Override public String execute() throws KeeperException, InterruptedException { return keeper.create(path, data, acl, createMode); }
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
public List<String> getChildren(final String path, final Watcher watcher, boolean retryOnConnLoss) throws KeeperException, InterruptedException { if (retryOnConnLoss) { return zkCmdExecutor.retryOperation(new ZkOperation() { @Override public List<String> execute() throws KeeperException, InterruptedException { return keeper.getChildren(path, watcher); } }); }
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
Override public List<String> execute() throws KeeperException, InterruptedException { return keeper.getChildren(path, watcher); }
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
public byte[] getData(final String path, final Watcher watcher, final Stat stat, boolean retryOnConnLoss) throws KeeperException, InterruptedException { if (retryOnConnLoss) { return zkCmdExecutor.retryOperation(new ZkOperation() { @Override public byte[] execute() throws KeeperException, InterruptedException { return keeper.getData(path, watcher, stat); } }); }
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
Override public byte[] execute() throws KeeperException, InterruptedException { return keeper.getData(path, watcher, stat); }
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
public Stat setData(final String path, final byte data[], final int version, boolean retryOnConnLoss) throws KeeperException, InterruptedException { if (retryOnConnLoss) { return zkCmdExecutor.retryOperation(new ZkOperation() { @Override public Stat execute() throws KeeperException, InterruptedException { return keeper.setData(path, data, version); } }); }
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
Override public Stat execute() throws KeeperException, InterruptedException { return keeper.setData(path, data, version); }
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
public String create(final String path, final byte[] data, final CreateMode createMode, boolean retryOnConnLoss) throws KeeperException, InterruptedException { if (retryOnConnLoss) { return zkCmdExecutor.retryOperation(new ZkOperation() { @Override public String execute() throws KeeperException, InterruptedException { return keeper.create(path, data, ZooDefs.Ids.OPEN_ACL_UNSAFE, createMode); } }); }
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
Override public String execute() throws KeeperException, InterruptedException { return keeper.create(path, data, ZooDefs.Ids.OPEN_ACL_UNSAFE, createMode); }
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
public void makePath(String path, boolean retryOnConnLoss) throws KeeperException, InterruptedException { makePath(path, null, CreateMode.PERSISTENT, retryOnConnLoss); }
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
public void makePath(String path, boolean failOnExists, boolean retryOnConnLoss) throws KeeperException, InterruptedException { makePath(path, null, CreateMode.PERSISTENT, null, failOnExists, retryOnConnLoss); }
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
public void makePath(String path, File file, boolean failOnExists, boolean retryOnConnLoss) throws IOException, KeeperException, InterruptedException { makePath(path, FileUtils.readFileToString(file).getBytes("UTF-8"), CreateMode.PERSISTENT, null, failOnExists, retryOnConnLoss); }
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
public void makePath(String path, File file, boolean retryOnConnLoss) throws IOException, KeeperException, InterruptedException { makePath(path, FileUtils.readFileToString(file).getBytes("UTF-8"), retryOnConnLoss); }
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
public void makePath(String path, CreateMode createMode, boolean retryOnConnLoss) throws KeeperException, InterruptedException { makePath(path, null, createMode, retryOnConnLoss); }
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
public void makePath(String path, byte[] data, boolean retryOnConnLoss) throws KeeperException, InterruptedException { makePath(path, data, CreateMode.PERSISTENT, retryOnConnLoss); }
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
public void makePath(String path, byte[] data, CreateMode createMode, boolean retryOnConnLoss) throws KeeperException, InterruptedException { makePath(path, data, createMode, null, retryOnConnLoss); }
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
public void makePath(String path, byte[] data, CreateMode createMode, Watcher watcher, boolean retryOnConnLoss) throws KeeperException, InterruptedException { makePath(path, data, createMode, watcher, true, retryOnConnLoss); }
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
public void makePath(String path, byte[] data, CreateMode createMode, Watcher watcher, boolean failOnExists, boolean retryOnConnLoss) throws KeeperException, InterruptedException { if (log.isInfoEnabled()) { log.info("makePath: " + path); } boolean retry = true; if (path.startsWith("/")) { path = path.substring(1, path.length()); } String[] paths = path.split("/"); StringBuilder sbPath = new StringBuilder(); for (int i = 0; i < paths.length; i++) { byte[] bytes = null; String pathPiece = paths[i]; sbPath.append("/" + pathPiece); final String currentPath = sbPath.toString(); Object exists = exists(currentPath, watcher, retryOnConnLoss); if (exists == null || ((i == paths.length -1) && failOnExists)) { CreateMode mode = CreateMode.PERSISTENT; if (i == paths.length - 1) { mode = createMode; bytes = data; if (!retryOnConnLoss) retry = false; } try { if (retry) { final CreateMode finalMode = mode; final byte[] finalBytes = bytes; zkCmdExecutor.retryOperation(new ZkOperation() { @Override public Object execute() throws KeeperException, InterruptedException { keeper.create(currentPath, finalBytes, ZooDefs.Ids.OPEN_ACL_UNSAFE, finalMode); return null; } }); } else { keeper.create(currentPath, bytes, ZooDefs.Ids.OPEN_ACL_UNSAFE, mode); } } catch (NodeExistsException e) { if (!failOnExists) { // TODO: version ? for now, don't worry about race setData(currentPath, data, -1, retryOnConnLoss); // set new watch exists(currentPath, watcher, retryOnConnLoss); return; } // ignore unless it's the last node in the path if (i == paths.length - 1) { throw e; } } if(i == paths.length -1) { // set new watch exists(currentPath, watcher, retryOnConnLoss); } } else if (i == paths.length - 1) { // TODO: version ? for now, don't worry about race setData(currentPath, data, -1, retryOnConnLoss); // set new watch exists(currentPath, watcher, retryOnConnLoss); } }
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
Override public Object execute() throws KeeperException, InterruptedException { keeper.create(currentPath, finalBytes, ZooDefs.Ids.OPEN_ACL_UNSAFE, finalMode); return null; }
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
public void makePath(String zkPath, CreateMode createMode, Watcher watcher, boolean retryOnConnLoss) throws KeeperException, InterruptedException { makePath(zkPath, null, createMode, watcher, retryOnConnLoss); }
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
public void setData(String path, byte[] data, boolean retryOnConnLoss) throws KeeperException, InterruptedException { setData(path, data, -1, retryOnConnLoss); }
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
public void setData(String path, File file, boolean retryOnConnLoss) throws IOException, KeeperException, InterruptedException { if (log.isInfoEnabled()) { log.info("Write to ZooKeepeer " + file.getAbsolutePath() + " to " + path); } String data = FileUtils.readFileToString(file); setData(path, data.getBytes("UTF-8"), retryOnConnLoss); }
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
public void printLayout(String path, int indent, StringBuilder string) throws KeeperException, InterruptedException { byte[] data = getData(path, null, null, true); List<String> children = getChildren(path, null, true); StringBuilder dent = new StringBuilder(); for (int i = 0; i < indent; i++) { dent.append(" "); } string.append(dent + path + " (" + children.size() + ")" + NEWL); if (data != null) { try { String dataString = new String(data, "UTF-8"); if ((!path.endsWith(".txt") && !path.endsWith(".xml")) || path.endsWith(ZkStateReader.CLUSTER_STATE)) { if (path.endsWith(".xml")) { // this is the cluster state in xml format - lets pretty print dataString = prettyPrint(dataString); } string.append(dent + "DATA:\n" + dent + " " + dataString.replaceAll("\n", "\n" + dent + " ") + NEWL); } else { string.append(dent + "DATA: ...supressed..." + NEWL); } } catch (UnsupportedEncodingException e) { // can't happen - UTF-8 throw new RuntimeException(e); } } for (String child : children) { if (!child.equals("quota")) { try { printLayout(path + (path.equals("/") ? "" : "/") + child, indent + 1, string); } catch (NoNodeException e) { // must have gone away } } } }
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
public void printLayoutToStdOut() throws KeeperException, InterruptedException { StringBuilder sb = new StringBuilder(); printLayout("/", 0, sb); System.out.println(sb.toString()); }
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
public void close() throws InterruptedException { if (isClosed) return; // it's okay if we over close - same as solrcore isClosed = true; keeper.close(); numCloses.incrementAndGet(); }
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
void updateKeeper(SolrZooKeeper keeper) throws InterruptedException { SolrZooKeeper oldKeeper = this.keeper; this.keeper = keeper; if (oldKeeper != null) { oldKeeper.close(); } }
// in solrj/src/java/org/apache/solr/common/cloud/ConnectionManager.java
public synchronized void process(WatchedEvent event) { if (log.isInfoEnabled()) { log.info("Watcher " + this + " name:" + name + " got event " + event + " path:" + event.getPath() + " type:" + event.getType()); } state = event.getState(); if (state == KeeperState.SyncConnected) { connected = true; clientConnected.countDown(); } else if (state == KeeperState.Expired) { connected = false; log.info("Attempting to reconnect to recover relationship with ZooKeeper..."); try { connectionStrategy.reconnect(zkServerAddress, zkClientTimeout, this, new ZkClientConnectionStrategy.ZkUpdate() { @Override public void update(SolrZooKeeper keeper) throws InterruptedException, TimeoutException, IOException { synchronized (connectionStrategy) { waitForConnected(SolrZkClient.DEFAULT_CLIENT_CONNECT_TIMEOUT); client.updateKeeper(keeper); if (onReconnect != null) { onReconnect.command(); } synchronized (ConnectionManager.this) { ConnectionManager.this.connected = true; } } } }); } catch (Exception e) { SolrException.log(log, "", e); } log.info("Connected:" + connected); }
// in solrj/src/java/org/apache/solr/common/cloud/ConnectionManager.java
Override public void update(SolrZooKeeper keeper) throws InterruptedException, TimeoutException, IOException { synchronized (connectionStrategy) { waitForConnected(SolrZkClient.DEFAULT_CLIENT_CONNECT_TIMEOUT); client.updateKeeper(keeper); if (onReconnect != null) { onReconnect.command(); } synchronized (ConnectionManager.this) { ConnectionManager.this.connected = true; } } }
// in solrj/src/java/org/apache/solr/common/cloud/ConnectionManager.java
public synchronized void waitForConnected(long waitForConnection) throws InterruptedException, TimeoutException, IOException { long expire = System.currentTimeMillis() + waitForConnection; long left = waitForConnection; while (!connected && left > 0) { wait(left); left = expire - System.currentTimeMillis(); } if (!connected) { throw new TimeoutException("Could not connect to ZooKeeper " + zkServerAddress + " within " + waitForConnection + " ms"); } }
// in solrj/src/java/org/apache/solr/common/cloud/ConnectionManager.java
public synchronized void waitForDisconnected(long timeout) throws InterruptedException, TimeoutException { long expire = System.currentTimeMillis() + timeout; long left = timeout; while (connected && left > 0) { wait(left); left = expire - System.currentTimeMillis(); } if (connected) { throw new TimeoutException("Did not disconnect"); } }
// in solrj/src/java/org/apache/solr/common/cloud/ZkStateReader.java
public void updateCloudState(boolean immediate) throws KeeperException, InterruptedException { updateCloudState(immediate, false); }
// in solrj/src/java/org/apache/solr/common/cloud/ZkStateReader.java
public void updateLiveNodes() throws KeeperException, InterruptedException { updateCloudState(true, true); }
// in solrj/src/java/org/apache/solr/common/cloud/ZkStateReader.java
public synchronized void createClusterStateWatchersAndUpdate() throws KeeperException, InterruptedException { // We need to fetch the current cluster state and the set of live nodes synchronized (getUpdateLock()) { cmdExecutor.ensureExists(CLUSTER_STATE, zkClient); log.info("Updating cluster state from ZooKeeper... "); zkClient.exists(CLUSTER_STATE, new Watcher() { @Override public void process(WatchedEvent event) { log.info("A cluster state change has occurred"); try { // delayed approach // ZkStateReader.this.updateCloudState(false, false); synchronized (ZkStateReader.this.getUpdateLock()) { // remake watch final Watcher thisWatch = this; byte[] data = zkClient.getData(CLUSTER_STATE, thisWatch, null, true); CloudState clusterState = CloudState.load(data, ZkStateReader.this.cloudState.getLiveNodes()); // update volatile cloudState = clusterState; } } catch (KeeperException e) { if (e.code() == KeeperException.Code.SESSIONEXPIRED || e.code() == KeeperException.Code.CONNECTIONLOSS) { log.warn("ZooKeeper watch triggered, but Solr cannot talk to ZK"); return; } log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.warn("", e); return; } }
// in solrj/src/java/org/apache/solr/common/cloud/ZkStateReader.java
private synchronized void updateCloudState(boolean immediate, final boolean onlyLiveNodes) throws KeeperException, InterruptedException { log.info("Manual update of cluster state initiated"); // build immutable CloudInfo if (immediate) { CloudState clusterState; synchronized (getUpdateLock()) { List<String> liveNodes = zkClient.getChildren(LIVE_NODES_ZKNODE, null, true); Set<String> liveNodesSet = new HashSet<String>(); liveNodesSet.addAll(liveNodes); if (!onlyLiveNodes) { log.info("Updating cloud state from ZooKeeper... "); clusterState = CloudState.load(zkClient, liveNodesSet); } else { log.info("Updating live nodes from ZooKeeper... "); clusterState = new CloudState(liveNodesSet, ZkStateReader.this.cloudState.getCollectionStates()); } } this.cloudState = clusterState; }
// in solrj/src/java/org/apache/solr/common/cloud/ZkStateReader.java
public String getLeaderUrl(String collection, String shard) throws InterruptedException, KeeperException { return getLeaderUrl(collection, shard, 1000); }
// in solrj/src/java/org/apache/solr/common/cloud/ZkStateReader.java
public String getLeaderUrl(String collection, String shard, int timeout) throws InterruptedException, KeeperException { ZkCoreNodeProps props = new ZkCoreNodeProps(getLeaderProps(collection, shard, timeout)); return props.getCoreUrl(); }
// in solrj/src/java/org/apache/solr/common/cloud/ZkStateReader.java
public ZkNodeProps getLeaderProps(String collection, String shard) throws InterruptedException { return getLeaderProps(collection, shard, 1000); }
// in solrj/src/java/org/apache/solr/common/cloud/ZkStateReader.java
public ZkNodeProps getLeaderProps(String collection, String shard, int timeout) throws InterruptedException { long timeoutAt = System.currentTimeMillis() + timeout; while (System.currentTimeMillis() < timeoutAt) { if (cloudState != null) { final CloudState currentState = cloudState; final ZkNodeProps nodeProps = currentState.getLeader(collection, shard); if (nodeProps != null) { return nodeProps; } } Thread.sleep(50); } throw new RuntimeException("No registered leader was found, collection:" + collection + " slice:" + shard); }
// in solrj/src/java/org/apache/solr/common/cloud/CloudState.java
public static CloudState load(SolrZkClient zkClient, Set<String> liveNodes) throws KeeperException, InterruptedException { byte[] state = zkClient.getData(ZkStateReader.CLUSTER_STATE, null, null, true); return load(state, liveNodes); }
// in solrj/src/java/org/apache/solr/common/cloud/CloudState.java
public static CloudState load(byte[] bytes, Set<String> liveNodes) throws KeeperException, InterruptedException { if (bytes == null || bytes.length == 0) { return new CloudState(liveNodes, Collections.<String, Map<String,Slice>>emptyMap()); } LinkedHashMap<String, Object> stateMap = (LinkedHashMap<String, Object>) ZkStateReader.fromJSON(bytes); HashMap<String,Map<String, Slice>> state = new HashMap<String,Map<String,Slice>>(); for(String collectionName: stateMap.keySet()){ Map<String, Object> collection = (Map<String, Object>)stateMap.get(collectionName); Map<String, Slice> slices = new LinkedHashMap<String,Slice>(); for(String sliceName: collection.keySet()) { Map<String, Map<String, String>> sliceMap = (Map<String, Map<String, String>>)collection.get(sliceName); Map<String, ZkNodeProps> shards = new LinkedHashMap<String,ZkNodeProps>(); for(String shardName: sliceMap.keySet()) { shards.put(shardName, new ZkNodeProps(sliceMap.get(shardName))); } Slice slice = new Slice(sliceName, shards); slices.put(sliceName, slice); } state.put(collectionName, slices); } return new CloudState(liveNodes, state); }
// in solrj/src/java/org/apache/solr/common/cloud/ZkCmdExecutor.java
public void ensureExists(String path, final SolrZkClient zkClient) throws KeeperException, InterruptedException { ensureExists(path, null, CreateMode.PERSISTENT, zkClient); }
// in solrj/src/java/org/apache/solr/common/cloud/ZkCmdExecutor.java
public void ensureExists(final String path, final byte[] data, CreateMode createMode, final SolrZkClient zkClient) throws KeeperException, InterruptedException { if (zkClient.exists(path, true)) { return; } try { zkClient.makePath(path, data, true); } catch (NodeExistsException e) { // its okay if another beats us creating the node } }
// in solrj/src/java/org/apache/solr/common/cloud/ZkCmdExecutor.java
protected void retryDelay(int attemptCount) throws InterruptedException { if (attemptCount > 0) { Thread.sleep(attemptCount * retryDelay); } }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
boolean fetchLatestIndex(SolrCore core, boolean force) throws IOException, InterruptedException { successfulInstall = false; replicationStartTime = System.currentTimeMillis(); try { //get the current 'replicateable' index version in the master NamedList response = null; try { response = getLatestVersion(); } catch (Exception e) { LOG.error("Master at: " + masterUrl + " is not available. Index fetch failed. Exception: " + e.getMessage()); return false; } long latestVersion = (Long) response.get(CMD_INDEX_VERSION); long latestGeneration = (Long) response.get(GENERATION); IndexCommit commit; RefCounted<SolrIndexSearcher> searcherRefCounted = null; try { searcherRefCounted = core.getNewestSearcher(false); if (searcherRefCounted == null) { SolrException.log(LOG, "No open searcher found - fetch aborted"); return false; } commit = searcherRefCounted.get().getIndexReader().getIndexCommit(); } finally { if (searcherRefCounted != null) searcherRefCounted.decref(); } if (latestVersion == 0L) { if (force && commit.getGeneration() != 0) { // since we won't get the files for an empty index, // we just clear ours and commit core.getUpdateHandler().getSolrCoreState().getIndexWriter(core).deleteAll(); SolrQueryRequest req = new LocalSolrQueryRequest(core, new ModifiableSolrParams()); core.getUpdateHandler().commit(new CommitUpdateCommand(req, false)); } //there is nothing to be replicated successfulInstall = true; return true; } if (!force && IndexDeletionPolicyWrapper.getCommitTimestamp(commit) == latestVersion) { //master and slave are already in sync just return LOG.info("Slave in sync with master."); successfulInstall = true; return true; } LOG.info("Master's generation: " + latestGeneration); LOG.info("Slave's generation: " + commit.getGeneration()); LOG.info("Starting replication process"); // get the list of files first fetchFileList(latestGeneration); // this can happen if the commit point is deleted before we fetch the file list. if(filesToDownload.isEmpty()) return false; LOG.info("Number of files in latest index in master: " + filesToDownload.size()); // Create the sync service fsyncService = Executors.newSingleThreadExecutor(); // use a synchronized list because the list is read by other threads (to show details) filesDownloaded = Collections.synchronizedList(new ArrayList<Map<String, Object>>()); // if the generateion of master is older than that of the slave , it means they are not compatible to be copied // then a new index direcory to be created and all the files need to be copied boolean isFullCopyNeeded = IndexDeletionPolicyWrapper.getCommitTimestamp(commit) >= latestVersion || force; File tmpIndexDir = createTempindexDir(core); if (isIndexStale()) isFullCopyNeeded = true; successfulInstall = false; boolean deleteTmpIdxDir = true; File indexDir = null ; try { indexDir = new File(core.getIndexDir()); downloadIndexFiles(isFullCopyNeeded, tmpIndexDir, latestGeneration); LOG.info("Total time taken for download : " + ((System.currentTimeMillis() - replicationStartTime) / 1000) + " secs"); Collection<Map<String, Object>> modifiedConfFiles = getModifiedConfFiles(confFilesToDownload); if (!modifiedConfFiles.isEmpty()) { downloadConfFiles(confFilesToDownload, latestGeneration); if (isFullCopyNeeded) { successfulInstall = modifyIndexProps(tmpIndexDir.getName()); deleteTmpIdxDir = false; } else { successfulInstall = copyIndexFiles(tmpIndexDir, indexDir); } if (successfulInstall) { LOG.info("Configuration files are modified, core will be reloaded"); logReplicationTimeAndConfFiles(modifiedConfFiles, successfulInstall);//write to a file time of replication and conf files. reloadCore(); } } else { terminateAndWaitFsyncService(); if (isFullCopyNeeded) { successfulInstall = modifyIndexProps(tmpIndexDir.getName()); deleteTmpIdxDir = false; } else { successfulInstall = copyIndexFiles(tmpIndexDir, indexDir); } if (successfulInstall) { logReplicationTimeAndConfFiles(modifiedConfFiles, successfulInstall); doCommit(); } } replicationStartTime = 0; return successfulInstall; } catch (ReplicationHandlerException e) { LOG.error("User aborted Replication"); return false; } catch (SolrException e) { throw e; } catch (InterruptedException e) { throw new InterruptedException("Index fetch interrupted"); } catch (Exception e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Index fetch failed : ", e); } finally { if (deleteTmpIdxDir) delTree(tmpIndexDir); else delTree(indexDir); } } finally { if (!successfulInstall) { logReplicationTimeAndConfFiles(null, successfulInstall); } filesToDownload = filesDownloaded = confFilesDownloaded = confFilesToDownload = null; replicationStartTime = 0; fileFetcher = null; if (fsyncService != null && !fsyncService.isShutdown()) fsyncService.shutdownNow(); fsyncService = null; stop = false; fsyncException = null; } }
// in core/src/java/org/apache/solr/handler/admin/ShowFileRequestHandler.java
Override public void handleRequestBody(SolrQueryRequest req, SolrQueryResponse rsp) throws IOException, KeeperException, InterruptedException { CoreContainer coreContainer = req.getCore().getCoreDescriptor().getCoreContainer(); if (coreContainer.isZooKeeperAware()) { showFromZooKeeper(req, rsp, coreContainer); } else { showFromFileSystem(req, rsp); } }
// in core/src/java/org/apache/solr/handler/admin/ShowFileRequestHandler.java
private void showFromZooKeeper(SolrQueryRequest req, SolrQueryResponse rsp, CoreContainer coreContainer) throws KeeperException, InterruptedException, UnsupportedEncodingException { String adminFile = null; SolrCore core = req.getCore(); SolrZkClient zkClient = coreContainer.getZkController().getZkClient(); final ZkSolrResourceLoader loader = (ZkSolrResourceLoader) core .getResourceLoader(); String confPath = loader.getCollectionZkPath(); String fname = req.getParams().get("file", null); if (fname == null) { adminFile = confPath; } else { fname = fname.replace('\\', '/'); // normalize slashes if (hiddenFiles.contains(fname.toUpperCase(Locale.ENGLISH))) { throw new SolrException(ErrorCode.FORBIDDEN, "Can not access: " + fname); } if (fname.indexOf("..") >= 0) { throw new SolrException(ErrorCode.FORBIDDEN, "Invalid path: " + fname); } adminFile = confPath + "/" + fname; } // Make sure the file exists, is readable and is not a hidden file if (!zkClient.exists(adminFile, true)) { throw new SolrException(ErrorCode.BAD_REQUEST, "Can not find: " + adminFile); } // Show a directory listing List<String> children = zkClient.getChildren(adminFile, null, true); if (children.size() > 0) { NamedList<SimpleOrderedMap<Object>> files = new SimpleOrderedMap<SimpleOrderedMap<Object>>(); for (String f : children) { if (hiddenFiles.contains(f.toUpperCase(Locale.ENGLISH))) { continue; // don't show 'hidden' files } if (f.startsWith(".")) { continue; // skip hidden system files... } SimpleOrderedMap<Object> fileInfo = new SimpleOrderedMap<Object>(); files.add(f, fileInfo); List<String> fchildren = zkClient.getChildren(adminFile, null, true); if (fchildren.size() > 0) { fileInfo.add("directory", true); } else { // TODO? content type fileInfo.add("size", f.length()); } // TODO: ? // fileInfo.add( "modified", new Date( f.lastModified() ) ); } rsp.add("files", files); } else { // Include the file contents // The file logic depends on RawResponseWriter, so force its use. ModifiableSolrParams params = new ModifiableSolrParams(req.getParams()); params.set(CommonParams.WT, "raw"); req.setParams(params); ContentStreamBase content = new ContentStreamBase.StringStream( new String(zkClient.getData(adminFile, null, null, true), "UTF-8")); content.setContentType(req.getParams().get(USE_CONTENT_TYPE)); rsp.add(RawResponseWriter.CONTENT, content); } rsp.setHttpCaching(false); }
// in core/src/java/org/apache/solr/handler/admin/CoreAdminHandler.java
protected void handleWaitForStateAction(SolrQueryRequest req, SolrQueryResponse rsp) throws IOException, InterruptedException { final SolrParams params = req.getParams(); String cname = params.get(CoreAdminParams.CORE); if (cname == null) { cname = ""; } String nodeName = params.get("nodeName"); String coreNodeName = params.get("coreNodeName"); String waitForState = params.get("state"); Boolean checkLive = params.getBool("checkLive"); int pauseFor = params.getInt("pauseFor", 0); String state = null; boolean live = false; int retry = 0; while (true) { SolrCore core = null; try { core = coreContainer.getCore(cname); if (core == null && retry == 30) { throw new SolrException(ErrorCode.BAD_REQUEST, "core not found:" + cname); } if (core != null) { // wait until we are sure the recovering node is ready // to accept updates CloudDescriptor cloudDescriptor = core.getCoreDescriptor() .getCloudDescriptor(); CloudState cloudState = coreContainer.getZkController() .getCloudState(); String collection = cloudDescriptor.getCollectionName(); Slice slice = cloudState.getSlice(collection, cloudDescriptor.getShardId()); if (slice != null) { ZkNodeProps nodeProps = slice.getShards().get(coreNodeName); if (nodeProps != null) { state = nodeProps.get(ZkStateReader.STATE_PROP); live = cloudState.liveNodesContain(nodeName); if (nodeProps != null && state.equals(waitForState)) { if (checkLive == null) { break; } else if (checkLive && live) { break; } else if (!checkLive && !live) { break; } } } } } if (retry++ == 30) { throw new SolrException(ErrorCode.BAD_REQUEST, "I was asked to wait on state " + waitForState + " for " + nodeName + " but I still do not see the request state. I see state: " + state + " live:" + live); } } finally { if (core != null) { core.close(); } } Thread.sleep(1000); } // small safety net for any updates that started with state that // kept it from sending the update to be buffered - // pause for a while to let any outstanding updates finish // System.out.println("I saw state:" + state + " sleep for " + pauseFor + // " live:" + live); Thread.sleep(pauseFor); // solrcloud_debug // try {; // LocalSolrQueryRequest r = new LocalSolrQueryRequest(core, new // ModifiableSolrParams()); // CommitUpdateCommand commitCmd = new CommitUpdateCommand(r, false); // commitCmd.softCommit = true; // core.getUpdateHandler().commit(commitCmd); // RefCounted<SolrIndexSearcher> searchHolder = // core.getNewestSearcher(false); // SolrIndexSearcher searcher = searchHolder.get(); // try { // System.out.println(core.getCoreDescriptor().getCoreContainer().getZkController().getNodeName() // + " to replicate " // + searcher.search(new MatchAllDocsQuery(), 1).totalHits + " gen:" + // core.getDeletionPolicy().getLatestCommit().getGeneration() + " data:" + // core.getDataDir()); // } finally { // searchHolder.decref(); // } // } catch (Exception e) { // // } }
// in core/src/java/org/apache/solr/handler/admin/CoreAdminHandler.java
protected void handleDistribUrlAction(SolrQueryRequest req, SolrQueryResponse rsp) throws IOException, InterruptedException, SolrServerException { // TODO: finish this and tests SolrParams params = req.getParams(); final ModifiableSolrParams newParams = new ModifiableSolrParams(params); newParams.remove("action"); SolrParams required = params.required(); final String subAction = required.get("subAction"); String collection = required.get("collection"); newParams.set(CoreAdminParams.ACTION, subAction); SolrCore core = req.getCore(); ZkController zkController = core.getCoreDescriptor().getCoreContainer() .getZkController(); CloudState cloudState = zkController.getCloudState(); Map<String,Slice> slices = cloudState.getCollectionStates().get(collection); for (Map.Entry<String,Slice> entry : slices.entrySet()) { Slice slice = entry.getValue(); Map<String,ZkNodeProps> shards = slice.getShards(); Set<Map.Entry<String,ZkNodeProps>> shardEntries = shards.entrySet(); for (Map.Entry<String,ZkNodeProps> shardEntry : shardEntries) { final ZkNodeProps node = shardEntry.getValue(); if (cloudState.liveNodesContain(node.get(ZkStateReader.NODE_NAME_PROP))) { newParams.set(CoreAdminParams.CORE, node.get(ZkStateReader.CORE_NAME_PROP)); String replica = node.get(ZkStateReader.BASE_URL_PROP); ShardRequest sreq = new ShardRequest(); newParams.set("qt", "/admin/cores"); sreq.purpose = 1; // TODO: this sucks if (replica.startsWith("http://")) replica = replica.substring(7); sreq.shards = new String[]{replica}; sreq.actualShards = sreq.shards; sreq.params = newParams; shardHandler.submit(sreq, replica, sreq.params); } } } ShardResponse srsp; do { srsp = shardHandler.takeCompletedOrError(); if (srsp != null) { Throwable e = srsp.getException(); if (e != null) { log.error("Error talking to shard: " + srsp.getShard(), e); } } } while(srsp != null); }
// in core/src/java/org/apache/solr/cloud/ShardLeaderWatcher.java
private void processLeaderChange() throws KeeperException, InterruptedException { if(closed) return; try { byte[] data = zkClient.getData(path, this, null, true); if (data != null) { final ZkCoreNodeProps leaderProps = new ZkCoreNodeProps(ZkNodeProps.load(data)); listener.announceLeader(collection, shard, leaderProps); } } catch (KeeperException ke) { //check if we lost connection or the node was gone if (ke.code() != Code.CONNECTIONLOSS && ke.code() != Code.SESSIONEXPIRED && ke.code() != Code.NONODE) { throw ke; } } }
// in core/src/java/org/apache/solr/cloud/Overseer.java
private CloudState updateState(CloudState state, String nodeName, CoreState coreState) throws KeeperException, InterruptedException { String collection = coreState.getCollectionName(); String zkCoreNodeName = coreState.getCoreNodeName(); //collection does not yet exist, create placeholders if num shards is specified if (!state.getCollections().contains(coreState.getCollectionName()) && coreState.getNumShards() != null) { state = createCollection(state, collection, coreState.getNumShards()); } // use the provided non null shardId String shardId = coreState.getProperties().get(ZkStateReader.SHARD_ID_PROP); if(shardId==null) { //use shardId from CloudState shardId = getAssignedId(state, nodeName, coreState); } if(shardId==null) { //request new shardId shardId = AssignShard.assignShard(collection, state, coreState.getNumShards()); } Map<String,String> props = new HashMap<String,String>(); Map<String,String> coreProps = new HashMap<String,String>(coreState.getProperties().size()); coreProps.putAll(coreState.getProperties()); // we don't put num_shards in the clusterstate coreProps.remove("num_shards"); for (Entry<String,String> entry : coreProps.entrySet()) { props.put(entry.getKey(), entry.getValue()); } ZkNodeProps zkProps = new ZkNodeProps(props); Slice slice = state.getSlice(collection, shardId); Map<String,ZkNodeProps> shardProps; if (slice == null) { shardProps = new HashMap<String,ZkNodeProps>(); } else { shardProps = state.getSlice(collection, shardId).getShardsCopy(); } shardProps.put(zkCoreNodeName, zkProps); slice = new Slice(shardId, shardProps); CloudState newCloudState = updateSlice(state, collection, slice); return newCloudState; }
// in core/src/java/org/apache/solr/cloud/Overseer.java
public synchronized void createWatches() throws KeeperException, InterruptedException { addCollectionsWatch(); addLiveNodesWatch(); }
// in core/src/java/org/apache/solr/cloud/Overseer.java
private void addCollectionsWatch() throws KeeperException, InterruptedException { zkCmdExecutor.ensureExists(ZkStateReader.COLLECTIONS_ZKNODE, zkClient); List<String> collections = zkClient.getChildren(ZkStateReader.COLLECTIONS_ZKNODE, new Watcher(){ @Override public void process(WatchedEvent event) { try { List<String> collections = zkClient.getChildren(ZkStateReader.COLLECTIONS_ZKNODE, this, true); collectionsChanged(collections); } catch (KeeperException e) { if (e.code() == Code.CONNECTIONLOSS || e.code() == Code.SESSIONEXPIRED) { log.warn("ZooKeeper watch triggered, but Solr cannot talk to ZK"); return; } } catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.warn("", e); } } }, true); collectionsChanged(collections); }
// in core/src/java/org/apache/solr/cloud/Overseer.java
private void collectionsChanged(Collection<String> collections) throws KeeperException, InterruptedException { synchronized (shardLeaderWatches) { for(String collection: collections) { if(!shardLeaderWatches.containsKey(collection)) { shardLeaderWatches.put(collection, new HashMap<String,ShardLeaderWatcher>()); addShardLeadersWatch(collection); } } //XXX not handling delete collections.. } }
// in core/src/java/org/apache/solr/cloud/Overseer.java
private void addShardLeadersWatch(final String collection) throws KeeperException, InterruptedException { zkCmdExecutor.ensureExists(ZkStateReader.getShardLeadersPath(collection, null), zkClient); final List<String> leaderNodes = zkClient.getChildren( ZkStateReader.getShardLeadersPath(collection, null), new Watcher() { @Override public void process(WatchedEvent event) { try { List<String> leaderNodes = zkClient.getChildren( ZkStateReader.getShardLeadersPath(collection, null), this, true); processLeaderNodesChanged(collection, leaderNodes); } catch (KeeperException e) { if (e.code() == KeeperException.Code.SESSIONEXPIRED || e.code() == KeeperException.Code.CONNECTIONLOSS) { log.warn("ZooKeeper watch triggered, but Solr cannot talk to ZK"); return; } SolrException.log(log, "", e); throw new ZooKeeperException( SolrException.ErrorCode.SERVER_ERROR, "", e); } catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); } } }, true); processLeaderNodesChanged(collection, leaderNodes); }
// in core/src/java/org/apache/solr/cloud/Overseer.java
private void addLiveNodesWatch() throws KeeperException, InterruptedException { List<String> liveNodes = zkCmdExecutor.retryOperation(new ZkOperation() { @Override public Object execute() throws KeeperException, InterruptedException { return zkClient.getChildren( ZkStateReader.LIVE_NODES_ZKNODE, new Watcher() { @Override public void process(WatchedEvent event) { try { List<String> liveNodes = zkClient.getChildren( ZkStateReader.LIVE_NODES_ZKNODE, this, true); synchronized (nodeStateWatches) { processLiveNodesChanged(nodeStateWatches.keySet(), liveNodes); } } catch (KeeperException e) { if (e.code() == KeeperException.Code.SESSIONEXPIRED || e.code() == KeeperException.Code.CONNECTIONLOSS) { log.warn("ZooKeeper watch triggered, but Solr cannot talk to ZK"); return; } SolrException.log(log, "", e); throw new ZooKeeperException( SolrException.ErrorCode.SERVER_ERROR, "", e); } catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); } } }, true); } }); processLiveNodesChanged(Collections.<String>emptySet(), liveNodes); }
// in core/src/java/org/apache/solr/cloud/Overseer.java
Override public Object execute() throws KeeperException, InterruptedException { return zkClient.getChildren( ZkStateReader.LIVE_NODES_ZKNODE, new Watcher() { @Override public void process(WatchedEvent event) { try { List<String> liveNodes = zkClient.getChildren( ZkStateReader.LIVE_NODES_ZKNODE, this, true); synchronized (nodeStateWatches) { processLiveNodesChanged(nodeStateWatches.keySet(), liveNodes); } } catch (KeeperException e) { if (e.code() == KeeperException.Code.SESSIONEXPIRED || e.code() == KeeperException.Code.CONNECTIONLOSS) { log.warn("ZooKeeper watch triggered, but Solr cannot talk to ZK"); return; } SolrException.log(log, "", e); throw new ZooKeeperException( SolrException.ErrorCode.SERVER_ERROR, "", e); } catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); } } }, true); }
// in core/src/java/org/apache/solr/cloud/Overseer.java
private void processLiveNodesChanged(Collection<String> oldLiveNodes, Collection<String> liveNodes) throws InterruptedException, KeeperException { Set<String> upNodes = complement(liveNodes, oldLiveNodes); if (upNodes.size() > 0) { addNodeStateWatches(upNodes); } Set<String> downNodes = complement(oldLiveNodes, liveNodes); for(String node: downNodes) { synchronized (nodeStateWatches) { NodeStateWatcher watcher = nodeStateWatches.remove(node); } log.debug("Removed NodeStateWatcher for node:" + node); } }
// in core/src/java/org/apache/solr/cloud/Overseer.java
private void addNodeStateWatches(Set<String> nodeNames) throws InterruptedException, KeeperException { for (String nodeName : nodeNames) { final String path = STATES_NODE + "/" + nodeName; synchronized (nodeStateWatches) { if (!nodeStateWatches.containsKey(nodeName)) { zkCmdExecutor.ensureExists(path, zkClient); nodeStateWatches.put(nodeName, new NodeStateWatcher(zkClient, nodeName, path, this)); log.debug("Added NodeStateWatcher for node " + nodeName); } else { log.debug("watch already added"); } } } }
// in core/src/java/org/apache/solr/cloud/Overseer.java
Override public void coreChanged(final String nodeName, final Set<CoreState> states) throws KeeperException, InterruptedException { log.info("Core change pooled: " + nodeName + " states:" + states); for (CoreState state : states) { fifo.add(new CloudStateUpdateRequest(Op.StateChange, nodeName, state)); } }
// in core/src/java/org/apache/solr/cloud/Overseer.java
Override public void coreDeleted(String nodeName, Collection<CoreState> states) throws KeeperException, InterruptedException { for (CoreState state : states) { fifo.add(new CloudStateUpdateRequest(Op.CoreDeleted, state.getCollectionName(), state.getCoreNodeName())); } }
// in core/src/java/org/apache/solr/cloud/Overseer.java
public static void createClientNodes(SolrZkClient zkClient, String nodeName) throws KeeperException, InterruptedException { final String node = STATES_NODE + "/" + nodeName; if (log.isInfoEnabled()) { log.info("creating node:" + node); } ZkCmdExecutor zkCmdExecutor = new ZkCmdExecutor(); zkCmdExecutor.ensureExists(node, zkClient); }
// in core/src/java/org/apache/solr/cloud/ElectionContext.java
public void cancelElection() throws InterruptedException, KeeperException { zkClient.delete(leaderSeqPath, -1, true); }
// in core/src/java/org/apache/solr/cloud/ElectionContext.java
Override void runLeaderProcess(boolean weAreReplacement) throws KeeperException, InterruptedException, IOException { try { zkClient.makePath(leaderPath, leaderProps == null ? null : ZkStateReader.toJSON(leaderProps), CreateMode.EPHEMERAL, true); } catch (NodeExistsException e) { // if a previous leader ephemeral still exists for some reason, try and // remove it zkClient.delete(leaderPath, -1, true); zkClient.makePath(leaderPath, leaderProps == null ? null : ZkStateReader.toJSON(leaderProps), CreateMode.EPHEMERAL, true); } }
// in core/src/java/org/apache/solr/cloud/ElectionContext.java
Override void runLeaderProcess(boolean weAreReplacement) throws KeeperException, InterruptedException, IOException { if (cc != null) { String coreName = leaderProps.get(ZkStateReader.CORE_NAME_PROP); SolrCore core = null; try { // the first time we are run, we will get a startupCore - after // we will get null and must use cc.getCore core = cc.getCore(coreName); if (core == null) { cancelElection(); throw new SolrException(ErrorCode.SERVER_ERROR, "Fatal Error, SolrCore not found:" + coreName + " in " + cc.getCoreNames()); } // should I be leader? if (weAreReplacement && !shouldIBeLeader(leaderProps)) { // System.out.println("there is a better leader candidate it appears"); rejoinLeaderElection(leaderSeqPath, core); return; } if (weAreReplacement) { if (zkClient.exists(leaderPath, true)) { zkClient.delete(leaderPath, -1, true); } // System.out.println("I may be the new Leader:" + leaderPath // + " - I need to try and sync"); boolean success = syncStrategy.sync(zkController, core, leaderProps); if (!success && anyoneElseActive()) { rejoinLeaderElection(leaderSeqPath, core); return; } } // If I am going to be the leader I have to be active // System.out.println("I am leader go active"); core.getUpdateHandler().getSolrCoreState().cancelRecovery(); zkController.publish(core.getCoreDescriptor(), ZkStateReader.ACTIVE); } finally { if (core != null ) { core.close(); } } } super.runLeaderProcess(weAreReplacement); }
// in core/src/java/org/apache/solr/cloud/ElectionContext.java
private void rejoinLeaderElection(String leaderSeqPath, SolrCore core) throws InterruptedException, KeeperException, IOException { // remove our ephemeral and re join the election // System.out.println("sync failed, delete our election node:" // + leaderSeqPath); zkController.publish(core.getCoreDescriptor(), ZkStateReader.DOWN); cancelElection(); core.getUpdateHandler().getSolrCoreState().doRecovery(cc, core.getName()); leaderElector.joinElection(this); }
// in core/src/java/org/apache/solr/cloud/ElectionContext.java
Override void runLeaderProcess(boolean weAreReplacement) throws KeeperException, InterruptedException { final String id = leaderSeqPath.substring(leaderSeqPath.lastIndexOf("/")+1); ZkNodeProps myProps = new ZkNodeProps("id", id); try { zkClient.makePath(leaderPath, ZkStateReader.toJSON(myProps), CreateMode.EPHEMERAL, true); } catch (NodeExistsException e) { // if a previous leader ephemeral still exists for some reason, try and // remove it zkClient.delete(leaderPath, -1, true); zkClient.makePath(leaderPath, ZkStateReader.toJSON(myProps), CreateMode.EPHEMERAL, true); } new Overseer(zkClient, stateReader, id); }
// in core/src/java/org/apache/solr/cloud/NodeStateWatcher.java
private void processStateChange() throws KeeperException, InterruptedException { byte[] data = zkClient.getData(path, this, null, true); if (data != null) { CoreState[] states = CoreState.load(data); List<CoreState> stateList = Arrays.asList(states); HashSet<CoreState> modifiedCores = new HashSet<CoreState>(); modifiedCores.addAll(stateList); modifiedCores.removeAll(currentState); HashSet<CoreState> newState = new HashSet<CoreState>(); newState.addAll(stateList); HashMap<String, CoreState> lookup = new HashMap<String, CoreState>(); for(CoreState state: states) { lookup.put(state.getCoreName(), state); } //check for status change for(CoreState state: currentState) { if(lookup.containsKey(state.getCoreName())) { if(!state.getProperties().equals(lookup.get(state.getCoreName()).getProperties())) { modifiedCores.add(lookup.get(state.getCoreName())); } } } HashMap<String, CoreState> deletedCores = new HashMap<String, CoreState>(); for(CoreState state: currentState) { deletedCores.put(state.getCoreNodeName(), state); } for(CoreState state: stateList) { deletedCores.remove(state.getCoreNodeName()); } if (deletedCores.size() > 0) { listener.coreDeleted(nodeName, deletedCores.values()); } currentState = Collections.unmodifiableSet(newState); if (modifiedCores.size() > 0) { try { listener.coreChanged(nodeName, Collections.unmodifiableSet(modifiedCores)); } catch (KeeperException e) { log.warn("Could not talk to ZK", e); } catch (InterruptedException e) { Thread.currentThread().interrupt(); log.warn("Could not talk to ZK", e); } } } else { // ignore null state } }
// in core/src/java/org/apache/solr/cloud/RecoveryStrategy.java
private Future<RecoveryInfo> replay(UpdateLog ulog) throws InterruptedException, ExecutionException, TimeoutException { Future<RecoveryInfo> future = ulog.applyBufferedUpdates(); if (future == null) { // no replay needed\ log.info("No replay needed"); } else { log.info("Replaying buffered documents"); // wait for replay future.get(); } // solrcloud_debug // try { // RefCounted<SolrIndexSearcher> searchHolder = core.getNewestSearcher(false); // SolrIndexSearcher searcher = searchHolder.get(); // try { // System.out.println(core.getCoreDescriptor().getCoreContainer().getZkController().getNodeName() + " replayed " // + searcher.search(new MatchAllDocsQuery(), 1).totalHits); // } finally { // searchHolder.decref(); // } // } catch (Exception e) { // // } return future; }
// in core/src/java/org/apache/solr/cloud/ZkController.java
public boolean configFileExists(String collection, String fileName) throws KeeperException, InterruptedException { Stat stat = zkClient.exists(CONFIGS_ZKNODE + "/" + collection + "/" + fileName, null, true); return stat != null; }
// in core/src/java/org/apache/solr/cloud/ZkController.java
public byte[] getConfigFileData(String zkConfigName, String fileName) throws KeeperException, InterruptedException { String zkPath = CONFIGS_ZKNODE + "/" + zkConfigName + "/" + fileName; byte[] bytes = zkClient.getData(zkPath, null, null, true); if (bytes == null) { log.error("Config file contains no data:" + zkPath); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "Config file contains no data:" + zkPath); } return bytes; }
// in core/src/java/org/apache/solr/cloud/ZkController.java
private void syncNodeState() throws KeeperException, InterruptedException { log.debug("Syncing internal state with zk. Current: " + coreStates); final String path = Overseer.STATES_NODE + "/" + getNodeName(); final byte[] data = zkClient.getData(path, null, null, true); if (data != null) { CoreState[] states = CoreState.load(data); synchronized (coreStates) { coreStates.clear(); // TODO: should we do this? for(CoreState coreState: states) { coreStates.put(coreState.getCoreName(), coreState); } } } log.debug("after sync: " + coreStates); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
private void createEphemeralLiveNode() throws KeeperException, InterruptedException { String nodeName = getNodeName(); String nodePath = ZkStateReader.LIVE_NODES_ZKNODE + "/" + nodeName; log.info("Register node as live in ZooKeeper:" + nodePath); try { boolean nodeDeleted = true; try { // we attempt a delete in the case of a quick server bounce - // if there was not a graceful shutdown, the node may exist // until expiration timeout - so a node won't be created here because // it exists, but eventually the node will be removed. So delete // in case it exists and create a new node. zkClient.delete(nodePath, -1, true); } catch (KeeperException.NoNodeException e) { // fine if there is nothing to delete // TODO: annoying that ZK logs a warning on us nodeDeleted = false; } if (nodeDeleted) { log .info("Found a previous node that still exists while trying to register a new live node " + nodePath + " - removing existing node to create another."); } zkClient.makePath(nodePath, CreateMode.EPHEMERAL, true); } catch (KeeperException e) { // its okay if the node already exists if (e.code() != KeeperException.Code.NODEEXISTS) { throw e; } } }
// in core/src/java/org/apache/solr/cloud/ZkController.java
public boolean pathExists(String path) throws KeeperException, InterruptedException { return zkClient.exists(path, true); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
public String readConfigName(String collection) throws KeeperException, InterruptedException, IOException { String configName = null; String path = ZkStateReader.COLLECTIONS_ZKNODE + "/" + collection; if (log.isInfoEnabled()) { log.info("Load collection config from:" + path); } byte[] data = zkClient.getData(path, null, null, true); if(data != null) { ZkNodeProps props = ZkNodeProps.load(data); configName = props.get(CONFIGNAME_PROP); } if (configName != null && !zkClient.exists(CONFIGS_ZKNODE + "/" + configName, true)) { log.error("Specified config does not exist in ZooKeeper:" + configName); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "Specified config does not exist in ZooKeeper:" + configName); } return configName; }
// in core/src/java/org/apache/solr/cloud/ZkController.java
private ZkCoreNodeProps getLeaderProps(final String collection, final String slice) throws KeeperException, InterruptedException { int iterCount = 60; while (iterCount-- > 0) try { byte[] data = zkClient.getData( ZkStateReader.getShardLeadersPath(collection, slice), null, null, true); ZkCoreNodeProps leaderProps = new ZkCoreNodeProps( ZkNodeProps.load(data)); return leaderProps; } catch (NoNodeException e) { Thread.sleep(500); } throw new RuntimeException("Could not get leader props"); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
private void joinElection(CoreDescriptor cd) throws InterruptedException, KeeperException, IOException { String shardId = cd.getCloudDescriptor().getShardId(); Map<String,String> props = new HashMap<String,String>(); // we only put a subset of props into the leader node props.put(ZkStateReader.BASE_URL_PROP, getBaseUrl()); props.put(ZkStateReader.CORE_NAME_PROP, cd.getName()); props.put(ZkStateReader.NODE_NAME_PROP, getNodeName()); final String coreZkNodeName = getNodeName() + "_" + cd.getName(); ZkNodeProps ourProps = new ZkNodeProps(props); String collection = cd.getCloudDescriptor() .getCollectionName(); ElectionContext context = new ShardLeaderElectionContext(leaderElector, shardId, collection, coreZkNodeName, ourProps, this, cc); leaderElector.setup(context); electionContexts.put(coreZkNodeName, context); leaderElector.joinElection(context); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
private boolean checkRecovery(String coreName, final CoreDescriptor desc, boolean recoverReloadedCores, final boolean isLeader, final CloudDescriptor cloudDesc, final String collection, final String shardZkNodeName, String shardId, ZkNodeProps leaderProps, SolrCore core, CoreContainer cc) throws InterruptedException, KeeperException, IOException, ExecutionException { if (SKIP_AUTO_RECOVERY) { log.warn("Skipping recovery according to sys prop solrcloud.skip.autorecovery"); return false; } boolean doRecovery = true; if (!isLeader) { if (core.isReloaded() && !recoverReloadedCores) { doRecovery = false; } if (doRecovery) { log.info("Core needs to recover:" + core.getName()); core.getUpdateHandler().getSolrCoreState().doRecovery(cc, coreName); return true; } } else { log.info("I am the leader, no recovery necessary"); } return false; }
// in core/src/java/org/apache/solr/cloud/ZkController.java
public void unregister(String coreName, CloudDescriptor cloudDesc) throws InterruptedException, KeeperException { synchronized (coreStates) { coreStates.remove(coreName); } publishState(); final String zkNodeName = getNodeName() + "_" + coreName; ElectionContext context = electionContexts.remove(zkNodeName); if (context != null) { context.cancelElection(); } }
// in core/src/java/org/apache/solr/cloud/ZkController.java
public void uploadToZK(File dir, String zkPath) throws IOException, KeeperException, InterruptedException { uploadToZK(zkClient, dir, zkPath); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
public void uploadConfigDir(File dir, String configName) throws IOException, KeeperException, InterruptedException { uploadToZK(zkClient, dir, ZkController.CONFIGS_ZKNODE + "/" + configName); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
void printLayoutToStdOut() throws KeeperException, InterruptedException { zkClient.printLayoutToStdOut(); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
public void createCollectionZkNode(CloudDescriptor cd) throws KeeperException, InterruptedException, IOException { String collection = cd.getCollectionName(); log.info("Check for collection zkNode:" + collection); String collectionPath = ZkStateReader.COLLECTIONS_ZKNODE + "/" + collection; try { if(!zkClient.exists(collectionPath, true)) { log.info("Creating collection in ZooKeeper:" + collection); SolrParams params = cd.getParams(); try { Map<String,String> collectionProps = new HashMap<String,String>(); // TODO: if collection.configName isn't set, and there isn't already a conf in zk, just use that? String defaultConfigName = System.getProperty(COLLECTION_PARAM_PREFIX+CONFIGNAME_PROP, collection); // params passed in - currently only done via core admin (create core commmand). if (params != null) { Iterator<String> iter = params.getParameterNamesIterator(); while (iter.hasNext()) { String paramName = iter.next(); if (paramName.startsWith(COLLECTION_PARAM_PREFIX)) { collectionProps.put(paramName.substring(COLLECTION_PARAM_PREFIX.length()), params.get(paramName)); } } // if the config name wasn't passed in, use the default if (!collectionProps.containsKey(CONFIGNAME_PROP)) getConfName(collection, collectionPath, collectionProps); } else if(System.getProperty("bootstrap_confdir") != null) { // if we are bootstrapping a collection, default the config for // a new collection to the collection we are bootstrapping log.info("Setting config for collection:" + collection + " to " + defaultConfigName); Properties sysProps = System.getProperties(); for (String sprop : System.getProperties().stringPropertyNames()) { if (sprop.startsWith(COLLECTION_PARAM_PREFIX)) { collectionProps.put(sprop.substring(COLLECTION_PARAM_PREFIX.length()), sysProps.getProperty(sprop)); } } // if the config name wasn't passed in, use the default if (!collectionProps.containsKey(CONFIGNAME_PROP)) collectionProps.put(CONFIGNAME_PROP, defaultConfigName); } else if (Boolean.getBoolean("bootstrap_conf")) { // the conf name should should be the collection name of this core collectionProps.put(CONFIGNAME_PROP, cd.getCollectionName()); } else { getConfName(collection, collectionPath, collectionProps); } ZkNodeProps zkProps = new ZkNodeProps(collectionProps); zkClient.makePath(collectionPath, ZkStateReader.toJSON(zkProps), CreateMode.PERSISTENT, null, true); // ping that there is a new collection zkClient.setData(ZkStateReader.COLLECTIONS_ZKNODE, (byte[])null, true); } catch (KeeperException e) { // its okay if the node already exists if (e.code() != KeeperException.Code.NODEEXISTS) { throw e; } } } else { log.info("Collection zkNode exists"); } } catch (KeeperException e) { // its okay if another beats us creating the node if (e.code() != KeeperException.Code.NODEEXISTS) { throw e; } } }
// in core/src/java/org/apache/solr/cloud/ZkController.java
private void getConfName(String collection, String collectionPath, Map<String,String> collectionProps) throws KeeperException, InterruptedException { // check for configName log.info("Looking for collection configName"); List<String> configNames = null; int retry = 1; int retryLimt = 6; for (; retry < retryLimt; retry++) { if (zkClient.exists(collectionPath, true)) { ZkNodeProps cProps = ZkNodeProps.load(zkClient.getData(collectionPath, null, null, true)); if (cProps.containsKey(CONFIGNAME_PROP)) { break; } } // if there is only one conf, use that try { configNames = zkClient.getChildren(CONFIGS_ZKNODE, null, true); } catch (NoNodeException e) { // just keep trying } if (configNames != null && configNames.size() == 1) { // no config set named, but there is only 1 - use it log.info("Only one config set found in zk - using it:" + configNames.get(0)); collectionProps.put(CONFIGNAME_PROP, configNames.get(0)); break; } if (configNames != null && configNames.contains(collection)) { log.info("Could not find explicit collection configName, but found config name matching collection name - using that set."); collectionProps.put(CONFIGNAME_PROP, collection); break; } log.info("Could not find collection configName - pausing for 3 seconds and trying again - try: " + retry); Thread.sleep(3000); } if (retry == retryLimt) { log.error("Could not find configName for collection " + collection); throw new ZooKeeperException( SolrException.ErrorCode.SERVER_ERROR, "Could not find configName for collection " + collection + " found:" + configNames); } }
// in core/src/java/org/apache/solr/cloud/ZkController.java
private String doGetShardIdProcess(String coreName, CloudDescriptor descriptor) throws InterruptedException { final String shardZkNodeName = getNodeName() + "_" + coreName; int retryCount = 120; while (retryCount-- > 0) { final String shardId = zkStateReader.getCloudState().getShardId( shardZkNodeName); if (shardId != null) { return shardId; } try { Thread.sleep(500); } catch (InterruptedException e) { Thread.currentThread().interrupt(); } } throw new SolrException(ErrorCode.SERVER_ERROR, "Could not get shard_id for core: " + coreName); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
public static void uploadToZK(SolrZkClient zkClient, File dir, String zkPath) throws IOException, KeeperException, InterruptedException { File[] files = dir.listFiles(); if (files == null) { throw new IllegalArgumentException("Illegal directory: " + dir); } for(File file : files) { if (!file.getName().startsWith(".")) { if (!file.isDirectory()) { zkClient.makePath(zkPath + "/" + file.getName(), file, false, true); } else { uploadToZK(zkClient, file, zkPath + "/" + file.getName()); } } } }
// in core/src/java/org/apache/solr/cloud/ZkController.java
public static void uploadConfigDir(SolrZkClient zkClient, File dir, String configName) throws IOException, KeeperException, InterruptedException { uploadToZK(zkClient, dir, ZkController.CONFIGS_ZKNODE + "/" + configName); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
public static void bootstrapConf(SolrZkClient zkClient, Config cfg, String solrHome) throws IOException, KeeperException, InterruptedException { NodeList nodes = (NodeList)cfg.evaluate("solr/cores/core", XPathConstants.NODESET); for (int i=0; i<nodes.getLength(); i++) { Node node = nodes.item(i); String rawName = DOMUtil.getAttr(node, "name", null); String instanceDir = DOMUtil.getAttr(node, "instanceDir", null); File idir = new File(instanceDir); if (!idir.isAbsolute()) { idir = new File(solrHome, instanceDir); } String confName = DOMUtil.getAttr(node, "collection", null); if (confName == null) { confName = rawName; } ZkController.uploadConfigDir(zkClient, new File(idir, "conf"), confName); } }
// in core/src/java/org/apache/solr/cloud/LeaderElector.java
private void checkIfIamLeader(final int seq, final ElectionContext context, boolean replacement) throws KeeperException, InterruptedException, IOException { // get all other numbers... final String holdElectionPath = context.electionPath + ELECTION_NODE; List<String> seqs = zkClient.getChildren(holdElectionPath, null, true); sortSeqs(seqs); List<Integer> intSeqs = getSeqs(seqs); if (seq <= intSeqs.get(0)) { runIamLeaderProcess(context, replacement); } else { // I am not the leader - watch the node below me int i = 1; for (; i < intSeqs.size(); i++) { int s = intSeqs.get(i); if (seq < s) { // we found who we come before - watch the guy in front break; } } int index = i - 2; if (index < 0) { log.warn("Our node is no longer in line to be leader"); return; } try { zkClient.getData(holdElectionPath + "/" + seqs.get(index), new Watcher() { @Override public void process(WatchedEvent event) { // am I the next leader? try { checkIfIamLeader(seq, context, true); } catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.warn("", e); } catch (IOException e) { log.warn("", e); } catch (Exception e) { log.warn("", e); } } }, null, true); } catch (KeeperException.SessionExpiredException e) { throw e; } catch (KeeperException e) { // we couldn't set our watch - the node before us may already be down? // we need to check if we are the leader again checkIfIamLeader(seq, context, true); } }
// in core/src/java/org/apache/solr/cloud/LeaderElector.java
protected void runIamLeaderProcess(final ElectionContext context, boolean weAreReplacement) throws KeeperException, InterruptedException, IOException { context.runLeaderProcess(weAreReplacement); }
// in core/src/java/org/apache/solr/cloud/LeaderElector.java
public int joinElection(ElectionContext context) throws KeeperException, InterruptedException, IOException { final String shardsElectZkPath = context.electionPath + LeaderElector.ELECTION_NODE; long sessionId = zkClient.getSolrZooKeeper().getSessionId(); String id = sessionId + "-" + context.id; String leaderSeqPath = null; boolean cont = true; int tries = 0; while (cont) { try { leaderSeqPath = zkClient.create(shardsElectZkPath + "/" + id + "-n_", null, CreateMode.EPHEMERAL_SEQUENTIAL, false); context.leaderSeqPath = leaderSeqPath; cont = false; } catch (ConnectionLossException e) { // we don't know if we made our node or not... List<String> entries = zkClient.getChildren(shardsElectZkPath, null, true); boolean foundId = false; for (String entry : entries) { String nodeId = getNodeId(entry); if (id.equals(nodeId)) { // we did create our node... foundId = true; break; } } if (!foundId) { throw e; } } catch (KeeperException.NoNodeException e) { // we must have failed in creating the election node - someone else must // be working on it, lets try again if (tries++ > 9) { throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } cont = true; Thread.sleep(50); } } int seq = getSeq(leaderSeqPath); checkIfIamLeader(seq, context, false); return seq; }
// in core/src/java/org/apache/solr/cloud/LeaderElector.java
public void setup(final ElectionContext context) throws InterruptedException, KeeperException { String electZKPath = context.electionPath + LeaderElector.ELECTION_NODE; zkCmdExecutor.ensureExists(electZKPath, zkClient); }
// in core/src/java/org/apache/solr/update/TransactionLog.java
public Object next() throws IOException, InterruptedException { long pos = fis.position(); synchronized (TransactionLog.this) { if (trace) { log.trace("Reading log record. pos="+pos+" currentSize="+fos.size()); } if (pos >= fos.size()) { return null; } fos.flushBuffer(); } if (pos == 0) { readHeader(fis); // shouldn't currently happen - header and first record are currently written at the same time synchronized (TransactionLog.this) { if (fis.position() >= fos.size()) { return null; } pos = fis.position(); } } Object o = codec.readVal(fis); // skip over record size int size = fis.readInt(); assert size == fis.position() - pos - 4; return o; }
// in core/src/java/org/apache/solr/core/CoreContainer.java
private SolrConfig getSolrConfigFromZk(String zkConfigName, String solrConfigFileName, SolrResourceLoader resourceLoader) throws IOException, ParserConfigurationException, SAXException, KeeperException, InterruptedException { byte[] config = zkController.getConfigFileData(zkConfigName, solrConfigFileName); InputSource is = new InputSource(new ByteArrayInputStream(config)); is.setSystemId(SystemIdResolver.createSystemIdFromResourceName(solrConfigFileName)); SolrConfig cfg = solrConfigFileName == null ? new SolrConfig( resourceLoader, SolrConfig.DEFAULT_CONF_FILE, is) : new SolrConfig( resourceLoader, solrConfigFileName, is); return cfg; }
// in core/src/java/org/apache/solr/core/CoreContainer.java
private IndexSchema getSchemaFromZk(String zkConfigName, String schemaName, SolrConfig config, SolrResourceLoader resourceLoader) throws KeeperException, InterruptedException { byte[] configBytes = zkController.getConfigFileData(zkConfigName, schemaName); InputSource is = new InputSource(new ByteArrayInputStream(configBytes)); is.setSystemId(SystemIdResolver.createSystemIdFromResourceName(schemaName)); IndexSchema schema = new IndexSchema(config, schemaName, is); return schema; }
// in core/src/java/org/apache/solr/util/RTimer.java
public static void main(String []argv) throws InterruptedException { RTimer rt = new RTimer(), subt, st; Thread.sleep(100); subt = rt.sub("sub1"); Thread.sleep(50); st = subt.sub("sub1.1"); st.resume(); Thread.sleep(10); st.pause(); Thread.sleep(50); st.resume(); Thread.sleep(10); st.pause(); subt.stop(); rt.stop(); System.out.println( rt.toString()); }
70
            
// in solrj/src/java/org/apache/zookeeper/SolrZooKeeper.java
catch (InterruptedException e) {}
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in solrj/src/java/org/apache/solr/common/cloud/ZkStateReader.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in solrj/src/java/org/apache/solr/common/cloud/ZkStateReader.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.warn("", e); return; }
// in solrj/src/java/org/apache/solr/common/cloud/ZkStateReader.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.warn("", e); return; }
// in solrj/src/java/org/apache/solr/common/cloud/ZkStateReader.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException( SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in solrj/src/java/org/apache/solr/common/cloud/ZkStateReader.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/ConcurrentUpdateSolrServer.java
catch (InterruptedException e) { e.printStackTrace(); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/ConcurrentUpdateSolrServer.java
catch (InterruptedException e) { log.error("interrupted", e); throw new IOException(e.getLocalizedMessage()); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/ConcurrentUpdateSolrServer.java
catch (InterruptedException ie) { scheduler.shutdownNow(); Thread.currentThread().interrupt(); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/ConcurrentUpdateSolrServer.java
catch (InterruptedException ie) { scheduler.shutdownNow(); Thread.currentThread().interrupt(); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/CloudSolrServer.java
catch (InterruptedException e) { Thread.currentThread().interrupt(); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/XPathEntityProcessor.java
catch (InterruptedException e) { return; }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/XPathEntityProcessor.java
catch (InterruptedException e) { LOG.debug("Caught InterruptedException while waiting for row. Aborting."); isEnd.set(true); return null; }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/ZKPropertiesWriter.java
catch (InterruptedException e) { Thread.currentThread().interrupt(); log.warn( "Could not persist properties to " + path + " :" + e.getClass(), e); }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
catch (InterruptedException e) { throw new InterruptedException("Index fetch interrupted"); }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
catch (InterruptedException e) { SolrException.log(LOG,e); }
// in core/src/java/org/apache/solr/handler/component/HttpShardHandler.java
catch (InterruptedException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/handler/admin/CoreAdminHandler.java
catch (InterruptedException e) { Thread.currentThread().interrupt(); throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Could not unregister core " + cname + " from cloudstate: " + e.getMessage(), e); }
// in core/src/java/org/apache/solr/request/PerSegmentSingleValuedFaceting.java
catch (InterruptedException e) { Thread.currentThread().interrupt(); throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/servlet/ZookeeperInfoServlet.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); writeError(503, "Could not connect to zookeeper at '" + addr + "'\""); zkClient = null; return; }
// in core/src/java/org/apache/solr/servlet/ZookeeperInfoServlet.java
catch (InterruptedException e) { // ignore exception on close }
// in core/src/java/org/apache/solr/servlet/ZookeeperInfoServlet.java
catch (InterruptedException e) { writeKeyValue(json, "warning", e.toString(), false); log.warn("InterruptedException", e); }
// in core/src/java/org/apache/solr/servlet/ZookeeperInfoServlet.java
catch (InterruptedException e) { writeError(500, e.toString()); return false; }
// in core/src/java/org/apache/solr/servlet/ZookeeperInfoServlet.java
catch (InterruptedException e) { writeError(500, e.toString()); return false; }
// in core/src/java/org/apache/solr/cloud/ShardLeaderWatcher.java
catch (InterruptedException e) { Thread.interrupted(); logger.warn("Shard leader watch triggered but Solr cannot talk to zk."); }
// in core/src/java/org/apache/solr/cloud/Overseer.java
catch (InterruptedException e) { Thread.currentThread().interrupt(); return; }
// in core/src/java/org/apache/solr/cloud/Overseer.java
catch (InterruptedException e) { Thread.currentThread().interrupt(); }
// in core/src/java/org/apache/solr/cloud/Overseer.java
catch (InterruptedException e) { Thread.currentThread().interrupt(); }
// in core/src/java/org/apache/solr/cloud/Overseer.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.warn("", e); }
// in core/src/java/org/apache/solr/cloud/Overseer.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); }
// in core/src/java/org/apache/solr/cloud/Overseer.java
catch (InterruptedException e) { Thread.currentThread().interrupt(); log.error("Failed to create watcher for shard leader col:" + collection + " shard:" + shardId + ", exception: " + e.getClass()); }
// in core/src/java/org/apache/solr/cloud/Overseer.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); }
// in core/src/java/org/apache/solr/cloud/ZkSolrResourceLoader.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/cloud/NodeStateWatcher.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); return; }
// in core/src/java/org/apache/solr/cloud/NodeStateWatcher.java
catch (InterruptedException e) { Thread.currentThread().interrupt(); log.warn("Could not talk to ZK", e); }
// in core/src/java/org/apache/solr/cloud/RecoveryStrategy.java
catch (InterruptedException e) { Thread.currentThread().interrupt(); log.warn("Recovery was interrupted", e); retries = INTERRUPTED; }
// in core/src/java/org/apache/solr/cloud/RecoveryStrategy.java
catch (InterruptedException e) { Thread.currentThread().interrupt(); log.warn("Recovery was interrupted", e); retries = INTERRUPTED; }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); throw new ZooKeeperException( SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.warn("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (InterruptedException e) { throw new SolrException(ErrorCode.SERVER_ERROR, "Interrupted"); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "could not publish node state", e); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (InterruptedException e) { Thread.currentThread().interrupt(); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (Exception e) { SolrException.log(log, "There was a problem finding the leader in zk", e); try { Thread.sleep(2000); } catch (InterruptedException e1) { Thread.currentThread().interrupt(); } if (i == retries - 1) { throw new SolrException(ErrorCode.SERVER_ERROR, "There was a problem finding the leader in zk"); } }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (InterruptedException e1) { Thread.currentThread().interrupt(); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (Exception e) { SolrException.log(log, "There was a problem making a request to the leader", e); try { Thread.sleep(2000); } catch (InterruptedException e1) { Thread.currentThread().interrupt(); } if (i == retries - 1) { throw new SolrException(ErrorCode.SERVER_ERROR, "There was a problem making a request to the leader"); } }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (InterruptedException e1) { Thread.currentThread().interrupt(); }
// in core/src/java/org/apache/solr/cloud/LeaderElector.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.warn("", e); }
// in core/src/java/org/apache/solr/update/UpdateLog.java
catch (InterruptedException e) { SolrException.log(log,e); }
// in core/src/java/org/apache/solr/update/DirectUpdateHandler2.java
catch (InterruptedException e) { SolrException.log(log,e); }
// in core/src/java/org/apache/solr/update/processor/DistributedUpdateProcessor.java
catch (InterruptedException e) { Thread.currentThread().interrupt(); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/update/processor/DistributedUpdateProcessor.java
catch (InterruptedException e) { Thread.currentThread().interrupt(); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/update/processor/DistributedUpdateProcessor.java
catch (InterruptedException e) { throw new SolrException(ErrorCode.SERVER_ERROR, "Exception finding leader for shard " + sliceName, e); }
// in core/src/java/org/apache/solr/update/processor/DistributedUpdateProcessor.java
catch (InterruptedException e) { Thread.currentThread().interrupt(); break; }
// in core/src/java/org/apache/solr/update/processor/DistributedUpdateProcessor.java
catch (InterruptedException e) { Thread.currentThread().interrupt(); return false; }
// in core/src/java/org/apache/solr/update/DefaultSolrCoreState.java
catch (InterruptedException e) { }
// in core/src/java/org/apache/solr/update/DefaultSolrCoreState.java
catch (InterruptedException e) { }
// in core/src/java/org/apache/solr/update/SolrCmdDistributor.java
catch (InterruptedException e) { throw new SolrException(SolrException.ErrorCode.SERVICE_UNAVAILABLE, "interrupted waiting for shard update response", e); }
// in core/src/java/org/apache/solr/core/RunExecutableListener.java
catch (InterruptedException e) { SolrException.log(log,e); ret = INVALID_PROCESS_RETURN_CODE; }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (InterruptedException e) { searcherExecutor.shutdownNow(); try { if (!searcherExecutor.awaitTermination(30, TimeUnit.SECONDS)) { log.error("Timeout waiting for searchExecutor to terminate"); } } catch (InterruptedException e2) { SolrException.log(log, e2); } }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (InterruptedException e2) { SolrException.log(log, e2); }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (InterruptedException e) { log.info(SolrException.toStr(e)); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); SolrException.log(log, "", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/util/ConcurrentLRUCache.java
catch (InterruptedException e) {}
// in core/src/java/org/apache/solr/util/ConcurrentLFUCache.java
catch (InterruptedException e) { }
// in core/src/java/org/apache/solr/util/FileUtils.java
catch (IOException ioe) { if (exc == null) exc = ioe; try { // Pause 5 msec Thread.sleep(5); } catch (InterruptedException ie) { Thread.currentThread().interrupt(); } }
// in core/src/java/org/apache/solr/util/FileUtils.java
catch (InterruptedException ie) { Thread.currentThread().interrupt(); }
26
            
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in solrj/src/java/org/apache/solr/common/cloud/ZkStateReader.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in solrj/src/java/org/apache/solr/common/cloud/ZkStateReader.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException( SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in solrj/src/java/org/apache/solr/common/cloud/ZkStateReader.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/ConcurrentUpdateSolrServer.java
catch (InterruptedException e) { log.error("interrupted", e); throw new IOException(e.getLocalizedMessage()); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/CloudSolrServer.java
catch (InterruptedException e) { Thread.currentThread().interrupt(); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
catch (InterruptedException e) { throw new InterruptedException("Index fetch interrupted"); }
// in core/src/java/org/apache/solr/handler/component/HttpShardHandler.java
catch (InterruptedException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/handler/admin/CoreAdminHandler.java
catch (InterruptedException e) { Thread.currentThread().interrupt(); throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Could not unregister core " + cname + " from cloudstate: " + e.getMessage(), e); }
// in core/src/java/org/apache/solr/request/PerSegmentSingleValuedFaceting.java
catch (InterruptedException e) { Thread.currentThread().interrupt(); throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/cloud/ZkSolrResourceLoader.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); throw new ZooKeeperException( SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.warn("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (InterruptedException e) { throw new SolrException(ErrorCode.SERVER_ERROR, "Interrupted"); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "could not publish node state", e); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (Exception e) { SolrException.log(log, "There was a problem finding the leader in zk", e); try { Thread.sleep(2000); } catch (InterruptedException e1) { Thread.currentThread().interrupt(); } if (i == retries - 1) { throw new SolrException(ErrorCode.SERVER_ERROR, "There was a problem finding the leader in zk"); } }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (Exception e) { SolrException.log(log, "There was a problem making a request to the leader", e); try { Thread.sleep(2000); } catch (InterruptedException e1) { Thread.currentThread().interrupt(); } if (i == retries - 1) { throw new SolrException(ErrorCode.SERVER_ERROR, "There was a problem making a request to the leader"); } }
// in core/src/java/org/apache/solr/update/processor/DistributedUpdateProcessor.java
catch (InterruptedException e) { Thread.currentThread().interrupt(); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/update/processor/DistributedUpdateProcessor.java
catch (InterruptedException e) { Thread.currentThread().interrupt(); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/update/processor/DistributedUpdateProcessor.java
catch (InterruptedException e) { throw new SolrException(ErrorCode.SERVER_ERROR, "Exception finding leader for shard " + sliceName, e); }
// in core/src/java/org/apache/solr/update/SolrCmdDistributor.java
catch (InterruptedException e) { throw new SolrException(SolrException.ErrorCode.SERVICE_UNAVAILABLE, "interrupted waiting for shard update response", e); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); SolrException.log(log, "", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
24
unknown (Lib) InvalidShapeException 0 0 0 11
            
// in core/src/java/org/apache/solr/schema/LatLonType.java
catch (InvalidShapeException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/schema/LatLonType.java
catch (InvalidShapeException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/schema/LatLonType.java
catch (InvalidShapeException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/schema/LatLonType.java
catch (InvalidShapeException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/schema/PointType.java
catch (InvalidShapeException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/schema/PointType.java
catch (InvalidShapeException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/schema/PointType.java
catch (InvalidShapeException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/schema/PointType.java
catch (InvalidShapeException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/schema/GeoHashField.java
catch (InvalidShapeException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/schema/GeoHashField.java
catch (InvalidShapeException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/search/function/distance/HaversineConstFunction.java
catch (InvalidShapeException e) { throw new ParseException("Bad spatial pt:" + pt); }
11
            
// in core/src/java/org/apache/solr/schema/LatLonType.java
catch (InvalidShapeException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/schema/LatLonType.java
catch (InvalidShapeException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/schema/LatLonType.java
catch (InvalidShapeException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/schema/LatLonType.java
catch (InvalidShapeException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/schema/PointType.java
catch (InvalidShapeException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/schema/PointType.java
catch (InvalidShapeException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/schema/PointType.java
catch (InvalidShapeException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/schema/PointType.java
catch (InvalidShapeException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/schema/GeoHashField.java
catch (InvalidShapeException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/schema/GeoHashField.java
catch (InvalidShapeException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/search/function/distance/HaversineConstFunction.java
catch (InvalidShapeException e) { throw new ParseException("Bad spatial pt:" + pt); }
11
unknown (Lib) InvalidTokenOffsetsException 0 0 0 1
            
// in core/src/java/org/apache/solr/highlight/DefaultSolrHighlighter.java
catch (InvalidTokenOffsetsException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
1
            
// in core/src/java/org/apache/solr/highlight/DefaultSolrHighlighter.java
catch (InvalidTokenOffsetsException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
1
unknown (Lib) InvocationTargetException 0 0 0 1
            
// in core/src/java/org/apache/solr/util/SolrPluginUtils.java
catch (InvocationTargetException e1) { throw new RuntimeException("Error invoking setter " + setterName + " on class : " + clazz.getName(), e1); }
1
            
// in core/src/java/org/apache/solr/util/SolrPluginUtils.java
catch (InvocationTargetException e1) { throw new RuntimeException("Error invoking setter " + setterName + " on class : " + clazz.getName(), e1); }
1
unknown (Lib) KeeperException 0 0 89
            
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
public void delete(final String path, final int version, boolean retryOnConnLoss) throws InterruptedException, KeeperException { if (retryOnConnLoss) { zkCmdExecutor.retryOperation(new ZkOperation() { @Override public Stat execute() throws KeeperException, InterruptedException { keeper.delete(path, version); return null; } }); }
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
Override public Stat execute() throws KeeperException, InterruptedException { keeper.delete(path, version); return null; }
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
public Stat exists(final String path, final Watcher watcher, boolean retryOnConnLoss) throws KeeperException, InterruptedException { if (retryOnConnLoss) { return zkCmdExecutor.retryOperation(new ZkOperation() { @Override public Stat execute() throws KeeperException, InterruptedException { return keeper.exists(path, watcher); } }); }
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
Override public Stat execute() throws KeeperException, InterruptedException { return keeper.exists(path, watcher); }
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
public Boolean exists(final String path, boolean retryOnConnLoss) throws KeeperException, InterruptedException { if (retryOnConnLoss) { return zkCmdExecutor.retryOperation(new ZkOperation() { @Override public Boolean execute() throws KeeperException, InterruptedException { return keeper.exists(path, null) != null; } }); }
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
Override public Boolean execute() throws KeeperException, InterruptedException { return keeper.exists(path, null) != null; }
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
public String create(final String path, final byte data[], final List<ACL> acl, final CreateMode createMode, boolean retryOnConnLoss) throws KeeperException, InterruptedException { if (retryOnConnLoss) { return zkCmdExecutor.retryOperation(new ZkOperation() { @Override public String execute() throws KeeperException, InterruptedException { return keeper.create(path, data, acl, createMode); } }); }
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
Override public String execute() throws KeeperException, InterruptedException { return keeper.create(path, data, acl, createMode); }
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
public List<String> getChildren(final String path, final Watcher watcher, boolean retryOnConnLoss) throws KeeperException, InterruptedException { if (retryOnConnLoss) { return zkCmdExecutor.retryOperation(new ZkOperation() { @Override public List<String> execute() throws KeeperException, InterruptedException { return keeper.getChildren(path, watcher); } }); }
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
Override public List<String> execute() throws KeeperException, InterruptedException { return keeper.getChildren(path, watcher); }
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
public byte[] getData(final String path, final Watcher watcher, final Stat stat, boolean retryOnConnLoss) throws KeeperException, InterruptedException { if (retryOnConnLoss) { return zkCmdExecutor.retryOperation(new ZkOperation() { @Override public byte[] execute() throws KeeperException, InterruptedException { return keeper.getData(path, watcher, stat); } }); }
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
Override public byte[] execute() throws KeeperException, InterruptedException { return keeper.getData(path, watcher, stat); }
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
public Stat setData(final String path, final byte data[], final int version, boolean retryOnConnLoss) throws KeeperException, InterruptedException { if (retryOnConnLoss) { return zkCmdExecutor.retryOperation(new ZkOperation() { @Override public Stat execute() throws KeeperException, InterruptedException { return keeper.setData(path, data, version); } }); }
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
Override public Stat execute() throws KeeperException, InterruptedException { return keeper.setData(path, data, version); }
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
public String create(final String path, final byte[] data, final CreateMode createMode, boolean retryOnConnLoss) throws KeeperException, InterruptedException { if (retryOnConnLoss) { return zkCmdExecutor.retryOperation(new ZkOperation() { @Override public String execute() throws KeeperException, InterruptedException { return keeper.create(path, data, ZooDefs.Ids.OPEN_ACL_UNSAFE, createMode); } }); }
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
Override public String execute() throws KeeperException, InterruptedException { return keeper.create(path, data, ZooDefs.Ids.OPEN_ACL_UNSAFE, createMode); }
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
public void makePath(String path, boolean retryOnConnLoss) throws KeeperException, InterruptedException { makePath(path, null, CreateMode.PERSISTENT, retryOnConnLoss); }
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
public void makePath(String path, boolean failOnExists, boolean retryOnConnLoss) throws KeeperException, InterruptedException { makePath(path, null, CreateMode.PERSISTENT, null, failOnExists, retryOnConnLoss); }
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
public void makePath(String path, File file, boolean failOnExists, boolean retryOnConnLoss) throws IOException, KeeperException, InterruptedException { makePath(path, FileUtils.readFileToString(file).getBytes("UTF-8"), CreateMode.PERSISTENT, null, failOnExists, retryOnConnLoss); }
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
public void makePath(String path, File file, boolean retryOnConnLoss) throws IOException, KeeperException, InterruptedException { makePath(path, FileUtils.readFileToString(file).getBytes("UTF-8"), retryOnConnLoss); }
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
public void makePath(String path, CreateMode createMode, boolean retryOnConnLoss) throws KeeperException, InterruptedException { makePath(path, null, createMode, retryOnConnLoss); }
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
public void makePath(String path, byte[] data, boolean retryOnConnLoss) throws KeeperException, InterruptedException { makePath(path, data, CreateMode.PERSISTENT, retryOnConnLoss); }
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
public void makePath(String path, byte[] data, CreateMode createMode, boolean retryOnConnLoss) throws KeeperException, InterruptedException { makePath(path, data, createMode, null, retryOnConnLoss); }
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
public void makePath(String path, byte[] data, CreateMode createMode, Watcher watcher, boolean retryOnConnLoss) throws KeeperException, InterruptedException { makePath(path, data, createMode, watcher, true, retryOnConnLoss); }
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
public void makePath(String path, byte[] data, CreateMode createMode, Watcher watcher, boolean failOnExists, boolean retryOnConnLoss) throws KeeperException, InterruptedException { if (log.isInfoEnabled()) { log.info("makePath: " + path); } boolean retry = true; if (path.startsWith("/")) { path = path.substring(1, path.length()); } String[] paths = path.split("/"); StringBuilder sbPath = new StringBuilder(); for (int i = 0; i < paths.length; i++) { byte[] bytes = null; String pathPiece = paths[i]; sbPath.append("/" + pathPiece); final String currentPath = sbPath.toString(); Object exists = exists(currentPath, watcher, retryOnConnLoss); if (exists == null || ((i == paths.length -1) && failOnExists)) { CreateMode mode = CreateMode.PERSISTENT; if (i == paths.length - 1) { mode = createMode; bytes = data; if (!retryOnConnLoss) retry = false; } try { if (retry) { final CreateMode finalMode = mode; final byte[] finalBytes = bytes; zkCmdExecutor.retryOperation(new ZkOperation() { @Override public Object execute() throws KeeperException, InterruptedException { keeper.create(currentPath, finalBytes, ZooDefs.Ids.OPEN_ACL_UNSAFE, finalMode); return null; } }); } else { keeper.create(currentPath, bytes, ZooDefs.Ids.OPEN_ACL_UNSAFE, mode); } } catch (NodeExistsException e) { if (!failOnExists) { // TODO: version ? for now, don't worry about race setData(currentPath, data, -1, retryOnConnLoss); // set new watch exists(currentPath, watcher, retryOnConnLoss); return; } // ignore unless it's the last node in the path if (i == paths.length - 1) { throw e; } } if(i == paths.length -1) { // set new watch exists(currentPath, watcher, retryOnConnLoss); } } else if (i == paths.length - 1) { // TODO: version ? for now, don't worry about race setData(currentPath, data, -1, retryOnConnLoss); // set new watch exists(currentPath, watcher, retryOnConnLoss); } }
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
Override public Object execute() throws KeeperException, InterruptedException { keeper.create(currentPath, finalBytes, ZooDefs.Ids.OPEN_ACL_UNSAFE, finalMode); return null; }
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
public void makePath(String zkPath, CreateMode createMode, Watcher watcher, boolean retryOnConnLoss) throws KeeperException, InterruptedException { makePath(zkPath, null, createMode, watcher, retryOnConnLoss); }
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
public void setData(String path, byte[] data, boolean retryOnConnLoss) throws KeeperException, InterruptedException { setData(path, data, -1, retryOnConnLoss); }
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
public void setData(String path, File file, boolean retryOnConnLoss) throws IOException, KeeperException, InterruptedException { if (log.isInfoEnabled()) { log.info("Write to ZooKeepeer " + file.getAbsolutePath() + " to " + path); } String data = FileUtils.readFileToString(file); setData(path, data.getBytes("UTF-8"), retryOnConnLoss); }
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
public void printLayout(String path, int indent, StringBuilder string) throws KeeperException, InterruptedException { byte[] data = getData(path, null, null, true); List<String> children = getChildren(path, null, true); StringBuilder dent = new StringBuilder(); for (int i = 0; i < indent; i++) { dent.append(" "); } string.append(dent + path + " (" + children.size() + ")" + NEWL); if (data != null) { try { String dataString = new String(data, "UTF-8"); if ((!path.endsWith(".txt") && !path.endsWith(".xml")) || path.endsWith(ZkStateReader.CLUSTER_STATE)) { if (path.endsWith(".xml")) { // this is the cluster state in xml format - lets pretty print dataString = prettyPrint(dataString); } string.append(dent + "DATA:\n" + dent + " " + dataString.replaceAll("\n", "\n" + dent + " ") + NEWL); } else { string.append(dent + "DATA: ...supressed..." + NEWL); } } catch (UnsupportedEncodingException e) { // can't happen - UTF-8 throw new RuntimeException(e); } } for (String child : children) { if (!child.equals("quota")) { try { printLayout(path + (path.equals("/") ? "" : "/") + child, indent + 1, string); } catch (NoNodeException e) { // must have gone away } } } }
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
public void printLayoutToStdOut() throws KeeperException, InterruptedException { StringBuilder sb = new StringBuilder(); printLayout("/", 0, sb); System.out.println(sb.toString()); }
// in solrj/src/java/org/apache/solr/common/cloud/ZkStateReader.java
public void updateCloudState(boolean immediate) throws KeeperException, InterruptedException { updateCloudState(immediate, false); }
// in solrj/src/java/org/apache/solr/common/cloud/ZkStateReader.java
public void updateLiveNodes() throws KeeperException, InterruptedException { updateCloudState(true, true); }
// in solrj/src/java/org/apache/solr/common/cloud/ZkStateReader.java
public synchronized void createClusterStateWatchersAndUpdate() throws KeeperException, InterruptedException { // We need to fetch the current cluster state and the set of live nodes synchronized (getUpdateLock()) { cmdExecutor.ensureExists(CLUSTER_STATE, zkClient); log.info("Updating cluster state from ZooKeeper... "); zkClient.exists(CLUSTER_STATE, new Watcher() { @Override public void process(WatchedEvent event) { log.info("A cluster state change has occurred"); try { // delayed approach // ZkStateReader.this.updateCloudState(false, false); synchronized (ZkStateReader.this.getUpdateLock()) { // remake watch final Watcher thisWatch = this; byte[] data = zkClient.getData(CLUSTER_STATE, thisWatch, null, true); CloudState clusterState = CloudState.load(data, ZkStateReader.this.cloudState.getLiveNodes()); // update volatile cloudState = clusterState; } } catch (KeeperException e) { if (e.code() == KeeperException.Code.SESSIONEXPIRED || e.code() == KeeperException.Code.CONNECTIONLOSS) { log.warn("ZooKeeper watch triggered, but Solr cannot talk to ZK"); return; } log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.warn("", e); return; } }
// in solrj/src/java/org/apache/solr/common/cloud/ZkStateReader.java
private synchronized void updateCloudState(boolean immediate, final boolean onlyLiveNodes) throws KeeperException, InterruptedException { log.info("Manual update of cluster state initiated"); // build immutable CloudInfo if (immediate) { CloudState clusterState; synchronized (getUpdateLock()) { List<String> liveNodes = zkClient.getChildren(LIVE_NODES_ZKNODE, null, true); Set<String> liveNodesSet = new HashSet<String>(); liveNodesSet.addAll(liveNodes); if (!onlyLiveNodes) { log.info("Updating cloud state from ZooKeeper... "); clusterState = CloudState.load(zkClient, liveNodesSet); } else { log.info("Updating live nodes from ZooKeeper... "); clusterState = new CloudState(liveNodesSet, ZkStateReader.this.cloudState.getCollectionStates()); } } this.cloudState = clusterState; }
// in solrj/src/java/org/apache/solr/common/cloud/ZkStateReader.java
public String getLeaderUrl(String collection, String shard) throws InterruptedException, KeeperException { return getLeaderUrl(collection, shard, 1000); }
// in solrj/src/java/org/apache/solr/common/cloud/ZkStateReader.java
public String getLeaderUrl(String collection, String shard, int timeout) throws InterruptedException, KeeperException { ZkCoreNodeProps props = new ZkCoreNodeProps(getLeaderProps(collection, shard, timeout)); return props.getCoreUrl(); }
// in solrj/src/java/org/apache/solr/common/cloud/CloudState.java
public static CloudState load(SolrZkClient zkClient, Set<String> liveNodes) throws KeeperException, InterruptedException { byte[] state = zkClient.getData(ZkStateReader.CLUSTER_STATE, null, null, true); return load(state, liveNodes); }
// in solrj/src/java/org/apache/solr/common/cloud/CloudState.java
public static CloudState load(byte[] bytes, Set<String> liveNodes) throws KeeperException, InterruptedException { if (bytes == null || bytes.length == 0) { return new CloudState(liveNodes, Collections.<String, Map<String,Slice>>emptyMap()); } LinkedHashMap<String, Object> stateMap = (LinkedHashMap<String, Object>) ZkStateReader.fromJSON(bytes); HashMap<String,Map<String, Slice>> state = new HashMap<String,Map<String,Slice>>(); for(String collectionName: stateMap.keySet()){ Map<String, Object> collection = (Map<String, Object>)stateMap.get(collectionName); Map<String, Slice> slices = new LinkedHashMap<String,Slice>(); for(String sliceName: collection.keySet()) { Map<String, Map<String, String>> sliceMap = (Map<String, Map<String, String>>)collection.get(sliceName); Map<String, ZkNodeProps> shards = new LinkedHashMap<String,ZkNodeProps>(); for(String shardName: sliceMap.keySet()) { shards.put(shardName, new ZkNodeProps(sliceMap.get(shardName))); } Slice slice = new Slice(sliceName, shards); slices.put(sliceName, slice); } state.put(collectionName, slices); } return new CloudState(liveNodes, state); }
// in solrj/src/java/org/apache/solr/common/cloud/ZkCmdExecutor.java
public void ensureExists(String path, final SolrZkClient zkClient) throws KeeperException, InterruptedException { ensureExists(path, null, CreateMode.PERSISTENT, zkClient); }
// in solrj/src/java/org/apache/solr/common/cloud/ZkCmdExecutor.java
public void ensureExists(final String path, final byte[] data, CreateMode createMode, final SolrZkClient zkClient) throws KeeperException, InterruptedException { if (zkClient.exists(path, true)) { return; } try { zkClient.makePath(path, data, true); } catch (NodeExistsException e) { // its okay if another beats us creating the node } }
// in core/src/java/org/apache/solr/handler/admin/ShowFileRequestHandler.java
Override public void handleRequestBody(SolrQueryRequest req, SolrQueryResponse rsp) throws IOException, KeeperException, InterruptedException { CoreContainer coreContainer = req.getCore().getCoreDescriptor().getCoreContainer(); if (coreContainer.isZooKeeperAware()) { showFromZooKeeper(req, rsp, coreContainer); } else { showFromFileSystem(req, rsp); } }
// in core/src/java/org/apache/solr/handler/admin/ShowFileRequestHandler.java
private void showFromZooKeeper(SolrQueryRequest req, SolrQueryResponse rsp, CoreContainer coreContainer) throws KeeperException, InterruptedException, UnsupportedEncodingException { String adminFile = null; SolrCore core = req.getCore(); SolrZkClient zkClient = coreContainer.getZkController().getZkClient(); final ZkSolrResourceLoader loader = (ZkSolrResourceLoader) core .getResourceLoader(); String confPath = loader.getCollectionZkPath(); String fname = req.getParams().get("file", null); if (fname == null) { adminFile = confPath; } else { fname = fname.replace('\\', '/'); // normalize slashes if (hiddenFiles.contains(fname.toUpperCase(Locale.ENGLISH))) { throw new SolrException(ErrorCode.FORBIDDEN, "Can not access: " + fname); } if (fname.indexOf("..") >= 0) { throw new SolrException(ErrorCode.FORBIDDEN, "Invalid path: " + fname); } adminFile = confPath + "/" + fname; } // Make sure the file exists, is readable and is not a hidden file if (!zkClient.exists(adminFile, true)) { throw new SolrException(ErrorCode.BAD_REQUEST, "Can not find: " + adminFile); } // Show a directory listing List<String> children = zkClient.getChildren(adminFile, null, true); if (children.size() > 0) { NamedList<SimpleOrderedMap<Object>> files = new SimpleOrderedMap<SimpleOrderedMap<Object>>(); for (String f : children) { if (hiddenFiles.contains(f.toUpperCase(Locale.ENGLISH))) { continue; // don't show 'hidden' files } if (f.startsWith(".")) { continue; // skip hidden system files... } SimpleOrderedMap<Object> fileInfo = new SimpleOrderedMap<Object>(); files.add(f, fileInfo); List<String> fchildren = zkClient.getChildren(adminFile, null, true); if (fchildren.size() > 0) { fileInfo.add("directory", true); } else { // TODO? content type fileInfo.add("size", f.length()); } // TODO: ? // fileInfo.add( "modified", new Date( f.lastModified() ) ); } rsp.add("files", files); } else { // Include the file contents // The file logic depends on RawResponseWriter, so force its use. ModifiableSolrParams params = new ModifiableSolrParams(req.getParams()); params.set(CommonParams.WT, "raw"); req.setParams(params); ContentStreamBase content = new ContentStreamBase.StringStream( new String(zkClient.getData(adminFile, null, null, true), "UTF-8")); content.setContentType(req.getParams().get(USE_CONTENT_TYPE)); rsp.add(RawResponseWriter.CONTENT, content); } rsp.setHttpCaching(false); }
// in core/src/java/org/apache/solr/cloud/ShardLeaderWatcher.java
private void processLeaderChange() throws KeeperException, InterruptedException { if(closed) return; try { byte[] data = zkClient.getData(path, this, null, true); if (data != null) { final ZkCoreNodeProps leaderProps = new ZkCoreNodeProps(ZkNodeProps.load(data)); listener.announceLeader(collection, shard, leaderProps); } } catch (KeeperException ke) { //check if we lost connection or the node was gone if (ke.code() != Code.CONNECTIONLOSS && ke.code() != Code.SESSIONEXPIRED && ke.code() != Code.NONODE) { throw ke; } } }
// in core/src/java/org/apache/solr/cloud/Overseer.java
private CloudState updateState(CloudState state, String nodeName, CoreState coreState) throws KeeperException, InterruptedException { String collection = coreState.getCollectionName(); String zkCoreNodeName = coreState.getCoreNodeName(); //collection does not yet exist, create placeholders if num shards is specified if (!state.getCollections().contains(coreState.getCollectionName()) && coreState.getNumShards() != null) { state = createCollection(state, collection, coreState.getNumShards()); } // use the provided non null shardId String shardId = coreState.getProperties().get(ZkStateReader.SHARD_ID_PROP); if(shardId==null) { //use shardId from CloudState shardId = getAssignedId(state, nodeName, coreState); } if(shardId==null) { //request new shardId shardId = AssignShard.assignShard(collection, state, coreState.getNumShards()); } Map<String,String> props = new HashMap<String,String>(); Map<String,String> coreProps = new HashMap<String,String>(coreState.getProperties().size()); coreProps.putAll(coreState.getProperties()); // we don't put num_shards in the clusterstate coreProps.remove("num_shards"); for (Entry<String,String> entry : coreProps.entrySet()) { props.put(entry.getKey(), entry.getValue()); } ZkNodeProps zkProps = new ZkNodeProps(props); Slice slice = state.getSlice(collection, shardId); Map<String,ZkNodeProps> shardProps; if (slice == null) { shardProps = new HashMap<String,ZkNodeProps>(); } else { shardProps = state.getSlice(collection, shardId).getShardsCopy(); } shardProps.put(zkCoreNodeName, zkProps); slice = new Slice(shardId, shardProps); CloudState newCloudState = updateSlice(state, collection, slice); return newCloudState; }
// in core/src/java/org/apache/solr/cloud/Overseer.java
public synchronized void createWatches() throws KeeperException, InterruptedException { addCollectionsWatch(); addLiveNodesWatch(); }
// in core/src/java/org/apache/solr/cloud/Overseer.java
private void addCollectionsWatch() throws KeeperException, InterruptedException { zkCmdExecutor.ensureExists(ZkStateReader.COLLECTIONS_ZKNODE, zkClient); List<String> collections = zkClient.getChildren(ZkStateReader.COLLECTIONS_ZKNODE, new Watcher(){ @Override public void process(WatchedEvent event) { try { List<String> collections = zkClient.getChildren(ZkStateReader.COLLECTIONS_ZKNODE, this, true); collectionsChanged(collections); } catch (KeeperException e) { if (e.code() == Code.CONNECTIONLOSS || e.code() == Code.SESSIONEXPIRED) { log.warn("ZooKeeper watch triggered, but Solr cannot talk to ZK"); return; } } catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.warn("", e); } } }, true); collectionsChanged(collections); }
// in core/src/java/org/apache/solr/cloud/Overseer.java
private void collectionsChanged(Collection<String> collections) throws KeeperException, InterruptedException { synchronized (shardLeaderWatches) { for(String collection: collections) { if(!shardLeaderWatches.containsKey(collection)) { shardLeaderWatches.put(collection, new HashMap<String,ShardLeaderWatcher>()); addShardLeadersWatch(collection); } } //XXX not handling delete collections.. } }
// in core/src/java/org/apache/solr/cloud/Overseer.java
private void addShardLeadersWatch(final String collection) throws KeeperException, InterruptedException { zkCmdExecutor.ensureExists(ZkStateReader.getShardLeadersPath(collection, null), zkClient); final List<String> leaderNodes = zkClient.getChildren( ZkStateReader.getShardLeadersPath(collection, null), new Watcher() { @Override public void process(WatchedEvent event) { try { List<String> leaderNodes = zkClient.getChildren( ZkStateReader.getShardLeadersPath(collection, null), this, true); processLeaderNodesChanged(collection, leaderNodes); } catch (KeeperException e) { if (e.code() == KeeperException.Code.SESSIONEXPIRED || e.code() == KeeperException.Code.CONNECTIONLOSS) { log.warn("ZooKeeper watch triggered, but Solr cannot talk to ZK"); return; } SolrException.log(log, "", e); throw new ZooKeeperException( SolrException.ErrorCode.SERVER_ERROR, "", e); } catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); } } }, true); processLeaderNodesChanged(collection, leaderNodes); }
// in core/src/java/org/apache/solr/cloud/Overseer.java
private void addLiveNodesWatch() throws KeeperException, InterruptedException { List<String> liveNodes = zkCmdExecutor.retryOperation(new ZkOperation() { @Override public Object execute() throws KeeperException, InterruptedException { return zkClient.getChildren( ZkStateReader.LIVE_NODES_ZKNODE, new Watcher() { @Override public void process(WatchedEvent event) { try { List<String> liveNodes = zkClient.getChildren( ZkStateReader.LIVE_NODES_ZKNODE, this, true); synchronized (nodeStateWatches) { processLiveNodesChanged(nodeStateWatches.keySet(), liveNodes); } } catch (KeeperException e) { if (e.code() == KeeperException.Code.SESSIONEXPIRED || e.code() == KeeperException.Code.CONNECTIONLOSS) { log.warn("ZooKeeper watch triggered, but Solr cannot talk to ZK"); return; } SolrException.log(log, "", e); throw new ZooKeeperException( SolrException.ErrorCode.SERVER_ERROR, "", e); } catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); } } }, true); } }); processLiveNodesChanged(Collections.<String>emptySet(), liveNodes); }
// in core/src/java/org/apache/solr/cloud/Overseer.java
Override public Object execute() throws KeeperException, InterruptedException { return zkClient.getChildren( ZkStateReader.LIVE_NODES_ZKNODE, new Watcher() { @Override public void process(WatchedEvent event) { try { List<String> liveNodes = zkClient.getChildren( ZkStateReader.LIVE_NODES_ZKNODE, this, true); synchronized (nodeStateWatches) { processLiveNodesChanged(nodeStateWatches.keySet(), liveNodes); } } catch (KeeperException e) { if (e.code() == KeeperException.Code.SESSIONEXPIRED || e.code() == KeeperException.Code.CONNECTIONLOSS) { log.warn("ZooKeeper watch triggered, but Solr cannot talk to ZK"); return; } SolrException.log(log, "", e); throw new ZooKeeperException( SolrException.ErrorCode.SERVER_ERROR, "", e); } catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); } } }, true); }
// in core/src/java/org/apache/solr/cloud/Overseer.java
private void processLiveNodesChanged(Collection<String> oldLiveNodes, Collection<String> liveNodes) throws InterruptedException, KeeperException { Set<String> upNodes = complement(liveNodes, oldLiveNodes); if (upNodes.size() > 0) { addNodeStateWatches(upNodes); } Set<String> downNodes = complement(oldLiveNodes, liveNodes); for(String node: downNodes) { synchronized (nodeStateWatches) { NodeStateWatcher watcher = nodeStateWatches.remove(node); } log.debug("Removed NodeStateWatcher for node:" + node); } }
// in core/src/java/org/apache/solr/cloud/Overseer.java
private void addNodeStateWatches(Set<String> nodeNames) throws InterruptedException, KeeperException { for (String nodeName : nodeNames) { final String path = STATES_NODE + "/" + nodeName; synchronized (nodeStateWatches) { if (!nodeStateWatches.containsKey(nodeName)) { zkCmdExecutor.ensureExists(path, zkClient); nodeStateWatches.put(nodeName, new NodeStateWatcher(zkClient, nodeName, path, this)); log.debug("Added NodeStateWatcher for node " + nodeName); } else { log.debug("watch already added"); } } } }
// in core/src/java/org/apache/solr/cloud/Overseer.java
Override public void coreChanged(final String nodeName, final Set<CoreState> states) throws KeeperException, InterruptedException { log.info("Core change pooled: " + nodeName + " states:" + states); for (CoreState state : states) { fifo.add(new CloudStateUpdateRequest(Op.StateChange, nodeName, state)); } }
// in core/src/java/org/apache/solr/cloud/Overseer.java
Override public void coreDeleted(String nodeName, Collection<CoreState> states) throws KeeperException, InterruptedException { for (CoreState state : states) { fifo.add(new CloudStateUpdateRequest(Op.CoreDeleted, state.getCollectionName(), state.getCoreNodeName())); } }
// in core/src/java/org/apache/solr/cloud/Overseer.java
public static void createClientNodes(SolrZkClient zkClient, String nodeName) throws KeeperException, InterruptedException { final String node = STATES_NODE + "/" + nodeName; if (log.isInfoEnabled()) { log.info("creating node:" + node); } ZkCmdExecutor zkCmdExecutor = new ZkCmdExecutor(); zkCmdExecutor.ensureExists(node, zkClient); }
// in core/src/java/org/apache/solr/cloud/ElectionContext.java
public void cancelElection() throws InterruptedException, KeeperException { zkClient.delete(leaderSeqPath, -1, true); }
// in core/src/java/org/apache/solr/cloud/ElectionContext.java
Override void runLeaderProcess(boolean weAreReplacement) throws KeeperException, InterruptedException, IOException { try { zkClient.makePath(leaderPath, leaderProps == null ? null : ZkStateReader.toJSON(leaderProps), CreateMode.EPHEMERAL, true); } catch (NodeExistsException e) { // if a previous leader ephemeral still exists for some reason, try and // remove it zkClient.delete(leaderPath, -1, true); zkClient.makePath(leaderPath, leaderProps == null ? null : ZkStateReader.toJSON(leaderProps), CreateMode.EPHEMERAL, true); } }
// in core/src/java/org/apache/solr/cloud/ElectionContext.java
Override void runLeaderProcess(boolean weAreReplacement) throws KeeperException, InterruptedException, IOException { if (cc != null) { String coreName = leaderProps.get(ZkStateReader.CORE_NAME_PROP); SolrCore core = null; try { // the first time we are run, we will get a startupCore - after // we will get null and must use cc.getCore core = cc.getCore(coreName); if (core == null) { cancelElection(); throw new SolrException(ErrorCode.SERVER_ERROR, "Fatal Error, SolrCore not found:" + coreName + " in " + cc.getCoreNames()); } // should I be leader? if (weAreReplacement && !shouldIBeLeader(leaderProps)) { // System.out.println("there is a better leader candidate it appears"); rejoinLeaderElection(leaderSeqPath, core); return; } if (weAreReplacement) { if (zkClient.exists(leaderPath, true)) { zkClient.delete(leaderPath, -1, true); } // System.out.println("I may be the new Leader:" + leaderPath // + " - I need to try and sync"); boolean success = syncStrategy.sync(zkController, core, leaderProps); if (!success && anyoneElseActive()) { rejoinLeaderElection(leaderSeqPath, core); return; } } // If I am going to be the leader I have to be active // System.out.println("I am leader go active"); core.getUpdateHandler().getSolrCoreState().cancelRecovery(); zkController.publish(core.getCoreDescriptor(), ZkStateReader.ACTIVE); } finally { if (core != null ) { core.close(); } } } super.runLeaderProcess(weAreReplacement); }
// in core/src/java/org/apache/solr/cloud/ElectionContext.java
private void rejoinLeaderElection(String leaderSeqPath, SolrCore core) throws InterruptedException, KeeperException, IOException { // remove our ephemeral and re join the election // System.out.println("sync failed, delete our election node:" // + leaderSeqPath); zkController.publish(core.getCoreDescriptor(), ZkStateReader.DOWN); cancelElection(); core.getUpdateHandler().getSolrCoreState().doRecovery(cc, core.getName()); leaderElector.joinElection(this); }
// in core/src/java/org/apache/solr/cloud/ElectionContext.java
Override void runLeaderProcess(boolean weAreReplacement) throws KeeperException, InterruptedException { final String id = leaderSeqPath.substring(leaderSeqPath.lastIndexOf("/")+1); ZkNodeProps myProps = new ZkNodeProps("id", id); try { zkClient.makePath(leaderPath, ZkStateReader.toJSON(myProps), CreateMode.EPHEMERAL, true); } catch (NodeExistsException e) { // if a previous leader ephemeral still exists for some reason, try and // remove it zkClient.delete(leaderPath, -1, true); zkClient.makePath(leaderPath, ZkStateReader.toJSON(myProps), CreateMode.EPHEMERAL, true); } new Overseer(zkClient, stateReader, id); }
// in core/src/java/org/apache/solr/cloud/NodeStateWatcher.java
private void processStateChange() throws KeeperException, InterruptedException { byte[] data = zkClient.getData(path, this, null, true); if (data != null) { CoreState[] states = CoreState.load(data); List<CoreState> stateList = Arrays.asList(states); HashSet<CoreState> modifiedCores = new HashSet<CoreState>(); modifiedCores.addAll(stateList); modifiedCores.removeAll(currentState); HashSet<CoreState> newState = new HashSet<CoreState>(); newState.addAll(stateList); HashMap<String, CoreState> lookup = new HashMap<String, CoreState>(); for(CoreState state: states) { lookup.put(state.getCoreName(), state); } //check for status change for(CoreState state: currentState) { if(lookup.containsKey(state.getCoreName())) { if(!state.getProperties().equals(lookup.get(state.getCoreName()).getProperties())) { modifiedCores.add(lookup.get(state.getCoreName())); } } } HashMap<String, CoreState> deletedCores = new HashMap<String, CoreState>(); for(CoreState state: currentState) { deletedCores.put(state.getCoreNodeName(), state); } for(CoreState state: stateList) { deletedCores.remove(state.getCoreNodeName()); } if (deletedCores.size() > 0) { listener.coreDeleted(nodeName, deletedCores.values()); } currentState = Collections.unmodifiableSet(newState); if (modifiedCores.size() > 0) { try { listener.coreChanged(nodeName, Collections.unmodifiableSet(modifiedCores)); } catch (KeeperException e) { log.warn("Could not talk to ZK", e); } catch (InterruptedException e) { Thread.currentThread().interrupt(); log.warn("Could not talk to ZK", e); } } } else { // ignore null state } }
// in core/src/java/org/apache/solr/cloud/ZkController.java
public boolean configFileExists(String collection, String fileName) throws KeeperException, InterruptedException { Stat stat = zkClient.exists(CONFIGS_ZKNODE + "/" + collection + "/" + fileName, null, true); return stat != null; }
// in core/src/java/org/apache/solr/cloud/ZkController.java
public byte[] getConfigFileData(String zkConfigName, String fileName) throws KeeperException, InterruptedException { String zkPath = CONFIGS_ZKNODE + "/" + zkConfigName + "/" + fileName; byte[] bytes = zkClient.getData(zkPath, null, null, true); if (bytes == null) { log.error("Config file contains no data:" + zkPath); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "Config file contains no data:" + zkPath); } return bytes; }
// in core/src/java/org/apache/solr/cloud/ZkController.java
private void syncNodeState() throws KeeperException, InterruptedException { log.debug("Syncing internal state with zk. Current: " + coreStates); final String path = Overseer.STATES_NODE + "/" + getNodeName(); final byte[] data = zkClient.getData(path, null, null, true); if (data != null) { CoreState[] states = CoreState.load(data); synchronized (coreStates) { coreStates.clear(); // TODO: should we do this? for(CoreState coreState: states) { coreStates.put(coreState.getCoreName(), coreState); } } } log.debug("after sync: " + coreStates); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
private void createEphemeralLiveNode() throws KeeperException, InterruptedException { String nodeName = getNodeName(); String nodePath = ZkStateReader.LIVE_NODES_ZKNODE + "/" + nodeName; log.info("Register node as live in ZooKeeper:" + nodePath); try { boolean nodeDeleted = true; try { // we attempt a delete in the case of a quick server bounce - // if there was not a graceful shutdown, the node may exist // until expiration timeout - so a node won't be created here because // it exists, but eventually the node will be removed. So delete // in case it exists and create a new node. zkClient.delete(nodePath, -1, true); } catch (KeeperException.NoNodeException e) { // fine if there is nothing to delete // TODO: annoying that ZK logs a warning on us nodeDeleted = false; } if (nodeDeleted) { log .info("Found a previous node that still exists while trying to register a new live node " + nodePath + " - removing existing node to create another."); } zkClient.makePath(nodePath, CreateMode.EPHEMERAL, true); } catch (KeeperException e) { // its okay if the node already exists if (e.code() != KeeperException.Code.NODEEXISTS) { throw e; } } }
// in core/src/java/org/apache/solr/cloud/ZkController.java
public boolean pathExists(String path) throws KeeperException, InterruptedException { return zkClient.exists(path, true); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
public String readConfigName(String collection) throws KeeperException, InterruptedException, IOException { String configName = null; String path = ZkStateReader.COLLECTIONS_ZKNODE + "/" + collection; if (log.isInfoEnabled()) { log.info("Load collection config from:" + path); } byte[] data = zkClient.getData(path, null, null, true); if(data != null) { ZkNodeProps props = ZkNodeProps.load(data); configName = props.get(CONFIGNAME_PROP); } if (configName != null && !zkClient.exists(CONFIGS_ZKNODE + "/" + configName, true)) { log.error("Specified config does not exist in ZooKeeper:" + configName); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "Specified config does not exist in ZooKeeper:" + configName); } return configName; }
// in core/src/java/org/apache/solr/cloud/ZkController.java
private ZkCoreNodeProps getLeaderProps(final String collection, final String slice) throws KeeperException, InterruptedException { int iterCount = 60; while (iterCount-- > 0) try { byte[] data = zkClient.getData( ZkStateReader.getShardLeadersPath(collection, slice), null, null, true); ZkCoreNodeProps leaderProps = new ZkCoreNodeProps( ZkNodeProps.load(data)); return leaderProps; } catch (NoNodeException e) { Thread.sleep(500); } throw new RuntimeException("Could not get leader props"); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
private void joinElection(CoreDescriptor cd) throws InterruptedException, KeeperException, IOException { String shardId = cd.getCloudDescriptor().getShardId(); Map<String,String> props = new HashMap<String,String>(); // we only put a subset of props into the leader node props.put(ZkStateReader.BASE_URL_PROP, getBaseUrl()); props.put(ZkStateReader.CORE_NAME_PROP, cd.getName()); props.put(ZkStateReader.NODE_NAME_PROP, getNodeName()); final String coreZkNodeName = getNodeName() + "_" + cd.getName(); ZkNodeProps ourProps = new ZkNodeProps(props); String collection = cd.getCloudDescriptor() .getCollectionName(); ElectionContext context = new ShardLeaderElectionContext(leaderElector, shardId, collection, coreZkNodeName, ourProps, this, cc); leaderElector.setup(context); electionContexts.put(coreZkNodeName, context); leaderElector.joinElection(context); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
private boolean checkRecovery(String coreName, final CoreDescriptor desc, boolean recoverReloadedCores, final boolean isLeader, final CloudDescriptor cloudDesc, final String collection, final String shardZkNodeName, String shardId, ZkNodeProps leaderProps, SolrCore core, CoreContainer cc) throws InterruptedException, KeeperException, IOException, ExecutionException { if (SKIP_AUTO_RECOVERY) { log.warn("Skipping recovery according to sys prop solrcloud.skip.autorecovery"); return false; } boolean doRecovery = true; if (!isLeader) { if (core.isReloaded() && !recoverReloadedCores) { doRecovery = false; } if (doRecovery) { log.info("Core needs to recover:" + core.getName()); core.getUpdateHandler().getSolrCoreState().doRecovery(cc, coreName); return true; } } else { log.info("I am the leader, no recovery necessary"); } return false; }
// in core/src/java/org/apache/solr/cloud/ZkController.java
public void unregister(String coreName, CloudDescriptor cloudDesc) throws InterruptedException, KeeperException { synchronized (coreStates) { coreStates.remove(coreName); } publishState(); final String zkNodeName = getNodeName() + "_" + coreName; ElectionContext context = electionContexts.remove(zkNodeName); if (context != null) { context.cancelElection(); } }
// in core/src/java/org/apache/solr/cloud/ZkController.java
public void uploadToZK(File dir, String zkPath) throws IOException, KeeperException, InterruptedException { uploadToZK(zkClient, dir, zkPath); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
public void uploadConfigDir(File dir, String configName) throws IOException, KeeperException, InterruptedException { uploadToZK(zkClient, dir, ZkController.CONFIGS_ZKNODE + "/" + configName); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
void printLayoutToStdOut() throws KeeperException, InterruptedException { zkClient.printLayoutToStdOut(); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
public void createCollectionZkNode(CloudDescriptor cd) throws KeeperException, InterruptedException, IOException { String collection = cd.getCollectionName(); log.info("Check for collection zkNode:" + collection); String collectionPath = ZkStateReader.COLLECTIONS_ZKNODE + "/" + collection; try { if(!zkClient.exists(collectionPath, true)) { log.info("Creating collection in ZooKeeper:" + collection); SolrParams params = cd.getParams(); try { Map<String,String> collectionProps = new HashMap<String,String>(); // TODO: if collection.configName isn't set, and there isn't already a conf in zk, just use that? String defaultConfigName = System.getProperty(COLLECTION_PARAM_PREFIX+CONFIGNAME_PROP, collection); // params passed in - currently only done via core admin (create core commmand). if (params != null) { Iterator<String> iter = params.getParameterNamesIterator(); while (iter.hasNext()) { String paramName = iter.next(); if (paramName.startsWith(COLLECTION_PARAM_PREFIX)) { collectionProps.put(paramName.substring(COLLECTION_PARAM_PREFIX.length()), params.get(paramName)); } } // if the config name wasn't passed in, use the default if (!collectionProps.containsKey(CONFIGNAME_PROP)) getConfName(collection, collectionPath, collectionProps); } else if(System.getProperty("bootstrap_confdir") != null) { // if we are bootstrapping a collection, default the config for // a new collection to the collection we are bootstrapping log.info("Setting config for collection:" + collection + " to " + defaultConfigName); Properties sysProps = System.getProperties(); for (String sprop : System.getProperties().stringPropertyNames()) { if (sprop.startsWith(COLLECTION_PARAM_PREFIX)) { collectionProps.put(sprop.substring(COLLECTION_PARAM_PREFIX.length()), sysProps.getProperty(sprop)); } } // if the config name wasn't passed in, use the default if (!collectionProps.containsKey(CONFIGNAME_PROP)) collectionProps.put(CONFIGNAME_PROP, defaultConfigName); } else if (Boolean.getBoolean("bootstrap_conf")) { // the conf name should should be the collection name of this core collectionProps.put(CONFIGNAME_PROP, cd.getCollectionName()); } else { getConfName(collection, collectionPath, collectionProps); } ZkNodeProps zkProps = new ZkNodeProps(collectionProps); zkClient.makePath(collectionPath, ZkStateReader.toJSON(zkProps), CreateMode.PERSISTENT, null, true); // ping that there is a new collection zkClient.setData(ZkStateReader.COLLECTIONS_ZKNODE, (byte[])null, true); } catch (KeeperException e) { // its okay if the node already exists if (e.code() != KeeperException.Code.NODEEXISTS) { throw e; } } } else { log.info("Collection zkNode exists"); } } catch (KeeperException e) { // its okay if another beats us creating the node if (e.code() != KeeperException.Code.NODEEXISTS) { throw e; } } }
// in core/src/java/org/apache/solr/cloud/ZkController.java
private void getConfName(String collection, String collectionPath, Map<String,String> collectionProps) throws KeeperException, InterruptedException { // check for configName log.info("Looking for collection configName"); List<String> configNames = null; int retry = 1; int retryLimt = 6; for (; retry < retryLimt; retry++) { if (zkClient.exists(collectionPath, true)) { ZkNodeProps cProps = ZkNodeProps.load(zkClient.getData(collectionPath, null, null, true)); if (cProps.containsKey(CONFIGNAME_PROP)) { break; } } // if there is only one conf, use that try { configNames = zkClient.getChildren(CONFIGS_ZKNODE, null, true); } catch (NoNodeException e) { // just keep trying } if (configNames != null && configNames.size() == 1) { // no config set named, but there is only 1 - use it log.info("Only one config set found in zk - using it:" + configNames.get(0)); collectionProps.put(CONFIGNAME_PROP, configNames.get(0)); break; } if (configNames != null && configNames.contains(collection)) { log.info("Could not find explicit collection configName, but found config name matching collection name - using that set."); collectionProps.put(CONFIGNAME_PROP, collection); break; } log.info("Could not find collection configName - pausing for 3 seconds and trying again - try: " + retry); Thread.sleep(3000); } if (retry == retryLimt) { log.error("Could not find configName for collection " + collection); throw new ZooKeeperException( SolrException.ErrorCode.SERVER_ERROR, "Could not find configName for collection " + collection + " found:" + configNames); } }
// in core/src/java/org/apache/solr/cloud/ZkController.java
public static void uploadToZK(SolrZkClient zkClient, File dir, String zkPath) throws IOException, KeeperException, InterruptedException { File[] files = dir.listFiles(); if (files == null) { throw new IllegalArgumentException("Illegal directory: " + dir); } for(File file : files) { if (!file.getName().startsWith(".")) { if (!file.isDirectory()) { zkClient.makePath(zkPath + "/" + file.getName(), file, false, true); } else { uploadToZK(zkClient, file, zkPath + "/" + file.getName()); } } } }
// in core/src/java/org/apache/solr/cloud/ZkController.java
public static void uploadConfigDir(SolrZkClient zkClient, File dir, String configName) throws IOException, KeeperException, InterruptedException { uploadToZK(zkClient, dir, ZkController.CONFIGS_ZKNODE + "/" + configName); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
public static void bootstrapConf(SolrZkClient zkClient, Config cfg, String solrHome) throws IOException, KeeperException, InterruptedException { NodeList nodes = (NodeList)cfg.evaluate("solr/cores/core", XPathConstants.NODESET); for (int i=0; i<nodes.getLength(); i++) { Node node = nodes.item(i); String rawName = DOMUtil.getAttr(node, "name", null); String instanceDir = DOMUtil.getAttr(node, "instanceDir", null); File idir = new File(instanceDir); if (!idir.isAbsolute()) { idir = new File(solrHome, instanceDir); } String confName = DOMUtil.getAttr(node, "collection", null); if (confName == null) { confName = rawName; } ZkController.uploadConfigDir(zkClient, new File(idir, "conf"), confName); } }
// in core/src/java/org/apache/solr/cloud/LeaderElector.java
private void checkIfIamLeader(final int seq, final ElectionContext context, boolean replacement) throws KeeperException, InterruptedException, IOException { // get all other numbers... final String holdElectionPath = context.electionPath + ELECTION_NODE; List<String> seqs = zkClient.getChildren(holdElectionPath, null, true); sortSeqs(seqs); List<Integer> intSeqs = getSeqs(seqs); if (seq <= intSeqs.get(0)) { runIamLeaderProcess(context, replacement); } else { // I am not the leader - watch the node below me int i = 1; for (; i < intSeqs.size(); i++) { int s = intSeqs.get(i); if (seq < s) { // we found who we come before - watch the guy in front break; } } int index = i - 2; if (index < 0) { log.warn("Our node is no longer in line to be leader"); return; } try { zkClient.getData(holdElectionPath + "/" + seqs.get(index), new Watcher() { @Override public void process(WatchedEvent event) { // am I the next leader? try { checkIfIamLeader(seq, context, true); } catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.warn("", e); } catch (IOException e) { log.warn("", e); } catch (Exception e) { log.warn("", e); } } }, null, true); } catch (KeeperException.SessionExpiredException e) { throw e; } catch (KeeperException e) { // we couldn't set our watch - the node before us may already be down? // we need to check if we are the leader again checkIfIamLeader(seq, context, true); } }
// in core/src/java/org/apache/solr/cloud/LeaderElector.java
protected void runIamLeaderProcess(final ElectionContext context, boolean weAreReplacement) throws KeeperException, InterruptedException, IOException { context.runLeaderProcess(weAreReplacement); }
// in core/src/java/org/apache/solr/cloud/LeaderElector.java
public int joinElection(ElectionContext context) throws KeeperException, InterruptedException, IOException { final String shardsElectZkPath = context.electionPath + LeaderElector.ELECTION_NODE; long sessionId = zkClient.getSolrZooKeeper().getSessionId(); String id = sessionId + "-" + context.id; String leaderSeqPath = null; boolean cont = true; int tries = 0; while (cont) { try { leaderSeqPath = zkClient.create(shardsElectZkPath + "/" + id + "-n_", null, CreateMode.EPHEMERAL_SEQUENTIAL, false); context.leaderSeqPath = leaderSeqPath; cont = false; } catch (ConnectionLossException e) { // we don't know if we made our node or not... List<String> entries = zkClient.getChildren(shardsElectZkPath, null, true); boolean foundId = false; for (String entry : entries) { String nodeId = getNodeId(entry); if (id.equals(nodeId)) { // we did create our node... foundId = true; break; } } if (!foundId) { throw e; } } catch (KeeperException.NoNodeException e) { // we must have failed in creating the election node - someone else must // be working on it, lets try again if (tries++ > 9) { throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } cont = true; Thread.sleep(50); } } int seq = getSeq(leaderSeqPath); checkIfIamLeader(seq, context, false); return seq; }
// in core/src/java/org/apache/solr/cloud/LeaderElector.java
public void setup(final ElectionContext context) throws InterruptedException, KeeperException { String electZKPath = context.electionPath + LeaderElector.ELECTION_NODE; zkCmdExecutor.ensureExists(electZKPath, zkClient); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
private SolrConfig getSolrConfigFromZk(String zkConfigName, String solrConfigFileName, SolrResourceLoader resourceLoader) throws IOException, ParserConfigurationException, SAXException, KeeperException, InterruptedException { byte[] config = zkController.getConfigFileData(zkConfigName, solrConfigFileName); InputSource is = new InputSource(new ByteArrayInputStream(config)); is.setSystemId(SystemIdResolver.createSystemIdFromResourceName(solrConfigFileName)); SolrConfig cfg = solrConfigFileName == null ? new SolrConfig( resourceLoader, SolrConfig.DEFAULT_CONF_FILE, is) : new SolrConfig( resourceLoader, solrConfigFileName, is); return cfg; }
// in core/src/java/org/apache/solr/core/CoreContainer.java
private IndexSchema getSchemaFromZk(String zkConfigName, String schemaName, SolrConfig config, SolrResourceLoader resourceLoader) throws KeeperException, InterruptedException { byte[] configBytes = zkController.getConfigFileData(zkConfigName, schemaName); InputSource is = new InputSource(new ByteArrayInputStream(configBytes)); is.setSystemId(SystemIdResolver.createSystemIdFromResourceName(schemaName)); IndexSchema schema = new IndexSchema(config, schemaName, is); return schema; }
34
            
// in solrj/src/java/org/apache/solr/common/cloud/ZkStateReader.java
catch (KeeperException e) { log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in solrj/src/java/org/apache/solr/common/cloud/ZkStateReader.java
catch (KeeperException e) { if (e.code() == KeeperException.Code.SESSIONEXPIRED || e.code() == KeeperException.Code.CONNECTIONLOSS) { log.warn("ZooKeeper watch triggered, but Solr cannot talk to ZK"); return; } log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in solrj/src/java/org/apache/solr/common/cloud/ZkStateReader.java
catch (KeeperException e) { if (e.code() == KeeperException.Code.SESSIONEXPIRED || e.code() == KeeperException.Code.CONNECTIONLOSS) { log.warn("ZooKeeper watch triggered, but Solr cannot talk to ZK"); return; } log.error("", e); throw new ZooKeeperException( SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in solrj/src/java/org/apache/solr/common/cloud/ZkStateReader.java
catch (KeeperException e) { if (e.code() == KeeperException.Code.SESSIONEXPIRED || e.code() == KeeperException.Code.CONNECTIONLOSS) { log.warn("ZooKeeper watch triggered, but Solr cannot talk to ZK"); return; } log.error("", e); throw new ZooKeeperException( SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in solrj/src/java/org/apache/solr/common/cloud/ZkCmdExecutor.java
catch (KeeperException.ConnectionLossException e) { if (exception == null) { exception = e; } if (Thread.currentThread().isInterrupted()) { Thread.currentThread().interrupt(); throw new InterruptedException(); } if (Thread.currentThread() instanceof SafeStopThread) { if (((SafeStopThread) Thread.currentThread()).isClosed()) { throw new RuntimeException("Interrupted"); } } retryDelay(i); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/CloudSolrServer.java
catch (KeeperException e) { throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/handler/admin/CoreAdminHandler.java
catch (KeeperException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Could not unregister core " + cname + " from cloudstate: " + e.getMessage(), e); }
// in core/src/java/org/apache/solr/servlet/ZookeeperInfoServlet.java
catch (KeeperException e) { writeKeyValue(json, "warning", e.toString(), false); log.warn("Keeper Exception", e); }
// in core/src/java/org/apache/solr/servlet/ZookeeperInfoServlet.java
catch (KeeperException e) { writeError(500, e.toString()); return false; }
// in core/src/java/org/apache/solr/servlet/ZookeeperInfoServlet.java
catch (KeeperException e) { writeError(500, e.toString()); return false; }
// in core/src/java/org/apache/solr/cloud/ShardLeaderWatcher.java
catch (KeeperException ke) { //check if we lost connection or the node was gone if (ke.code() != Code.CONNECTIONLOSS && ke.code() != Code.SESSIONEXPIRED && ke.code() != Code.NONODE) { throw ke; } }
// in core/src/java/org/apache/solr/cloud/ShardLeaderWatcher.java
catch (KeeperException e) { logger.warn("Shard leader watch triggered but Solr cannot talk to zk."); }
// in core/src/java/org/apache/solr/cloud/Overseer.java
catch (KeeperException e) { if (e.code() == KeeperException.Code.SESSIONEXPIRED || e.code() == KeeperException.Code.CONNECTIONLOSS) { log.warn("ZooKeeper watch triggered, but Solr cannot talk to ZK"); return; } SolrException.log(log, "", e); throw new ZooKeeperException( SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/cloud/Overseer.java
catch (KeeperException e) { log.warn("", e); }
// in core/src/java/org/apache/solr/cloud/Overseer.java
catch (KeeperException e) { if (e.code() == Code.CONNECTIONLOSS || e.code() == Code.SESSIONEXPIRED) { log.warn("ZooKeeper watch triggered, but Solr cannot talk to ZK"); return; } }
// in core/src/java/org/apache/solr/cloud/Overseer.java
catch (KeeperException e) { if (e.code() == KeeperException.Code.SESSIONEXPIRED || e.code() == KeeperException.Code.CONNECTIONLOSS) { log.warn("ZooKeeper watch triggered, but Solr cannot talk to ZK"); return; } SolrException.log(log, "", e); throw new ZooKeeperException( SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/cloud/Overseer.java
catch (KeeperException e) { log.error("Failed to create watcher for shard leader col:" + collection + " shard:" + shardId + ", exception: " + e.getClass()); }
// in core/src/java/org/apache/solr/cloud/Overseer.java
catch (KeeperException e) { if (e.code() == KeeperException.Code.SESSIONEXPIRED || e.code() == KeeperException.Code.CONNECTIONLOSS) { log.warn("ZooKeeper watch triggered, but Solr cannot talk to ZK"); return; } SolrException.log(log, "", e); throw new ZooKeeperException( SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/cloud/ZkSolrResourceLoader.java
catch (KeeperException e) { log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/cloud/NodeStateWatcher.java
catch (KeeperException e) { log.warn("Could not talk to ZK", e); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (KeeperException e) { log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (KeeperException.NoNodeException e) { // fine if there is nothing to delete // TODO: annoying that ZK logs a warning on us nodeDeleted = false; }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (KeeperException e) { // its okay if the node already exists if (e.code() != KeeperException.Code.NODEEXISTS) { throw e; } }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (KeeperException e) { throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (KeeperException e) { // its okay if the node already exists if (e.code() != KeeperException.Code.NODEEXISTS) { throw e; } }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (KeeperException e) { // its okay if another beats us creating the node if (e.code() != KeeperException.Code.NODEEXISTS) { throw e; } }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (KeeperException e) { throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "could not publish node state", e); }
// in core/src/java/org/apache/solr/cloud/LeaderElector.java
catch (KeeperException.SessionExpiredException e) { throw e; }
// in core/src/java/org/apache/solr/cloud/LeaderElector.java
catch (KeeperException e) { // we couldn't set our watch - the node before us may already be down? // we need to check if we are the leader again checkIfIamLeader(seq, context, true); }
// in core/src/java/org/apache/solr/cloud/LeaderElector.java
catch (KeeperException.NoNodeException e) { // we must have failed in creating the election node - someone else must // be working on it, lets try again if (tries++ > 9) { throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } cont = true; Thread.sleep(50); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (KeeperException e) { log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (KeeperException e) { log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (KeeperException e) { log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (KeeperException e) { log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
25
            
// in solrj/src/java/org/apache/solr/common/cloud/ZkStateReader.java
catch (KeeperException e) { log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in solrj/src/java/org/apache/solr/common/cloud/ZkStateReader.java
catch (KeeperException e) { if (e.code() == KeeperException.Code.SESSIONEXPIRED || e.code() == KeeperException.Code.CONNECTIONLOSS) { log.warn("ZooKeeper watch triggered, but Solr cannot talk to ZK"); return; } log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in solrj/src/java/org/apache/solr/common/cloud/ZkStateReader.java
catch (KeeperException e) { if (e.code() == KeeperException.Code.SESSIONEXPIRED || e.code() == KeeperException.Code.CONNECTIONLOSS) { log.warn("ZooKeeper watch triggered, but Solr cannot talk to ZK"); return; } log.error("", e); throw new ZooKeeperException( SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in solrj/src/java/org/apache/solr/common/cloud/ZkStateReader.java
catch (KeeperException e) { if (e.code() == KeeperException.Code.SESSIONEXPIRED || e.code() == KeeperException.Code.CONNECTIONLOSS) { log.warn("ZooKeeper watch triggered, but Solr cannot talk to ZK"); return; } log.error("", e); throw new ZooKeeperException( SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in solrj/src/java/org/apache/solr/common/cloud/ZkCmdExecutor.java
catch (KeeperException.ConnectionLossException e) { if (exception == null) { exception = e; } if (Thread.currentThread().isInterrupted()) { Thread.currentThread().interrupt(); throw new InterruptedException(); } if (Thread.currentThread() instanceof SafeStopThread) { if (((SafeStopThread) Thread.currentThread()).isClosed()) { throw new RuntimeException("Interrupted"); } } retryDelay(i); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/CloudSolrServer.java
catch (KeeperException e) { throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/handler/admin/CoreAdminHandler.java
catch (KeeperException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Could not unregister core " + cname + " from cloudstate: " + e.getMessage(), e); }
// in core/src/java/org/apache/solr/cloud/ShardLeaderWatcher.java
catch (KeeperException ke) { //check if we lost connection or the node was gone if (ke.code() != Code.CONNECTIONLOSS && ke.code() != Code.SESSIONEXPIRED && ke.code() != Code.NONODE) { throw ke; } }
// in core/src/java/org/apache/solr/cloud/Overseer.java
catch (KeeperException e) { if (e.code() == KeeperException.Code.SESSIONEXPIRED || e.code() == KeeperException.Code.CONNECTIONLOSS) { log.warn("ZooKeeper watch triggered, but Solr cannot talk to ZK"); return; } SolrException.log(log, "", e); throw new ZooKeeperException( SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/cloud/Overseer.java
catch (KeeperException e) { if (e.code() == KeeperException.Code.SESSIONEXPIRED || e.code() == KeeperException.Code.CONNECTIONLOSS) { log.warn("ZooKeeper watch triggered, but Solr cannot talk to ZK"); return; } SolrException.log(log, "", e); throw new ZooKeeperException( SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/cloud/Overseer.java
catch (KeeperException e) { if (e.code() == KeeperException.Code.SESSIONEXPIRED || e.code() == KeeperException.Code.CONNECTIONLOSS) { log.warn("ZooKeeper watch triggered, but Solr cannot talk to ZK"); return; } SolrException.log(log, "", e); throw new ZooKeeperException( SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/cloud/ZkSolrResourceLoader.java
catch (KeeperException e) { log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (KeeperException e) { log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (KeeperException e) { // its okay if the node already exists if (e.code() != KeeperException.Code.NODEEXISTS) { throw e; } }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (KeeperException e) { throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (KeeperException e) { // its okay if the node already exists if (e.code() != KeeperException.Code.NODEEXISTS) { throw e; } }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (KeeperException e) { // its okay if another beats us creating the node if (e.code() != KeeperException.Code.NODEEXISTS) { throw e; } }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (KeeperException e) { throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "could not publish node state", e); }
// in core/src/java/org/apache/solr/cloud/LeaderElector.java
catch (KeeperException.SessionExpiredException e) { throw e; }
// in core/src/java/org/apache/solr/cloud/LeaderElector.java
catch (KeeperException.NoNodeException e) { // we must have failed in creating the election node - someone else must // be working on it, lets try again if (tries++ > 9) { throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } cont = true; Thread.sleep(50); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (KeeperException e) { log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (KeeperException e) { log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (KeeperException e) { log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (KeeperException e) { log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
19
unknown (Lib) LangDetectException 0 0 1
            
// in contrib/langid/src/java/org/apache/solr/update/processor/LangDetectLanguageIdentifierUpdateProcessorFactory.java
public static synchronized void loadData() throws IOException, LangDetectException { if (loaded) { return; } loaded = true; List<String> profileData = new ArrayList<String>(); Charset encoding = Charset.forName("UTF-8"); for (String language : languages) { InputStream stream = LangDetectLanguageIdentifierUpdateProcessor.class.getResourceAsStream("langdetect-profiles/" + language); BufferedReader reader = new BufferedReader(new InputStreamReader(stream, encoding)); profileData.add(new String(IOUtils.toCharArray(reader))); reader.close(); } DetectorFactory.loadProfile(profileData); DetectorFactory.setSeed(0); }
1
            
// in contrib/langid/src/java/org/apache/solr/update/processor/LangDetectLanguageIdentifierUpdateProcessor.java
catch (LangDetectException e) { log.debug("Could not determine language, returning empty list: ", e); return Collections.emptyList(); }
0 0
unknown (Lib) LockObtainFailedException 1
            
// in core/src/java/org/apache/solr/core/SolrCore.java
void initIndex() { try { String indexDir = getNewIndexDir(); boolean indexExists = getDirectoryFactory().exists(indexDir); boolean firstTime; synchronized (SolrCore.class) { firstTime = dirs.add(new File(indexDir).getCanonicalPath()); } boolean removeLocks = solrConfig.unlockOnStartup; initIndexReaderFactory(); if (indexExists && firstTime) { // to remove locks, the directory must already exist... so we create it // if it didn't exist already... Directory dir = directoryFactory.get(indexDir, getSolrConfig().indexConfig.lockType); if (dir != null) { if (IndexWriter.isLocked(dir)) { if (removeLocks) { log.warn(logid + "WARNING: Solr index directory '{}' is locked. Unlocking...", indexDir); IndexWriter.unlock(dir); } else { log.error(logid + "Solr index directory '{}' is locked. Throwing exception", indexDir); throw new LockObtainFailedException("Index locked for write for core " + name); } } directoryFactory.release(dir); } } // Create the index if it doesn't exist. if(!indexExists) { log.warn(logid+"Solr index directory '" + new File(indexDir) + "' doesn't exist." + " Creating new index..."); SolrIndexWriter writer = new SolrIndexWriter("SolrCore.initIndex", indexDir, getDirectoryFactory(), true, schema, solrConfig.indexConfig, solrDelPolicy, codec, false); writer.close(); } } catch (IOException e) { throw new RuntimeException(e); } }
0 0 0 0 0
unknown (Lib) MalformedURLException 0 0 13
            
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
protected HttpSolrServer makeServer(String server) throws MalformedURLException { return new HttpSolrServer(server, httpClient, binaryParser); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
public void addSolrServer(String server) throws MalformedURLException { HttpSolrServer solrServer = makeServer(server); addToAlive(new ServerWrapper(solrServer)); }
// in core/src/java/org/apache/solr/cloud/SyncStrategy.java
private boolean syncWithReplicas(ZkController zkController, SolrCore core, ZkNodeProps props, String collection, String shardId) throws MalformedURLException, SolrServerException, IOException { List<ZkCoreNodeProps> nodes = zkController.getZkStateReader() .getReplicaProps(collection, shardId, props.get(ZkStateReader.NODE_NAME_PROP), props.get(ZkStateReader.CORE_NAME_PROP), ZkStateReader.ACTIVE); // TODO: // should // there // be a // state // filter? if (nodes == null) { // I have no replicas return true; } List<String> syncWith = new ArrayList<String>(); for (ZkCoreNodeProps node : nodes) { // if we see a leader, must be stale state, and this is the guy that went down if (!node.getNodeProps().keySet().contains(ZkStateReader.LEADER_PROP)) { syncWith.add(node.getCoreUrl()); } } PeerSync peerSync = new PeerSync(core, syncWith, core.getUpdateHandler().getUpdateLog().numRecordsToKeep); return peerSync.sync(); }
// in core/src/java/org/apache/solr/cloud/SyncStrategy.java
private void syncToMe(ZkController zkController, String collection, String shardId, ZkNodeProps leaderProps) throws MalformedURLException, SolrServerException, IOException { // sync everyone else // TODO: we should do this in parallel at least List<ZkCoreNodeProps> nodes = zkController .getZkStateReader() .getReplicaProps(collection, shardId, leaderProps.get(ZkStateReader.NODE_NAME_PROP), leaderProps.get(ZkStateReader.CORE_NAME_PROP), ZkStateReader.ACTIVE); if (nodes == null) { // System.out.println("I have no replicas"); // I have no replicas return; } //System.out.println("tell my replicas to sync"); ZkCoreNodeProps zkLeader = new ZkCoreNodeProps(leaderProps); for (ZkCoreNodeProps node : nodes) { try { // System.out // .println("try and ask " + node.getCoreUrl() + " to sync"); log.info("try and ask " + node.getCoreUrl() + " to sync"); requestSync(zkLeader.getCoreUrl(), node.getCoreName()); } catch (Exception e) { SolrException.log(log, "Error syncing replica to leader", e); } } for(;;) { ShardResponse srsp = shardHandler.takeCompletedOrError(); if (srsp == null) break; boolean success = handleResponse(srsp); //System.out.println("got response:" + success); if (!success) { try { log.info("Sync failed - asking replica to recover."); //System.out.println("Sync failed - asking replica to recover."); RequestRecovery recoverRequestCmd = new RequestRecovery(); recoverRequestCmd.setAction(CoreAdminAction.REQUESTRECOVERY); recoverRequestCmd.setCoreName(((SyncShardRequest)srsp.getShardRequest()).coreName); HttpSolrServer server = new HttpSolrServer(zkLeader.getBaseUrl()); server.request(recoverRequestCmd); } catch (Exception e) { log.info("Could not tell a replica to recover", e); } shardHandler.cancelAll(); break; } } }
// in core/src/java/org/apache/solr/cloud/RecoveryStrategy.java
private void commitOnLeader(String leaderUrl) throws MalformedURLException, SolrServerException, IOException { HttpSolrServer server = new HttpSolrServer(leaderUrl); server.setConnectionTimeout(30000); server.setSoTimeout(30000); UpdateRequest ureq = new UpdateRequest(); ureq.setParams(new ModifiableSolrParams()); ureq.getParams().set(DistributedUpdateProcessor.COMMIT_END_POINT, true); ureq.setAction(AbstractUpdateRequest.ACTION.COMMIT, false, true).process( server); server.shutdown(); }
// in core/src/java/org/apache/solr/cloud/RecoveryStrategy.java
private void sendPrepRecoveryCmd(String leaderBaseUrl, String leaderCoreName) throws MalformedURLException, SolrServerException, IOException { HttpSolrServer server = new HttpSolrServer(leaderBaseUrl); server.setConnectionTimeout(45000); server.setSoTimeout(45000); WaitForState prepCmd = new WaitForState(); prepCmd.setCoreName(leaderCoreName); prepCmd.setNodeName(zkController.getNodeName()); prepCmd.setCoreNodeName(coreZkNodeName); prepCmd.setState(ZkStateReader.RECOVERING); prepCmd.setCheckLive(true); prepCmd.setPauseFor(6000); server.request(prepCmd); server.shutdown(); }
// in core/src/java/org/apache/solr/update/processor/URLClassifyProcessor.java
public URL getNormalizedURL(String url) throws MalformedURLException, URISyntaxException { return new URI(url).normalize().toURL(); }
13
            
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
catch (MalformedURLException e) { throw new RuntimeException(e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/SolrEntityProcessor.java
catch (MalformedURLException e) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, e); }
// in core/src/java/org/apache/solr/handler/StandardRequestHandler.java
catch( MalformedURLException ex ) { return null; }
// in core/src/java/org/apache/solr/handler/MoreLikeThisHandler.java
catch( MalformedURLException ex ) { return null; }
// in core/src/java/org/apache/solr/handler/component/QueryElevationComponent.java
catch (MalformedURLException e) { throw new RuntimeException(e); }
// in core/src/java/org/apache/solr/handler/component/HttpShardHandlerFactory.java
catch (MalformedURLException e) { // should be impossible since we're not passing any URLs here throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/handler/admin/LukeRequestHandler.java
catch( MalformedURLException ex ) { return null; }
// in core/src/java/org/apache/solr/update/processor/URLClassifyProcessor.java
catch (MalformedURLException e) { log.warn("cannot get the normalized url for \"" + url + "\" due to " + e.getMessage()); }
// in core/src/java/org/apache/solr/update/processor/URLClassifyProcessor.java
catch (MalformedURLException e) { e.printStackTrace(); }
// in core/src/java/org/apache/solr/core/SolrResourceLoader.java
catch (MalformedURLException e) { SolrException.log(log, "Can't add element to classloader: " + files[j], e); }
// in core/src/java/org/apache/solr/util/SuggestMissingFactories.java
catch (MalformedURLException e) { throw new RuntimeException ("WTF, how can JarFile.toURL() be malformed?", e); }
// in core/src/java/org/apache/solr/util/SimplePostTool.java
catch (MalformedURLException e) { fatal("System Property 'url' is not a valid URL: " + u); }
// in core/src/java/org/apache/solr/util/SimplePostTool.java
catch (MalformedURLException e) { fatal("The specified URL "+url+" is not a valid URL. Please check"); }
5
            
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
catch (MalformedURLException e) { throw new RuntimeException(e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/SolrEntityProcessor.java
catch (MalformedURLException e) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, e); }
// in core/src/java/org/apache/solr/handler/component/QueryElevationComponent.java
catch (MalformedURLException e) { throw new RuntimeException(e); }
// in core/src/java/org/apache/solr/handler/component/HttpShardHandlerFactory.java
catch (MalformedURLException e) { // should be impossible since we're not passing any URLs here throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/util/SuggestMissingFactories.java
catch (MalformedURLException e) { throw new RuntimeException ("WTF, how can JarFile.toURL() be malformed?", e); }
5
unknown (Lib) MessagingException 0 0 2
            
// in contrib/dataimporthandler-extras/src/java/org/apache/solr/handler/dataimport/MailEntityProcessor.java
private void addEnvelopToDocument(Part part, Map<String, Object> row) throws MessagingException { MimeMessage mail = (MimeMessage) part; Address[] adresses; if ((adresses = mail.getFrom()) != null && adresses.length > 0) row.put(FROM, adresses[0].toString()); List<String> to = new ArrayList<String>(); if ((adresses = mail.getRecipients(Message.RecipientType.TO)) != null) addAddressToList(adresses, to); if ((adresses = mail.getRecipients(Message.RecipientType.CC)) != null) addAddressToList(adresses, to); if ((adresses = mail.getRecipients(Message.RecipientType.BCC)) != null) addAddressToList(adresses, to); if (to.size() > 0) row.put(TO_CC_BCC, to); row.put(MESSAGE_ID, mail.getMessageID()); row.put(SUBJECT, mail.getSubject()); Date d = mail.getSentDate(); if (d != null) { row.put(SENT_DATE, d); } List<String> flags = new ArrayList<String>(); for (Flags.Flag flag : mail.getFlags().getSystemFlags()) { if (flag == Flags.Flag.ANSWERED) flags.add(FLAG_ANSWERED); else if (flag == Flags.Flag.DELETED) flags.add(FLAG_DELETED); else if (flag == Flags.Flag.DRAFT) flags.add(FLAG_DRAFT); else if (flag == Flags.Flag.FLAGGED) flags.add(FLAG_FLAGGED); else if (flag == Flags.Flag.RECENT) flags.add(FLAG_RECENT); else if (flag == Flags.Flag.SEEN) flags.add(FLAG_SEEN); } flags.addAll(Arrays.asList(mail.getFlags().getUserFlags())); row.put(FLAGS, flags); String[] hdrs = mail.getHeader("X-Mailer"); if (hdrs != null) row.put(XMAILER, hdrs[0]); }
// in contrib/dataimporthandler-extras/src/java/org/apache/solr/handler/dataimport/MailEntityProcessor.java
private void getNextBatch(int batchSize, Folder folder) throws MessagingException { // after each batch invalidate cache if (messagesInCurBatch != null) { for (Message m : messagesInCurBatch) { if (m instanceof IMAPMessage) ((IMAPMessage) m).invalidateHeaders(); } } int lastMsg = (currentBatch + 1) * batchSize; lastMsg = lastMsg > totalInFolder ? totalInFolder : lastMsg; messagesInCurBatch = folder.getMessages(currentBatch * batchSize + 1, lastMsg); folder.fetch(messagesInCurBatch, fp); current = 0; currentBatch++; LOG.info("Current Batch : " + currentBatch); LOG.info("Messages in this batch : " + messagesInCurBatch.length); }
6
            
// in contrib/dataimporthandler-extras/src/java/org/apache/solr/handler/dataimport/MailEntityProcessor.java
catch (MessagingException e) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Connection failed", e); }
// in contrib/dataimporthandler-extras/src/java/org/apache/solr/handler/dataimport/MailEntityProcessor.java
catch (MessagingException e) { //throw new DataImportHandlerException(DataImportHandlerException.SEVERE, // "Folder open failed", e); }
// in contrib/dataimporthandler-extras/src/java/org/apache/solr/handler/dataimport/MailEntityProcessor.java
catch (MessagingException e) { // skip bad ones unless its the last one and still no good folder if (folders.size() == 0 && i == topLevelFolders.size() - 1) throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Folder retreival failed"); }
// in contrib/dataimporthandler-extras/src/java/org/apache/solr/handler/dataimport/MailEntityProcessor.java
catch (MessagingException e) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Folder retreival failed"); }
// in contrib/dataimporthandler-extras/src/java/org/apache/solr/handler/dataimport/MailEntityProcessor.java
catch (MessagingException e) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Message retreival failed", e); }
// in contrib/dataimporthandler-extras/src/java/org/apache/solr/handler/dataimport/MailEntityProcessor.java
catch (MessagingException e) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Message retreival failed", e); }
5
            
// in contrib/dataimporthandler-extras/src/java/org/apache/solr/handler/dataimport/MailEntityProcessor.java
catch (MessagingException e) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Connection failed", e); }
// in contrib/dataimporthandler-extras/src/java/org/apache/solr/handler/dataimport/MailEntityProcessor.java
catch (MessagingException e) { // skip bad ones unless its the last one and still no good folder if (folders.size() == 0 && i == topLevelFolders.size() - 1) throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Folder retreival failed"); }
// in contrib/dataimporthandler-extras/src/java/org/apache/solr/handler/dataimport/MailEntityProcessor.java
catch (MessagingException e) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Folder retreival failed"); }
// in contrib/dataimporthandler-extras/src/java/org/apache/solr/handler/dataimport/MailEntityProcessor.java
catch (MessagingException e) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Message retreival failed", e); }
// in contrib/dataimporthandler-extras/src/java/org/apache/solr/handler/dataimport/MailEntityProcessor.java
catch (MessagingException e) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Message retreival failed", e); }
5
unknown (Lib) MimeTypeException 0 0 1
            
// in contrib/extraction/src/java/org/apache/solr/handler/extraction/ExtractingRequestHandler.java
private TikaConfig getDefaultConfig(ClassLoader classLoader) throws MimeTypeException, IOException { return new TikaConfig(classLoader); }
1
            
// in contrib/extraction/src/java/org/apache/solr/handler/extraction/ExtractingRequestHandler.java
catch (MimeTypeException e) { throw new SolrException(ErrorCode.SERVER_ERROR, e); }
1
            
// in contrib/extraction/src/java/org/apache/solr/handler/extraction/ExtractingRequestHandler.java
catch (MimeTypeException e) { throw new SolrException(ErrorCode.SERVER_ERROR, e); }
1
unknown (Lib) NamingException 0 0 0 1
            
// in core/src/java/org/apache/solr/core/SolrResourceLoader.java
catch (NamingException e) { log.info("No /"+project+"/home in JNDI"); }
0 0
unknown (Lib) NoClassDefFoundError 0 0 0 1
            
// in core/src/java/org/apache/solr/util/SuggestMissingFactories.java
catch (NoClassDefFoundError e1) { throw new ClassNotFoundException ("Can't load: " + cn, e1); }
1
            
// in core/src/java/org/apache/solr/util/SuggestMissingFactories.java
catch (NoClassDefFoundError e1) { throw new ClassNotFoundException ("Can't load: " + cn, e1); }
0
unknown (Lib) NoHttpResponseException 0 0 0 1
            
// in solrj/src/java/org/apache/solr/client/solrj/impl/HttpSolrServer.java
catch( NoHttpResponseException r ) { method = null; if(is != null) { is.close(); } // If out of tries then just rethrow (as normal error). if (tries < 1) { throw r; } }
1
            
// in solrj/src/java/org/apache/solr/client/solrj/impl/HttpSolrServer.java
catch( NoHttpResponseException r ) { method = null; if(is != null) { is.close(); } // If out of tries then just rethrow (as normal error). if (tries < 1) { throw r; } }
0
unknown (Lib) NoInitialContextException 0 0 0 1
            
// in core/src/java/org/apache/solr/core/SolrResourceLoader.java
catch (NoInitialContextException e) { log.info("JNDI not configured for "+project+" (NoInitialContextEx)"); }
0 0
unknown (Lib) NoNodeException 0 0 0 5
            
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
catch (NoNodeException e) { // must have gone away }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (KeeperException.NoNodeException e) { // fine if there is nothing to delete // TODO: annoying that ZK logs a warning on us nodeDeleted = false; }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (NoNodeException e) { Thread.sleep(500); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (NoNodeException e) { // just keep trying }
// in core/src/java/org/apache/solr/cloud/LeaderElector.java
catch (KeeperException.NoNodeException e) { // we must have failed in creating the election node - someone else must // be working on it, lets try again if (tries++ > 9) { throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } cont = true; Thread.sleep(50); }
1
            
// in core/src/java/org/apache/solr/cloud/LeaderElector.java
catch (KeeperException.NoNodeException e) { // we must have failed in creating the election node - someone else must // be working on it, lets try again if (tries++ > 9) { throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } cont = true; Thread.sleep(50); }
1
unknown (Lib) NoSuchAlgorithmException 0 0 0 1
            
// in core/src/java/org/apache/solr/update/processor/MD5Signature.java
catch (NoSuchAlgorithmException e) { throw new RuntimeException(e); }
1
            
// in core/src/java/org/apache/solr/update/processor/MD5Signature.java
catch (NoSuchAlgorithmException e) { throw new RuntimeException(e); }
1
unknown (Lib) NoSuchElementException 1
            
// in core/src/java/org/apache/solr/spelling/PossibilityIterator.java
private RankedSpellPossibility internalNext() { if (done) { throw new NoSuchElementException(); } List<SpellCheckCorrection> possibleCorrection = new ArrayList<SpellCheckCorrection>(); int rank = 0; for (int i = 0; i < correctionIndex.length; i++) { List<SpellCheckCorrection> singleWordPossibilities = possibilityList.get(i); SpellCheckCorrection singleWordPossibility = singleWordPossibilities.get(correctionIndex[i]); rank += correctionIndex[i]; if (i == correctionIndex.length - 1) { correctionIndex[i]++; if (correctionIndex[i] == singleWordPossibilities.size()) { correctionIndex[i] = 0; if (correctionIndex.length == 1) { done = true; } for (int ii = i - 1; ii >= 0; ii--) { correctionIndex[ii]++; if (correctionIndex[ii] >= possibilityList.get(ii).size() && ii > 0) { correctionIndex[ii] = 0; } else { break; } } } } possibleCorrection.add(singleWordPossibility); } if(correctionIndex[0] == possibilityList.get(0).size()) { done = true; } RankedSpellPossibility rsl = new RankedSpellPossibility(); rsl.setCorrections(possibleCorrection); rsl.setRank(rank); return rsl; }
0 0 0 0 0
unknown (Lib) NoSuchMethodException 0 0 3
            
// in core/src/java/org/apache/solr/util/SuggestMissingFactories.java
public static void main(String[] args) throws ClassNotFoundException, IOException, NoSuchMethodException { final File[] files = new File[args.length]; for (int i = 0; i < args.length; i++) { files[i] = new File(args[i]); } final FindClasses finder = new FindClasses(files); final ClassLoader cl = finder.getClassLoader(); final Class TOKENSTREAM = cl.loadClass("org.apache.lucene.analysis.TokenStream"); final Class TOKENIZER = cl.loadClass("org.apache.lucene.analysis.Tokenizer"); final Class TOKENFILTER = cl.loadClass("org.apache.lucene.analysis.TokenFilter"); final Class TOKENIZERFACTORY = cl.loadClass("org.apache.solr.analysis.TokenizerFactory"); final Class TOKENFILTERFACTORY = cl.loadClass("org.apache.solr.analysis.TokenFilterFactory"); final HashSet<Class> result = new HashSet<Class>(finder.findExtends(TOKENIZER)); result.addAll(finder.findExtends(TOKENFILTER)); result.removeAll(finder.findMethodReturns (finder.findExtends(TOKENIZERFACTORY), "create", Reader.class).values()); result.removeAll(finder.findMethodReturns (finder.findExtends(TOKENFILTERFACTORY), "create", TOKENSTREAM).values()); for (final Class c : result) { System.out.println(c.getName()); } }
// in core/src/java/org/apache/solr/util/SuggestMissingFactories.java
public static void main(String[] args) throws ClassNotFoundException, IOException, NoSuchMethodException { FindClasses finder = new FindClasses(new File(args[1])); ClassLoader cl = finder.getClassLoader(); Class clazz = cl.loadClass(args[0]); if (args.length == 2) { System.out.println("Finding all extenders of " + clazz.getName()); for (Class c : finder.findExtends(clazz)) { System.out.println(c.getName()); } } else { String methName = args[2]; System.out.println("Finding all extenders of " + clazz.getName() + " with method: " + methName); Class[] methArgs = new Class[args.length-3]; for (int i = 3; i < args.length; i++) { methArgs[i-3] = cl.loadClass(args[i]); } Map<Class,Class> map = finder.findMethodReturns (finder.findExtends(clazz),methName, methArgs); for (Class key : map.keySet()) { System.out.println(key.getName() + " => " + map.get(key).getName()); } } }
// in core/src/java/org/apache/solr/util/SuggestMissingFactories.java
public Map<Class,Class> findMethodReturns(Collection<Class> clazzes, String methodName, Class... parameterTypes) throws NoSuchMethodException{ HashMap<Class,Class> results = new HashMap<Class,Class>(); for (Class clazz : clazzes) { try { Method m = clazz.getMethod(methodName, parameterTypes); results.put(clazz, m.getReturnType()); } catch (NoSuchMethodException e) { /* :NOOP: we expect this and skip clazz */ } } return results; }
3
            
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/EntityProcessorWrapper.java
catch (NoSuchMethodException nsme){ String msg = "Transformer :" + trans + "does not implement Transformer interface or does not have a transformRow(Map<String.Object> m)method"; log.error(msg); wrapAndThrow(SEVERE, nsme,msg); }
// in core/src/java/org/apache/solr/schema/FieldTypePluginLoader.java
catch (NoSuchMethodException nsme) { // otherwise use default ctor return clazz.newInstance(); }
// in core/src/java/org/apache/solr/util/SuggestMissingFactories.java
catch (NoSuchMethodException e) { /* :NOOP: we expect this and skip clazz */ }
0 0
unknown (Lib) NodeExistsException 0 0 0 5
            
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
catch (NodeExistsException e) { if (!failOnExists) { // TODO: version ? for now, don't worry about race setData(currentPath, data, -1, retryOnConnLoss); // set new watch exists(currentPath, watcher, retryOnConnLoss); return; } // ignore unless it's the last node in the path if (i == paths.length - 1) { throw e; } }
// in solrj/src/java/org/apache/solr/common/cloud/ZkCmdExecutor.java
catch (NodeExistsException e) { // its okay if another beats us creating the node }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/ZKPropertiesWriter.java
catch (NodeExistsException e) {}
// in core/src/java/org/apache/solr/cloud/ElectionContext.java
catch (NodeExistsException e) { // if a previous leader ephemeral still exists for some reason, try and // remove it zkClient.delete(leaderPath, -1, true); zkClient.makePath(leaderPath, leaderProps == null ? null : ZkStateReader.toJSON(leaderProps), CreateMode.EPHEMERAL, true); }
// in core/src/java/org/apache/solr/cloud/ElectionContext.java
catch (NodeExistsException e) { // if a previous leader ephemeral still exists for some reason, try and // remove it zkClient.delete(leaderPath, -1, true); zkClient.makePath(leaderPath, ZkStateReader.toJSON(myProps), CreateMode.EPHEMERAL, true); }
1
            
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
catch (NodeExistsException e) { if (!failOnExists) { // TODO: version ? for now, don't worry about race setData(currentPath, data, -1, retryOnConnLoss); // set new watch exists(currentPath, watcher, retryOnConnLoss); return; } // ignore unless it's the last node in the path if (i == paths.length - 1) { throw e; } }
0
runtime (Lib) NullPointerException 6
            
// in core/src/java/org/apache/solr/client/solrj/embedded/EmbeddedSolrServer.java
Deprecated public EmbeddedSolrServer( SolrCore core ) { if ( core == null ) { throw new NullPointerException("SolrCore instance required"); } CoreDescriptor dcore = core.getCoreDescriptor(); if (dcore == null) throw new NullPointerException("CoreDescriptor required"); CoreContainer cores = dcore.getCoreContainer(); if (cores == null) throw new NullPointerException("CoreContainer required"); coreName = dcore.getName(); coreContainer = cores; _parser = new SolrRequestParsers( null ); }
// in core/src/java/org/apache/solr/core/CachingDirectoryFactory.java
Override public void release(Directory directory) throws IOException { if (directory == null) { throw new NullPointerException(); } close(directory); }
0 0 0 0 0
unknown (Lib) NumberFormatException 1
            
// in core/src/java/org/apache/solr/internal/csv/CSVParser.java
protected int unicodeEscapeLexer(int c) throws IOException { int ret = 0; // ignore 'u' (assume c==\ now) and read 4 hex digits c = in.read(); code.clear(); try { for (int i = 0; i < 4; i++) { c = in.read(); if (isEndOfFile(c) || isEndOfLine(c)) { throw new NumberFormatException("number too short"); } code.append((char) c); } ret = Integer.parseInt(code.toString(), 16); } catch (NumberFormatException e) { throw new IOException( "(line " + getLineNumber() + ") Wrong unicode escape sequence found '" + code.toString() + "'" + e.toString()); } return ret; }
0 0 31
            
// in solrj/src/java/org/apache/solr/client/solrj/response/QueryResponse.java
catch (NumberFormatException e) { //Ignore for non-number responses which are already handled above }
// in contrib/dataimporthandler-extras/src/java/org/apache/solr/handler/dataimport/MailEntityProcessor.java
catch (NumberFormatException e) { //do nothing }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/EvaluatorBag.java
catch (NumberFormatException e) { if (vr.resolve(ss[i]) == null) { wrapAndThrow( SEVERE, e, "Invalid number :" + ss[i] + "in parameters " + expression); } }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/JdbcDataSource.java
catch (NumberFormatException e) { LOG.warn("Invalid batch size: " + bsz); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/BinURLDataSource.java
catch (NumberFormatException e) { LOG.warn("Invalid connection timeout: " + cTimeout); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/BinURLDataSource.java
catch (NumberFormatException e) { LOG.warn("Invalid read timeout: " + rTimeout); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/URLDataSource.java
catch (NumberFormatException e) { LOG.warn("Invalid connection timeout: " + cTimeout); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/URLDataSource.java
catch (NumberFormatException e) { LOG.warn("Invalid read timeout: " + rTimeout); }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
catch (NumberFormatException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, INTERVAL_ERR_MSG); }
// in core/src/java/org/apache/solr/handler/ReplicationHandler.java
catch (NumberFormatException e) {/*no op*/ }
// in core/src/java/org/apache/solr/handler/component/TermVectorComponent.java
catch (NumberFormatException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e.getMessage(), e); }
// in core/src/java/org/apache/solr/analysis/TrieTokenizerFactory.java
catch (NumberFormatException nfe) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Invalid Number: " + v); }
// in core/src/java/org/apache/solr/analysis/LegacyHTMLStripCharFilter.java
catch (NumberFormatException e) { return MISMATCH; }
// in core/src/java/org/apache/solr/schema/IndexSchema.java
catch (NumberFormatException e) { log.warn("Couldn't parse maxChars attribute for copyField from " + source + " to " + dest + " as integer. The whole field will be copied."); }
// in core/src/java/org/apache/solr/schema/DoubleField.java
catch (NumberFormatException e){ // can't parse - write out the contents as a string so nothing is lost and // clients don't get a parse error. writer.writeStr(name, s, true); }
// in core/src/java/org/apache/solr/schema/CurrencyField.java
catch (NumberFormatException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Could not parse exchange rate: " + rateNode, e); }
// in core/src/java/org/apache/solr/schema/CurrencyField.java
catch (NumberFormatException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/schema/IntField.java
catch (NumberFormatException e){ // can't parse - write out the contents as a string so nothing is lost and // clients don't get a parse error. writer.writeStr(name, s, true); }
// in core/src/java/org/apache/solr/schema/FloatField.java
catch (NumberFormatException e){ // can't parse - write out the contents as a string so nothing is lost and // clients don't get a parse error. writer.writeStr(name, s, true); }
// in core/src/java/org/apache/solr/schema/ByteField.java
catch (NumberFormatException e){ // can't parse - write out the contents as a string so nothing is lost and // clients don't get a parse error. writer.writeStr(name, s, true); }
// in core/src/java/org/apache/solr/schema/LongField.java
catch (NumberFormatException e){ // can't parse - write out the contents as a string so nothing is lost and // clients don't get a parse error. writer.writeStr(name, s, true); }
// in core/src/java/org/apache/solr/schema/ShortField.java
catch (NumberFormatException e){ // can't parse - write out the contents as a string so nothing is lost and // clients don't get a parse error. writer.writeStr(name, s, true); }
// in core/src/java/org/apache/solr/schema/JsonPreAnalyzedParser.java
catch (NumberFormatException nfe) { LOG.warn("Invalid " + OFFSET_START_KEY + " attribute, skipped: '" + obj + "'"); hasOffsetStart = false; }
// in core/src/java/org/apache/solr/schema/JsonPreAnalyzedParser.java
catch (NumberFormatException nfe) { LOG.warn("Invalid " + OFFSET_END_KEY + " attribute, skipped: '" + obj + "'"); hasOffsetEnd = false; }
// in core/src/java/org/apache/solr/schema/JsonPreAnalyzedParser.java
catch (NumberFormatException nfe) { LOG.warn("Invalid " + POSINCR_KEY + " attribute, skipped: '" + obj + "'"); }
// in core/src/java/org/apache/solr/schema/JsonPreAnalyzedParser.java
catch (NumberFormatException nfe) { LOG.warn("Invalid " + FLAGS_KEY + " attribute, skipped: '" + e.getValue() + "'"); }
// in core/src/java/org/apache/solr/internal/csv/CSVParser.java
catch (NumberFormatException e) { throw new IOException( "(line " + getLineNumber() + ") Wrong unicode escape sequence found '" + code.toString() + "'" + e.toString()); }
// in core/src/java/org/apache/solr/spelling/AbstractLuceneSpellChecker.java
catch (NumberFormatException e) { throw new RuntimeException( "Unparseable accuracy given for dictionary: " + name, e); }
// in core/src/java/org/apache/solr/cloud/SolrZkServer.java
catch (NumberFormatException e) { throw new IllegalArgumentException("serverid " + myIdString + " is not a number"); }
// in core/src/java/org/apache/solr/util/DOMUtil.java
catch (NumberFormatException nfe) { throw new SolrException (SolrException.ErrorCode.SERVER_ERROR, "Value " + (null != name ? ("of '" +name+ "' ") : "") + "can not be parsed as '" +type+ "': \"" + textValue + "\"", nfe); }
// in core/src/java/org/apache/solr/util/DateMathParser.java
catch (NumberFormatException e) { throw new ParseException ("Not a Number: \"" + ops[pos-1] + "\"", pos-1); }
10
            
// in core/src/java/org/apache/solr/handler/SnapPuller.java
catch (NumberFormatException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, INTERVAL_ERR_MSG); }
// in core/src/java/org/apache/solr/handler/component/TermVectorComponent.java
catch (NumberFormatException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e.getMessage(), e); }
// in core/src/java/org/apache/solr/analysis/TrieTokenizerFactory.java
catch (NumberFormatException nfe) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Invalid Number: " + v); }
// in core/src/java/org/apache/solr/schema/CurrencyField.java
catch (NumberFormatException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Could not parse exchange rate: " + rateNode, e); }
// in core/src/java/org/apache/solr/schema/CurrencyField.java
catch (NumberFormatException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/internal/csv/CSVParser.java
catch (NumberFormatException e) { throw new IOException( "(line " + getLineNumber() + ") Wrong unicode escape sequence found '" + code.toString() + "'" + e.toString()); }
// in core/src/java/org/apache/solr/spelling/AbstractLuceneSpellChecker.java
catch (NumberFormatException e) { throw new RuntimeException( "Unparseable accuracy given for dictionary: " + name, e); }
// in core/src/java/org/apache/solr/cloud/SolrZkServer.java
catch (NumberFormatException e) { throw new IllegalArgumentException("serverid " + myIdString + " is not a number"); }
// in core/src/java/org/apache/solr/util/DOMUtil.java
catch (NumberFormatException nfe) { throw new SolrException (SolrException.ErrorCode.SERVER_ERROR, "Value " + (null != name ? ("of '" +name+ "' ") : "") + "can not be parsed as '" +type+ "': \"" + textValue + "\"", nfe); }
// in core/src/java/org/apache/solr/util/DateMathParser.java
catch (NumberFormatException e) { throw new ParseException ("Not a Number: \"" + ops[pos-1] + "\"", pos-1); }
9
runtime (Domain) ParseException
public static class ParseException extends RuntimeException {
    public ParseException(String msg) {
      super(msg);
    }
  }
37
            
// in solrj/src/java/org/apache/solr/common/util/DateUtil.java
public static Date parseDate( String dateValue, Collection<String> dateFormats, Date startDate ) throws ParseException { if (dateValue == null) { throw new IllegalArgumentException("dateValue is null"); } if (dateFormats == null) { dateFormats = DEFAULT_HTTP_CLIENT_PATTERNS; } if (startDate == null) { startDate = DEFAULT_TWO_DIGIT_YEAR_START; } // trim single quotes around date if present // see issue #5279 if (dateValue.length() > 1 && dateValue.startsWith("'") && dateValue.endsWith("'") ) { dateValue = dateValue.substring(1, dateValue.length() - 1); } SimpleDateFormat dateParser = null; Iterator formatIter = dateFormats.iterator(); while (formatIter.hasNext()) { String format = (String) formatIter.next(); if (dateParser == null) { dateParser = new SimpleDateFormat(format, Locale.US); dateParser.setTimeZone(GMT); dateParser.set2DigitYearStart(startDate); } else { dateParser.applyPattern(format); } try { return dateParser.parse(dateValue); } catch (ParseException pe) { // ignore this exception, we will try the next format } } // we were unable to parse the date throw new ParseException("Unable to parse the date " + dateValue, 0); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/NumberFormatTransformer.java
private Number parseNumber(String val, NumberFormat numFormat) throws ParseException { ParsePosition parsePos = new ParsePosition(0); Number num = numFormat.parse(val, parsePos); if (parsePos.getIndex() != val.length()) { throw new ParseException("illegal number format", parsePos.getIndex()); } return num; }
// in core/src/java/org/apache/solr/search/QParser.java
private void checkRecurse() throws ParseException { if (recurseCount++ >= 100) { throw new ParseException("Infinite Recursion detected parsing query '" + qstr + "'"); } }
// in core/src/java/org/apache/solr/search/SurroundQParserPlugin.java
Override public Query parse() throws org.apache.lucene.queryparser.classic.ParseException { SrndQuery sq; String qstr = getString(); if (qstr == null) return null; String mbqparam = getParam(MBQParam); if (mbqparam == null) { this.maxBasicQueries = DEFMAXBASICQUERIES; } else { try { this.maxBasicQueries = Integer.parseInt(mbqparam); } catch (Exception e) { LOG.warn("Couldn't parse maxBasicQueries value " + mbqparam +", using default of 1000"); this.maxBasicQueries = DEFMAXBASICQUERIES; } } // ugh .. colliding ParseExceptions try { sq = org.apache.lucene.queryparser.surround.parser.QueryParser .parse(qstr); } catch (org.apache.lucene.queryparser.surround.parser.ParseException pe) { throw new org.apache.lucene.queryparser.classic.ParseException( pe.getMessage()); } // so what do we do with the SrndQuery ?? // processing based on example in LIA Ch 9 String defaultField = getParam(CommonParams.DF); if (defaultField == null) { defaultField = getReq().getSchema().getDefaultSearchFieldName(); } BasicQueryFactory bqFactory = new BasicQueryFactory(this.maxBasicQueries); Query lquery = sq.makeLuceneQueryField(defaultField, bqFactory); return lquery; }
// in core/src/java/org/apache/solr/search/ExtendedDismaxQParserPlugin.java
private void validateCyclicAliasing(String field) throws ParseException { Set<String> set = new HashSet<String>(); set.add(field); if(validateField(field, set)) { throw new ParseException("Field aliases lead to a cycle"); } }
// in core/src/java/org/apache/solr/search/QueryParsing.java
public static int parseLocalParams(String txt, int start, Map<String, String> target, SolrParams params, String startString, char endChar) throws ParseException { int off = start; if (!txt.startsWith(startString, off)) return start; StrParser p = new StrParser(txt, start, txt.length()); p.pos += startString.length(); // skip over "{!" for (; ;) { /* if (p.pos>=txt.length()) { throw new ParseException("Missing '}' parsing local params '" + txt + '"'); } */ char ch = p.peek(); if (ch == endChar) { return p.pos + 1; } String id = p.getId(); if (id.length() == 0) { throw new ParseException("Expected ending character '" + endChar + "' parsing local params '" + txt + '"'); } String val = null; ch = p.peek(); if (ch != '=') { // single word... treat {!func} as type=func for easy lookup val = id; id = TYPE; } else { // saw equals, so read value p.pos++; ch = p.peek(); boolean deref = false; if (ch == '$') { p.pos++; ch = p.peek(); deref = true; // dereference whatever value is read by treating it as a variable name } if (ch == '\"' || ch == '\'') { val = p.getQuotedString(); } else { // read unquoted literal ended by whitespace or endChar (normally '}') // there is no escaping. int valStart = p.pos; for (; ;) { if (p.pos >= p.end) { throw new ParseException("Missing end to unquoted value starting at " + valStart + " str='" + txt + "'"); } char c = p.val.charAt(p.pos); if (c == endChar || Character.isWhitespace(c)) { val = p.val.substring(valStart, p.pos); break; } p.pos++; } } if (deref) { // dereference parameter if (params != null) { val = params.get(val); } } } if (target != null) target.put(id, val); } }
// in core/src/java/org/apache/solr/search/QueryParsing.java
void expect(String s) throws ParseException { eatws(); int slen = s.length(); if (val.regionMatches(pos, s, 0, slen)) { pos += slen; } else { throw new ParseException("Expected '" + s + "' at position " + pos + " in '" + val + "'"); } }
// in core/src/java/org/apache/solr/search/QueryParsing.java
String getId(String errMessage) throws ParseException { eatws(); int id_start = pos; char ch; if (pos < end && (ch = val.charAt(pos)) != '$' && Character.isJavaIdentifierStart(ch)) { pos++; while (pos < end) { ch = val.charAt(pos); // if (!Character.isJavaIdentifierPart(ch) && ch != '.' && ch != ':') { if (!Character.isJavaIdentifierPart(ch) && ch != '.') { break; } pos++; } return val.substring(id_start, pos); } if (errMessage != null) { throw new ParseException(errMessage + " at pos " + pos + " str='" + val + "'"); } return null; }
// in core/src/java/org/apache/solr/search/QueryParsing.java
public String getGlobbedId(String errMessage) throws ParseException { eatws(); int id_start = pos; char ch; if (pos < end && (ch = val.charAt(pos)) != '$' && (Character.isJavaIdentifierStart(ch) || ch=='?' || ch=='*')) { pos++; while (pos < end) { ch = val.charAt(pos); if (!(Character.isJavaIdentifierPart(ch) || ch=='?' || ch=='*') && ch != '.') { break; } pos++; } return val.substring(id_start, pos); } if (errMessage != null) { throw new ParseException(errMessage + " at pos " + pos + " str='" + val + "'"); } return null; }
// in core/src/java/org/apache/solr/search/QueryParsing.java
String getQuotedString() throws ParseException { eatws(); char delim = peekChar(); if (!(delim == '\"' || delim == '\'')) { return null; } int val_start = ++pos; StringBuilder sb = new StringBuilder(); // needed for escaping for (; ;) { if (pos >= end) { throw new ParseException("Missing end quote for string at pos " + (val_start - 1) + " str='" + val + "'"); } char ch = val.charAt(pos); if (ch == '\\') { pos++; if (pos >= end) break; ch = val.charAt(pos); switch (ch) { case 'n': ch = '\n'; break; case 't': ch = '\t'; break; case 'r': ch = '\r'; break; case 'b': ch = '\b'; break; case 'f': ch = '\f'; break; case 'u': if (pos + 4 >= end) { throw new ParseException("bad unicode escape \\uxxxx at pos" + (val_start - 1) + " str='" + val + "'"); } ch = (char) Integer.parseInt(val.substring(pos + 1, pos + 5), 16); pos += 4; break; } } else if (ch == delim) { pos++; // skip over the quote break; } sb.append(ch); pos++; } return sb.toString(); }
// in core/src/java/org/apache/solr/search/LuceneQParserPlugin.java
Override public Query parse() throws ParseException { // handle legacy "query;sort" syntax if (getLocalParams() == null) { String qstr = getString(); if (qstr == null || qstr.length() == 0) return null; sortStr = getParams().get(CommonParams.SORT); if (sortStr == null) { // sort may be legacy form, included in the query string List<String> commands = StrUtils.splitSmart(qstr,';'); if (commands.size() == 2) { qstr = commands.get(0); sortStr = commands.get(1); } else if (commands.size() == 1) { // This is need to support the case where someone sends: "q=query;" qstr = commands.get(0); } else if (commands.size() > 2) { throw new ParseException("If you want to use multiple ';' in the query, use the 'sort' param."); } } setString(qstr); } return super.parse(); }
// in core/src/java/org/apache/solr/search/function/distance/HaversineConstFunction.java
Override public ValueSource parse(FunctionQParser fp) throws ParseException { // TODO: dispatch through SpatialQueriable in the future? List<ValueSource> sources = fp.parseValueSourceList(); // "m" is a multi-value source, "x" is a single-value source // allow (m,m) (m,x,x) (x,x,m) (x,x,x,x) // if not enough points are present, "pt" will be checked first, followed by "sfield". MultiValueSource mv1 = null; MultiValueSource mv2 = null; if (sources.size() == 0) { // nothing to do now } else if (sources.size() == 1) { ValueSource vs = sources.get(0); if (!(vs instanceof MultiValueSource)) { throw new ParseException("geodist - invalid parameters:" + sources); } mv1 = (MultiValueSource)vs; } else if (sources.size() == 2) { ValueSource vs1 = sources.get(0); ValueSource vs2 = sources.get(1); if (vs1 instanceof MultiValueSource && vs2 instanceof MultiValueSource) { mv1 = (MultiValueSource)vs1; mv2 = (MultiValueSource)vs2; } else { mv1 = makeMV(sources, sources); } } else if (sources.size()==3) { ValueSource vs1 = sources.get(0); ValueSource vs2 = sources.get(1); if (vs1 instanceof MultiValueSource) { // (m,x,x) mv1 = (MultiValueSource)vs1; mv2 = makeMV(sources.subList(1,3), sources); } else { // (x,x,m) mv1 = makeMV(sources.subList(0,2), sources); vs1 = sources.get(2); if (!(vs1 instanceof MultiValueSource)) { throw new ParseException("geodist - invalid parameters:" + sources); } mv2 = (MultiValueSource)vs1; } } else if (sources.size()==4) { mv1 = makeMV(sources.subList(0,2), sources); mv2 = makeMV(sources.subList(2,4), sources); } else if (sources.size() > 4) { throw new ParseException("geodist - invalid parameters:" + sources); } if (mv1 == null) { mv1 = parsePoint(fp); mv2 = parseSfield(fp); } else if (mv2 == null) { mv2 = parsePoint(fp); if (mv2 == null) mv2 = parseSfield(fp); } if (mv1 == null || mv2 == null) { throw new ParseException("geodist - not enough parameters:" + sources); } // We have all the parameters at this point, now check if one of the points is constant double[] constants; constants = getConstants(mv1); MultiValueSource other = mv2; if (constants == null) { constants = getConstants(mv2); other = mv1; } if (constants != null && other instanceof VectorValueSource) { return new HaversineConstFunction(constants[0], constants[1], (VectorValueSource)other); } return new HaversineFunction(mv1, mv2, DistanceUtils.EARTH_MEAN_RADIUS_KM, true); }
// in core/src/java/org/apache/solr/search/function/distance/HaversineConstFunction.java
private static VectorValueSource makeMV(List<ValueSource> sources, List<ValueSource> orig) throws ParseException { ValueSource vs1 = sources.get(0); ValueSource vs2 = sources.get(1); if (vs1 instanceof MultiValueSource || vs2 instanceof MultiValueSource) { throw new ParseException("geodist - invalid parameters:" + orig); } return new VectorValueSource(sources); }
// in core/src/java/org/apache/solr/search/function/distance/HaversineConstFunction.java
private static MultiValueSource parsePoint(FunctionQParser fp) throws ParseException { String pt = fp.getParam(SpatialParams.POINT); if (pt == null) return null; double[] point = null; try { point = ParseUtils.parseLatitudeLongitude(pt); } catch (InvalidShapeException e) { throw new ParseException("Bad spatial pt:" + pt); } return new VectorValueSource(Arrays.<ValueSource>asList(new DoubleConstValueSource(point[0]),new DoubleConstValueSource(point[1]))); }
// in core/src/java/org/apache/solr/search/function/distance/HaversineConstFunction.java
private static MultiValueSource parseSfield(FunctionQParser fp) throws ParseException { String sfield = fp.getParam(SpatialParams.FIELD); if (sfield == null) return null; SchemaField sf = fp.getReq().getSchema().getField(sfield); ValueSource vs = sf.getType().getValueSource(sf, fp); if (!(vs instanceof MultiValueSource)) { throw new ParseException("Spatial field must implement MultiValueSource:" + sf); } return (MultiValueSource)vs; }
// in core/src/java/org/apache/solr/search/FunctionQParser.java
Override public Query parse() throws ParseException { sp = new QueryParsing.StrParser(getString()); ValueSource vs = null; List<ValueSource> lst = null; for(;;) { ValueSource valsource = parseValueSource(false); sp.eatws(); if (!parseMultipleSources) { vs = valsource; break; } else { if (lst != null) { lst.add(valsource); } else { vs = valsource; } } // check if there is a "," separator if (sp.peek() != ',') break; consumeArgumentDelimiter(); if (lst == null) { lst = new ArrayList<ValueSource>(2); lst.add(valsource); } } if (parseToEnd && sp.pos < sp.end) { throw new ParseException("Unexpected text after function: " + sp.val.substring(sp.pos, sp.end)); } if (lst != null) { vs = new VectorValueSource(lst); } return new FunctionQuery(vs); }
// in core/src/java/org/apache/solr/search/FunctionQParser.java
public String parseId() throws ParseException { String value = parseArg(); if (argWasQuoted) throw new ParseException("Expected identifier instead of quoted string:" + value); return value; }
// in core/src/java/org/apache/solr/search/FunctionQParser.java
public Float parseFloat() throws ParseException { String str = parseArg(); if (argWasQuoted()) throw new ParseException("Expected float instead of quoted string:" + str); float value = Float.parseFloat(str); return value; }
// in core/src/java/org/apache/solr/search/FunctionQParser.java
public double parseDouble() throws ParseException { String str = parseArg(); if (argWasQuoted()) throw new ParseException("Expected double instead of quoted string:" + str); double value = Double.parseDouble(str); return value; }
// in core/src/java/org/apache/solr/search/FunctionQParser.java
public int parseInt() throws ParseException { String str = parseArg(); if (argWasQuoted()) throw new ParseException("Expected double instead of quoted string:" + str); int value = Integer.parseInt(str); return value; }
// in core/src/java/org/apache/solr/search/FunctionQParser.java
public String parseArg() throws ParseException { argWasQuoted = false; sp.eatws(); char ch = sp.peek(); String val = null; switch (ch) { case ')': return null; case '$': sp.pos++; String param = sp.getId(); val = getParam(param); break; case '\'': case '"': val = sp.getQuotedString(); argWasQuoted = true; break; default: // read unquoted literal ended by whitespace ',' or ')' // there is no escaping. int valStart = sp.pos; for (;;) { if (sp.pos >= sp.end) { throw new ParseException("Missing end to unquoted value starting at " + valStart + " str='" + sp.val +"'"); } char c = sp.val.charAt(sp.pos); if (c==')' || c==',' || Character.isWhitespace(c)) { val = sp.val.substring(valStart, sp.pos); break; } sp.pos++; } } sp.eatws(); consumeArgumentDelimiter(); return val; }
// in core/src/java/org/apache/solr/search/FunctionQParser.java
public Query parseNestedQuery() throws ParseException { Query nestedQuery; if (sp.opt("$")) { String param = sp.getId(); String qstr = getParam(param); qstr = qstr==null ? "" : qstr; nestedQuery = subQuery(qstr, null).getQuery(); } else { int start = sp.pos; String v = sp.val; String qs = v; HashMap nestedLocalParams = new HashMap<String,String>(); int end = QueryParsing.parseLocalParams(qs, start, nestedLocalParams, getParams()); QParser sub; if (end>start) { if (nestedLocalParams.get(QueryParsing.V) != null) { // value specified directly in local params... so the end of the // query should be the end of the local params. sub = subQuery(qs.substring(start, end), null); } else { // value here is *after* the local params... ask the parser. sub = subQuery(qs, null); // int subEnd = sub.findEnd(')'); // TODO.. implement functions to find the end of a nested query throw new ParseException("Nested local params must have value in v parameter. got '" + qs + "'"); } } else { throw new ParseException("Nested function query must use $param or {!v=value} forms. got '" + qs + "'"); } sp.pos += end-start; // advance past nested query nestedQuery = sub.getQuery(); } consumeArgumentDelimiter(); return nestedQuery; }
// in core/src/java/org/apache/solr/search/FunctionQParser.java
protected ValueSource parseValueSource(boolean doConsumeDelimiter) throws ParseException { ValueSource valueSource; int ch = sp.peek(); if (ch>='0' && ch<='9' || ch=='.' || ch=='+' || ch=='-') { Number num = sp.getNumber(); if (num instanceof Long) { valueSource = new LongConstValueSource(num.longValue()); } else if (num instanceof Double) { valueSource = new DoubleConstValueSource(num.doubleValue()); } else { // shouldn't happen valueSource = new ConstValueSource(num.floatValue()); } } else if (ch == '"' || ch == '\''){ valueSource = new LiteralValueSource(sp.getQuotedString()); } else if (ch == '$') { sp.pos++; String param = sp.getId(); String val = getParam(param); if (val == null) { throw new ParseException("Missing param " + param + " while parsing function '" + sp.val + "'"); } QParser subParser = subQuery(val, "func"); if (subParser instanceof FunctionQParser) { ((FunctionQParser)subParser).setParseMultipleSources(true); } Query subQuery = subParser.getQuery(); if (subQuery instanceof FunctionQuery) { valueSource = ((FunctionQuery) subQuery).getValueSource(); } else { valueSource = new QueryValueSource(subQuery, 0.0f); } /*** // dereference *simple* argument (i.e., can't currently be a function) // In the future we could support full function dereferencing via a stack of ValueSource (or StringParser) objects ch = val.length()==0 ? '\0' : val.charAt(0); if (ch>='0' && ch<='9' || ch=='.' || ch=='+' || ch=='-') { QueryParsing.StrParser sp = new QueryParsing.StrParser(val); Number num = sp.getNumber(); if (num instanceof Long) { valueSource = new LongConstValueSource(num.longValue()); } else if (num instanceof Double) { valueSource = new DoubleConstValueSource(num.doubleValue()); } else { // shouldn't happen valueSource = new ConstValueSource(num.floatValue()); } } else if (ch == '"' || ch == '\'') { QueryParsing.StrParser sp = new QueryParsing.StrParser(val); val = sp.getQuotedString(); valueSource = new LiteralValueSource(val); } else { if (val.length()==0) { valueSource = new LiteralValueSource(val); } else { String id = val; SchemaField f = req.getSchema().getField(id); valueSource = f.getType().getValueSource(f, this); } } ***/ } else { String id = sp.getId(); if (sp.opt("(")) { // a function... look it up. ValueSourceParser argParser = req.getCore().getValueSourceParser(id); if (argParser==null) { throw new ParseException("Unknown function " + id + " in FunctionQuery(" + sp + ")"); } valueSource = argParser.parse(this); sp.expect(")"); } else { if ("true".equals(id)) { valueSource = new BoolConstValueSource(true); } else if ("false".equals(id)) { valueSource = new BoolConstValueSource(false); } else { SchemaField f = req.getSchema().getField(id); valueSource = f.getType().getValueSource(f, this); } } } if (doConsumeDelimiter) consumeArgumentDelimiter(); return valueSource; }
// in core/src/java/org/apache/solr/util/DateMathParser.java
public Date parseMath(String math) throws ParseException { Calendar cal = Calendar.getInstance(zone, loc); cal.setTime(getNow()); /* check for No-Op */ if (0==math.length()) { return cal.getTime(); } String[] ops = splitter.split(math); int pos = 0; while ( pos < ops.length ) { if (1 != ops[pos].length()) { throw new ParseException ("Multi character command found: \"" + ops[pos] + "\"", pos); } char command = ops[pos++].charAt(0); switch (command) { case '/': if (ops.length < pos + 1) { throw new ParseException ("Need a unit after command: \"" + command + "\"", pos); } try { round(cal, ops[pos++]); } catch (IllegalArgumentException e) { throw new ParseException ("Unit not recognized: \"" + ops[pos-1] + "\"", pos-1); } break; case '+': /* fall through */ case '-': if (ops.length < pos + 2) { throw new ParseException ("Need a value and unit for command: \"" + command + "\"", pos); } int val = 0; try { val = Integer.valueOf(ops[pos++]); } catch (NumberFormatException e) { throw new ParseException ("Not a Number: \"" + ops[pos-1] + "\"", pos-1); } if ('-' == command) { val = 0 - val; } try { String unit = ops[pos++]; add(cal, val, unit); } catch (IllegalArgumentException e) { throw new ParseException ("Unit not recognized: \"" + ops[pos-1] + "\"", pos-1); } break; default: throw new ParseException ("Unrecognized command: \"" + command + "\"", pos-1); } } return cal.getTime(); }
5
            
// in core/src/java/org/apache/solr/search/SurroundQParserPlugin.java
catch (org.apache.lucene.queryparser.surround.parser.ParseException pe) { throw new org.apache.lucene.queryparser.classic.ParseException( pe.getMessage()); }
// in core/src/java/org/apache/solr/search/function/distance/HaversineConstFunction.java
catch (InvalidShapeException e) { throw new ParseException("Bad spatial pt:" + pt); }
// in core/src/java/org/apache/solr/util/DateMathParser.java
catch (IllegalArgumentException e) { throw new ParseException ("Unit not recognized: \"" + ops[pos-1] + "\"", pos-1); }
// in core/src/java/org/apache/solr/util/DateMathParser.java
catch (NumberFormatException e) { throw new ParseException ("Not a Number: \"" + ops[pos-1] + "\"", pos-1); }
// in core/src/java/org/apache/solr/util/DateMathParser.java
catch (IllegalArgumentException e) { throw new ParseException ("Unit not recognized: \"" + ops[pos-1] + "\"", pos-1); }
161
            
// in solrj/src/java/org/apache/solr/common/util/DateUtil.java
public static Date parseDate(String d) throws ParseException { return parseDate(d, DEFAULT_DATE_FORMATS); }
// in solrj/src/java/org/apache/solr/common/util/DateUtil.java
public static Date parseDate(String d, Collection<String> fmts) throws ParseException { // 2007-04-26T08:05:04Z if (d.endsWith("Z") && d.length() > 20) { return getThreadLocalDateFormat().parse(d); } return parseDate(d, fmts, null); }
// in solrj/src/java/org/apache/solr/common/util/DateUtil.java
public static Date parseDate( String dateValue, Collection<String> dateFormats, Date startDate ) throws ParseException { if (dateValue == null) { throw new IllegalArgumentException("dateValue is null"); } if (dateFormats == null) { dateFormats = DEFAULT_HTTP_CLIENT_PATTERNS; } if (startDate == null) { startDate = DEFAULT_TWO_DIGIT_YEAR_START; } // trim single quotes around date if present // see issue #5279 if (dateValue.length() > 1 && dateValue.startsWith("'") && dateValue.endsWith("'") ) { dateValue = dateValue.substring(1, dateValue.length() - 1); } SimpleDateFormat dateParser = null; Iterator formatIter = dateFormats.iterator(); while (formatIter.hasNext()) { String format = (String) formatIter.next(); if (dateParser == null) { dateParser = new SimpleDateFormat(format, Locale.US); dateParser.setTimeZone(GMT); dateParser.set2DigitYearStart(startDate); } else { dateParser.applyPattern(format); } try { return dateParser.parse(dateValue); } catch (ParseException pe) { // ignore this exception, we will try the next format } } // we were unable to parse the date throw new ParseException("Unable to parse the date " + dateValue, 0); }
// in solrj/src/java/org/apache/solr/client/solrj/util/ClientUtils.java
Deprecated public static Date parseDate( String d ) throws ParseException { return DateUtil.parseDate(d); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DateFormatTransformer.java
private Date process(Object value, String format, Locale locale) throws ParseException { if (value == null) return null; String strVal = value.toString().trim(); if (strVal.length() == 0) return null; SimpleDateFormat fmt = fmtCache.get(format); if (fmt == null) { fmt = new SimpleDateFormat(format, locale); fmtCache.put(format, fmt); } return fmt.parse(strVal); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/NumberFormatTransformer.java
private Number process(String val, String style, Locale locale) throws ParseException { if (INTEGER.equals(style)) { return parseNumber(val, NumberFormat.getIntegerInstance(locale)); } else if (NUMBER.equals(style)) { return parseNumber(val, NumberFormat.getNumberInstance(locale)); } else if (CURRENCY.equals(style)) { return parseNumber(val, NumberFormat.getCurrencyInstance(locale)); } else if (PERCENT.equals(style)) { return parseNumber(val, NumberFormat.getPercentInstance(locale)); } return null; }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/NumberFormatTransformer.java
private Number parseNumber(String val, NumberFormat numFormat) throws ParseException { ParsePosition parsePos = new ParsePosition(0); Number num = numFormat.parse(val, parsePos); if (parsePos.getIndex() != val.length()) { throw new ParseException("illegal number format", parsePos.getIndex()); } return num; }
// in core/src/java/org/apache/solr/handler/component/SearchHandler.java
Override public void handleRequestBody(SolrQueryRequest req, SolrQueryResponse rsp) throws Exception, ParseException, InstantiationException, IllegalAccessException { // int sleep = req.getParams().getInt("sleep",0); // if (sleep > 0) {log.error("SLEEPING for " + sleep); Thread.sleep(sleep);} ResponseBuilder rb = new ResponseBuilder(req, rsp, components); if (rb.requestInfo != null) { rb.requestInfo.setResponseBuilder(rb); } boolean dbg = req.getParams().getBool(CommonParams.DEBUG_QUERY, false); rb.setDebug(dbg); if (dbg == false){//if it's true, we are doing everything anyway. SolrPluginUtils.getDebugInterests(req.getParams().getParams(CommonParams.DEBUG), rb); } final RTimer timer = rb.isDebug() ? new RTimer() : null; ShardHandler shardHandler1 = shardHandlerFactory.getShardHandler(); shardHandler1.checkDistributed(rb); if (timer == null) { // non-debugging prepare phase for( SearchComponent c : components ) { c.prepare(rb); } } else { // debugging prepare phase RTimer subt = timer.sub( "prepare" ); for( SearchComponent c : components ) { rb.setTimer( subt.sub( c.getName() ) ); c.prepare(rb); rb.getTimer().stop(); } subt.stop(); } if (!rb.isDistrib) { // a normal non-distributed request // The semantics of debugging vs not debugging are different enough that // it makes sense to have two control loops if(!rb.isDebug()) { // Process for( SearchComponent c : components ) { c.process(rb); } } else { // Process RTimer subt = timer.sub( "process" ); for( SearchComponent c : components ) { rb.setTimer( subt.sub( c.getName() ) ); c.process(rb); rb.getTimer().stop(); } subt.stop(); timer.stop(); // add the timing info if (rb.isDebugTimings()) { rb.addDebugInfo("timing", timer.asNamedList() ); } } } else { // a distributed request if (rb.outgoing == null) { rb.outgoing = new LinkedList<ShardRequest>(); } rb.finished = new ArrayList<ShardRequest>(); int nextStage = 0; do { rb.stage = nextStage; nextStage = ResponseBuilder.STAGE_DONE; // call all components for( SearchComponent c : components ) { // the next stage is the minimum of what all components report nextStage = Math.min(nextStage, c.distributedProcess(rb)); } // check the outgoing queue and send requests while (rb.outgoing.size() > 0) { // submit all current request tasks at once while (rb.outgoing.size() > 0) { ShardRequest sreq = rb.outgoing.remove(0); sreq.actualShards = sreq.shards; if (sreq.actualShards==ShardRequest.ALL_SHARDS) { sreq.actualShards = rb.shards; } sreq.responses = new ArrayList<ShardResponse>(); // TODO: map from shard to address[] for (String shard : sreq.actualShards) { ModifiableSolrParams params = new ModifiableSolrParams(sreq.params); params.remove(ShardParams.SHARDS); // not a top-level request params.set("distrib", "false"); // not a top-level request params.remove("indent"); params.remove(CommonParams.HEADER_ECHO_PARAMS); params.set(ShardParams.IS_SHARD, true); // a sub (shard) request params.set(ShardParams.SHARD_URL, shard); // so the shard knows what was asked if (rb.requestInfo != null) { // we could try and detect when this is needed, but it could be tricky params.set("NOW", Long.toString(rb.requestInfo.getNOW().getTime())); } String shardQt = params.get(ShardParams.SHARDS_QT); if (shardQt == null) { params.remove(CommonParams.QT); } else { params.set(CommonParams.QT, shardQt); } shardHandler1.submit(sreq, shard, params); } } // now wait for replies, but if anyone puts more requests on // the outgoing queue, send them out immediately (by exiting // this loop) boolean tolerant = rb.req.getParams().getBool(ShardParams.SHARDS_TOLERANT, false); while (rb.outgoing.size() == 0) { ShardResponse srsp = tolerant ? shardHandler1.takeCompletedIncludingErrors(): shardHandler1.takeCompletedOrError(); if (srsp == null) break; // no more requests to wait for // Was there an exception? if (srsp.getException() != null) { // If things are not tolerant, abort everything and rethrow if(!tolerant) { shardHandler1.cancelAll(); if (srsp.getException() instanceof SolrException) { throw (SolrException)srsp.getException(); } else { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, srsp.getException()); } } } rb.finished.add(srsp.getShardRequest()); // let the components see the responses to the request for(SearchComponent c : components) { c.handleResponses(rb, srsp.getShardRequest()); } } } for(SearchComponent c : components) { c.finishStage(rb); } // we are done when the next stage is MAX_VALUE } while (nextStage != Integer.MAX_VALUE); } }
// in core/src/java/org/apache/solr/analysis/SynonymFilterFactory.java
private SynonymMap loadSolrSynonyms(ResourceLoader loader, boolean dedup, Analyzer analyzer) throws IOException, ParseException { final boolean expand = getBoolean("expand", true); String synonyms = args.get("synonyms"); if (synonyms == null) throw new InitializationException("Missing required argument 'synonyms'."); CharsetDecoder decoder = Charset.forName("UTF-8").newDecoder() .onMalformedInput(CodingErrorAction.REPORT) .onUnmappableCharacter(CodingErrorAction.REPORT); SolrSynonymParser parser = new SolrSynonymParser(dedup, expand, analyzer); File synonymFile = new File(synonyms); if (synonymFile.exists()) { decoder.reset(); parser.add(new InputStreamReader(loader.openResource(synonyms), decoder)); } else { List<String> files = StrUtils.splitFileNames(synonyms); for (String file : files) { decoder.reset(); parser.add(new InputStreamReader(loader.openResource(file), decoder)); } } return parser.build(); }
// in core/src/java/org/apache/solr/analysis/SynonymFilterFactory.java
private SynonymMap loadWordnetSynonyms(ResourceLoader loader, boolean dedup, Analyzer analyzer) throws IOException, ParseException { final boolean expand = getBoolean("expand", true); String synonyms = args.get("synonyms"); if (synonyms == null) throw new InitializationException("Missing required argument 'synonyms'."); CharsetDecoder decoder = Charset.forName("UTF-8").newDecoder() .onMalformedInput(CodingErrorAction.REPORT) .onUnmappableCharacter(CodingErrorAction.REPORT); WordnetSynonymParser parser = new WordnetSynonymParser(dedup, expand, analyzer); File synonymFile = new File(synonyms); if (synonymFile.exists()) { decoder.reset(); parser.add(new InputStreamReader(loader.openResource(synonyms), decoder)); } else { List<String> files = StrUtils.splitFileNames(synonyms); for (String file : files) { decoder.reset(); parser.add(new InputStreamReader(loader.openResource(file), decoder)); } } return parser.build(); }
// in core/src/java/org/apache/solr/request/SimpleFacets.java
void parseParams(String type, String param) throws ParseException, IOException { localParams = QueryParsing.getLocalParams(param, req.getParams()); base = docs; facetValue = param; key = param; threads = -1; if (localParams == null) return; // remove local params unless it's a query if (type != FacetParams.FACET_QUERY) { // TODO Cut over to an Enum here facetValue = localParams.get(CommonParams.VALUE); } // reset set the default key now that localParams have been removed key = facetValue; // allow explicit set of the key key = localParams.get(CommonParams.OUTPUT_KEY, key); String threadStr = localParams.get(CommonParams.THREADS); if (threadStr != null) { threads = Integer.parseInt(threadStr); } // figure out if we need a new base DocSet String excludeStr = localParams.get(CommonParams.EXCLUDE); if (excludeStr == null) return; Map<?,?> tagMap = (Map<?,?>)req.getContext().get("tags"); if (tagMap != null && rb != null) { List<String> excludeTagList = StrUtils.splitSmart(excludeStr,','); IdentityHashMap<Query,Boolean> excludeSet = new IdentityHashMap<Query,Boolean>(); for (String excludeTag : excludeTagList) { Object olst = tagMap.get(excludeTag); // tagMap has entries of List<String,List<QParser>>, but subject to change in the future if (!(olst instanceof Collection)) continue; for (Object o : (Collection<?>)olst) { if (!(o instanceof QParser)) continue; QParser qp = (QParser)o; excludeSet.put(qp.getQuery(), Boolean.TRUE); } } if (excludeSet.size() == 0) return; List<Query> qlist = new ArrayList<Query>(); // add the base query if (!excludeSet.containsKey(rb.getQuery())) { qlist.add(rb.getQuery()); } // add the filters if (rb.getFilters() != null) { for (Query q : rb.getFilters()) { if (!excludeSet.containsKey(q)) { qlist.add(q); } } } // get the new base docset for this facet DocSet base = searcher.getDocSet(qlist); if (rb.grouping() && rb.getGroupingSpec().isTruncateGroups()) { Grouping grouping = new Grouping(searcher, null, rb.getQueryCommand(), false, 0, false); if (rb.getGroupingSpec().getFields().length > 0) { grouping.addFieldCommand(rb.getGroupingSpec().getFields()[0], req); } else if (rb.getGroupingSpec().getFunctions().length > 0) { grouping.addFunctionCommand(rb.getGroupingSpec().getFunctions()[0], req); } else { this.base = base; return; } AbstractAllGroupHeadsCollector allGroupHeadsCollector = grouping.getCommands().get(0).createAllGroupCollector(); searcher.search(new MatchAllDocsQuery(), base.getTopFilter(), allGroupHeadsCollector); int maxDoc = searcher.maxDoc(); FixedBitSet fixedBitSet = allGroupHeadsCollector.retrieveGroupHeads(maxDoc); long[] bits = fixedBitSet.getBits(); this.base = new BitDocSet(new OpenBitSet(bits, bits.length)); } else { this.base = base; } } }
// in core/src/java/org/apache/solr/request/SimpleFacets.java
public NamedList<Integer> getFacetQueryCounts() throws IOException,ParseException { NamedList<Integer> res = new SimpleOrderedMap<Integer>(); /* Ignore CommonParams.DF - could have init param facet.query assuming * the schema default with query param DF intented to only affect Q. * If user doesn't want schema default for facet.query, they should be * explicit. */ // SolrQueryParser qp = searcher.getSchema().getSolrQueryParser(null); String[] facetQs = params.getParams(FacetParams.FACET_QUERY); if (null != facetQs && 0 != facetQs.length) { for (String q : facetQs) { parseParams(FacetParams.FACET_QUERY, q); // TODO: slight optimization would prevent double-parsing of any localParams Query qobj = QParser.getParser(q, null, req).getQuery(); res.add(key, searcher.numDocs(qobj, base)); } } return res; }
// in core/src/java/org/apache/solr/request/SimpleFacets.java
public NamedList<Object> getFacetFieldCounts() throws IOException, ParseException { NamedList<Object> res = new SimpleOrderedMap<Object>(); String[] facetFs = params.getParams(FacetParams.FACET_FIELD); if (null != facetFs) { for (String f : facetFs) { parseParams(FacetParams.FACET_FIELD, f); String termList = localParams == null ? null : localParams.get(CommonParams.TERMS); if (termList != null) { res.add(key, getListedTermCounts(facetValue, termList)); } else { res.add(key, getTermCounts(facetValue)); } } } return res; }
// in core/src/java/org/apache/solr/request/SimpleFacets.java
Deprecated public NamedList<Object> getFacetDateCounts() throws IOException, ParseException { final NamedList<Object> resOuter = new SimpleOrderedMap<Object>(); final String[] fields = params.getParams(FacetParams.FACET_DATE); if (null == fields || 0 == fields.length) return resOuter; for (String f : fields) { getFacetDateCounts(f, resOuter); } return resOuter; }
// in core/src/java/org/apache/solr/request/SimpleFacets.java
Deprecated public void getFacetDateCounts(String dateFacet, NamedList<Object> resOuter) throws IOException, ParseException { final IndexSchema schema = searcher.getSchema(); parseParams(FacetParams.FACET_DATE, dateFacet); String f = facetValue; final NamedList<Object> resInner = new SimpleOrderedMap<Object>(); resOuter.add(key, resInner); final SchemaField sf = schema.getField(f); if (! (sf.getType() instanceof DateField)) { throw new SolrException (SolrException.ErrorCode.BAD_REQUEST, "Can not date facet on a field which is not a DateField: " + f); } final DateField ft = (DateField) sf.getType(); final String startS = required.getFieldParam(f,FacetParams.FACET_DATE_START); final Date start; try { start = ft.parseMath(null, startS); } catch (SolrException e) { throw new SolrException (SolrException.ErrorCode.BAD_REQUEST, "date facet 'start' is not a valid Date string: " + startS, e); } final String endS = required.getFieldParam(f,FacetParams.FACET_DATE_END); Date end; // not final, hardend may change this try { end = ft.parseMath(null, endS); } catch (SolrException e) { throw new SolrException (SolrException.ErrorCode.BAD_REQUEST, "date facet 'end' is not a valid Date string: " + endS, e); } if (end.before(start)) { throw new SolrException (SolrException.ErrorCode.BAD_REQUEST, "date facet 'end' comes before 'start': "+endS+" < "+startS); } final String gap = required.getFieldParam(f,FacetParams.FACET_DATE_GAP); final DateMathParser dmp = new DateMathParser(); final int minCount = params.getFieldInt(f,FacetParams.FACET_MINCOUNT, 0); String[] iStrs = params.getFieldParams(f,FacetParams.FACET_DATE_INCLUDE); // Legacy support for default of [lower,upper,edge] for date faceting // this is not handled by FacetRangeInclude.parseParam because // range faceting has differnet defaults final EnumSet<FacetRangeInclude> include = (null == iStrs || 0 == iStrs.length ) ? EnumSet.of(FacetRangeInclude.LOWER, FacetRangeInclude.UPPER, FacetRangeInclude.EDGE) : FacetRangeInclude.parseParam(iStrs); try { Date low = start; while (low.before(end)) { dmp.setNow(low); String label = ft.toExternal(low); Date high = dmp.parseMath(gap); if (end.before(high)) { if (params.getFieldBool(f,FacetParams.FACET_DATE_HARD_END,false)) { high = end; } else { end = high; } } if (high.before(low)) { throw new SolrException (SolrException.ErrorCode.BAD_REQUEST, "date facet infinite loop (is gap negative?)"); } final boolean includeLower = (include.contains(FacetRangeInclude.LOWER) || (include.contains(FacetRangeInclude.EDGE) && low.equals(start))); final boolean includeUpper = (include.contains(FacetRangeInclude.UPPER) || (include.contains(FacetRangeInclude.EDGE) && high.equals(end))); final int count = rangeCount(sf,low,high,includeLower,includeUpper); if (count >= minCount) { resInner.add(label, count); } low = high; } } catch (java.text.ParseException e) { throw new SolrException (SolrException.ErrorCode.BAD_REQUEST, "date facet 'gap' is not a valid Date Math string: " + gap, e); } // explicitly return the gap and end so all the counts // (including before/after/between) are meaningful - even if mincount // has removed the neighboring ranges resInner.add("gap", gap); resInner.add("start", start); resInner.add("end", end); final String[] othersP = params.getFieldParams(f,FacetParams.FACET_DATE_OTHER); if (null != othersP && 0 < othersP.length ) { final Set<FacetRangeOther> others = EnumSet.noneOf(FacetRangeOther.class); for (final String o : othersP) { others.add(FacetRangeOther.get(o)); } // no matter what other values are listed, we don't do // anything if "none" is specified. if (! others.contains(FacetRangeOther.NONE) ) { boolean all = others.contains(FacetRangeOther.ALL); if (all || others.contains(FacetRangeOther.BEFORE)) { // include upper bound if "outer" or if first gap doesn't already include it resInner.add(FacetRangeOther.BEFORE.toString(), rangeCount(sf,null,start, false, (include.contains(FacetRangeInclude.OUTER) || (! (include.contains(FacetRangeInclude.LOWER) || include.contains(FacetRangeInclude.EDGE)))))); } if (all || others.contains(FacetRangeOther.AFTER)) { // include lower bound if "outer" or if last gap doesn't already include it resInner.add(FacetRangeOther.AFTER.toString(), rangeCount(sf,end,null, (include.contains(FacetRangeInclude.OUTER) || (! (include.contains(FacetRangeInclude.UPPER) || include.contains(FacetRangeInclude.EDGE)))), false)); } if (all || others.contains(FacetRangeOther.BETWEEN)) { resInner.add(FacetRangeOther.BETWEEN.toString(), rangeCount(sf,start,end, (include.contains(FacetRangeInclude.LOWER) || include.contains(FacetRangeInclude.EDGE)), (include.contains(FacetRangeInclude.UPPER) || include.contains(FacetRangeInclude.EDGE)))); } } } }
// in core/src/java/org/apache/solr/request/SimpleFacets.java
public NamedList<Object> getFacetRangeCounts() throws IOException, ParseException { final NamedList<Object> resOuter = new SimpleOrderedMap<Object>(); final String[] fields = params.getParams(FacetParams.FACET_RANGE); if (null == fields || 0 == fields.length) return resOuter; for (String f : fields) { getFacetRangeCounts(f, resOuter); } return resOuter; }
// in core/src/java/org/apache/solr/request/SimpleFacets.java
void getFacetRangeCounts(String facetRange, NamedList<Object> resOuter) throws IOException, ParseException { final IndexSchema schema = searcher.getSchema(); parseParams(FacetParams.FACET_RANGE, facetRange); String f = facetValue; final SchemaField sf = schema.getField(f); final FieldType ft = sf.getType(); RangeEndpointCalculator<?> calc = null; if (ft instanceof TrieField) { final TrieField trie = (TrieField)ft; switch (trie.getType()) { case FLOAT: calc = new FloatRangeEndpointCalculator(sf); break; case DOUBLE: calc = new DoubleRangeEndpointCalculator(sf); break; case INTEGER: calc = new IntegerRangeEndpointCalculator(sf); break; case LONG: calc = new LongRangeEndpointCalculator(sf); break; default: throw new SolrException (SolrException.ErrorCode.BAD_REQUEST, "Unable to range facet on tried field of unexpected type:" + f); } } else if (ft instanceof DateField) { calc = new DateRangeEndpointCalculator(sf, null); } else if (ft instanceof SortableIntField) { calc = new IntegerRangeEndpointCalculator(sf); } else if (ft instanceof SortableLongField) { calc = new LongRangeEndpointCalculator(sf); } else if (ft instanceof SortableFloatField) { calc = new FloatRangeEndpointCalculator(sf); } else if (ft instanceof SortableDoubleField) { calc = new DoubleRangeEndpointCalculator(sf); } else { throw new SolrException (SolrException.ErrorCode.BAD_REQUEST, "Unable to range facet on field:" + sf); } resOuter.add(key, getFacetRangeCounts(sf, calc)); }
// in core/src/java/org/apache/solr/request/SimpleFacets.java
protected Object parseGap(final String rawval) throws java.text.ParseException { return parseVal(rawval); }
// in core/src/java/org/apache/solr/request/SimpleFacets.java
Override public Date parseAndAddGap(Date value, String gap) throws java.text.ParseException { final DateMathParser dmp = new DateMathParser(); dmp.setNow(value); return dmp.parseMath(gap); }
// in core/src/java/org/apache/solr/schema/DateField.java
public Date toObject(String indexedForm) throws java.text.ParseException { return parseDate(indexedToReadable(indexedForm)); }
// in core/src/java/org/apache/solr/schema/DateField.java
public static Date parseDate(String s) throws ParseException { return fmtThreadLocal.get().parse(s); }
// in core/src/java/org/apache/solr/schema/DateField.java
public Date parseDateLenient(String s, SolrQueryRequest req) throws ParseException { // request could define timezone in the future try { return fmtThreadLocal.get().parse(s); } catch (Exception e) { return DateUtil.parseDate(s); } }
// in core/src/java/org/apache/solr/search/DisMaxQParser.java
Override public Query parse() throws ParseException { SolrParams solrParams = SolrParams.wrapDefaults(localParams, params); queryFields = SolrPluginUtils.parseFieldBoosts(solrParams.getParams(DisMaxParams.QF)); if (0 == queryFields.size()) { queryFields.put(req.getSchema().getDefaultSearchFieldName(), 1.0f); } /* the main query we will execute. we disable the coord because * this query is an artificial construct */ BooleanQuery query = new BooleanQuery(true); boolean notBlank = addMainQuery(query, solrParams); if (!notBlank) return null; addBoostQuery(query, solrParams); addBoostFunctions(query, solrParams); return query; }
// in core/src/java/org/apache/solr/search/DisMaxQParser.java
protected void addBoostFunctions(BooleanQuery query, SolrParams solrParams) throws ParseException { String[] boostFuncs = solrParams.getParams(DisMaxParams.BF); if (null != boostFuncs && 0 != boostFuncs.length) { for (String boostFunc : boostFuncs) { if (null == boostFunc || "".equals(boostFunc)) continue; Map<String, Float> ff = SolrPluginUtils.parseFieldBoosts(boostFunc); for (String f : ff.keySet()) { Query fq = subQuery(f, FunctionQParserPlugin.NAME).getQuery(); Float b = ff.get(f); if (null != b) { fq.setBoost(b); } query.add(fq, BooleanClause.Occur.SHOULD); } } } }
// in core/src/java/org/apache/solr/search/DisMaxQParser.java
protected void addBoostQuery(BooleanQuery query, SolrParams solrParams) throws ParseException { boostParams = solrParams.getParams(DisMaxParams.BQ); //List<Query> boostQueries = SolrPluginUtils.parseQueryStrings(req, boostParams); boostQueries = null; if (boostParams != null && boostParams.length > 0) { boostQueries = new ArrayList<Query>(); for (String qs : boostParams) { if (qs.trim().length() == 0) continue; Query q = subQuery(qs, null).getQuery(); boostQueries.add(q); } } if (null != boostQueries) { if (1 == boostQueries.size() && 1 == boostParams.length) { /* legacy logic */ Query f = boostQueries.get(0); if (1.0f == f.getBoost() && f instanceof BooleanQuery) { /* if the default boost was used, and we've got a BooleanQuery * extract the subqueries out and use them directly */ for (Object c : ((BooleanQuery) f).clauses()) { query.add((BooleanClause) c); } } else { query.add(f, BooleanClause.Occur.SHOULD); } } else { for (Query f : boostQueries) { query.add(f, BooleanClause.Occur.SHOULD); } } } }
// in core/src/java/org/apache/solr/search/DisMaxQParser.java
protected boolean addMainQuery(BooleanQuery query, SolrParams solrParams) throws ParseException { Map<String, Float> phraseFields = SolrPluginUtils.parseFieldBoosts(solrParams.getParams(DisMaxParams.PF)); float tiebreaker = solrParams.getFloat(DisMaxParams.TIE, 0.0f); /* a parser for dealing with user input, which will convert * things to DisjunctionMaxQueries */ SolrPluginUtils.DisjunctionMaxQueryParser up = getParser(queryFields, DisMaxParams.QS, solrParams, tiebreaker); /* for parsing sloppy phrases using DisjunctionMaxQueries */ SolrPluginUtils.DisjunctionMaxQueryParser pp = getParser(phraseFields, DisMaxParams.PS, solrParams, tiebreaker); /* * * Main User Query * * */ parsedUserQuery = null; String userQuery = getString(); altUserQuery = null; if (userQuery == null || userQuery.trim().length() < 1) { // If no query is specified, we may have an alternate altUserQuery = getAlternateUserQuery(solrParams); if (altUserQuery == null) return false; query.add(altUserQuery, BooleanClause.Occur.MUST); } else { // There is a valid query string userQuery = SolrPluginUtils.partialEscape(SolrPluginUtils.stripUnbalancedQuotes(userQuery)).toString(); userQuery = SolrPluginUtils.stripIllegalOperators(userQuery).toString(); parsedUserQuery = getUserQuery(userQuery, up, solrParams); query.add(parsedUserQuery, BooleanClause.Occur.MUST); Query phrase = getPhraseQuery(userQuery, pp); if (null != phrase) { query.add(phrase, BooleanClause.Occur.SHOULD); } } return true; }
// in core/src/java/org/apache/solr/search/DisMaxQParser.java
protected Query getAlternateUserQuery(SolrParams solrParams) throws ParseException { String altQ = solrParams.get(DisMaxParams.ALTQ); if (altQ != null) { QParser altQParser = subQuery(altQ, null); return altQParser.getQuery(); } else { return null; } }
// in core/src/java/org/apache/solr/search/DisMaxQParser.java
protected Query getPhraseQuery(String userQuery, SolrPluginUtils.DisjunctionMaxQueryParser pp) throws ParseException { /* * * Add on Phrases for the Query * * */ /* build up phrase boosting queries */ /* if the userQuery already has some quotes, strip them out. * we've already done the phrases they asked for in the main * part of the query, this is to boost docs that may not have * matched those phrases but do match looser phrases. */ String userPhraseQuery = userQuery.replace("\"", ""); return pp.parse("\"" + userPhraseQuery + "\""); }
// in core/src/java/org/apache/solr/search/DisMaxQParser.java
protected Query getUserQuery(String userQuery, SolrPluginUtils.DisjunctionMaxQueryParser up, SolrParams solrParams) throws ParseException { String minShouldMatch = parseMinShouldMatch(req.getSchema(), solrParams); Query dis = up.parse(userQuery); Query query = dis; if (dis instanceof BooleanQuery) { BooleanQuery t = new BooleanQuery(); SolrPluginUtils.flattenBooleanQuery(t, (BooleanQuery) dis); SolrPluginUtils.setMinShouldMatch(t, minShouldMatch); query = t; } return query; }
// in core/src/java/org/apache/solr/search/DisMaxQParser.java
Override public Query getHighlightQuery() throws ParseException { return parsedUserQuery == null ? altUserQuery : parsedUserQuery; }
// in core/src/java/org/apache/solr/search/SpatialFilterQParser.java
Override public Query parse() throws ParseException { //if more than one, we need to treat them as a point... //TODO: Should we accept multiple fields String[] fields = localParams.getParams("f"); if (fields == null || fields.length == 0) { String field = getParam(SpatialParams.FIELD); if (field == null) throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, " missing sfield for spatial request"); fields = new String[] {field}; } String pointStr = getParam(SpatialParams.POINT); if (pointStr == null) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, SpatialParams.POINT + " missing."); } double dist = -1; String distS = getParam(SpatialParams.DISTANCE); if (distS != null) dist = Double.parseDouble(distS); if (dist < 0) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, SpatialParams.DISTANCE + " must be >= 0"); } String measStr = localParams.get(SpatialParams.MEASURE); //TODO: Need to do something with Measures Query result = null; //fields is valid at this point if (fields.length == 1) { SchemaField sf = req.getSchema().getField(fields[0]); FieldType type = sf.getType(); if (type instanceof SpatialQueryable) { double radius = localParams.getDouble(SpatialParams.SPHERE_RADIUS, DistanceUtils.EARTH_MEAN_RADIUS_KM); SpatialOptions opts = new SpatialOptions(pointStr, dist, sf, measStr, radius, DistanceUnits.KILOMETERS); opts.bbox = bbox; result = ((SpatialQueryable)type).createSpatialQuery(this, opts); } else { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "The field " + fields[0] + " does not support spatial filtering"); } } else {// fields.length > 1 //TODO: Not sure about this just yet, is there a way to delegate, or do we just have a helper class? //Seems like we could just use FunctionQuery, but then what about scoring /*List<ValueSource> sources = new ArrayList<ValueSource>(fields.length); for (String field : fields) { SchemaField sf = schema.getField(field); sources.add(sf.getType().getValueSource(sf, this)); } MultiValueSource vs = new VectorValueSource(sources); ValueSourceRangeFilter rf = new ValueSourceRangeFilter(vs, "0", String.valueOf(dist), true, true); result = new SolrConstantScoreQuery(rf);*/ } return result; }
// in core/src/java/org/apache/solr/search/TermQParserPlugin.java
Override public QParser createParser(String qstr, SolrParams localParams, SolrParams params, SolrQueryRequest req) { return new QParser(qstr, localParams, params, req) { @Override public Query parse() throws ParseException { String fname = localParams.get(QueryParsing.F); FieldType ft = req.getSchema().getFieldTypeNoEx(fname); String val = localParams.get(QueryParsing.V); BytesRef term = new BytesRef(); if (ft != null) { ft.readableToIndexed(val, term); } else { term.copyChars(val); } return new TermQuery(new Term(fname, term)); } }; }
// in core/src/java/org/apache/solr/search/TermQParserPlugin.java
Override public Query parse() throws ParseException { String fname = localParams.get(QueryParsing.F); FieldType ft = req.getSchema().getFieldTypeNoEx(fname); String val = localParams.get(QueryParsing.V); BytesRef term = new BytesRef(); if (ft != null) { ft.readableToIndexed(val, term); } else { term.copyChars(val); } return new TermQuery(new Term(fname, term)); }
// in core/src/java/org/apache/solr/search/QParser.java
public Query getQuery() throws ParseException { if (query==null) { query=parse(); if (localParams != null) { String cacheStr = localParams.get(CommonParams.CACHE); if (cacheStr != null) { if (CommonParams.FALSE.equals(cacheStr)) { extendedQuery().setCache(false); } else if (CommonParams.TRUE.equals(cacheStr)) { extendedQuery().setCache(true); } else if ("sep".equals(cacheStr)) { extendedQuery().setCacheSep(true); } } int cost = localParams.getInt(CommonParams.COST, Integer.MIN_VALUE); if (cost != Integer.MIN_VALUE) { extendedQuery().setCost(cost); } } } return query; }
// in core/src/java/org/apache/solr/search/QParser.java
private void checkRecurse() throws ParseException { if (recurseCount++ >= 100) { throw new ParseException("Infinite Recursion detected parsing query '" + qstr + "'"); } }
// in core/src/java/org/apache/solr/search/QParser.java
public QParser subQuery(String q, String defaultType) throws ParseException { checkRecurse(); if (defaultType == null && localParams != null) { // if not passed, try and get the defaultType from local params defaultType = localParams.get(QueryParsing.DEFTYPE); } QParser nestedParser = getParser(q, defaultType, getReq()); nestedParser.recurseCount = recurseCount; recurseCount--; return nestedParser; }
// in core/src/java/org/apache/solr/search/QParser.java
public ScoreDoc getPaging() throws ParseException { return null; /*** This is not ready for prime-time... see SOLR-1726 String pageScoreS = null; String pageDocS = null; pageScoreS = params.get(CommonParams.PAGESCORE); pageDocS = params.get(CommonParams.PAGEDOC); if (pageScoreS == null || pageDocS == null) return null; int pageDoc = pageDocS != null ? Integer.parseInt(pageDocS) : -1; float pageScore = pageScoreS != null ? new Float(pageScoreS) : -1; if(pageDoc != -1 && pageScore != -1){ return new ScoreDoc(pageDoc, pageScore); } else { return null; } ***/ }
// in core/src/java/org/apache/solr/search/QParser.java
public SortSpec getSort(boolean useGlobalParams) throws ParseException { getQuery(); // ensure query is parsed first String sortStr = null; String startS = null; String rowsS = null; if (localParams != null) { sortStr = localParams.get(CommonParams.SORT); startS = localParams.get(CommonParams.START); rowsS = localParams.get(CommonParams.ROWS); // if any of these parameters are present, don't go back to the global params if (sortStr != null || startS != null || rowsS != null) { useGlobalParams = false; } } if (useGlobalParams) { if (sortStr ==null) { sortStr = params.get(CommonParams.SORT); } if (startS==null) { startS = params.get(CommonParams.START); } if (rowsS==null) { rowsS = params.get(CommonParams.ROWS); } } int start = startS != null ? Integer.parseInt(startS) : 0; int rows = rowsS != null ? Integer.parseInt(rowsS) : 10; Sort sort = null; if( sortStr != null ) { sort = QueryParsing.parseSort(sortStr, req); } return new SortSpec( sort, start, rows ); }
// in core/src/java/org/apache/solr/search/QParser.java
public Query getHighlightQuery() throws ParseException { Query query = getQuery(); return query instanceof WrappedQuery ? ((WrappedQuery)query).getWrappedQuery() : query; }
// in core/src/java/org/apache/solr/search/QParser.java
public static QParser getParser(String qstr, String defaultType, SolrQueryRequest req) throws ParseException { // SolrParams localParams = QueryParsing.getLocalParams(qstr, req.getParams()); String stringIncludingLocalParams = qstr; SolrParams localParams = null; SolrParams globalParams = req.getParams(); boolean valFollowedParams = true; int localParamsEnd = -1; if (qstr != null && qstr.startsWith(QueryParsing.LOCALPARAM_START)) { Map<String, String> localMap = new HashMap<String, String>(); localParamsEnd = QueryParsing.parseLocalParams(qstr, 0, localMap, globalParams); String val = localMap.get(QueryParsing.V); if (val != null) { // val was directly specified in localParams via v=<something> or v=$arg valFollowedParams = false; } else { // use the remainder of the string as the value valFollowedParams = true; val = qstr.substring(localParamsEnd); localMap.put(QueryParsing.V, val); } localParams = new MapSolrParams(localMap); } String type; if (localParams == null) { type = defaultType; } else { type = localParams.get(QueryParsing.TYPE,defaultType); qstr = localParams.get("v"); } type = type==null ? QParserPlugin.DEFAULT_QTYPE : type; QParserPlugin qplug = req.getCore().getQueryPlugin(type); QParser parser = qplug.createParser(qstr, localParams, req.getParams(), req); parser.stringIncludingLocalParams = stringIncludingLocalParams; parser.valFollowedParams = valFollowedParams; parser.localParamsEnd = localParamsEnd; return parser; }
// in core/src/java/org/apache/solr/search/SurroundQParserPlugin.java
Override public Query parse() throws org.apache.lucene.queryparser.classic.ParseException { SrndQuery sq; String qstr = getString(); if (qstr == null) return null; String mbqparam = getParam(MBQParam); if (mbqparam == null) { this.maxBasicQueries = DEFMAXBASICQUERIES; } else { try { this.maxBasicQueries = Integer.parseInt(mbqparam); } catch (Exception e) { LOG.warn("Couldn't parse maxBasicQueries value " + mbqparam +", using default of 1000"); this.maxBasicQueries = DEFMAXBASICQUERIES; } } // ugh .. colliding ParseExceptions try { sq = org.apache.lucene.queryparser.surround.parser.QueryParser .parse(qstr); } catch (org.apache.lucene.queryparser.surround.parser.ParseException pe) { throw new org.apache.lucene.queryparser.classic.ParseException( pe.getMessage()); } // so what do we do with the SrndQuery ?? // processing based on example in LIA Ch 9 String defaultField = getParam(CommonParams.DF); if (defaultField == null) { defaultField = getReq().getSchema().getDefaultSearchFieldName(); } BasicQueryFactory bqFactory = new BasicQueryFactory(this.maxBasicQueries); Query lquery = sq.makeLuceneQueryField(defaultField, bqFactory); return lquery; }
// in core/src/java/org/apache/solr/search/Grouping.java
public void addFieldCommand(String field, SolrQueryRequest request) throws ParseException { SchemaField schemaField = searcher.getSchema().getField(field); // Throws an exception when field doesn't exist. Bad request. FieldType fieldType = schemaField.getType(); ValueSource valueSource = fieldType.getValueSource(schemaField, null); if (!(valueSource instanceof StrFieldSource)) { addFunctionCommand(field, request); return; } Grouping.CommandField gc = new CommandField(); gc.groupSort = groupSort; gc.groupBy = field; gc.key = field; gc.numGroups = limitDefault; gc.docsPerGroup = docsPerGroupDefault; gc.groupOffset = groupOffsetDefault; gc.offset = cmd.getOffset(); gc.sort = sort; gc.format = defaultFormat; gc.totalCount = defaultTotalCount; if (main) { gc.main = true; gc.format = Grouping.Format.simple; } if (gc.format == Grouping.Format.simple) { gc.groupOffset = 0; // doesn't make sense } commands.add(gc); }
// in core/src/java/org/apache/solr/search/Grouping.java
public void addFunctionCommand(String groupByStr, SolrQueryRequest request) throws ParseException { QParser parser = QParser.getParser(groupByStr, "func", request); Query q = parser.getQuery(); final Grouping.Command gc; if (q instanceof FunctionQuery) { ValueSource valueSource = ((FunctionQuery) q).getValueSource(); if (valueSource instanceof StrFieldSource) { String field = ((StrFieldSource) valueSource).getField(); CommandField commandField = new CommandField(); commandField.groupBy = field; gc = commandField; } else { CommandFunc commandFunc = new CommandFunc(); commandFunc.groupBy = valueSource; gc = commandFunc; } } else { CommandFunc commandFunc = new CommandFunc(); commandFunc.groupBy = new QueryValueSource(q, 0.0f); gc = commandFunc; } gc.groupSort = groupSort; gc.key = groupByStr; gc.numGroups = limitDefault; gc.docsPerGroup = docsPerGroupDefault; gc.groupOffset = groupOffsetDefault; gc.offset = cmd.getOffset(); gc.sort = sort; gc.format = defaultFormat; gc.totalCount = defaultTotalCount; if (main) { gc.main = true; gc.format = Grouping.Format.simple; } if (gc.format == Grouping.Format.simple) { gc.groupOffset = 0; // doesn't make sense } commands.add(gc); }
// in core/src/java/org/apache/solr/search/Grouping.java
public void addQueryCommand(String groupByStr, SolrQueryRequest request) throws ParseException { QParser parser = QParser.getParser(groupByStr, null, request); Query gq = parser.getQuery(); Grouping.CommandQuery gc = new CommandQuery(); gc.query = gq; gc.groupSort = groupSort; gc.key = groupByStr; gc.numGroups = limitDefault; gc.docsPerGroup = docsPerGroupDefault; gc.groupOffset = groupOffsetDefault; // these two params will only be used if this is for the main result set gc.offset = cmd.getOffset(); gc.numGroups = limitDefault; gc.format = defaultFormat; if (main) { gc.main = true; gc.format = Grouping.Format.simple; } if (gc.format == Grouping.Format.simple) { gc.docsPerGroup = gc.numGroups; // doesn't make sense to limit to one gc.groupOffset = gc.offset; } commands.add(gc); }
// in core/src/java/org/apache/solr/search/ExtendedDismaxQParserPlugin.java
Override public Query parse() throws ParseException { SolrParams localParams = getLocalParams(); SolrParams params = getParams(); solrParams = SolrParams.wrapDefaults(localParams, params); final String minShouldMatch = DisMaxQParser.parseMinShouldMatch(req.getSchema(), solrParams); userFields = new UserFields(U.parseFieldBoosts(solrParams.getParams(DMP.UF))); queryFields = SolrPluginUtils.parseFieldBoosts(solrParams.getParams(DisMaxParams.QF)); if (0 == queryFields.size()) { queryFields.put(req.getSchema().getDefaultSearchFieldName(), 1.0f); } // Boosted phrase of the full query string List<FieldParams> phraseFields = U.parseFieldBoostsAndSlop(solrParams.getParams(DMP.PF),0); // Boosted Bi-Term Shingles from the query string List<FieldParams> phraseFields2 = U.parseFieldBoostsAndSlop(solrParams.getParams("pf2"),2); // Boosted Tri-Term Shingles from the query string List<FieldParams> phraseFields3 = U.parseFieldBoostsAndSlop(solrParams.getParams("pf3"),3); float tiebreaker = solrParams.getFloat(DisMaxParams.TIE, 0.0f); int pslop = solrParams.getInt(DisMaxParams.PS, 0); int qslop = solrParams.getInt(DisMaxParams.QS, 0); // remove stopwords from mandatory "matching" component? boolean stopwords = solrParams.getBool("stopwords", true); /* the main query we will execute. we disable the coord because * this query is an artificial construct */ BooleanQuery query = new BooleanQuery(true); /* * * Main User Query * * */ parsedUserQuery = null; String userQuery = getString(); altUserQuery = null; if( userQuery == null || userQuery.trim().length() == 0 ) { // If no query is specified, we may have an alternate String altQ = solrParams.get( DisMaxParams.ALTQ ); if (altQ != null) { altQParser = subQuery(altQ, null); altUserQuery = altQParser.getQuery(); query.add( altUserQuery , BooleanClause.Occur.MUST ); } else { return null; // throw new ParseException("missing query string" ); } } else { // There is a valid query string // userQuery = partialEscape(U.stripUnbalancedQuotes(userQuery)).toString(); boolean lowercaseOperators = solrParams.getBool("lowercaseOperators", true); String mainUserQuery = userQuery; ExtendedSolrQueryParser up = new ExtendedSolrQueryParser(this, IMPOSSIBLE_FIELD_NAME); up.addAlias(IMPOSSIBLE_FIELD_NAME, tiebreaker, queryFields); addAliasesFromRequest(up, tiebreaker); up.setPhraseSlop(qslop); // slop for explicit user phrase queries up.setAllowLeadingWildcard(true); // defer escaping and only do if lucene parsing fails, or we need phrases // parsing fails. Need to sloppy phrase queries anyway though. List<Clause> clauses = null; int numPluses = 0; int numMinuses = 0; int numOR = 0; int numNOT = 0; clauses = splitIntoClauses(userQuery, false); for (Clause clause : clauses) { if (clause.must == '+') numPluses++; if (clause.must == '-') numMinuses++; if (clause.isBareWord()) { String s = clause.val; if ("OR".equals(s)) { numOR++; } else if ("NOT".equals(s)) { numNOT++; } else if (lowercaseOperators && "or".equals(s)) { numOR++; } } } // Always rebuild mainUserQuery from clauses to catch modifications from splitIntoClauses // This was necessary for userFields modifications to get propagated into the query. // Convert lower or mixed case operators to uppercase if we saw them. // only do this for the lucene query part and not for phrase query boosting // since some fields might not be case insensitive. // We don't use a regex for this because it might change and AND or OR in // a phrase query in a case sensitive field. StringBuilder sb = new StringBuilder(); for (int i=0; i<clauses.size(); i++) { Clause clause = clauses.get(i); String s = clause.raw; // and and or won't be operators at the start or end if (i>0 && i+1<clauses.size()) { if ("AND".equalsIgnoreCase(s)) { s="AND"; } else if ("OR".equalsIgnoreCase(s)) { s="OR"; } } sb.append(s); sb.append(' '); } mainUserQuery = sb.toString(); // For correct lucene queries, turn off mm processing if there // were explicit operators (except for AND). boolean doMinMatched = (numOR + numNOT + numPluses + numMinuses) == 0; try { up.setRemoveStopFilter(!stopwords); up.exceptions = true; parsedUserQuery = up.parse(mainUserQuery); if (stopwords && isEmpty(parsedUserQuery)) { // if the query was all stop words, remove none of them up.setRemoveStopFilter(true); parsedUserQuery = up.parse(mainUserQuery); } } catch (Exception e) { // ignore failure and reparse later after escaping reserved chars up.exceptions = false; } if (parsedUserQuery != null && doMinMatched) { if (parsedUserQuery instanceof BooleanQuery) { SolrPluginUtils.setMinShouldMatch((BooleanQuery)parsedUserQuery, minShouldMatch); } } if (parsedUserQuery == null) { sb = new StringBuilder(); for (Clause clause : clauses) { boolean doQuote = clause.isPhrase; String s=clause.val; if (!clause.isPhrase && ("OR".equals(s) || "AND".equals(s) || "NOT".equals(s))) { doQuote=true; } if (clause.must != 0) { sb.append(clause.must); } if (clause.field != null) { sb.append(clause.field); sb.append(':'); } if (doQuote) { sb.append('"'); } sb.append(clause.val); if (doQuote) { sb.append('"'); } if (clause.field != null) { // Add the default user field boost, if any Float boost = userFields.getBoost(clause.field); if(boost != null) sb.append("^").append(boost); } sb.append(' '); } String escapedUserQuery = sb.toString(); parsedUserQuery = up.parse(escapedUserQuery); if (parsedUserQuery instanceof BooleanQuery) { BooleanQuery t = new BooleanQuery(); SolrPluginUtils.flattenBooleanQuery(t, (BooleanQuery)parsedUserQuery); SolrPluginUtils.setMinShouldMatch(t, minShouldMatch); parsedUserQuery = t; } } query.add(parsedUserQuery, BooleanClause.Occur.MUST); // sloppy phrase queries for proximity List<FieldParams> allPhraseFields = new ArrayList<FieldParams>(); allPhraseFields.addAll(phraseFields); allPhraseFields.addAll(phraseFields2); allPhraseFields.addAll(phraseFields3); if (allPhraseFields.size() > 0) { // find non-field clauses List<Clause> normalClauses = new ArrayList<Clause>(clauses.size()); for (Clause clause : clauses) { if (clause.field != null || clause.isPhrase) continue; // check for keywords "AND,OR,TO" if (clause.isBareWord()) { String s = clause.val.toString(); // avoid putting explict operators in the phrase query if ("OR".equals(s) || "AND".equals(s) || "NOT".equals(s) || "TO".equals(s)) continue; } normalClauses.add(clause); } // full phrase and shingles for (FieldParams phraseField: allPhraseFields) { int slop = (phraseField.getSlop() == 0) ? pslop : phraseField.getSlop(); Map<String,Float> pf = new HashMap<String,Float>(1); pf.put(phraseField.getField(),phraseField.getBoost()); addShingledPhraseQueries(query, normalClauses, pf, phraseField.getWordGrams(),tiebreaker, slop); } } } /* * * Boosting Query * * */ boostParams = solrParams.getParams(DisMaxParams.BQ); boostQueries=null; if (boostParams!=null && boostParams.length>0) { Map<String,Float> bqBoosts = SolrPluginUtils.parseFieldBoosts(boostParams); boostQueries = new ArrayList<Query>(); for (Map.Entry<String,Float> bqs : bqBoosts.entrySet()) { if (bqs.getKey().trim().length()==0) continue; Query q = subQuery(bqs.getKey(), null).getQuery(); Float b = bqs.getValue(); if(b!=null) { q.setBoost(b); } boostQueries.add(q); } } if (null != boostQueries) { for(Query f : boostQueries) { query.add(f, BooleanClause.Occur.SHOULD); } } /* * * Boosting Functions * * */ String[] boostFuncs = solrParams.getParams(DisMaxParams.BF); if (null != boostFuncs && 0 != boostFuncs.length) { for (String boostFunc : boostFuncs) { if(null == boostFunc || "".equals(boostFunc)) continue; Map<String,Float> ff = SolrPluginUtils.parseFieldBoosts(boostFunc); for (String f : ff.keySet()) { Query fq = subQuery(f, FunctionQParserPlugin.NAME).getQuery(); Float b = ff.get(f); if (null != b) { fq.setBoost(b); } query.add(fq, BooleanClause.Occur.SHOULD); } } } // // create a boosted query (scores multiplied by boosts) // Query topQuery = query; multBoosts = solrParams.getParams("boost"); if (multBoosts!=null && multBoosts.length>0) { List<ValueSource> boosts = new ArrayList<ValueSource>(); for (String boostStr : multBoosts) { if (boostStr==null || boostStr.length()==0) continue; Query boost = subQuery(boostStr, FunctionQParserPlugin.NAME).getQuery(); ValueSource vs; if (boost instanceof FunctionQuery) { vs = ((FunctionQuery)boost).getValueSource(); } else { vs = new QueryValueSource(boost, 1.0f); } boosts.add(vs); } if (boosts.size()>1) { ValueSource prod = new ProductFloatFunction(boosts.toArray(new ValueSource[boosts.size()])); topQuery = new BoostedQuery(query, prod); } else if (boosts.size() == 1) { topQuery = new BoostedQuery(query, boosts.get(0)); } } return topQuery; }
// in core/src/java/org/apache/solr/search/ExtendedDismaxQParserPlugin.java
private void addAliasesFromRequest(ExtendedSolrQueryParser up, float tiebreaker) throws ParseException { Iterator<String> it = solrParams.getParameterNamesIterator(); while(it.hasNext()) { String param = it.next(); if(param.startsWith("f.") && param.endsWith(".qf")) { // Add the alias String fname = param.substring(2,param.length()-3); String qfReplacement = solrParams.get(param); Map<String,Float> parsedQf = SolrPluginUtils.parseFieldBoosts(qfReplacement); if(parsedQf.size() == 0) return; up.addAlias(fname, tiebreaker, parsedQf); } } }
// in core/src/java/org/apache/solr/search/ExtendedDismaxQParserPlugin.java
private void addShingledPhraseQueries(final BooleanQuery mainQuery, final List<Clause> clauses, final Map<String,Float> fields, int shingleSize, final float tiebreaker, final int slop) throws ParseException { if (null == fields || fields.isEmpty() || null == clauses || clauses.size() < shingleSize ) return; if (0 == shingleSize) shingleSize = clauses.size(); final int goat = shingleSize-1; // :TODO: better name for var? StringBuilder userPhraseQuery = new StringBuilder(); for (int i=0; i < clauses.size() - goat; i++) { userPhraseQuery.append('"'); for (int j=0; j <= goat; j++) { userPhraseQuery.append(clauses.get(i + j).val); userPhraseQuery.append(' '); } userPhraseQuery.append('"'); userPhraseQuery.append(' '); } /* for parsing sloppy phrases using DisjunctionMaxQueries */ ExtendedSolrQueryParser pp = new ExtendedSolrQueryParser(this, IMPOSSIBLE_FIELD_NAME); pp.addAlias(IMPOSSIBLE_FIELD_NAME, tiebreaker, fields); pp.setPhraseSlop(slop); pp.setRemoveStopFilter(true); // remove stop filter and keep stopwords /* :TODO: reevaluate using makeDismax=true vs false... * * The DismaxQueryParser always used DisjunctionMaxQueries for the * pf boost, for the same reasons it used them for the qf fields. * When Yonik first wrote the ExtendedDismaxQParserPlugin, he added * the "makeDismax=false" property to use BooleanQueries instead, but * when asked why his response was "I honestly don't recall" ... * * https://issues.apache.org/jira/browse/SOLR-1553?focusedCommentId=12793813#action_12793813 * * so for now, we continue to use dismax style queries becuse it * seems the most logical and is back compatible, but we should * try to figure out what Yonik was thinking at the time (because he * rarely does things for no reason) */ pp.makeDismax = true; // minClauseSize is independent of the shingleSize because of stop words // (if they are removed from the middle, so be it, but we need at least // two or there shouldn't be a boost) pp.minClauseSize = 2; // TODO: perhaps we shouldn't use synonyms either... Query phrase = pp.parse(userPhraseQuery.toString()); if (phrase != null) { mainQuery.add(phrase, BooleanClause.Occur.SHOULD); } }
// in core/src/java/org/apache/solr/search/ExtendedDismaxQParserPlugin.java
Override public Query getHighlightQuery() throws ParseException { return parsedUserQuery == null ? altUserQuery : parsedUserQuery; }
// in core/src/java/org/apache/solr/search/ExtendedDismaxQParserPlugin.java
Override protected Query getBooleanQuery(List clauses, boolean disableCoord) throws ParseException { Query q = super.getBooleanQuery(clauses, disableCoord); if (q != null) { q = QueryUtils.makeQueryable(q); } return q; }
// in core/src/java/org/apache/solr/search/ExtendedDismaxQParserPlugin.java
Override protected Query getFieldQuery(String field, String val, boolean quoted) throws ParseException { //System.out.println("getFieldQuery: val="+val); this.type = QType.FIELD; this.field = field; this.val = val; this.slop = getPhraseSlop(); // unspecified return getAliasedQuery(); }
// in core/src/java/org/apache/solr/search/ExtendedDismaxQParserPlugin.java
Override protected Query getFieldQuery(String field, String val, int slop) throws ParseException { //System.out.println("getFieldQuery: val="+val+" slop="+slop); this.type = QType.PHRASE; this.field = field; this.val = val; this.slop = slop; return getAliasedQuery(); }
// in core/src/java/org/apache/solr/search/ExtendedDismaxQParserPlugin.java
Override protected Query getPrefixQuery(String field, String val) throws ParseException { //System.out.println("getPrefixQuery: val="+val); if (val.equals("") && field.equals("*")) { return new MatchAllDocsQuery(); } this.type = QType.PREFIX; this.field = field; this.val = val; return getAliasedQuery(); }
// in core/src/java/org/apache/solr/search/ExtendedDismaxQParserPlugin.java
Override protected Query newFieldQuery(Analyzer analyzer, String field, String queryText, boolean quoted) throws ParseException { Analyzer actualAnalyzer; if (removeStopFilter) { if (nonStopFilterAnalyzerPerField == null) { nonStopFilterAnalyzerPerField = new HashMap<String, Analyzer>(); } actualAnalyzer = nonStopFilterAnalyzerPerField.get(field); if (actualAnalyzer == null) { actualAnalyzer = noStopwordFilterAnalyzer(field); } } else { actualAnalyzer = parser.getReq().getSchema().getFieldType(field).getQueryAnalyzer(); } return super.newFieldQuery(actualAnalyzer, field, queryText, quoted); }
// in core/src/java/org/apache/solr/search/ExtendedDismaxQParserPlugin.java
Override protected Query getRangeQuery(String field, String a, String b, boolean startInclusive, boolean endInclusive) throws ParseException { //System.out.println("getRangeQuery:"); this.type = QType.RANGE; this.field = field; this.val = a; this.val2 = b; this.bool = startInclusive; this.bool2 = endInclusive; return getAliasedQuery(); }
// in core/src/java/org/apache/solr/search/ExtendedDismaxQParserPlugin.java
Override protected Query getWildcardQuery(String field, String val) throws ParseException { //System.out.println("getWildcardQuery: val="+val); if (val.equals("*")) { if (field.equals("*")) { return new MatchAllDocsQuery(); } else{ return getPrefixQuery(field,""); } } this.type = QType.WILDCARD; this.field = field; this.val = val; return getAliasedQuery(); }
// in core/src/java/org/apache/solr/search/ExtendedDismaxQParserPlugin.java
Override protected Query getFuzzyQuery(String field, String val, float minSimilarity) throws ParseException { //System.out.println("getFuzzyQuery: val="+val); this.type = QType.FUZZY; this.field = field; this.val = val; this.flt = minSimilarity; return getAliasedQuery(); }
// in core/src/java/org/apache/solr/search/ExtendedDismaxQParserPlugin.java
protected Query getAliasedQuery() throws ParseException { Alias a = aliases.get(field); this.validateCyclicAliasing(field); if (a != null) { List<Query> lst = getQueries(a); if (lst == null || lst.size()==0) return getQuery(); // make a DisjunctionMaxQuery in this case too... it will stop // the "mm" processing from making everything required in the case // that the query expanded to multiple clauses. // DisMaxQuery.rewrite() removes itself if there is just a single clause anyway. // if (lst.size()==1) return lst.get(0); if (makeDismax) { DisjunctionMaxQuery q = new DisjunctionMaxQuery(lst, a.tie); return q; } else { // should we disable coord? BooleanQuery q = new BooleanQuery(disableCoord); for (Query sub : lst) { q.add(sub, BooleanClause.Occur.SHOULD); } return q; } } else { // verify that a fielded query is actually on a field that exists... if not, // then throw an exception to get us out of here, and we'll treat it like a // literal when we try the escape+re-parse. if (exceptions) { FieldType ft = schema.getFieldTypeNoEx(field); if (ft == null && null == MagicFieldName.get(field)) { throw unknownField; } } return getQuery(); } }
// in core/src/java/org/apache/solr/search/ExtendedDismaxQParserPlugin.java
private void validateCyclicAliasing(String field) throws ParseException { Set<String> set = new HashSet<String>(); set.add(field); if(validateField(field, set)) { throw new ParseException("Field aliases lead to a cycle"); } }
// in core/src/java/org/apache/solr/search/ExtendedDismaxQParserPlugin.java
protected List<Query> getQueries(Alias a) throws ParseException { if (a == null) return null; if (a.fields.size()==0) return null; List<Query> lst= new ArrayList<Query>(4); for (String f : a.fields.keySet()) { this.field = f; Query sub = getAliasedQuery(); if (sub != null) { Float boost = a.fields.get(f); if (boost != null) { sub.setBoost(boost); } lst.add(sub); } } return lst; }
// in core/src/java/org/apache/solr/search/ExtendedDismaxQParserPlugin.java
private Query getQuery() throws ParseException { try { switch (type) { case FIELD: // fallthrough case PHRASE: Query query = super.getFieldQuery(field, val, type == QType.PHRASE); if (query instanceof PhraseQuery) { PhraseQuery pq = (PhraseQuery)query; if (minClauseSize > 1 && pq.getTerms().length < minClauseSize) return null; ((PhraseQuery)query).setSlop(slop); } else if (query instanceof MultiPhraseQuery) { MultiPhraseQuery pq = (MultiPhraseQuery)query; if (minClauseSize > 1 && pq.getTermArrays().size() < minClauseSize) return null; ((MultiPhraseQuery)query).setSlop(slop); } else if (minClauseSize > 1) { // if it's not a type of phrase query, it doesn't meet the minClauseSize requirements return null; } return query; case PREFIX: return super.getPrefixQuery(field, val); case WILDCARD: return super.getWildcardQuery(field, val); case FUZZY: return super.getFuzzyQuery(field, val, flt); case RANGE: return super.getRangeQuery(field, val, val2, bool, bool2); } return null; } catch (Exception e) { // an exception here is due to the field query not being compatible with the input text // for example, passing a string to a numeric field. return null; } }
// in core/src/java/org/apache/solr/search/ReturnFields.java
String getFieldName(QueryParsing.StrParser sp) throws ParseException { sp.eatws(); int id_start = sp.pos; char ch; if (sp.pos < sp.end && (ch = sp.val.charAt(sp.pos)) != '$' && Character.isJavaIdentifierStart(ch)) { sp.pos++; while (sp.pos < sp.end) { ch = sp.val.charAt(sp.pos); if (!Character.isJavaIdentifierPart(ch) && ch != '.' && ch != '-') { break; } sp.pos++; } return sp.val.substring(id_start, sp.pos); } return null; }
// in core/src/java/org/apache/solr/search/NestedQParserPlugin.java
Override public QParser createParser(String qstr, SolrParams localParams, SolrParams params, SolrQueryRequest req) { return new QParser(qstr, localParams, params, req) { QParser baseParser; ValueSource vs; String b; @Override public Query parse() throws ParseException { baseParser = subQuery(localParams.get(QueryParsing.V), null); return baseParser.getQuery(); } @Override public String[] getDefaultHighlightFields() { return baseParser.getDefaultHighlightFields(); } @Override public Query getHighlightQuery() throws ParseException { return baseParser.getHighlightQuery(); } @Override public void addDebugInfo(NamedList<Object> debugInfo) { // encapsulate base debug info in a sub-list? baseParser.addDebugInfo(debugInfo); } }; }
// in core/src/java/org/apache/solr/search/NestedQParserPlugin.java
Override public Query parse() throws ParseException { baseParser = subQuery(localParams.get(QueryParsing.V), null); return baseParser.getQuery(); }
// in core/src/java/org/apache/solr/search/NestedQParserPlugin.java
Override public Query getHighlightQuery() throws ParseException { return baseParser.getHighlightQuery(); }
// in core/src/java/org/apache/solr/search/QueryParsing.java
public static int parseLocalParams(String txt, int start, Map<String, String> target, SolrParams params) throws ParseException { return parseLocalParams(txt, start, target, params, LOCALPARAM_START, LOCALPARAM_END); }
// in core/src/java/org/apache/solr/search/QueryParsing.java
public static int parseLocalParams(String txt, int start, Map<String, String> target, SolrParams params, String startString, char endChar) throws ParseException { int off = start; if (!txt.startsWith(startString, off)) return start; StrParser p = new StrParser(txt, start, txt.length()); p.pos += startString.length(); // skip over "{!" for (; ;) { /* if (p.pos>=txt.length()) { throw new ParseException("Missing '}' parsing local params '" + txt + '"'); } */ char ch = p.peek(); if (ch == endChar) { return p.pos + 1; } String id = p.getId(); if (id.length() == 0) { throw new ParseException("Expected ending character '" + endChar + "' parsing local params '" + txt + '"'); } String val = null; ch = p.peek(); if (ch != '=') { // single word... treat {!func} as type=func for easy lookup val = id; id = TYPE; } else { // saw equals, so read value p.pos++; ch = p.peek(); boolean deref = false; if (ch == '$') { p.pos++; ch = p.peek(); deref = true; // dereference whatever value is read by treating it as a variable name } if (ch == '\"' || ch == '\'') { val = p.getQuotedString(); } else { // read unquoted literal ended by whitespace or endChar (normally '}') // there is no escaping. int valStart = p.pos; for (; ;) { if (p.pos >= p.end) { throw new ParseException("Missing end to unquoted value starting at " + valStart + " str='" + txt + "'"); } char c = p.val.charAt(p.pos); if (c == endChar || Character.isWhitespace(c)) { val = p.val.substring(valStart, p.pos); break; } p.pos++; } } if (deref) { // dereference parameter if (params != null) { val = params.get(val); } } } if (target != null) target.put(id, val); } }
// in core/src/java/org/apache/solr/search/QueryParsing.java
public static SolrParams getLocalParams(String txt, SolrParams params) throws ParseException { if (txt == null || !txt.startsWith(LOCALPARAM_START)) { return null; } Map<String, String> localParams = new HashMap<String, String>(); int start = QueryParsing.parseLocalParams(txt, 0, localParams, params); String val = localParams.get(V); if (val == null) { val = txt.substring(start); localParams.put(V, val); } else { // localParams.put(VAL_EXPLICIT, "true"); } return new MapSolrParams(localParams); }
// in core/src/java/org/apache/solr/search/QueryParsing.java
void expect(String s) throws ParseException { eatws(); int slen = s.length(); if (val.regionMatches(pos, s, 0, slen)) { pos += slen; } else { throw new ParseException("Expected '" + s + "' at position " + pos + " in '" + val + "'"); } }
// in core/src/java/org/apache/solr/search/QueryParsing.java
float getFloat() throws ParseException { eatws(); char[] arr = new char[end - pos]; int i; for (i = 0; i < arr.length; i++) { char ch = val.charAt(pos); if ((ch >= '0' && ch <= '9') || ch == '+' || ch == '-' || ch == '.' || ch == 'e' || ch == 'E' ) { pos++; arr[i] = ch; } else { break; } } return Float.parseFloat(new String(arr, 0, i)); }
// in core/src/java/org/apache/solr/search/QueryParsing.java
Number getNumber() throws ParseException { eatws(); int start = pos; boolean flt = false; while (pos < end) { char ch = val.charAt(pos); if ((ch >= '0' && ch <= '9') || ch == '+' || ch == '-') { pos++; } else if (ch == '.' || ch =='e' || ch=='E') { flt = true; pos++; } else { break; } } String v = val.substring(start,pos); if (flt) { return Double.parseDouble(v); } else { return Long.parseLong(v); } }
// in core/src/java/org/apache/solr/search/QueryParsing.java
double getDouble() throws ParseException { eatws(); char[] arr = new char[end - pos]; int i; for (i = 0; i < arr.length; i++) { char ch = val.charAt(pos); if ((ch >= '0' && ch <= '9') || ch == '+' || ch == '-' || ch == '.' || ch == 'e' || ch == 'E' ) { pos++; arr[i] = ch; } else { break; } } return Double.parseDouble(new String(arr, 0, i)); }
// in core/src/java/org/apache/solr/search/QueryParsing.java
int getInt() throws ParseException { eatws(); char[] arr = new char[end - pos]; int i; for (i = 0; i < arr.length; i++) { char ch = val.charAt(pos); if ((ch >= '0' && ch <= '9') || ch == '+' || ch == '-' ) { pos++; arr[i] = ch; } else { break; } } return Integer.parseInt(new String(arr, 0, i)); }
// in core/src/java/org/apache/solr/search/QueryParsing.java
String getId() throws ParseException { return getId("Expected identifier"); }
// in core/src/java/org/apache/solr/search/QueryParsing.java
String getId(String errMessage) throws ParseException { eatws(); int id_start = pos; char ch; if (pos < end && (ch = val.charAt(pos)) != '$' && Character.isJavaIdentifierStart(ch)) { pos++; while (pos < end) { ch = val.charAt(pos); // if (!Character.isJavaIdentifierPart(ch) && ch != '.' && ch != ':') { if (!Character.isJavaIdentifierPart(ch) && ch != '.') { break; } pos++; } return val.substring(id_start, pos); } if (errMessage != null) { throw new ParseException(errMessage + " at pos " + pos + " str='" + val + "'"); } return null; }
// in core/src/java/org/apache/solr/search/QueryParsing.java
public String getGlobbedId(String errMessage) throws ParseException { eatws(); int id_start = pos; char ch; if (pos < end && (ch = val.charAt(pos)) != '$' && (Character.isJavaIdentifierStart(ch) || ch=='?' || ch=='*')) { pos++; while (pos < end) { ch = val.charAt(pos); if (!(Character.isJavaIdentifierPart(ch) || ch=='?' || ch=='*') && ch != '.') { break; } pos++; } return val.substring(id_start, pos); } if (errMessage != null) { throw new ParseException(errMessage + " at pos " + pos + " str='" + val + "'"); } return null; }
// in core/src/java/org/apache/solr/search/QueryParsing.java
Boolean getSortDirection() throws ParseException { final int startPos = pos; final String order = getId(null); Boolean top = null; if (null != order) { if ("desc".equals(order) || "top".equals(order)) { top = true; } else if ("asc".equals(order) || "bottom".equals(order)) { top = false; } // it's not a legal direction if more stuff comes after it eatws(); final char c = ch(); if (0 == c) { // :NOOP } else if (',' == c) { pos++; } else { top = null; } } if (null == top) pos = startPos; // no direction, reset return top; }
// in core/src/java/org/apache/solr/search/QueryParsing.java
String getQuotedString() throws ParseException { eatws(); char delim = peekChar(); if (!(delim == '\"' || delim == '\'')) { return null; } int val_start = ++pos; StringBuilder sb = new StringBuilder(); // needed for escaping for (; ;) { if (pos >= end) { throw new ParseException("Missing end quote for string at pos " + (val_start - 1) + " str='" + val + "'"); } char ch = val.charAt(pos); if (ch == '\\') { pos++; if (pos >= end) break; ch = val.charAt(pos); switch (ch) { case 'n': ch = '\n'; break; case 't': ch = '\t'; break; case 'r': ch = '\r'; break; case 'b': ch = '\b'; break; case 'f': ch = '\f'; break; case 'u': if (pos + 4 >= end) { throw new ParseException("bad unicode escape \\uxxxx at pos" + (val_start - 1) + " str='" + val + "'"); } ch = (char) Integer.parseInt(val.substring(pos + 1, pos + 5), 16); pos += 4; break; } } else if (ch == delim) { pos++; // skip over the quote break; } sb.append(ch); pos++; } return sb.toString(); }
// in core/src/java/org/apache/solr/search/grouping/distributed/command/QueryCommand.java
public Builder setQuery(String groupQueryString, SolrQueryRequest request) throws ParseException { QParser parser = QParser.getParser(groupQueryString, null, request); this.queryString = groupQueryString; return setQuery(parser.getQuery()); }
// in core/src/java/org/apache/solr/search/JoinQParserPlugin.java
public QParser createParser(String qstr, SolrParams localParams, SolrParams params, SolrQueryRequest req) { return new QParser(qstr, localParams, params, req) { public Query parse() throws ParseException { String fromField = getParam("from"); String fromIndex = getParam("fromIndex"); String toField = getParam("to"); String v = localParams.get("v"); Query fromQuery; long fromCoreOpenTime = 0; if (fromIndex != null && !fromIndex.equals(req.getCore().getCoreDescriptor().getName()) ) { CoreContainer container = req.getCore().getCoreDescriptor().getCoreContainer(); final SolrCore fromCore = container.getCore(fromIndex); RefCounted<SolrIndexSearcher> fromHolder = null; if (fromCore == null) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Cross-core join: no such core " + fromIndex); } LocalSolrQueryRequest otherReq = new LocalSolrQueryRequest(fromCore, params); try { QParser parser = QParser.getParser(v, "lucene", otherReq); fromQuery = parser.getQuery(); fromHolder = fromCore.getRegisteredSearcher(); if (fromHolder != null) fromCoreOpenTime = fromHolder.get().getOpenTime(); } finally { otherReq.close(); fromCore.close(); if (fromHolder != null) fromHolder.decref(); } } else { QParser fromQueryParser = subQuery(v, null); fromQuery = fromQueryParser.getQuery(); } JoinQuery jq = new JoinQuery(fromField, toField, fromIndex, fromQuery); jq.fromCoreOpenTime = fromCoreOpenTime; return jq; } }; }
// in core/src/java/org/apache/solr/search/JoinQParserPlugin.java
public Query parse() throws ParseException { String fromField = getParam("from"); String fromIndex = getParam("fromIndex"); String toField = getParam("to"); String v = localParams.get("v"); Query fromQuery; long fromCoreOpenTime = 0; if (fromIndex != null && !fromIndex.equals(req.getCore().getCoreDescriptor().getName()) ) { CoreContainer container = req.getCore().getCoreDescriptor().getCoreContainer(); final SolrCore fromCore = container.getCore(fromIndex); RefCounted<SolrIndexSearcher> fromHolder = null; if (fromCore == null) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Cross-core join: no such core " + fromIndex); } LocalSolrQueryRequest otherReq = new LocalSolrQueryRequest(fromCore, params); try { QParser parser = QParser.getParser(v, "lucene", otherReq); fromQuery = parser.getQuery(); fromHolder = fromCore.getRegisteredSearcher(); if (fromHolder != null) fromCoreOpenTime = fromHolder.get().getOpenTime(); } finally { otherReq.close(); fromCore.close(); if (fromHolder != null) fromHolder.decref(); } } else { QParser fromQueryParser = subQuery(v, null); fromQuery = fromQueryParser.getQuery(); } JoinQuery jq = new JoinQuery(fromField, toField, fromIndex, fromQuery); jq.fromCoreOpenTime = fromCoreOpenTime; return jq; }
// in core/src/java/org/apache/solr/search/PrefixQParserPlugin.java
Override public QParser createParser(String qstr, SolrParams localParams, SolrParams params, SolrQueryRequest req) { return new QParser(qstr, localParams, params, req) { @Override public Query parse() throws ParseException { return new PrefixQuery(new Term(localParams.get(QueryParsing.F), localParams.get(QueryParsing.V))); } }; }
// in core/src/java/org/apache/solr/search/PrefixQParserPlugin.java
Override public Query parse() throws ParseException { return new PrefixQuery(new Term(localParams.get(QueryParsing.F), localParams.get(QueryParsing.V))); }
// in core/src/java/org/apache/solr/search/LuceneQParserPlugin.java
Override public Query parse() throws ParseException { String qstr = getString(); if (qstr == null || qstr.length()==0) return null; String defaultField = getParam(CommonParams.DF); if (defaultField==null) { defaultField = getReq().getSchema().getDefaultSearchFieldName(); } lparser = new SolrQueryParser(this, defaultField); lparser.setDefaultOperator (QueryParsing.getQueryParserDefaultOperator(getReq().getSchema(), getParam(QueryParsing.OP))); return lparser.parse(qstr); }
// in core/src/java/org/apache/solr/search/LuceneQParserPlugin.java
Override public Query parse() throws ParseException { // handle legacy "query;sort" syntax if (getLocalParams() == null) { String qstr = getString(); if (qstr == null || qstr.length() == 0) return null; sortStr = getParams().get(CommonParams.SORT); if (sortStr == null) { // sort may be legacy form, included in the query string List<String> commands = StrUtils.splitSmart(qstr,';'); if (commands.size() == 2) { qstr = commands.get(0); sortStr = commands.get(1); } else if (commands.size() == 1) { // This is need to support the case where someone sends: "q=query;" qstr = commands.get(0); } else if (commands.size() > 2) { throw new ParseException("If you want to use multiple ';' in the query, use the 'sort' param."); } } setString(qstr); } return super.parse(); }
// in core/src/java/org/apache/solr/search/LuceneQParserPlugin.java
Override public SortSpec getSort(boolean useGlobal) throws ParseException { SortSpec sort = super.getSort(useGlobal); if (sortStr != null && sortStr.length()>0 && sort.getSort()==null) { Sort oldSort = QueryParsing.parseSort(sortStr, getReq()); if( oldSort != null ) { sort.sort = oldSort; } } return sort; }
// in core/src/java/org/apache/solr/search/SolrQueryParser.java
Override protected Query getFieldQuery(String field, String queryText, boolean quoted) throws ParseException { checkNullField(field); // intercept magic field name of "_" to use as a hook for our // own functions. if (field.charAt(0) == '_' && parser != null) { MagicFieldName magic = MagicFieldName.get(field); if (null != magic) { QParser nested = parser.subQuery(queryText, magic.subParser); return nested.getQuery(); } } SchemaField sf = schema.getFieldOrNull(field); if (sf != null) { FieldType ft = sf.getType(); // delegate to type for everything except tokenized fields if (ft.isTokenized()) { return super.getFieldQuery(field, queryText, quoted || (ft instanceof TextField && ((TextField)ft).getAutoGeneratePhraseQueries())); } else { return sf.getType().getFieldQuery(parser, sf, queryText); } } // default to a normal field query return super.getFieldQuery(field, queryText, quoted); }
// in core/src/java/org/apache/solr/search/SolrQueryParser.java
Override protected Query getRangeQuery(String field, String part1, String part2, boolean startInclusive, boolean endInclusive) throws ParseException { checkNullField(field); SchemaField sf = schema.getField(field); return sf.getType().getRangeQuery(parser, sf, part1, part2, startInclusive, endInclusive); }
// in core/src/java/org/apache/solr/search/SolrQueryParser.java
Override protected Query getPrefixQuery(String field, String termStr) throws ParseException { checkNullField(field); termStr = analyzeIfMultitermTermText(field, termStr, schema.getFieldType(field)); // Solr has always used constant scoring for prefix queries. This should return constant scoring by default. return newPrefixQuery(new Term(field, termStr)); }
// in core/src/java/org/apache/solr/search/SolrQueryParser.java
Override protected Query getWildcardQuery(String field, String termStr) throws ParseException { // *:* -> MatchAllDocsQuery if ("*".equals(field) && "*".equals(termStr)) { return newMatchAllDocsQuery(); } FieldType fieldType = schema.getFieldType(field); termStr = analyzeIfMultitermTermText(field, termStr, fieldType); // can we use reversed wildcards in this field? ReversedWildcardFilterFactory factory = getReversedWildcardFilterFactory(fieldType); if (factory != null) { Term term = new Term(field, termStr); // fsa representing the query Automaton automaton = WildcardQuery.toAutomaton(term); // TODO: we should likely use the automaton to calculate shouldReverse, too. if (factory.shouldReverse(termStr)) { automaton = BasicOperations.concatenate(automaton, BasicAutomata.makeChar(factory.getMarkerChar())); SpecialOperations.reverse(automaton); } else { // reverse wildcardfilter is active: remove false positives // fsa representing false positives (markerChar*) Automaton falsePositives = BasicOperations.concatenate( BasicAutomata.makeChar(factory.getMarkerChar()), BasicAutomata.makeAnyString()); // subtract these away automaton = BasicOperations.minus(automaton, falsePositives); } return new AutomatonQuery(term, automaton) { // override toString so its completely transparent @Override public String toString(String field) { StringBuilder buffer = new StringBuilder(); if (!getField().equals(field)) { buffer.append(getField()); buffer.append(":"); } buffer.append(term.text()); buffer.append(ToStringUtils.boost(getBoost())); return buffer.toString(); } }; } // Solr has always used constant scoring for wildcard queries. This should return constant scoring by default. return newWildcardQuery(new Term(field, termStr)); }
// in core/src/java/org/apache/solr/search/SolrQueryParser.java
Override protected Query getRegexpQuery(String field, String termStr) throws ParseException { termStr = analyzeIfMultitermTermText(field, termStr, schema.getFieldType(field)); return newRegexpQuery(new Term(field, termStr)); }
// in core/src/java/org/apache/solr/search/BoostQParserPlugin.java
Override public QParser createParser(String qstr, SolrParams localParams, SolrParams params, SolrQueryRequest req) { return new QParser(qstr, localParams, params, req) { QParser baseParser; ValueSource vs; String b; @Override public Query parse() throws ParseException { b = localParams.get(BOOSTFUNC); baseParser = subQuery(localParams.get(QueryParsing.V), null); Query q = baseParser.getQuery(); if (b == null) return q; Query bq = subQuery(b, FunctionQParserPlugin.NAME).getQuery(); if (bq instanceof FunctionQuery) { vs = ((FunctionQuery)bq).getValueSource(); } else { vs = new QueryValueSource(bq, 0.0f); } return new BoostedQuery(q, vs); } @Override public String[] getDefaultHighlightFields() { return baseParser.getDefaultHighlightFields(); } @Override public Query getHighlightQuery() throws ParseException { return baseParser.getHighlightQuery(); } @Override public void addDebugInfo(NamedList<Object> debugInfo) { // encapsulate base debug info in a sub-list? baseParser.addDebugInfo(debugInfo); debugInfo.add("boost_str",b); debugInfo.add("boost_parsed",vs); } }; }
// in core/src/java/org/apache/solr/search/BoostQParserPlugin.java
Override public Query parse() throws ParseException { b = localParams.get(BOOSTFUNC); baseParser = subQuery(localParams.get(QueryParsing.V), null); Query q = baseParser.getQuery(); if (b == null) return q; Query bq = subQuery(b, FunctionQParserPlugin.NAME).getQuery(); if (bq instanceof FunctionQuery) { vs = ((FunctionQuery)bq).getValueSource(); } else { vs = new QueryValueSource(bq, 0.0f); } return new BoostedQuery(q, vs); }
// in core/src/java/org/apache/solr/search/BoostQParserPlugin.java
Override public Query getHighlightQuery() throws ParseException { return baseParser.getHighlightQuery(); }
// in core/src/java/org/apache/solr/search/function/distance/HaversineConstFunction.java
Override public ValueSource parse(FunctionQParser fp) throws ParseException { // TODO: dispatch through SpatialQueriable in the future? List<ValueSource> sources = fp.parseValueSourceList(); // "m" is a multi-value source, "x" is a single-value source // allow (m,m) (m,x,x) (x,x,m) (x,x,x,x) // if not enough points are present, "pt" will be checked first, followed by "sfield". MultiValueSource mv1 = null; MultiValueSource mv2 = null; if (sources.size() == 0) { // nothing to do now } else if (sources.size() == 1) { ValueSource vs = sources.get(0); if (!(vs instanceof MultiValueSource)) { throw new ParseException("geodist - invalid parameters:" + sources); } mv1 = (MultiValueSource)vs; } else if (sources.size() == 2) { ValueSource vs1 = sources.get(0); ValueSource vs2 = sources.get(1); if (vs1 instanceof MultiValueSource && vs2 instanceof MultiValueSource) { mv1 = (MultiValueSource)vs1; mv2 = (MultiValueSource)vs2; } else { mv1 = makeMV(sources, sources); } } else if (sources.size()==3) { ValueSource vs1 = sources.get(0); ValueSource vs2 = sources.get(1); if (vs1 instanceof MultiValueSource) { // (m,x,x) mv1 = (MultiValueSource)vs1; mv2 = makeMV(sources.subList(1,3), sources); } else { // (x,x,m) mv1 = makeMV(sources.subList(0,2), sources); vs1 = sources.get(2); if (!(vs1 instanceof MultiValueSource)) { throw new ParseException("geodist - invalid parameters:" + sources); } mv2 = (MultiValueSource)vs1; } } else if (sources.size()==4) { mv1 = makeMV(sources.subList(0,2), sources); mv2 = makeMV(sources.subList(2,4), sources); } else if (sources.size() > 4) { throw new ParseException("geodist - invalid parameters:" + sources); } if (mv1 == null) { mv1 = parsePoint(fp); mv2 = parseSfield(fp); } else if (mv2 == null) { mv2 = parsePoint(fp); if (mv2 == null) mv2 = parseSfield(fp); } if (mv1 == null || mv2 == null) { throw new ParseException("geodist - not enough parameters:" + sources); } // We have all the parameters at this point, now check if one of the points is constant double[] constants; constants = getConstants(mv1); MultiValueSource other = mv2; if (constants == null) { constants = getConstants(mv2); other = mv1; } if (constants != null && other instanceof VectorValueSource) { return new HaversineConstFunction(constants[0], constants[1], (VectorValueSource)other); } return new HaversineFunction(mv1, mv2, DistanceUtils.EARTH_MEAN_RADIUS_KM, true); }
// in core/src/java/org/apache/solr/search/function/distance/HaversineConstFunction.java
private static VectorValueSource makeMV(List<ValueSource> sources, List<ValueSource> orig) throws ParseException { ValueSource vs1 = sources.get(0); ValueSource vs2 = sources.get(1); if (vs1 instanceof MultiValueSource || vs2 instanceof MultiValueSource) { throw new ParseException("geodist - invalid parameters:" + orig); } return new VectorValueSource(sources); }
// in core/src/java/org/apache/solr/search/function/distance/HaversineConstFunction.java
private static MultiValueSource parsePoint(FunctionQParser fp) throws ParseException { String pt = fp.getParam(SpatialParams.POINT); if (pt == null) return null; double[] point = null; try { point = ParseUtils.parseLatitudeLongitude(pt); } catch (InvalidShapeException e) { throw new ParseException("Bad spatial pt:" + pt); } return new VectorValueSource(Arrays.<ValueSource>asList(new DoubleConstValueSource(point[0]),new DoubleConstValueSource(point[1]))); }
// in core/src/java/org/apache/solr/search/function/distance/HaversineConstFunction.java
private static MultiValueSource parseSfield(FunctionQParser fp) throws ParseException { String sfield = fp.getParam(SpatialParams.FIELD); if (sfield == null) return null; SchemaField sf = fp.getReq().getSchema().getField(sfield); ValueSource vs = sf.getType().getValueSource(sf, fp); if (!(vs instanceof MultiValueSource)) { throw new ParseException("Spatial field must implement MultiValueSource:" + sf); } return (MultiValueSource)vs; }
// in core/src/java/org/apache/solr/search/FunctionRangeQParserPlugin.java
Override public QParser createParser(String qstr, SolrParams localParams, SolrParams params, SolrQueryRequest req) { return new QParser(qstr, localParams, params, req) { ValueSource vs; String funcStr; @Override public Query parse() throws ParseException { funcStr = localParams.get(QueryParsing.V, null); Query funcQ = subQuery(funcStr, FunctionQParserPlugin.NAME).getQuery(); if (funcQ instanceof FunctionQuery) { vs = ((FunctionQuery)funcQ).getValueSource(); } else { vs = new QueryValueSource(funcQ, 0.0f); } String l = localParams.get("l"); String u = localParams.get("u"); boolean includeLower = localParams.getBool("incl",true); boolean includeUpper = localParams.getBool("incu",true); // TODO: add a score=val option to allow score to be the value ValueSourceRangeFilter rf = new ValueSourceRangeFilter(vs, l, u, includeLower, includeUpper); FunctionRangeQuery frq = new FunctionRangeQuery(rf); return frq; } }; }
// in core/src/java/org/apache/solr/search/FunctionRangeQParserPlugin.java
Override public Query parse() throws ParseException { funcStr = localParams.get(QueryParsing.V, null); Query funcQ = subQuery(funcStr, FunctionQParserPlugin.NAME).getQuery(); if (funcQ instanceof FunctionQuery) { vs = ((FunctionQuery)funcQ).getValueSource(); } else { vs = new QueryValueSource(funcQ, 0.0f); } String l = localParams.get("l"); String u = localParams.get("u"); boolean includeLower = localParams.getBool("incl",true); boolean includeUpper = localParams.getBool("incu",true); // TODO: add a score=val option to allow score to be the value ValueSourceRangeFilter rf = new ValueSourceRangeFilter(vs, l, u, includeLower, includeUpper); FunctionRangeQuery frq = new FunctionRangeQuery(rf); return frq; }
// in core/src/java/org/apache/solr/search/RawQParserPlugin.java
Override public QParser createParser(String qstr, SolrParams localParams, SolrParams params, SolrQueryRequest req) { return new QParser(qstr, localParams, params, req) { @Override public Query parse() throws ParseException { return new TermQuery(new Term(localParams.get(QueryParsing.F), localParams.get(QueryParsing.V))); } }; }
// in core/src/java/org/apache/solr/search/RawQParserPlugin.java
Override public Query parse() throws ParseException { return new TermQuery(new Term(localParams.get(QueryParsing.F), localParams.get(QueryParsing.V))); }
// in core/src/java/org/apache/solr/search/FunctionQParser.java
Override public Query parse() throws ParseException { sp = new QueryParsing.StrParser(getString()); ValueSource vs = null; List<ValueSource> lst = null; for(;;) { ValueSource valsource = parseValueSource(false); sp.eatws(); if (!parseMultipleSources) { vs = valsource; break; } else { if (lst != null) { lst.add(valsource); } else { vs = valsource; } } // check if there is a "," separator if (sp.peek() != ',') break; consumeArgumentDelimiter(); if (lst == null) { lst = new ArrayList<ValueSource>(2); lst.add(valsource); } } if (parseToEnd && sp.pos < sp.end) { throw new ParseException("Unexpected text after function: " + sp.val.substring(sp.pos, sp.end)); } if (lst != null) { vs = new VectorValueSource(lst); } return new FunctionQuery(vs); }
// in core/src/java/org/apache/solr/search/FunctionQParser.java
public boolean hasMoreArguments() throws ParseException { int ch = sp.peek(); /* determine whether the function is ending with a paren or end of str */ return (! (ch == 0 || ch == ')') ); }
// in core/src/java/org/apache/solr/search/FunctionQParser.java
public String parseId() throws ParseException { String value = parseArg(); if (argWasQuoted) throw new ParseException("Expected identifier instead of quoted string:" + value); return value; }
// in core/src/java/org/apache/solr/search/FunctionQParser.java
public Float parseFloat() throws ParseException { String str = parseArg(); if (argWasQuoted()) throw new ParseException("Expected float instead of quoted string:" + str); float value = Float.parseFloat(str); return value; }
// in core/src/java/org/apache/solr/search/FunctionQParser.java
public double parseDouble() throws ParseException { String str = parseArg(); if (argWasQuoted()) throw new ParseException("Expected double instead of quoted string:" + str); double value = Double.parseDouble(str); return value; }
// in core/src/java/org/apache/solr/search/FunctionQParser.java
public int parseInt() throws ParseException { String str = parseArg(); if (argWasQuoted()) throw new ParseException("Expected double instead of quoted string:" + str); int value = Integer.parseInt(str); return value; }
// in core/src/java/org/apache/solr/search/FunctionQParser.java
public String parseArg() throws ParseException { argWasQuoted = false; sp.eatws(); char ch = sp.peek(); String val = null; switch (ch) { case ')': return null; case '$': sp.pos++; String param = sp.getId(); val = getParam(param); break; case '\'': case '"': val = sp.getQuotedString(); argWasQuoted = true; break; default: // read unquoted literal ended by whitespace ',' or ')' // there is no escaping. int valStart = sp.pos; for (;;) { if (sp.pos >= sp.end) { throw new ParseException("Missing end to unquoted value starting at " + valStart + " str='" + sp.val +"'"); } char c = sp.val.charAt(sp.pos); if (c==')' || c==',' || Character.isWhitespace(c)) { val = sp.val.substring(valStart, sp.pos); break; } sp.pos++; } } sp.eatws(); consumeArgumentDelimiter(); return val; }
// in core/src/java/org/apache/solr/search/FunctionQParser.java
public List<ValueSource> parseValueSourceList() throws ParseException { List<ValueSource> sources = new ArrayList<ValueSource>(3); while (hasMoreArguments()) { sources.add(parseValueSource(true)); } return sources; }
// in core/src/java/org/apache/solr/search/FunctionQParser.java
public ValueSource parseValueSource() throws ParseException { /* consume the delimiter afterward for an external call to parseValueSource */ return parseValueSource(true); }
// in core/src/java/org/apache/solr/search/FunctionQParser.java
public Query parseNestedQuery() throws ParseException { Query nestedQuery; if (sp.opt("$")) { String param = sp.getId(); String qstr = getParam(param); qstr = qstr==null ? "" : qstr; nestedQuery = subQuery(qstr, null).getQuery(); } else { int start = sp.pos; String v = sp.val; String qs = v; HashMap nestedLocalParams = new HashMap<String,String>(); int end = QueryParsing.parseLocalParams(qs, start, nestedLocalParams, getParams()); QParser sub; if (end>start) { if (nestedLocalParams.get(QueryParsing.V) != null) { // value specified directly in local params... so the end of the // query should be the end of the local params. sub = subQuery(qs.substring(start, end), null); } else { // value here is *after* the local params... ask the parser. sub = subQuery(qs, null); // int subEnd = sub.findEnd(')'); // TODO.. implement functions to find the end of a nested query throw new ParseException("Nested local params must have value in v parameter. got '" + qs + "'"); } } else { throw new ParseException("Nested function query must use $param or {!v=value} forms. got '" + qs + "'"); } sp.pos += end-start; // advance past nested query nestedQuery = sub.getQuery(); } consumeArgumentDelimiter(); return nestedQuery; }
// in core/src/java/org/apache/solr/search/FunctionQParser.java
protected ValueSource parseValueSource(boolean doConsumeDelimiter) throws ParseException { ValueSource valueSource; int ch = sp.peek(); if (ch>='0' && ch<='9' || ch=='.' || ch=='+' || ch=='-') { Number num = sp.getNumber(); if (num instanceof Long) { valueSource = new LongConstValueSource(num.longValue()); } else if (num instanceof Double) { valueSource = new DoubleConstValueSource(num.doubleValue()); } else { // shouldn't happen valueSource = new ConstValueSource(num.floatValue()); } } else if (ch == '"' || ch == '\''){ valueSource = new LiteralValueSource(sp.getQuotedString()); } else if (ch == '$') { sp.pos++; String param = sp.getId(); String val = getParam(param); if (val == null) { throw new ParseException("Missing param " + param + " while parsing function '" + sp.val + "'"); } QParser subParser = subQuery(val, "func"); if (subParser instanceof FunctionQParser) { ((FunctionQParser)subParser).setParseMultipleSources(true); } Query subQuery = subParser.getQuery(); if (subQuery instanceof FunctionQuery) { valueSource = ((FunctionQuery) subQuery).getValueSource(); } else { valueSource = new QueryValueSource(subQuery, 0.0f); } /*** // dereference *simple* argument (i.e., can't currently be a function) // In the future we could support full function dereferencing via a stack of ValueSource (or StringParser) objects ch = val.length()==0 ? '\0' : val.charAt(0); if (ch>='0' && ch<='9' || ch=='.' || ch=='+' || ch=='-') { QueryParsing.StrParser sp = new QueryParsing.StrParser(val); Number num = sp.getNumber(); if (num instanceof Long) { valueSource = new LongConstValueSource(num.longValue()); } else if (num instanceof Double) { valueSource = new DoubleConstValueSource(num.doubleValue()); } else { // shouldn't happen valueSource = new ConstValueSource(num.floatValue()); } } else if (ch == '"' || ch == '\'') { QueryParsing.StrParser sp = new QueryParsing.StrParser(val); val = sp.getQuotedString(); valueSource = new LiteralValueSource(val); } else { if (val.length()==0) { valueSource = new LiteralValueSource(val); } else { String id = val; SchemaField f = req.getSchema().getField(id); valueSource = f.getType().getValueSource(f, this); } } ***/ } else { String id = sp.getId(); if (sp.opt("(")) { // a function... look it up. ValueSourceParser argParser = req.getCore().getValueSourceParser(id); if (argParser==null) { throw new ParseException("Unknown function " + id + " in FunctionQuery(" + sp + ")"); } valueSource = argParser.parse(this); sp.expect(")"); } else { if ("true".equals(id)) { valueSource = new BoolConstValueSource(true); } else if ("false".equals(id)) { valueSource = new BoolConstValueSource(false); } else { SchemaField f = req.getSchema().getField(id); valueSource = f.getType().getValueSource(f, this); } } } if (doConsumeDelimiter) consumeArgumentDelimiter(); return valueSource; }
// in core/src/java/org/apache/solr/search/FunctionQParser.java
protected boolean consumeArgumentDelimiter() throws ParseException { /* if a list of args is ending, don't expect the comma */ if (hasMoreArguments()) { sp.expect(","); return true; } return false; }
// in core/src/java/org/apache/solr/search/ValueSourceParser.java
Override public ValueSource parse(FunctionQParser fp) throws ParseException { final ValueSource source = fp.parseValueSource(); return new TestValueSource(source); }
// in core/src/java/org/apache/solr/search/ValueSourceParser.java
Override public ValueSource parse(FunctionQParser fp) throws ParseException { String field = fp.parseId(); return new OrdFieldSource(field); }
// in core/src/java/org/apache/solr/search/ValueSourceParser.java
Override public ValueSource parse(FunctionQParser fp) throws ParseException { return new LiteralValueSource(fp.getString()); }
// in core/src/java/org/apache/solr/search/ValueSourceParser.java
Override public ValueSource parse(FunctionQParser fp) throws ParseException { String field = fp.parseId(); return new ReverseOrdFieldSource(field); }
// in core/src/java/org/apache/solr/search/ValueSourceParser.java
Override public ValueSource parse(FunctionQParser fp) throws ParseException { // top(vs) is now a no-op ValueSource source = fp.parseValueSource(); return source; }
// in core/src/java/org/apache/solr/search/ValueSourceParser.java
Override public ValueSource parse(FunctionQParser fp) throws ParseException { ValueSource source = fp.parseValueSource(); float slope = fp.parseFloat(); float intercept = fp.parseFloat(); return new LinearFloatFunction(source, slope, intercept); }
// in core/src/java/org/apache/solr/search/ValueSourceParser.java
Override public ValueSource parse(FunctionQParser fp) throws ParseException { ValueSource source = fp.parseValueSource(); float m = fp.parseFloat(); float a = fp.parseFloat(); float b = fp.parseFloat(); return new ReciprocalFloatFunction(source, m, a, b); }
// in core/src/java/org/apache/solr/search/ValueSourceParser.java
Override public ValueSource parse(FunctionQParser fp) throws ParseException { ValueSource source = fp.parseValueSource(); float min = fp.parseFloat(); float max = fp.parseFloat(); return new ScaleFloatFunction(source, min, max); }
// in core/src/java/org/apache/solr/search/ValueSourceParser.java
Override public ValueSource parse(FunctionQParser fp) throws ParseException { ValueSource a = fp.parseValueSource(); ValueSource b = fp.parseValueSource(); return new DivFloatFunction(a, b); }
// in core/src/java/org/apache/solr/search/ValueSourceParser.java
Override public ValueSource parse(FunctionQParser fp) throws ParseException { ValueSource source = fp.parseValueSource(); float min = fp.parseFloat(); float max = fp.parseFloat(); float target = fp.parseFloat(); Float def = fp.hasMoreArguments() ? fp.parseFloat() : null; return new RangeMapFloatFunction(source, min, max, target, def); }
// in core/src/java/org/apache/solr/search/ValueSourceParser.java
Override public ValueSource parse(FunctionQParser fp) throws ParseException { ValueSource source = fp.parseValueSource(); return new SimpleFloatFunction(source) { @Override protected String name() { return "abs"; } @Override protected float func(int doc, FunctionValues vals) { return Math.abs(vals.floatVal(doc)); } }; }
// in core/src/java/org/apache/solr/search/ValueSourceParser.java
Override public ValueSource parse(FunctionQParser fp) throws ParseException { List<ValueSource> sources = fp.parseValueSourceList(); return new SumFloatFunction(sources.toArray(new ValueSource[sources.size()])); }
// in core/src/java/org/apache/solr/search/ValueSourceParser.java
Override public ValueSource parse(FunctionQParser fp) throws ParseException { List<ValueSource> sources = fp.parseValueSourceList(); return new ProductFloatFunction(sources.toArray(new ValueSource[sources.size()])); }
// in core/src/java/org/apache/solr/search/ValueSourceParser.java
Override public ValueSource parse(FunctionQParser fp) throws ParseException { ValueSource a = fp.parseValueSource(); ValueSource b = fp.parseValueSource(); return new DualFloatFunction(a, b) { @Override protected String name() { return "sub"; } @Override protected float func(int doc, FunctionValues aVals, FunctionValues bVals) { return aVals.floatVal(doc) - bVals.floatVal(doc); } }; }
// in core/src/java/org/apache/solr/search/ValueSourceParser.java
Override public ValueSource parse(FunctionQParser fp) throws ParseException{ return new VectorValueSource(fp.parseValueSourceList()); }
// in core/src/java/org/apache/solr/search/ValueSourceParser.java
Override public ValueSource parse(FunctionQParser fp) throws ParseException { Query q = fp.parseNestedQuery(); float defVal = 0.0f; if (fp.hasMoreArguments()) { defVal = fp.parseFloat(); } return new QueryValueSource(q, defVal); }
// in core/src/java/org/apache/solr/search/ValueSourceParser.java
Override public ValueSource parse(FunctionQParser fp) throws ParseException { Query q = fp.parseNestedQuery(); ValueSource vs = fp.parseValueSource(); BoostedQuery bq = new BoostedQuery(q, vs); return new QueryValueSource(bq, 0.0f); }
// in core/src/java/org/apache/solr/search/ValueSourceParser.java
Override public ValueSource parse(FunctionQParser fp) throws ParseException { String f0 = fp.parseArg(); String qf = fp.parseArg(); return new JoinDocFreqValueSource( f0, qf ); }
// in core/src/java/org/apache/solr/search/ValueSourceParser.java
Override public ValueSource parse(FunctionQParser fp) throws ParseException { double radius = fp.parseDouble(); //SOLR-2114, make the convert flag required, since the parser doesn't support much in the way of lookahead or the ability to convert a String into a ValueSource boolean convert = Boolean.parseBoolean(fp.parseArg()); MultiValueSource pv1; MultiValueSource pv2; ValueSource one = fp.parseValueSource(); ValueSource two = fp.parseValueSource(); if (fp.hasMoreArguments()) { List<ValueSource> s1 = new ArrayList<ValueSource>(); s1.add(one); s1.add(two); pv1 = new VectorValueSource(s1); ValueSource x2 = fp.parseValueSource(); ValueSource y2 = fp.parseValueSource(); List<ValueSource> s2 = new ArrayList<ValueSource>(); s2.add(x2); s2.add(y2); pv2 = new VectorValueSource(s2); } else { //check to see if we have multiValue source if (one instanceof MultiValueSource && two instanceof MultiValueSource){ pv1 = (MultiValueSource) one; pv2 = (MultiValueSource) two; } else { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Input must either be 2 MultiValueSources, or there must be 4 ValueSources"); } } return new HaversineFunction(pv1, pv2, radius, convert); }
// in core/src/java/org/apache/solr/search/ValueSourceParser.java
Override public ValueSource parse(FunctionQParser fp) throws ParseException { double radius = fp.parseDouble(); ValueSource gh1 = fp.parseValueSource(); ValueSource gh2 = fp.parseValueSource(); return new GeohashHaversineFunction(gh1, gh2, radius); }
// in core/src/java/org/apache/solr/search/ValueSourceParser.java
Override public ValueSource parse(FunctionQParser fp) throws ParseException { ValueSource lat = fp.parseValueSource(); ValueSource lon = fp.parseValueSource(); return new GeohashFunction(lat, lon); }
// in core/src/java/org/apache/solr/search/ValueSourceParser.java
Override public ValueSource parse(FunctionQParser fp) throws ParseException { ValueSource str1 = fp.parseValueSource(); ValueSource str2 = fp.parseValueSource(); String distClass = fp.parseArg(); StringDistance dist = null; if (distClass.equalsIgnoreCase("jw")) { dist = new JaroWinklerDistance(); } else if (distClass.equalsIgnoreCase("edit")) { dist = new LevensteinDistance(); } else if (distClass.equalsIgnoreCase("ngram")) { int ngram = 2; if (fp.hasMoreArguments()) { ngram = fp.parseInt(); } dist = new NGramDistance(ngram); } else { dist = fp.req.getCore().getResourceLoader().newInstance(distClass, StringDistance.class); } return new StringDistanceFunction(str1, str2, dist); }
// in core/src/java/org/apache/solr/search/ValueSourceParser.java
Override public ValueSource parse(FunctionQParser fp) throws ParseException { String fieldName = fp.parseArg(); SchemaField f = fp.getReq().getSchema().getField(fieldName); return f.getType().getValueSource(f, fp); }
// in core/src/java/org/apache/solr/search/ValueSourceParser.java
Override public ValueSource parse(FunctionQParser fp) throws ParseException { List<ValueSource> sources = fp.parseValueSourceList(); return new MaxFloatFunction(sources.toArray(new ValueSource[sources.size()])); }
// in core/src/java/org/apache/solr/search/ValueSourceParser.java
Override public ValueSource parse(FunctionQParser fp) throws ParseException { List<ValueSource> sources = fp.parseValueSourceList(); return new MinFloatFunction(sources.toArray(new ValueSource[sources.size()])); }
// in core/src/java/org/apache/solr/search/ValueSourceParser.java
Override public ValueSource parse(FunctionQParser fp) throws ParseException { List<ValueSource> sources = fp.parseValueSourceList(); MVResult mvr = getMultiValueSources(sources); return new SquaredEuclideanFunction(mvr.mv1, mvr.mv2); }
// in core/src/java/org/apache/solr/search/ValueSourceParser.java
Override public ValueSource parse(FunctionQParser fp) throws ParseException { float power = fp.parseFloat(); List<ValueSource> sources = fp.parseValueSourceList(); MVResult mvr = getMultiValueSources(sources); return new VectorDistanceFunction(power, mvr.mv1, mvr.mv2); }
// in core/src/java/org/apache/solr/search/ValueSourceParser.java
Override public ValueSource parse(FunctionQParser fp) throws ParseException { return new DoubleConstValueSource(Math.PI); }
// in core/src/java/org/apache/solr/search/ValueSourceParser.java
Override public ValueSource parse(FunctionQParser fp) throws ParseException { return new DoubleConstValueSource(Math.E); }
// in core/src/java/org/apache/solr/search/ValueSourceParser.java
Override public ValueSource parse(FunctionQParser fp) throws ParseException { TInfo tinfo = parseTerm(fp); return new DocFreqValueSource(tinfo.field, tinfo.val, tinfo.indexedField, tinfo.indexedBytes); }
// in core/src/java/org/apache/solr/search/ValueSourceParser.java
Override public ValueSource parse(FunctionQParser fp) throws ParseException { TInfo tinfo = parseTerm(fp); return new TotalTermFreqValueSource(tinfo.field, tinfo.val, tinfo.indexedField, tinfo.indexedBytes); }
// in core/src/java/org/apache/solr/search/ValueSourceParser.java
Override public ValueSource parse(FunctionQParser fp) throws ParseException { String field = fp.parseArg(); return new SumTotalTermFreqValueSource(field); }
// in core/src/java/org/apache/solr/search/ValueSourceParser.java
Override public ValueSource parse(FunctionQParser fp) throws ParseException { TInfo tinfo = parseTerm(fp); return new IDFValueSource(tinfo.field, tinfo.val, tinfo.indexedField, tinfo.indexedBytes); }
// in core/src/java/org/apache/solr/search/ValueSourceParser.java
Override public ValueSource parse(FunctionQParser fp) throws ParseException { TInfo tinfo = parseTerm(fp); return new TermFreqValueSource(tinfo.field, tinfo.val, tinfo.indexedField, tinfo.indexedBytes); }
// in core/src/java/org/apache/solr/search/ValueSourceParser.java
Override public ValueSource parse(FunctionQParser fp) throws ParseException { TInfo tinfo = parseTerm(fp); return new TFValueSource(tinfo.field, tinfo.val, tinfo.indexedField, tinfo.indexedBytes); }
// in core/src/java/org/apache/solr/search/ValueSourceParser.java
Override public ValueSource parse(FunctionQParser fp) throws ParseException { String field = fp.parseArg(); return new NormValueSource(field); }
// in core/src/java/org/apache/solr/search/ValueSourceParser.java
Override public ValueSource parse(FunctionQParser fp) throws ParseException { return new MaxDocValueSource(); }
// in core/src/java/org/apache/solr/search/ValueSourceParser.java
Override public ValueSource parse(FunctionQParser fp) throws ParseException { return new NumDocsValueSource(); }
// in core/src/java/org/apache/solr/search/ValueSourceParser.java
Override public ValueSource parse(FunctionQParser fp) throws ParseException { return new BoolConstValueSource(true); }
// in core/src/java/org/apache/solr/search/ValueSourceParser.java
Override public ValueSource parse(FunctionQParser fp) throws ParseException { return new BoolConstValueSource(false); }
// in core/src/java/org/apache/solr/search/ValueSourceParser.java
Override public ValueSource parse(FunctionQParser fp) throws ParseException { ValueSource vs = fp.parseValueSource(); return new SimpleBoolFunction(vs) { @Override protected String name() { return "exists"; } @Override protected boolean func(int doc, FunctionValues vals) { return vals.exists(doc); } }; }
// in core/src/java/org/apache/solr/search/ValueSourceParser.java
Override public ValueSource parse(FunctionQParser fp) throws ParseException { ValueSource vs = fp.parseValueSource(); return new SimpleBoolFunction(vs) { @Override protected boolean func(int doc, FunctionValues vals) { return !vals.boolVal(doc); } @Override protected String name() { return "not"; } }; }
// in core/src/java/org/apache/solr/search/ValueSourceParser.java
Override public ValueSource parse(FunctionQParser fp) throws ParseException { List<ValueSource> sources = fp.parseValueSourceList(); return new MultiBoolFunction(sources) { @Override protected String name() { return "and"; } @Override protected boolean func(int doc, FunctionValues[] vals) { for (FunctionValues dv : vals) if (!dv.boolVal(doc)) return false; return true; } }; }
// in core/src/java/org/apache/solr/search/ValueSourceParser.java
Override public ValueSource parse(FunctionQParser fp) throws ParseException { List<ValueSource> sources = fp.parseValueSourceList(); return new MultiBoolFunction(sources) { @Override protected String name() { return "or"; } @Override protected boolean func(int doc, FunctionValues[] vals) { for (FunctionValues dv : vals) if (dv.boolVal(doc)) return true; return false; } }; }
// in core/src/java/org/apache/solr/search/ValueSourceParser.java
Override public ValueSource parse(FunctionQParser fp) throws ParseException { List<ValueSource> sources = fp.parseValueSourceList(); return new MultiBoolFunction(sources) { @Override protected String name() { return "xor"; } @Override protected boolean func(int doc, FunctionValues[] vals) { int nTrue=0, nFalse=0; for (FunctionValues dv : vals) { if (dv.boolVal(doc)) nTrue++; else nFalse++; } return nTrue != 0 && nFalse != 0; } }; }
// in core/src/java/org/apache/solr/search/ValueSourceParser.java
Override public ValueSource parse(FunctionQParser fp) throws ParseException { ValueSource ifValueSource = fp.parseValueSource(); ValueSource trueValueSource = fp.parseValueSource(); ValueSource falseValueSource = fp.parseValueSource(); return new IfFunction(ifValueSource, trueValueSource, falseValueSource); }
// in core/src/java/org/apache/solr/search/ValueSourceParser.java
Override public ValueSource parse(FunctionQParser fp) throws ParseException { return new DefFunction(fp.parseValueSourceList()); }
// in core/src/java/org/apache/solr/search/ValueSourceParser.java
private static TInfo parseTerm(FunctionQParser fp) throws ParseException { TInfo tinfo = new TInfo(); tinfo.indexedField = tinfo.field = fp.parseArg(); tinfo.val = fp.parseArg(); tinfo.indexedBytes = new BytesRef(); FieldType ft = fp.getReq().getSchema().getFieldTypeNoEx(tinfo.field); if (ft == null) ft = new StrField(); if (ft instanceof TextField) { // need to do analysis on the term String indexedVal = tinfo.val; Query q = ft.getFieldQuery(fp, fp.getReq().getSchema().getFieldOrNull(tinfo.field), tinfo.val); if (q instanceof TermQuery) { Term term = ((TermQuery)q).getTerm(); tinfo.indexedField = term.field(); indexedVal = term.text(); } UnicodeUtil.UTF16toUTF8(indexedVal, 0, indexedVal.length(), tinfo.indexedBytes); } else { ft.readableToIndexed(tinfo.val, tinfo.indexedBytes); } return tinfo; }
// in core/src/java/org/apache/solr/search/ValueSourceParser.java
Override public ValueSource parse(FunctionQParser fp) throws ParseException { String first = fp.parseArg(); String second = fp.parseArg(); if (first == null) first = "NOW"; Date d1 = getDate(fp, first); ValueSource v1 = d1 == null ? getValueSource(fp, first) : null; Date d2 = getDate(fp, second); ValueSource v2 = d2 == null ? getValueSource(fp, second) : null; // d constant // v field // dd constant // dv subtract field from constant // vd subtract constant from field // vv subtract fields final long ms1 = (d1 == null) ? 0 : d1.getTime(); final long ms2 = (d2 == null) ? 0 : d2.getTime(); // "d,dd" handle both constant cases if (d1 != null && v2 == null) { return new LongConstValueSource(ms1 - ms2); } // "v" just the date field if (v1 != null && v2 == null && d2 == null) { return v1; } // "dv" if (d1 != null && v2 != null) return new DualFloatFunction(new LongConstValueSource(ms1), v2) { @Override protected String name() { return "ms"; } @Override protected float func(int doc, FunctionValues aVals, FunctionValues bVals) { return ms1 - bVals.longVal(doc); } }; // "vd" if (v1 != null && d2 != null) return new DualFloatFunction(v1, new LongConstValueSource(ms2)) { @Override protected String name() { return "ms"; } @Override protected float func(int doc, FunctionValues aVals, FunctionValues bVals) { return aVals.longVal(doc) - ms2; } }; // "vv" if (v1 != null && v2 != null) return new DualFloatFunction(v1, v2) { @Override protected String name() { return "ms"; } @Override protected float func(int doc, FunctionValues aVals, FunctionValues bVals) { return aVals.longVal(doc) - bVals.longVal(doc); } }; return null; // shouldn't happen }
// in core/src/java/org/apache/solr/search/ValueSourceParser.java
Override public ValueSource parse(FunctionQParser fp) throws ParseException { return new Function(fp.parseValueSource()); }
// in core/src/java/org/apache/solr/search/ValueSourceParser.java
Override public ValueSource parse(FunctionQParser fp) throws ParseException { return new Function(fp.parseValueSource(), fp.parseValueSource()); }
// in core/src/java/org/apache/solr/search/FieldQParserPlugin.java
Override public QParser createParser(String qstr, SolrParams localParams, SolrParams params, SolrQueryRequest req) { return new QParser(qstr, localParams, params, req) { @Override public Query parse() throws ParseException { String field = localParams.get(QueryParsing.F); String queryText = localParams.get(QueryParsing.V); SchemaField sf = req.getSchema().getField(field); FieldType ft = sf.getType(); return ft.getFieldQuery(this, sf, queryText); } }; }
// in core/src/java/org/apache/solr/search/FieldQParserPlugin.java
Override public Query parse() throws ParseException { String field = localParams.get(QueryParsing.F); String queryText = localParams.get(QueryParsing.V); SchemaField sf = req.getSchema().getField(field); FieldType ft = sf.getType(); return ft.getFieldQuery(this, sf, queryText); }
// in core/src/java/org/apache/solr/util/SolrPluginUtils.java
Override protected Query getFieldQuery(String field, String queryText, boolean quoted) throws ParseException { if (aliases.containsKey(field)) { Alias a = aliases.get(field); DisjunctionMaxQuery q = new DisjunctionMaxQuery(a.tie); /* we might not get any valid queries from delegation, * in which case we should return null */ boolean ok = false; for (String f : a.fields.keySet()) { Query sub = getFieldQuery(f,queryText,quoted); if (null != sub) { if (null != a.fields.get(f)) { sub.setBoost(a.fields.get(f)); } q.add(sub); ok = true; } } return ok ? q : null; } else { try { return super.getFieldQuery(field, queryText, quoted); } catch (Exception e) { return null; } } }
// in core/src/java/org/apache/solr/util/SolrPluginUtils.java
public static List<Query> parseQueryStrings(SolrQueryRequest req, String[] queries) throws ParseException { if (null == queries || 0 == queries.length) return null; List<Query> out = new ArrayList<Query>(queries.length); for (String q : queries) { if (null != q && 0 != q.trim().length()) { out.add(QParser.getParser(q, null, req).getQuery()); } } return out; }
// in core/src/java/org/apache/solr/util/DateMathParser.java
public Date parseMath(String math) throws ParseException { Calendar cal = Calendar.getInstance(zone, loc); cal.setTime(getNow()); /* check for No-Op */ if (0==math.length()) { return cal.getTime(); } String[] ops = splitter.split(math); int pos = 0; while ( pos < ops.length ) { if (1 != ops[pos].length()) { throw new ParseException ("Multi character command found: \"" + ops[pos] + "\"", pos); } char command = ops[pos++].charAt(0); switch (command) { case '/': if (ops.length < pos + 1) { throw new ParseException ("Need a unit after command: \"" + command + "\"", pos); } try { round(cal, ops[pos++]); } catch (IllegalArgumentException e) { throw new ParseException ("Unit not recognized: \"" + ops[pos-1] + "\"", pos-1); } break; case '+': /* fall through */ case '-': if (ops.length < pos + 2) { throw new ParseException ("Need a value and unit for command: \"" + command + "\"", pos); } int val = 0; try { val = Integer.valueOf(ops[pos++]); } catch (NumberFormatException e) { throw new ParseException ("Not a Number: \"" + ops[pos-1] + "\"", pos-1); } if ('-' == command) { val = 0 - val; } try { String unit = ops[pos++]; add(cal, val, unit); } catch (IllegalArgumentException e) { throw new ParseException ("Unit not recognized: \"" + ops[pos-1] + "\"", pos-1); } break; default: throw new ParseException ("Unrecognized command: \"" + command + "\"", pos-1); } } return cal.getTime(); }
28
            
// in solrj/src/java/org/apache/solr/common/util/DateUtil.java
catch (ParseException pe) { // ignore this exception, we will try the next format }
// in contrib/dataimporthandler-extras/src/java/org/apache/solr/handler/dataimport/MailEntityProcessor.java
catch (ParseException e) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Invalid value for fetchMailSince: " + s, e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/EvaluatorBag.java
catch (ParseException exp) { wrapAndThrow(SEVERE, exp, "Invalid expression for date"); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/EvaluatorBag.java
catch (ParseException e) { wrapAndThrow(SEVERE, e, "Invalid expression for date"); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DateFormatTransformer.java
catch (ParseException e) { LOG.warn("Could not parse a Date field ", e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/NumberFormatTransformer.java
catch (ParseException e) { throw new DataImportHandlerException( DataImportHandlerException.SEVERE, "Failed to apply NumberFormat on column: " + column, e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/NumberFormatTransformer.java
catch (ParseException e) { throw new DataImportHandlerException( DataImportHandlerException.SEVERE, "Failed to apply NumberFormat on column: " + column, e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/FileListEntityProcessor.java
catch (ParseException exp) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Invalid expression for date", exp); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/FileListEntityProcessor.java
catch (ParseException exp) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Invalid expression for date", exp); }
// in core/src/java/org/apache/solr/handler/MoreLikeThisHandler.java
catch (ParseException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/handler/component/QueryComponent.java
catch (ParseException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/handler/component/QueryComponent.java
catch (ParseException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/handler/component/FacetComponent.java
catch (ParseException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/handler/component/HighlightComponent.java
catch (ParseException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/request/SimpleFacets.java
catch (ParseException e) { throw new SolrException(ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/request/SimpleFacets.java
catch (java.text.ParseException e) { throw new SolrException (SolrException.ErrorCode.BAD_REQUEST, "date facet 'gap' is not a valid Date Math string: " + gap, e); }
// in core/src/java/org/apache/solr/schema/DateField.java
catch (ParseException e) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "Invalid Date in Date Math String:'" +val+'\'',e); }
// in core/src/java/org/apache/solr/schema/DateField.java
catch (ParseException e) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "Invalid Date Math String:'" +val+'\'',e); }
// in core/src/java/org/apache/solr/schema/DateField.java
catch( ParseException ex ) { throw new RuntimeException( ex ); }
// in core/src/java/org/apache/solr/schema/DateField.java
catch (ParseException e) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "Invalid Date in Date Math String:'" +val+'\'',e); }
// in core/src/java/org/apache/solr/schema/DateField.java
catch (ParseException e) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "Invalid Date Math String:'" +val+'\'',e); }
// in core/src/java/org/apache/solr/schema/CollationField.java
catch (ParseException e) { // invalid rules throw new RuntimeException(e); }
// in core/src/java/org/apache/solr/search/SurroundQParserPlugin.java
catch (org.apache.lucene.queryparser.surround.parser.ParseException pe) { throw new org.apache.lucene.queryparser.classic.ParseException( pe.getMessage()); }
// in core/src/java/org/apache/solr/search/ReturnFields.java
catch (ParseException e) { // try again, simple rules for a field name with no whitespace sp.pos = start; field = sp.getSimpleString(); if (req.getSchema().getFieldOrNull(field) != null) { // OK, it was an oddly named field fields.add(field); if( key != null ) { rename.add(field, key); } } else { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Error parsing fieldname: " + e.getMessage(), e); } }
// in core/src/java/org/apache/solr/search/ReturnFields.java
catch (ParseException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Error parsing fieldname", e); }
// in core/src/java/org/apache/solr/search/QueryParsing.java
catch (ParseException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "error in sort: " + sortSpec, e); }
// in core/src/java/org/apache/solr/update/DirectUpdateHandler2.java
catch (ParseException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/util/SolrPluginUtils.java
catch (ParseException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Error parsing query: " + qs); }
24
            
// in contrib/dataimporthandler-extras/src/java/org/apache/solr/handler/dataimport/MailEntityProcessor.java
catch (ParseException e) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Invalid value for fetchMailSince: " + s, e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/NumberFormatTransformer.java
catch (ParseException e) { throw new DataImportHandlerException( DataImportHandlerException.SEVERE, "Failed to apply NumberFormat on column: " + column, e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/NumberFormatTransformer.java
catch (ParseException e) { throw new DataImportHandlerException( DataImportHandlerException.SEVERE, "Failed to apply NumberFormat on column: " + column, e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/FileListEntityProcessor.java
catch (ParseException exp) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Invalid expression for date", exp); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/FileListEntityProcessor.java
catch (ParseException exp) { throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Invalid expression for date", exp); }
// in core/src/java/org/apache/solr/handler/MoreLikeThisHandler.java
catch (ParseException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/handler/component/QueryComponent.java
catch (ParseException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/handler/component/QueryComponent.java
catch (ParseException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/handler/component/FacetComponent.java
catch (ParseException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/handler/component/HighlightComponent.java
catch (ParseException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/request/SimpleFacets.java
catch (ParseException e) { throw new SolrException(ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/request/SimpleFacets.java
catch (java.text.ParseException e) { throw new SolrException (SolrException.ErrorCode.BAD_REQUEST, "date facet 'gap' is not a valid Date Math string: " + gap, e); }
// in core/src/java/org/apache/solr/schema/DateField.java
catch (ParseException e) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "Invalid Date in Date Math String:'" +val+'\'',e); }
// in core/src/java/org/apache/solr/schema/DateField.java
catch (ParseException e) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "Invalid Date Math String:'" +val+'\'',e); }
// in core/src/java/org/apache/solr/schema/DateField.java
catch( ParseException ex ) { throw new RuntimeException( ex ); }
// in core/src/java/org/apache/solr/schema/DateField.java
catch (ParseException e) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "Invalid Date in Date Math String:'" +val+'\'',e); }
// in core/src/java/org/apache/solr/schema/DateField.java
catch (ParseException e) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "Invalid Date Math String:'" +val+'\'',e); }
// in core/src/java/org/apache/solr/schema/CollationField.java
catch (ParseException e) { // invalid rules throw new RuntimeException(e); }
// in core/src/java/org/apache/solr/search/SurroundQParserPlugin.java
catch (org.apache.lucene.queryparser.surround.parser.ParseException pe) { throw new org.apache.lucene.queryparser.classic.ParseException( pe.getMessage()); }
// in core/src/java/org/apache/solr/search/ReturnFields.java
catch (ParseException e) { // try again, simple rules for a field name with no whitespace sp.pos = start; field = sp.getSimpleString(); if (req.getSchema().getFieldOrNull(field) != null) { // OK, it was an oddly named field fields.add(field); if( key != null ) { rename.add(field, key); } } else { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Error parsing fieldname: " + e.getMessage(), e); } }
// in core/src/java/org/apache/solr/search/ReturnFields.java
catch (ParseException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Error parsing fieldname", e); }
// in core/src/java/org/apache/solr/search/QueryParsing.java
catch (ParseException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "error in sort: " + sortSpec, e); }
// in core/src/java/org/apache/solr/update/DirectUpdateHandler2.java
catch (ParseException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/util/SolrPluginUtils.java
catch (ParseException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Error parsing query: " + qs); }
23
unknown (Lib) ParserConfigurationException 0 0 16
            
// in core/src/java/org/apache/solr/core/SolrCore.java
public SolrCore reload(SolrResourceLoader resourceLoader) throws IOException, ParserConfigurationException, SAXException { // TODO - what if indexwriter settings have changed SolrConfig config = new SolrConfig(resourceLoader, getSolrConfig().getName(), null); IndexSchema schema = new IndexSchema(config, getSchema().getResourceName(), null); updateHandler.incref(); SolrCore core = new SolrCore(getName(), null, config, schema, coreDescriptor, updateHandler); return core; }
// in core/src/java/org/apache/solr/core/CoreContainer.java
public CoreContainer initialize() throws IOException, ParserConfigurationException, SAXException { CoreContainer cores = null; String solrHome = SolrResourceLoader.locateSolrHome(); File fconf = new File(solrHome, containerConfigFilename == null ? "solr.xml" : containerConfigFilename); log.info("looking for solr.xml: " + fconf.getAbsolutePath()); cores = new CoreContainer(solrHome); if (fconf.exists()) { cores.load(solrHome, fconf); } else { log.info("no solr.xml file found - using default"); cores.load(solrHome, new InputSource(new ByteArrayInputStream(DEF_SOLR_XML.getBytes("UTF-8")))); cores.configFile = fconf; } containerConfigFilename = cores.getConfigFile().getName(); return cores; }
// in core/src/java/org/apache/solr/core/CoreContainer.java
public void load(String dir, File configFile ) throws ParserConfigurationException, IOException, SAXException { this.configFile = configFile; this.load(dir, new InputSource(configFile.toURI().toASCIIString())); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
public void load(String dir, InputSource cfgis) throws ParserConfigurationException, IOException, SAXException { if (null == dir) { // don't rely on SolrResourceLoader(), determine explicitly first dir = SolrResourceLoader.locateSolrHome(); } log.info("Loading CoreContainer using Solr Home: '{}'", dir); this.loader = new SolrResourceLoader(dir); solrHome = loader.getInstanceDir(); Config cfg = new Config(loader, null, cfgis, null, false); // keep orig config for persist to consult try { this.cfg = new Config(loader, null, copyDoc(cfg.getDocument())); } catch (TransformerException e) { throw new SolrException(ErrorCode.SERVER_ERROR, "", e); } cfg.substituteProperties(); // Initialize Logging if(cfg.getBool("solr/logging/@enabled",true)) { String slf4jImpl = null; String fname = cfg.get("solr/logging/watcher/@class", null); try { slf4jImpl = StaticLoggerBinder.getSingleton().getLoggerFactoryClassStr(); if(fname==null) { if( slf4jImpl.indexOf("Log4j") > 0) { log.warn("Log watching is not yet implemented for log4j" ); } else if( slf4jImpl.indexOf("JDK") > 0) { fname = "JUL"; } } } catch(Throwable ex) { log.warn("Unable to read SLF4J version. LogWatcher will be disabled: "+ex); } // Now load the framework if(fname!=null) { if("JUL".equalsIgnoreCase(fname)) { logging = new JulWatcher(slf4jImpl); } // else if( "Log4j".equals(fname) ) { // logging = new Log4jWatcher(slf4jImpl); // } else { try { logging = loader.newInstance(fname, LogWatcher.class); } catch (Throwable e) { log.warn("Unable to load LogWatcher", e); } } if( logging != null ) { ListenerConfig v = new ListenerConfig(); v.size = cfg.getInt("solr/logging/watcher/@size",50); v.threshold = cfg.get("solr/logging/watcher/@threshold",null); if(v.size>0) { log.info("Registering Log Listener"); logging.registerListener(v, this); } } } } String dcoreName = cfg.get("solr/cores/@defaultCoreName", null); if(dcoreName != null && !dcoreName.isEmpty()) { defaultCoreName = dcoreName; } persistent = cfg.getBool("solr/@persistent", false); libDir = cfg.get("solr/@sharedLib", null); zkHost = cfg.get("solr/@zkHost" , null); adminPath = cfg.get("solr/cores/@adminPath", null); shareSchema = cfg.getBool("solr/cores/@shareSchema", DEFAULT_SHARE_SCHEMA); zkClientTimeout = cfg.getInt("solr/cores/@zkClientTimeout", DEFAULT_ZK_CLIENT_TIMEOUT); hostPort = cfg.get("solr/cores/@hostPort", DEFAULT_HOST_PORT); hostContext = cfg.get("solr/cores/@hostContext", DEFAULT_HOST_CONTEXT); host = cfg.get("solr/cores/@host", null); if(shareSchema){ indexSchemaCache = new ConcurrentHashMap<String ,IndexSchema>(); } adminHandler = cfg.get("solr/cores/@adminHandler", null ); managementPath = cfg.get("solr/cores/@managementPath", null ); zkClientTimeout = Integer.parseInt(System.getProperty("zkClientTimeout", Integer.toString(zkClientTimeout))); initZooKeeper(zkHost, zkClientTimeout); if (libDir != null) { File f = FileUtils.resolvePath(new File(dir), libDir); log.info( "loading shared library: "+f.getAbsolutePath() ); libLoader = SolrResourceLoader.createClassLoader(f, null); } if (adminPath != null) { if (adminHandler == null) { coreAdminHandler = new CoreAdminHandler(this); } else { coreAdminHandler = this.createMultiCoreHandler(adminHandler); } } try { containerProperties = readProperties(cfg, ((NodeList) cfg.evaluate(DEFAULT_HOST_CONTEXT, XPathConstants.NODESET)).item(0)); } catch (Throwable e) { SolrException.log(log,null,e); } NodeList nodes = (NodeList)cfg.evaluate("solr/cores/core", XPathConstants.NODESET); for (int i=0; i<nodes.getLength(); i++) { Node node = nodes.item(i); try { String rawName = DOMUtil.getAttr(node, "name", null); if (null == rawName) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Each core in solr.xml must have a 'name'"); } String name = rawName; CoreDescriptor p = new CoreDescriptor(this, name, DOMUtil.getAttr(node, "instanceDir", null)); // deal with optional settings String opt = DOMUtil.getAttr(node, "config", null); if (opt != null) { p.setConfigName(opt); } opt = DOMUtil.getAttr(node, "schema", null); if (opt != null) { p.setSchemaName(opt); } if (zkController != null) { opt = DOMUtil.getAttr(node, "shard", null); if (opt != null && opt.length() > 0) { p.getCloudDescriptor().setShardId(opt); } opt = DOMUtil.getAttr(node, "collection", null); if (opt != null) { p.getCloudDescriptor().setCollectionName(opt); } opt = DOMUtil.getAttr(node, "roles", null); if(opt != null){ p.getCloudDescriptor().setRoles(opt); } } opt = DOMUtil.getAttr(node, "properties", null); if (opt != null) { p.setPropertiesName(opt); } opt = DOMUtil.getAttr(node, CoreAdminParams.DATA_DIR, null); if (opt != null) { p.setDataDir(opt); } p.setCoreProperties(readProperties(cfg, node)); SolrCore core = create(p); register(name, core, false); // track original names coreToOrigName.put(core, rawName); } catch (Throwable ex) { SolrException.log(log,null,ex); } } }
// in core/src/java/org/apache/solr/core/CoreContainer.java
public SolrCore create(CoreDescriptor dcore) throws ParserConfigurationException, IOException, SAXException { // Make the instanceDir relative to the cores instanceDir if not absolute File idir = new File(dcore.getInstanceDir()); if (!idir.isAbsolute()) { idir = new File(solrHome, dcore.getInstanceDir()); } String instanceDir = idir.getPath(); log.info("Creating SolrCore '{}' using instanceDir: {}", dcore.getName(), instanceDir); // Initialize the solr config SolrResourceLoader solrLoader = null; SolrConfig config = null; String zkConfigName = null; if(zkController == null) { solrLoader = new SolrResourceLoader(instanceDir, libLoader, getCoreProps(instanceDir, dcore.getPropertiesName(),dcore.getCoreProperties())); config = new SolrConfig(solrLoader, dcore.getConfigName(), null); } else { try { String collection = dcore.getCloudDescriptor().getCollectionName(); zkController.createCollectionZkNode(dcore.getCloudDescriptor()); zkConfigName = zkController.readConfigName(collection); if (zkConfigName == null) { log.error("Could not find config name for collection:" + collection); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "Could not find config name for collection:" + collection); } solrLoader = new ZkSolrResourceLoader(instanceDir, zkConfigName, libLoader, getCoreProps(instanceDir, dcore.getPropertiesName(),dcore.getCoreProperties()), zkController); config = getSolrConfigFromZk(zkConfigName, dcore.getConfigName(), solrLoader); } catch (KeeperException e) { log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } } IndexSchema schema = null; if (indexSchemaCache != null) { if (zkController != null) { File schemaFile = new File(dcore.getSchemaName()); if (!schemaFile.isAbsolute()) { schemaFile = new File(solrLoader.getInstanceDir() + "conf" + File.separator + dcore.getSchemaName()); } if (schemaFile.exists()) { String key = schemaFile.getAbsolutePath() + ":" + new SimpleDateFormat("yyyyMMddHHmmss", Locale.US).format(new Date( schemaFile.lastModified())); schema = indexSchemaCache.get(key); if (schema == null) { log.info("creating new schema object for core: " + dcore.name); schema = new IndexSchema(config, dcore.getSchemaName(), null); indexSchemaCache.put(key, schema); } else { log.info("re-using schema object for core: " + dcore.name); } } } else { // TODO: handle caching from ZooKeeper - perhaps using ZooKeepers versioning // Don't like this cache though - how does it empty as last modified changes? } } if(schema == null){ if(zkController != null) { try { schema = getSchemaFromZk(zkConfigName, dcore.getSchemaName(), config, solrLoader); } catch (KeeperException e) { log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } } else { schema = new IndexSchema(config, dcore.getSchemaName(), null); } } SolrCore core = new SolrCore(dcore.getName(), null, config, schema, dcore); if (zkController == null && core.getUpdateHandler().getUpdateLog() != null) { // always kick off recovery if we are in standalone mode. core.getUpdateHandler().getUpdateLog().recoverFromLog(); } return core; }
// in core/src/java/org/apache/solr/core/CoreContainer.java
public void reload(String name) throws ParserConfigurationException, IOException, SAXException { name= checkDefault(name); SolrCore core; synchronized(cores) { core = cores.get(name); } if (core == null) throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "No such core: " + name ); CoreDescriptor cd = core.getCoreDescriptor(); File instanceDir = new File(cd.getInstanceDir()); if (!instanceDir.isAbsolute()) { instanceDir = new File(getSolrHome(), cd.getInstanceDir()); } log.info("Reloading SolrCore '{}' using instanceDir: {}", cd.getName(), instanceDir.getAbsolutePath()); SolrResourceLoader solrLoader; if(zkController == null) { solrLoader = new SolrResourceLoader(instanceDir.getAbsolutePath(), libLoader, getCoreProps(instanceDir.getAbsolutePath(), cd.getPropertiesName(),cd.getCoreProperties())); } else { try { String collection = cd.getCloudDescriptor().getCollectionName(); zkController.createCollectionZkNode(cd.getCloudDescriptor()); String zkConfigName = zkController.readConfigName(collection); if (zkConfigName == null) { log.error("Could not find config name for collection:" + collection); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "Could not find config name for collection:" + collection); } solrLoader = new ZkSolrResourceLoader(instanceDir.getAbsolutePath(), zkConfigName, libLoader, getCoreProps(instanceDir.getAbsolutePath(), cd.getPropertiesName(),cd.getCoreProperties()), zkController); } catch (KeeperException e) { log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } } SolrCore newCore = core.reload(solrLoader); // keep core to orig name link String origName = coreToOrigName.remove(core); if (origName != null) { coreToOrigName.put(newCore, origName); } register(name, newCore, false); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
private SolrConfig getSolrConfigFromZk(String zkConfigName, String solrConfigFileName, SolrResourceLoader resourceLoader) throws IOException, ParserConfigurationException, SAXException, KeeperException, InterruptedException { byte[] config = zkController.getConfigFileData(zkConfigName, solrConfigFileName); InputSource is = new InputSource(new ByteArrayInputStream(config)); is.setSystemId(SystemIdResolver.createSystemIdFromResourceName(solrConfigFileName)); SolrConfig cfg = solrConfigFileName == null ? new SolrConfig( resourceLoader, SolrConfig.DEFAULT_CONF_FILE, is) : new SolrConfig( resourceLoader, solrConfigFileName, is); return cfg; }
2
            
// in core/src/java/org/apache/solr/schema/CurrencyField.java
catch (ParserConfigurationException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Error parsing currency config.", e); }
// in core/src/java/org/apache/solr/core/Config.java
catch (ParserConfigurationException e) { SolrException.log(log, "Exception during parsing file: " + name, e); throw e; }
2
            
// in core/src/java/org/apache/solr/schema/CurrencyField.java
catch (ParserConfigurationException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Error parsing currency config.", e); }
// in core/src/java/org/apache/solr/core/Config.java
catch (ParserConfigurationException e) { SolrException.log(log, "Exception during parsing file: " + name, e); throw e; }
1
unknown (Lib) PatternSyntaxException 0 0 0 2
            
// in core/src/java/org/apache/solr/update/processor/RegexReplaceProcessorFactory.java
catch (PatternSyntaxException e) { throw new SolrException(ErrorCode.SERVER_ERROR, "Invalid regex: " + patternParam, e); }
// in core/src/java/org/apache/solr/update/processor/FieldMutatingUpdateProcessorFactory.java
catch (PatternSyntaxException e) { throw new SolrException (SERVER_ERROR, "Invalid 'fieldRegex' pattern: " + s, e); }
2
            
// in core/src/java/org/apache/solr/update/processor/RegexReplaceProcessorFactory.java
catch (PatternSyntaxException e) { throw new SolrException(ErrorCode.SERVER_ERROR, "Invalid regex: " + patternParam, e); }
// in core/src/java/org/apache/solr/update/processor/FieldMutatingUpdateProcessorFactory.java
catch (PatternSyntaxException e) { throw new SolrException (SERVER_ERROR, "Invalid 'fieldRegex' pattern: " + s, e); }
2
unknown (Lib) ProtocolException 0 0 0 1
            
// in core/src/java/org/apache/solr/util/SimplePostTool.java
catch (ProtocolException e) { fatal("Shouldn't happen: HttpURLConnection doesn't support POST??"+e); }
0 0
unknown (Domain) ReplicationHandlerException
private static class ReplicationHandlerException extends InterruptedException {
    public ReplicationHandlerException(String message) {
      super(message);
    }
  }
1
            
// in core/src/java/org/apache/solr/handler/SnapPuller.java
private int fetchPackets(FastInputStream fis) throws Exception { byte[] intbytes = new byte[4]; byte[] longbytes = new byte[8]; try { while (true) { if (stop) { stop = false; aborted = true; throw new ReplicationHandlerException("User aborted replication"); } long checkSumServer = -1; fis.readFully(intbytes); //read the size of the packet int packetSize = readInt(intbytes); if (packetSize <= 0) { LOG.warn("No content recieved for file: " + currentFile); return NO_CONTENT; } if (buf.length < packetSize) buf = new byte[packetSize]; if (checksum != null) { //read the checksum fis.readFully(longbytes); checkSumServer = readLong(longbytes); } //then read the packet of bytes fis.readFully(buf, 0, packetSize); //compare the checksum as sent from the master if (includeChecksum) { checksum.reset(); checksum.update(buf, 0, packetSize); long checkSumClient = checksum.getValue(); if (checkSumClient != checkSumServer) { LOG.error("Checksum not matched between client and server for: " + currentFile); //if checksum is wrong it is a problem return for retry return 1; } } //if everything is fine, write down the packet to the file fileChannel.write(ByteBuffer.wrap(buf, 0, packetSize)); bytesDownloaded += packetSize; if (bytesDownloaded >= size) return 0; //errorcount is always set to zero after a successful packet errorCount = 0; } } catch (ReplicationHandlerException e) { throw e; } catch (Exception e) { LOG.warn("Error in fetching packets ", e); //for any failure , increment the error count errorCount++; //if it fails for the same pacaket for MAX_RETRIES fail and come out if (errorCount > MAX_RETRIES) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Fetch failed for file:" + fileName, e); } return ERR; } }
0 0 2
            
// in core/src/java/org/apache/solr/handler/SnapPuller.java
catch (ReplicationHandlerException e) { LOG.error("User aborted Replication"); return false; }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
catch (ReplicationHandlerException e) { throw e; }
1
            
// in core/src/java/org/apache/solr/handler/SnapPuller.java
catch (ReplicationHandlerException e) { throw e; }
0
runtime (Lib) RuntimeException 145
            
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
public void printLayout(String path, int indent, StringBuilder string) throws KeeperException, InterruptedException { byte[] data = getData(path, null, null, true); List<String> children = getChildren(path, null, true); StringBuilder dent = new StringBuilder(); for (int i = 0; i < indent; i++) { dent.append(" "); } string.append(dent + path + " (" + children.size() + ")" + NEWL); if (data != null) { try { String dataString = new String(data, "UTF-8"); if ((!path.endsWith(".txt") && !path.endsWith(".xml")) || path.endsWith(ZkStateReader.CLUSTER_STATE)) { if (path.endsWith(".xml")) { // this is the cluster state in xml format - lets pretty print dataString = prettyPrint(dataString); } string.append(dent + "DATA:\n" + dent + " " + dataString.replaceAll("\n", "\n" + dent + " ") + NEWL); } else { string.append(dent + "DATA: ...supressed..." + NEWL); } } catch (UnsupportedEncodingException e) { // can't happen - UTF-8 throw new RuntimeException(e); } } for (String child : children) { if (!child.equals("quota")) { try { printLayout(path + (path.equals("/") ? "" : "/") + child, indent + 1, string); } catch (NoNodeException e) { // must have gone away } } } }
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
public static String prettyPrint(String input, int indent) { try { Source xmlInput = new StreamSource(new StringReader(input)); StringWriter stringWriter = new StringWriter(); StreamResult xmlOutput = new StreamResult(stringWriter); TransformerFactory transformerFactory = TransformerFactory.newInstance(); transformerFactory.setAttribute("indent-number", indent); Transformer transformer = transformerFactory.newTransformer(); transformer.setOutputProperty(OutputKeys.INDENT, "yes"); transformer.transform(xmlInput, xmlOutput); return xmlOutput.getWriter().toString(); } catch (Exception e) { throw new RuntimeException("Problem pretty printing XML", e); } }
// in solrj/src/java/org/apache/solr/common/cloud/ZkStateReader.java
public static Object fromJSON(byte[] utf8) { // convert directly from bytes to chars // and parse directly from that instead of going through // intermediate strings or readers CharArr chars = new CharArr(); ByteUtils.UTF8toUTF16(utf8, 0, utf8.length, chars); JSONParser parser = new JSONParser(chars.getArray(), chars.getStart(), chars.length()); try { return ObjectBuilder.getVal(parser); } catch (IOException e) { throw new RuntimeException(e); // should never happen w/o using real IO } }
// in solrj/src/java/org/apache/solr/common/cloud/ZkStateReader.java
public ZkNodeProps getLeaderProps(String collection, String shard, int timeout) throws InterruptedException { long timeoutAt = System.currentTimeMillis() + timeout; while (System.currentTimeMillis() < timeoutAt) { if (cloudState != null) { final CloudState currentState = cloudState; final ZkNodeProps nodeProps = currentState.getLeader(collection, shard); if (nodeProps != null) { return nodeProps; } } Thread.sleep(50); } throw new RuntimeException("No registered leader was found, collection:" + collection + " slice:" + shard); }
// in solrj/src/java/org/apache/solr/common/util/IteratorChain.java
public void addIterator(Iterator<E> it) { if(itit!=null) throw new RuntimeException("all Iterators must be added before calling hasNext()"); iterators.add(it); }
// in solrj/src/java/org/apache/solr/common/util/IteratorChain.java
public E next() { if(current==null) { throw new RuntimeException("For an IteratorChain, hasNext() MUST be called before calling next()"); } return current.next(); }
// in solrj/src/java/org/apache/solr/common/util/JavaBinCodec.java
public Object unmarshal(InputStream is) throws IOException { FastInputStream dis = FastInputStream.wrap(is); version = dis.readByte(); if (version != VERSION) { throw new RuntimeException("Invalid version (expected " + VERSION + ", but " + version + ") or the data in not in 'javabin' format"); } return readVal(dis); }
// in solrj/src/java/org/apache/solr/common/util/JavaBinCodec.java
public Object readVal(FastInputStream dis) throws IOException { tagByte = dis.readByte(); // if ((tagByte & 0xe0) == 0) { // if top 3 bits are clear, this is a normal tag // OK, try type + size in single byte switch (tagByte >>> 5) { case STR >>> 5: return readStr(dis); case SINT >>> 5: return readSmallInt(dis); case SLONG >>> 5: return readSmallLong(dis); case ARR >>> 5: return readArray(dis); case ORDERED_MAP >>> 5: return readOrderedMap(dis); case NAMED_LST >>> 5: return readNamedList(dis); case EXTERN_STRING >>> 5: return readExternString(dis); } switch (tagByte) { case NULL: return null; case DATE: return new Date(dis.readLong()); case INT: return dis.readInt(); case BOOL_TRUE: return Boolean.TRUE; case BOOL_FALSE: return Boolean.FALSE; case FLOAT: return dis.readFloat(); case DOUBLE: return dis.readDouble(); case LONG: return dis.readLong(); case BYTE: return dis.readByte(); case SHORT: return dis.readShort(); case MAP: return readMap(dis); case SOLRDOC: return readSolrDocument(dis); case SOLRDOCLST: return readSolrDocumentList(dis); case BYTEARR: return readByteArray(dis); case ITERATOR: return readIterator(dis); case END: return END_OBJ; case SOLRINPUTDOC: return readSolrInputDocument(dis); } throw new RuntimeException("Unknown type " + tagByte); }
// in solrj/src/java/org/apache/solr/common/params/MapSolrParams.java
Override public String toString() { StringBuilder sb = new StringBuilder(128); try { boolean first=true; for (Map.Entry<String,String> entry : map.entrySet()) { String key = entry.getKey(); String val = entry.getValue(); if (!first) sb.append('&'); first=false; sb.append(key); sb.append('='); StrUtils.partialURLEncodeVal(sb, val==null ? "" : val); } } catch (IOException e) {throw new RuntimeException(e);} // can't happen return sb.toString(); }
// in solrj/src/java/org/apache/solr/common/params/MultiMapSolrParams.java
Override public String toString() { StringBuilder sb = new StringBuilder(128); try { boolean first=true; for (Map.Entry<String,String[]> entry : map.entrySet()) { String key = entry.getKey(); String[] valarr = entry.getValue(); for (String val : valarr) { if (!first) sb.append('&'); first=false; sb.append(key); sb.append('='); StrUtils.partialURLEncodeVal(sb, val==null ? "" : val); } } } catch (IOException e) {throw new RuntimeException(e);} // can't happen return sb.toString(); }
// in solrj/src/java/org/apache/solr/common/params/ModifiableSolrParams.java
Override public String toString() { StringBuilder sb = new StringBuilder(128); try { boolean first=true; for (Map.Entry<String,String[]> entry : vals.entrySet()) { String key = entry.getKey(); String[] valarr = entry.getValue(); for (String val : valarr) { if (!first) sb.append('&'); first=false; sb.append(key); sb.append('='); if( val != null ) { sb.append( URLEncoder.encode( val, "UTF-8" ) ); } } } } catch (IOException e) {throw new RuntimeException(e);} // can't happen return sb.toString(); }
// in solrj/src/java/org/apache/solr/client/solrj/request/RequestWriter.java
private ContentStream getDelegate() { if (contentStream == null) { try { contentStream = getContentStream(req); } catch (IOException e) { throw new RuntimeException("Unable to write xml into a stream", e); } } return contentStream; }
// in solrj/src/java/org/apache/solr/client/solrj/request/CoreAdminRequest.java
Override public SolrParams getParams() { if( action == null ) { throw new RuntimeException( "no action specified!" ); } ModifiableSolrParams params = new ModifiableSolrParams(); params.set( CoreAdminParams.ACTION, action.toString() ); if( action.equals(CoreAdminAction.CREATE) ) { params.set( CoreAdminParams.NAME, core ); } else { params.set( CoreAdminParams.CORE, core ); } params.set( CoreAdminParams.INSTANCE_DIR, instanceDir); if (configName != null) { params.set( CoreAdminParams.CONFIG, configName); } if (schemaName != null) { params.set( CoreAdminParams.SCHEMA, schemaName); } if (dataDir != null) { params.set( CoreAdminParams.DATA_DIR, dataDir); } if (collection != null) { params.set( CoreAdminParams.COLLECTION, collection); } if (numShards != null) { params.set( ZkStateReader.NUM_SHARDS_PROP, numShards); } if (shardId != null) { params.set( CoreAdminParams.SHARD, shardId); } if (roles != null) { params.set( CoreAdminParams.ROLES, roles); } return params; }
// in solrj/src/java/org/apache/solr/client/solrj/request/CoreAdminRequest.java
Override public SolrParams getParams() { if( action == null ) { throw new RuntimeException( "no action specified!" ); } ModifiableSolrParams params = new ModifiableSolrParams(); params.set( CoreAdminParams.ACTION, action.toString() ); params.set( CoreAdminParams.CORE, core ); if (nodeName != null) { params.set( "nodeName", nodeName); } if (coreNodeName != null) { params.set( "coreNodeName", coreNodeName); } if (state != null) { params.set( "state", state); } if (checkLive != null) { params.set( "checkLive", checkLive); } if (pauseFor != null) { params.set( "pauseFor", pauseFor); } return params; }
// in solrj/src/java/org/apache/solr/client/solrj/request/CoreAdminRequest.java
Override public SolrParams getParams() { if( action == null ) { throw new RuntimeException( "no action specified!" ); } ModifiableSolrParams params = new ModifiableSolrParams(); params.set( CoreAdminParams.ACTION, action.toString() ); params.set( CoreAdminParams.CORE, core ); return params; }
// in solrj/src/java/org/apache/solr/client/solrj/request/CoreAdminRequest.java
Override public SolrParams getParams() { if( action == null ) { throw new RuntimeException( "no action specified!" ); } ModifiableSolrParams params = new ModifiableSolrParams(); params.set( CoreAdminParams.ACTION, action.toString() ); if (fileName != null) { params.set( CoreAdminParams.FILE, fileName); } return params; }
// in solrj/src/java/org/apache/solr/client/solrj/request/CoreAdminRequest.java
Override public SolrParams getParams() { if (action == null) { throw new RuntimeException("no action specified!"); } ModifiableSolrParams params = new ModifiableSolrParams(); params.set(CoreAdminParams.ACTION, action.toString()); params.set(CoreAdminParams.CORE, core); if (indexDirs != null) { for (String indexDir : indexDirs) { params.set(CoreAdminParams.INDEX_DIR, indexDir); } } if (srcCores != null) { for (String srcCore : srcCores) { params.set(CoreAdminParams.SRC_CORE, srcCore); } } return params; }
// in solrj/src/java/org/apache/solr/client/solrj/request/CoreAdminRequest.java
Override public SolrParams getParams() { if( action == null ) { throw new RuntimeException( "no action specified!" ); } ModifiableSolrParams params = new ModifiableSolrParams(); params.set( CoreAdminParams.ACTION, action.toString() ); params.set( CoreAdminParams.CORE, core ); if (other != null) { params.set(CoreAdminParams.OTHER, other); } return params; }
// in solrj/src/java/org/apache/solr/client/solrj/impl/BinaryResponseParser.java
Override public NamedList<Object> processResponse(Reader reader) { throw new RuntimeException("Cannot handle character stream"); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/BinaryRequestWriter.java
public Reader getReader() throws IOException { throw new RuntimeException("No reader available . this is a binarystream"); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/StreamingBinaryResponseParser.java
Override public SolrDocumentList readSolrDocumentList(FastInputStream dis) throws IOException { SolrDocumentList solrDocs = new SolrDocumentList(); List list = (List) readVal(dis); solrDocs.setNumFound((Long) list.get(0)); solrDocs.setStart((Long) list.get(1)); solrDocs.setMaxScore((Float) list.get(2)); callback.streamDocListInfo( solrDocs.getNumFound(), solrDocs.getStart(), solrDocs.getMaxScore() ); // Read the Array tagByte = dis.readByte(); if( (tagByte >>> 5) != (ARR >>> 5) ) { throw new RuntimeException( "doclist must have an array" ); } int sz = readSize(dis); for (int i = 0; i < sz; i++) { // must be a SolrDocument readVal( dis ); } return solrDocs; }
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
public String removeSolrServer(String server) { try { server = new URL(server).toExternalForm(); } catch (MalformedURLException e) { throw new RuntimeException(e); } if (server.endsWith("/")) { server = server.substring(0, server.length() - 1); } // there is a small race condition here - if the server is in the process of being moved between // lists, we could fail to remove it. removeFromAlive(server); zombieServers.remove(server); return null; }
// in solrj/src/java/org/apache/solr/client/solrj/impl/XMLResponseParser.java
protected NamedList<Object> readNamedList( XMLStreamReader parser ) throws XMLStreamException { if( XMLStreamConstants.START_ELEMENT != parser.getEventType() ) { throw new RuntimeException( "must be start element, not: "+parser.getEventType() ); } StringBuilder builder = new StringBuilder(); NamedList<Object> nl = new SimpleOrderedMap<Object>(); KnownType type = null; String name = null; // just eat up the events... int depth = 0; while( true ) { switch (parser.next()) { case XMLStreamConstants.START_ELEMENT: depth++; builder.setLength( 0 ); // reset the text type = KnownType.get( parser.getLocalName() ); if( type == null ) { throw new RuntimeException( "this must be known type! not: "+parser.getLocalName() ); } name = null; int cnt = parser.getAttributeCount(); for( int i=0; i<cnt; i++ ) { if( "name".equals( parser.getAttributeLocalName( i ) ) ) { name = parser.getAttributeValue( i ); break; } } /** The name in a NamedList can actually be null if( name == null ) { throw new XMLStreamException( "requires 'name' attribute: "+parser.getLocalName(), parser.getLocation() ); } **/ if( !type.isLeaf ) { switch( type ) { case LST: nl.add( name, readNamedList( parser ) ); depth--; continue; case ARR: nl.add( name, readArray( parser ) ); depth--; continue; case RESULT: nl.add( name, readDocuments( parser ) ); depth--; continue; case DOC: nl.add( name, readDocument( parser ) ); depth--; continue; } throw new XMLStreamException( "branch element not handled!", parser.getLocation() ); } break; case XMLStreamConstants.END_ELEMENT: if( --depth < 0 ) { return nl; } //System.out.println( "NL:ELEM:"+type+"::"+name+"::"+builder ); nl.add( name, type.read( builder.toString().trim() ) ); break; case XMLStreamConstants.SPACE: // TODO? should this be trimmed? make sure it only gets one/two space? case XMLStreamConstants.CDATA: case XMLStreamConstants.CHARACTERS: builder.append( parser.getText() ); break; } } }
// in solrj/src/java/org/apache/solr/client/solrj/impl/XMLResponseParser.java
protected List<Object> readArray( XMLStreamReader parser ) throws XMLStreamException { if( XMLStreamConstants.START_ELEMENT != parser.getEventType() ) { throw new RuntimeException( "must be start element, not: "+parser.getEventType() ); } if( !"arr".equals( parser.getLocalName().toLowerCase(Locale.ENGLISH) ) ) { throw new RuntimeException( "must be 'arr', not: "+parser.getLocalName() ); } StringBuilder builder = new StringBuilder(); KnownType type = null; List<Object> vals = new ArrayList<Object>(); int depth = 0; while( true ) { switch (parser.next()) { case XMLStreamConstants.START_ELEMENT: depth++; KnownType t = KnownType.get( parser.getLocalName() ); if( t == null ) { throw new RuntimeException( "this must be known type! not: "+parser.getLocalName() ); } if( type == null ) { type = t; } /*** actually, there is no rule that arrays need the same type else if( type != t && !(t == KnownType.NULL || type == KnownType.NULL)) { throw new RuntimeException( "arrays must have the same type! ("+type+"!="+t+") "+parser.getLocalName() ); } ***/ type = t; builder.setLength( 0 ); // reset the text if( !type.isLeaf ) { switch( type ) { case LST: vals.add( readNamedList( parser ) ); depth--; continue; case ARR: vals.add( readArray( parser ) ); depth--; continue; case RESULT: vals.add( readDocuments( parser ) ); depth--; continue; case DOC: vals.add( readDocument( parser ) ); depth--; continue; } throw new XMLStreamException( "branch element not handled!", parser.getLocation() ); } break; case XMLStreamConstants.END_ELEMENT: if( --depth < 0 ) { return vals; // the last element is itself } //System.out.println( "ARR:"+type+"::"+builder ); Object val = type.read( builder.toString().trim() ); if( val == null && type != KnownType.NULL) { throw new XMLStreamException( "error reading value:"+type, parser.getLocation() ); } vals.add( val ); break; case XMLStreamConstants.SPACE: // TODO? should this be trimmed? make sure it only gets one/two space? case XMLStreamConstants.CDATA: case XMLStreamConstants.CHARACTERS: builder.append( parser.getText() ); break; } } }
// in solrj/src/java/org/apache/solr/client/solrj/impl/XMLResponseParser.java
protected SolrDocumentList readDocuments( XMLStreamReader parser ) throws XMLStreamException { SolrDocumentList docs = new SolrDocumentList(); // Parse the attributes for( int i=0; i<parser.getAttributeCount(); i++ ) { String n = parser.getAttributeLocalName( i ); String v = parser.getAttributeValue( i ); if( "numFound".equals( n ) ) { docs.setNumFound( Long.parseLong( v ) ); } else if( "start".equals( n ) ) { docs.setStart( Long.parseLong( v ) ); } else if( "maxScore".equals( n ) ) { docs.setMaxScore( Float.parseFloat( v ) ); } } // Read through each document int event; while( true ) { event = parser.next(); if( XMLStreamConstants.START_ELEMENT == event ) { if( !"doc".equals( parser.getLocalName() ) ) { throw new RuntimeException( "should be doc! "+parser.getLocalName() + " :: " + parser.getLocation() ); } docs.add( readDocument( parser ) ); } else if ( XMLStreamConstants.END_ELEMENT == event ) { return docs; // only happens once } } }
// in solrj/src/java/org/apache/solr/client/solrj/impl/XMLResponseParser.java
protected SolrDocument readDocument( XMLStreamReader parser ) throws XMLStreamException { if( XMLStreamConstants.START_ELEMENT != parser.getEventType() ) { throw new RuntimeException( "must be start element, not: "+parser.getEventType() ); } if( !"doc".equals( parser.getLocalName().toLowerCase(Locale.ENGLISH) ) ) { throw new RuntimeException( "must be 'lst', not: "+parser.getLocalName() ); } SolrDocument doc = new SolrDocument(); StringBuilder builder = new StringBuilder(); KnownType type = null; String name = null; // just eat up the events... int depth = 0; while( true ) { switch (parser.next()) { case XMLStreamConstants.START_ELEMENT: depth++; builder.setLength( 0 ); // reset the text type = KnownType.get( parser.getLocalName() ); if( type == null ) { throw new RuntimeException( "this must be known type! not: "+parser.getLocalName() ); } name = null; int cnt = parser.getAttributeCount(); for( int i=0; i<cnt; i++ ) { if( "name".equals( parser.getAttributeLocalName( i ) ) ) { name = parser.getAttributeValue( i ); break; } } if( name == null ) { throw new XMLStreamException( "requires 'name' attribute: "+parser.getLocalName(), parser.getLocation() ); } // Handle multi-valued fields if( type == KnownType.ARR ) { for( Object val : readArray( parser ) ) { doc.addField( name, val ); } depth--; // the array reading clears out the 'endElement' } else if( type == KnownType.LST ) { doc.addField( name, readNamedList( parser ) ); depth--; } else if( !type.isLeaf ) { System.out.println("nbot leaf!:" + type); throw new XMLStreamException( "must be value or array", parser.getLocation() ); } break; case XMLStreamConstants.END_ELEMENT: if( --depth < 0 ) { return doc; } //System.out.println( "FIELD:"+type+"::"+name+"::"+builder ); Object val = type.read( builder.toString().trim() ); if( val == null ) { throw new XMLStreamException( "error reading value:"+type, parser.getLocation() ); } doc.addField( name, val ); break; case XMLStreamConstants.SPACE: // TODO? should this be trimmed? make sure it only gets one/two space? case XMLStreamConstants.CDATA: case XMLStreamConstants.CHARACTERS: builder.append( parser.getText() ); break; } } }
// in solrj/src/java/org/apache/solr/client/solrj/util/ClientUtils.java
public static String toQueryString( SolrParams params, boolean xml ) { StringBuilder sb = new StringBuilder(128); try { String amp = xml ? "&amp;" : "&"; boolean first=true; Iterator<String> names = params.getParameterNamesIterator(); while( names.hasNext() ) { String key = names.next(); String[] valarr = params.getParams( key ); if( valarr == null ) { sb.append( first?"?":amp ); sb.append(key); first=false; } else { for (String val : valarr) { sb.append( first? "?":amp ); sb.append(key); if( val != null ) { sb.append('='); sb.append( URLEncoder.encode( val, "UTF-8" ) ); } first=false; } } } } catch (IOException e) {throw new RuntimeException(e);} // can't happen return sb.toString(); }
// in solrj/src/java/org/apache/noggit/CharArr.java
Override public void flush() { try { sink.write(buf, start, end-start); } catch (IOException e) { throw new RuntimeException(e); } start = end = 0; }
// in solrj/src/java/org/apache/noggit/CharArr.java
Override public void write(char b[], int off, int len) { int space = buf.length - end; if (len < space) { unsafeWrite(b, off, len); } else if (len < buf.length) { unsafeWrite(b, off, space); flush(); unsafeWrite(b, off+space, len-space); } else { flush(); try { sink.write(b, off, len); } catch (IOException e) { throw new RuntimeException(e); } } }
// in solrj/src/java/org/apache/noggit/CharArr.java
Override public void write(String s, int stringOffset, int len) { int space = buf.length - end; if (len < space) { s.getChars(stringOffset, stringOffset+len, buf, end); end += len; } else if (len < buf.length) { // if the data to write is small enough, buffer it. s.getChars(stringOffset, stringOffset+space, buf, end); flush(); s.getChars(stringOffset+space, stringOffset+len, buf, 0); end = len-space; } else { flush(); // don't buffer, just write to sink try { sink.write(s, stringOffset, len); } catch (IOException e) { throw new RuntimeException(e); } } }
// in contrib/analysis-extras/src/java/org/apache/solr/schema/ICUCollationField.java
private Collator createFromRules(String fileName, ResourceLoader loader) { InputStream input = null; try { input = loader.openResource(fileName); String rules = IOUtils.toString(input, "UTF-8"); return new RuleBasedCollator(rules); } catch (Exception e) { // io error or invalid rules throw new RuntimeException(e); } finally { IOUtils.closeQuietly(input); } }
// in contrib/analysis-extras/src/java/org/apache/solr/schema/ICUCollationField.java
private BytesRef analyzeRangePart(String field, String part) { TokenStream source; try { source = analyzer.tokenStream(field, new StringReader(part)); source.reset(); } catch (IOException e) { throw new RuntimeException("Unable to initialize TokenStream to analyze range part: " + part, e); } TermToBytesRefAttribute termAtt = source.getAttribute(TermToBytesRefAttribute.class); BytesRef bytes = termAtt.getBytesRef(); // we control the analyzer here: most errors are impossible try { if (!source.incrementToken()) throw new IllegalArgumentException("analyzer returned no terms for range part: " + part); termAtt.fillBytesRef(); assert !source.incrementToken(); } catch (IOException e) { throw new RuntimeException("error analyzing range part: " + part, e); } try { source.end(); source.close(); } catch (IOException e) { throw new RuntimeException("Unable to end & close TokenStream after analyzing range part: " + part, e); } return BytesRef.deepCopyOf(bytes); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/XPathEntityProcessor.java
Override public void run() { try { xpathReader.streamRecords(data, new XPathRecordReader.Handler() { @SuppressWarnings("unchecked") public void handle(Map<String, Object> record, String xpath) { if (isEnd.get()) { throwExp.set(false); //To end the streaming . otherwise the parsing will go on forever //though consumer has gone away throw new RuntimeException("BREAK"); } Map<String, Object> row; try { row = readRow(record, xpath); } catch (Exception e) { isEnd.set(true); return; } offer(row); } });
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/FileDataSource.java
static File getFile(String basePath, String query) { try { File file0 = new File(query); File file = file0; if (!file.isAbsolute()) file = new File(basePath + query); if (file.isFile() && file.canRead()) { LOG.debug("Accessing File: " + file.toString()); return file; } else if (file != file0) if (file0.isFile() && file0.canRead()) { LOG.debug("Accessing File0: " + file0.toString()); return file0; } throw new FileNotFoundException("Could not find file: " + query); } catch (FileNotFoundException e) { throw new RuntimeException(e); } }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/FieldStreamDataSource.java
Override public InputStream getData(String query) { Object o = wrapper.getVariableResolver().resolve(dataField); if (o == null) { throw new DataImportHandlerException(SEVERE, "No field available for name : " + dataField); } if (o instanceof Blob) { Blob blob = (Blob) o; try { //Most of the JDBC drivers have getBinaryStream defined as public // so let us just check it Method m = blob.getClass().getDeclaredMethod("getBinaryStream"); if (Modifier.isPublic(m.getModifiers())) { return (InputStream) m.invoke(blob); } else { // force invoke m.setAccessible(true); return (InputStream) m.invoke(blob); } } catch (Exception e) { LOG.info("Unable to get data from BLOB"); return null; } } else if (o instanceof byte[]) { byte[] bytes = (byte[]) o; return new ByteArrayInputStream(bytes); } else { throw new RuntimeException("unsupported type : " + o.getClass()); } }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/XPathRecordReader.java
private void addField0(String xpath, String name, boolean multiValued, boolean isRecord, int flags) { if (!xpath.startsWith("/")) throw new RuntimeException("xpath must start with '/' : " + xpath); List<String> paths = splitEscapeQuote(xpath); // deal with how split behaves when seperator starts a string! if ("".equals(paths.get(0).trim())) paths.remove(0); rootNode.build(paths, name, multiValued, isRecord, flags); rootNode.buildOptimise(null); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/XPathRecordReader.java
public void streamRecords(Reader r, Handler handler) { try { XMLStreamReader parser = factory.createXMLStreamReader(r); rootNode.parse(parser, handler, new HashMap<String, Object>(), new Stack<Set<String>>(), false); } catch (Exception e) { throw new RuntimeException(e); } }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DocBuilder.java
private void buildDocument(VariableResolverImpl vr, DocWrapper doc, Map<String,Object> pk, EntityProcessorWrapper epw, boolean isRoot, ContextImpl parentCtx) { List<EntityProcessorWrapper> entitiesToDestroy = new ArrayList<EntityProcessorWrapper>(); try { buildDocument(vr, doc, pk, epw, isRoot, parentCtx, entitiesToDestroy); } catch (Exception e) { throw new RuntimeException(e); } finally { for (EntityProcessorWrapper entityWrapper : entitiesToDestroy) { entityWrapper.destroy(); } resetEntity(epw); } }
// in contrib/velocity/src/java/org/apache/solr/response/SolrParamResourceLoader.java
Override public InputStream getResourceStream(String s) throws ResourceNotFoundException { String template = templates.get(s); try { return template == null ? null : new ByteArrayInputStream(template.getBytes("UTF-8")); } catch (UnsupportedEncodingException e) { throw new RuntimeException(e); // may not happen } }
// in contrib/velocity/src/java/org/apache/solr/response/VelocityResponseWriter.java
private VelocityEngine getEngine(SolrQueryRequest request) { VelocityEngine engine = new VelocityEngine(); String template_root = request.getParams().get("v.base_dir"); File baseDir = new File(request.getCore().getResourceLoader().getConfigDir(), "velocity"); if (template_root != null) { baseDir = new File(template_root); } engine.setProperty(RuntimeConstants.FILE_RESOURCE_LOADER_PATH, baseDir.getAbsolutePath()); engine.setProperty("params.resource.loader.instance", new SolrParamResourceLoader(request)); SolrVelocityResourceLoader resourceLoader = new SolrVelocityResourceLoader(request.getCore().getSolrConfig().getResourceLoader()); engine.setProperty("solr.resource.loader.instance", resourceLoader); // TODO: Externalize Velocity properties engine.setProperty(RuntimeConstants.RESOURCE_LOADER, "params,file,solr"); String propFile = request.getParams().get("v.properties"); try { if (propFile == null) engine.init(); else { InputStream is = null; try { is = resourceLoader.getResourceStream(propFile); Properties props = new Properties(); props.load(is); engine.init(props); } finally { if (is != null) is.close(); } } } catch (Exception e) { throw new RuntimeException(e); } return engine; }
// in core/src/java/org/apache/solr/handler/AnalysisRequestHandlerBase.java
protected Set<BytesRef> getQueryTokenSet(String query, Analyzer analyzer) { try { final Set<BytesRef> tokens = new HashSet<BytesRef>(); final TokenStream tokenStream = analyzer.tokenStream("", new StringReader(query)); final TermToBytesRefAttribute bytesAtt = tokenStream.getAttribute(TermToBytesRefAttribute.class); final BytesRef bytes = bytesAtt.getBytesRef(); tokenStream.reset(); while (tokenStream.incrementToken()) { bytesAtt.fillBytesRef(); tokens.add(BytesRef.deepCopyOf(bytes)); } tokenStream.end(); tokenStream.close(); return tokens; } catch (IOException ioe) { throw new RuntimeException("Error occured while iterating over tokenstream", ioe); } }
// in core/src/java/org/apache/solr/handler/AnalysisRequestHandlerBase.java
private List<AttributeSource> analyzeTokenStream(TokenStream tokenStream) { final List<AttributeSource> tokens = new ArrayList<AttributeSource>(); final PositionIncrementAttribute posIncrAtt = tokenStream.addAttribute(PositionIncrementAttribute.class); final TokenTrackingAttribute trackerAtt = tokenStream.addAttribute(TokenTrackingAttribute.class); // for backwards compatibility, add all "common" attributes tokenStream.addAttribute(OffsetAttribute.class); tokenStream.addAttribute(TypeAttribute.class); try { tokenStream.reset(); int position = 0; while (tokenStream.incrementToken()) { position += posIncrAtt.getPositionIncrement(); trackerAtt.setActPosition(position); tokens.add(tokenStream.cloneAttributes()); } } catch (IOException ioe) { throw new RuntimeException("Error occured while iterating over tokenstream", ioe); } return tokens; }
// in core/src/java/org/apache/solr/handler/ReplicationHandler.java
public void write(Writer writer, SolrQueryRequest request, SolrQueryResponse response) throws IOException { throw new RuntimeException("This is a binary writer , Cannot write to a characterstream"); }
// in core/src/java/org/apache/solr/handler/component/QueryElevationComponent.java
Override public URL[] getDocs() { try { return new URL[]{ new URL("http://wiki.apache.org/solr/QueryElevationComponent") }; } catch (MalformedURLException e) { throw new RuntimeException(e); } }
// in core/src/java/org/apache/solr/handler/component/SpellCheckComponent.java
public void inform(SolrCore core) { if (initParams != null) { LOG.info("Initializing spell checkers"); boolean hasDefault = false; for (int i = 0; i < initParams.size(); i++) { if (initParams.getName(i).equals("spellchecker")) { NamedList spellchecker = (NamedList) initParams.getVal(i); String className = (String) spellchecker.get("classname"); // TODO: this is a little bit sneaky: warn if class isnt supplied // so that its mandatory in a future release? if (className == null) className = IndexBasedSpellChecker.class.getName(); SolrResourceLoader loader = core.getResourceLoader(); SolrSpellChecker checker = loader.newInstance(className, SolrSpellChecker.class); if (checker != null) { String dictionary = checker.init(spellchecker, core); if (dictionary != null) { boolean isDefault = dictionary.equals(SolrSpellChecker.DEFAULT_DICTIONARY_NAME); if (isDefault == true && hasDefault == false){ hasDefault = true; } else if (isDefault == true && hasDefault == true){ throw new RuntimeException("More than one dictionary is missing name."); } spellCheckers.put(dictionary, checker); } else { if (hasDefault == false){ spellCheckers.put(SolrSpellChecker.DEFAULT_DICTIONARY_NAME, checker); hasDefault = true; } else { throw new RuntimeException("More than one dictionary is missing name."); } } // Register event listeners for this SpellChecker core.registerFirstSearcherListener(new SpellCheckerListener(core, checker, false, false)); boolean buildOnCommit = Boolean.parseBoolean((String) spellchecker.get("buildOnCommit")); boolean buildOnOptimize = Boolean.parseBoolean((String) spellchecker.get("buildOnOptimize")); if (buildOnCommit || buildOnOptimize) { LOG.info("Registering newSearcher listener for spellchecker: " + checker.getDictionaryName()); core.registerNewSearcherListener(new SpellCheckerListener(core, checker, buildOnCommit, buildOnOptimize)); } } else { throw new RuntimeException("Can't load spell checker: " + className); } } } Map<String, QueryConverter> queryConverters = new HashMap<String, QueryConverter>(); core.initPlugins(queryConverters,QueryConverter.class); //ensure that there is at least one query converter defined if (queryConverters.size() == 0) { LOG.info("No queryConverter defined, using default converter"); queryConverters.put("queryConverter", new SpellingQueryConverter()); } //there should only be one if (queryConverters.size() == 1) { queryConverter = queryConverters.values().iterator().next(); IndexSchema schema = core.getSchema(); String fieldTypeName = (String) initParams.get("queryAnalyzerFieldType"); FieldType fieldType = schema.getFieldTypes().get(fieldTypeName); Analyzer analyzer = fieldType == null ? new WhitespaceAnalyzer(core.getSolrConfig().luceneMatchVersion) : fieldType.getQueryAnalyzer(); //TODO: There's got to be a better way! Where's Spring when you need it? queryConverter.setAnalyzer(analyzer); } } }
// in core/src/java/org/apache/solr/handler/component/ShardDoc.java
Comparator getCachedComparator(String fieldname, SortField.Type type, FieldComparatorSource factory) { Comparator comparator = null; switch (type) { case SCORE: comparator = comparatorScore(fieldname); break; case STRING: comparator = comparatorNatural(fieldname); break; case CUSTOM: if (factory instanceof MissingStringLastComparatorSource){ comparator = comparatorMissingStringLast(fieldname); } else { // TODO: support other types such as random... is there a way to // support generically? Perhaps just comparing Object comparator = comparatorNatural(fieldname); // throw new RuntimeException("Custom sort not supported factory is "+factory.getClass()); } break; case DOC: // TODO: we can support this! throw new RuntimeException("Doc sort not supported"); default: comparator = comparatorNatural(fieldname); break; } return comparator; }
// in core/src/java/org/apache/solr/handler/component/StatsComponent.java
public NamedList<?> getFieldCacheStats(String fieldName, String[] facet ) { SchemaField sf = searcher.getSchema().getField(fieldName); FieldCache.DocTermsIndex si; try { si = FieldCache.DEFAULT.getTermsIndex(searcher.getAtomicReader(), fieldName); } catch (IOException e) { throw new RuntimeException( "failed to open field cache for: "+fieldName, e ); } StatsValues allstats = StatsValuesFactory.createStatsValues(sf); final int nTerms = si.numOrd(); if ( nTerms <= 0 || docs.size() <= 0 ) return allstats.getStatsValues(); // don't worry about faceting if no documents match... List<FieldFacetStats> facetStats = new ArrayList<FieldFacetStats>(); FieldCache.DocTermsIndex facetTermsIndex; for( String facetField : facet ) { SchemaField fsf = searcher.getSchema().getField(facetField); FieldType facetFieldType = fsf.getType(); if (facetFieldType.isTokenized() || facetFieldType.isMultiValued()) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Stats can only facet on single-valued fields, not: " + facetField + "[" + facetFieldType + "]"); } try { facetTermsIndex = FieldCache.DEFAULT.getTermsIndex(searcher.getAtomicReader(), facetField); } catch (IOException e) { throw new RuntimeException( "failed to open field cache for: " + facetField, e ); } facetStats.add(new FieldFacetStats(facetField, facetTermsIndex, sf, fsf, nTerms)); } final BytesRef tempBR = new BytesRef(); DocIterator iter = docs.iterator(); while (iter.hasNext()) { int docID = iter.nextDoc(); BytesRef raw = si.lookup(si.getOrd(docID), tempBR); if( raw.length > 0 ) { allstats.accumulate(raw); } else { allstats.missing(); } // now update the facets for (FieldFacetStats f : facetStats) { f.facet(docID, raw); } } for (FieldFacetStats f : facetStats) { allstats.addFacet(f.name, f.facetStatsValues); } return allstats.getStatsValues(); }
// in core/src/java/org/apache/solr/handler/component/TermVectorComponent.java
private static int getDocFreq(IndexReader reader, String field, BytesRef term) { int result = 1; try { result = reader.docFreq(field, term); } catch (IOException e) { throw new RuntimeException(e); } return result; }
// in core/src/java/org/apache/solr/client/solrj/embedded/EmbeddedSolrServer.java
Override public NamedList<Object> request(SolrRequest request) throws SolrServerException, IOException { String path = request.getPath(); if( path == null || !path.startsWith( "/" ) ) { path = "/select"; } // Check for cores action SolrCore core = coreContainer.getCore( coreName ); if( core == null ) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "No such core: " + coreName ); } SolrParams params = request.getParams(); if( params == null ) { params = new ModifiableSolrParams(); } // Extract the handler from the path or params SolrRequestHandler handler = core.getRequestHandler( path ); if( handler == null ) { if( "/select".equals( path ) || "/select/".equalsIgnoreCase( path) ) { String qt = params.get( CommonParams.QT ); handler = core.getRequestHandler( qt ); if( handler == null ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "unknown handler: "+qt); } } // Perhaps the path is to manage the cores if( handler == null && coreContainer != null && path.equals( coreContainer.getAdminPath() ) ) { handler = coreContainer.getMultiCoreHandler(); } } if( handler == null ) { core.close(); throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "unknown handler: "+path ); } SolrQueryRequest req = null; try { req = _parser.buildRequestFrom( core, params, request.getContentStreams() ); req.getContext().put( "path", path ); SolrQueryResponse rsp = new SolrQueryResponse(); SolrRequestInfo.setRequestInfo(new SolrRequestInfo(req, rsp)); core.execute( handler, req, rsp ); if( rsp.getException() != null ) { if(rsp.getException() instanceof SolrException) { throw rsp.getException(); } throw new SolrServerException( rsp.getException() ); } // Check if this should stream results if( request.getStreamingResponseCallback() != null ) { try { final StreamingResponseCallback callback = request.getStreamingResponseCallback(); BinaryResponseWriter.Resolver resolver = new BinaryResponseWriter.Resolver( req, rsp.getReturnFields()) { @Override public void writeResults(ResultContext ctx, JavaBinCodec codec) throws IOException { // write an empty list... SolrDocumentList docs = new SolrDocumentList(); docs.setNumFound( ctx.docs.matches() ); docs.setStart( ctx.docs.offset() ); docs.setMaxScore( ctx.docs.maxScore() ); codec.writeSolrDocumentList( docs ); // This will transform writeResultsBody( ctx, codec ); } }; ByteArrayOutputStream out = new ByteArrayOutputStream(); new JavaBinCodec(resolver) { @Override public void writeSolrDocument(SolrDocument doc) throws IOException { callback.streamSolrDocument( doc ); //super.writeSolrDocument( doc, fields ); } @Override public void writeSolrDocumentList(SolrDocumentList docs) throws IOException { if( docs.size() > 0 ) { SolrDocumentList tmp = new SolrDocumentList(); tmp.setMaxScore( docs.getMaxScore() ); tmp.setNumFound( docs.getNumFound() ); tmp.setStart( docs.getStart() ); docs = tmp; } callback.streamDocListInfo( docs.getNumFound(), docs.getStart(), docs.getMaxScore() ); super.writeSolrDocumentList(docs); } }.marshal(rsp.getValues(), out); InputStream in = new ByteArrayInputStream(out.toByteArray()); return (NamedList<Object>) new JavaBinCodec(resolver).unmarshal(in); } catch (Exception ex) { throw new RuntimeException(ex); } } // Now write it out NamedList<Object> normalized = getParsedResponse(req, rsp); return normalized; } catch( IOException iox ) { throw iox; } catch( SolrException sx ) { throw sx; } catch( Exception ex ) { throw new SolrServerException( ex ); } finally { if (req != null) req.close(); core.close(); SolrRequestInfo.clearRequestInfo(); } }
// in core/src/java/org/apache/solr/client/solrj/embedded/JettySolrRunner.java
public void start(boolean waitForSolr) throws Exception { // if started before, make a new server if (startedBefore) { waitOnSolr = false; init(solrHome, context, lastPort, stopAtShutdown); } else { startedBefore = true; } if( dataDir != null) { System.setProperty("solr.data.dir", dataDir); } if(shards != null) { System.setProperty("shard", shards); } if (!server.isRunning()) { server.start(); } synchronized (JettySolrRunner.this) { int cnt = 0; while (!waitOnSolr) { this.wait(100); if (cnt++ == 5) { throw new RuntimeException("Jetty/Solr unresponsive"); } } } System.clearProperty("shard"); System.clearProperty("solr.data.dir"); }
// in core/src/java/org/apache/solr/client/solrj/embedded/JettySolrRunner.java
private int getFirstConnectorPort() { Connector[] conns = server.getConnectors(); if (0 == conns.length) { throw new RuntimeException("Jetty Server has no Connectors"); } return conns[0].getLocalPort(); }
// in core/src/java/org/apache/solr/response/XSLTResponseWriter.java
public String getContentType(SolrQueryRequest request, SolrQueryResponse response) { Transformer t = null; try { t = getTransformer(request); } catch(Exception e) { // TODO should our parent interface throw (IO)Exception? throw new RuntimeException("getTransformer fails in getContentType",e); } String mediaType = t.getOutputProperty("media-type"); if (mediaType == null || mediaType.length()==0) { // This did not happen in my tests, mediaTypeFromXslt is set to "text/xml" // if the XSLT transform does not contain an xsl:output element. Not sure // if this is standard behavior or if it's just my JVM/libraries mediaType = DEFAULT_CONTENT_TYPE; } if (!mediaType.contains("charset")) { String encoding = t.getOutputProperty("encoding"); if (encoding == null || encoding.length()==0) { encoding = "UTF-8"; } mediaType = mediaType + "; charset=" + encoding; } return mediaType; }
// in core/src/java/org/apache/solr/response/BinaryResponseWriter.java
public void write(Writer writer, SolrQueryRequest request, SolrQueryResponse response) throws IOException { throw new RuntimeException("This is a binary writer , Cannot write to a characterstream"); }
// in core/src/java/org/apache/solr/request/UnInvertedField.java
public StatsValues getStats(SolrIndexSearcher searcher, DocSet baseDocs, String[] facet) throws IOException { //this function is ripped off nearly wholesale from the getCounts function to use //for multiValued fields within the StatsComponent. may be useful to find common //functionality between the two and refactor code somewhat use.incrementAndGet(); SchemaField sf = searcher.getSchema().getField(field); // FieldType ft = sf.getType(); StatsValues allstats = StatsValuesFactory.createStatsValues(sf); DocSet docs = baseDocs; int baseSize = docs.size(); int maxDoc = searcher.maxDoc(); if (baseSize <= 0) return allstats; DocSet missing = docs.andNot( searcher.getDocSet(new TermRangeQuery(field, null, null, false, false)) ); int i = 0; final FieldFacetStats[] finfo = new FieldFacetStats[facet.length]; //Initialize facetstats, if facets have been passed in FieldCache.DocTermsIndex si; for (String f : facet) { SchemaField facet_sf = searcher.getSchema().getField(f); try { si = FieldCache.DEFAULT.getTermsIndex(searcher.getAtomicReader(), f); } catch (IOException e) { throw new RuntimeException("failed to open field cache for: " + f, e); } finfo[i] = new FieldFacetStats(f, si, sf, facet_sf, numTermsInField); i++; } final int[] index = this.index; final int[] counts = new int[numTermsInField];//keep track of the number of times we see each word in the field for all the documents in the docset TermsEnum te = getOrdTermsEnum(searcher.getAtomicReader()); boolean doNegative = false; if (finfo.length == 0) { //if we're collecting statistics with a facet field, can't do inverted counting doNegative = baseSize > maxDoc >> 1 && termInstances > 0 && docs instanceof BitDocSet; } if (doNegative) { OpenBitSet bs = (OpenBitSet) ((BitDocSet) docs).getBits().clone(); bs.flip(0, maxDoc); // TODO: when iterator across negative elements is available, use that // instead of creating a new bitset and inverting. docs = new BitDocSet(bs, maxDoc - baseSize); // simply negating will mean that we have deleted docs in the set. // that should be OK, as their entries in our table should be empty. } // For the biggest terms, do straight set intersections for (TopTerm tt : bigTerms.values()) { // TODO: counts could be deferred if sorted==false if (tt.termNum >= 0 && tt.termNum < numTermsInField) { final Term t = new Term(field, tt.term); if (finfo.length == 0) { counts[tt.termNum] = searcher.numDocs(new TermQuery(t), docs); } else { //COULD BE VERY SLOW //if we're collecting stats for facet fields, we need to iterate on all matching documents DocSet bigTermDocSet = searcher.getDocSet(new TermQuery(t)).intersection(docs); DocIterator iter = bigTermDocSet.iterator(); while (iter.hasNext()) { int doc = iter.nextDoc(); counts[tt.termNum]++; for (FieldFacetStats f : finfo) { f.facetTermNum(doc, tt.termNum); } } } } } if (termInstances > 0) { DocIterator iter = docs.iterator(); while (iter.hasNext()) { int doc = iter.nextDoc(); int code = index[doc]; if ((code & 0xff) == 1) { int pos = code >>> 8; int whichArray = (doc >>> 16) & 0xff; byte[] arr = tnums[whichArray]; int tnum = 0; for (; ;) { int delta = 0; for (; ;) { byte b = arr[pos++]; delta = (delta << 7) | (b & 0x7f); if ((b & 0x80) == 0) break; } if (delta == 0) break; tnum += delta - TNUM_OFFSET; counts[tnum]++; for (FieldFacetStats f : finfo) { f.facetTermNum(doc, tnum); } } } else { int tnum = 0; int delta = 0; for (; ;) { delta = (delta << 7) | (code & 0x7f); if ((code & 0x80) == 0) { if (delta == 0) break; tnum += delta - TNUM_OFFSET; counts[tnum]++; for (FieldFacetStats f : finfo) { f.facetTermNum(doc, tnum); } delta = 0; } code >>>= 8; } } } } // add results in index order for (i = 0; i < numTermsInField; i++) { int c = doNegative ? maxTermCounts[i] - counts[i] : counts[i]; if (c == 0) continue; BytesRef value = getTermValue(te, i); allstats.accumulate(value, c); //as we've parsed the termnum into a value, lets also accumulate fieldfacet statistics for (FieldFacetStats f : finfo) { f.accumulateTermNum(i, value); } } int c = missing.size(); allstats.addMissing(c); if (finfo.length > 0) { for (FieldFacetStats f : finfo) { Map<String, StatsValues> facetStatsValues = f.facetStatsValues; FieldType facetType = searcher.getSchema().getFieldType(f.name); for (Map.Entry<String,StatsValues> entry : facetStatsValues.entrySet()) { String termLabel = entry.getKey(); int missingCount = searcher.numDocs(new TermQuery(new Term(f.name, facetType.toInternal(termLabel))), missing); entry.getValue().addMissing(missingCount); } allstats.addFacet(f.name, facetStatsValues); } } return allstats; }
// in core/src/java/org/apache/solr/servlet/cache/HttpCacheHeaderUtil.java
public String calcEtag(final long currentIndexVersion) { if (currentIndexVersion != indexVersionCache) { indexVersionCache=currentIndexVersion; try { etagCache = "\"" + new String(Base64.encodeBase64((Long.toHexString (Long.reverse(indexVersionCache)) + etagSeed).getBytes()), "US-ASCII") + "\""; } catch (UnsupportedEncodingException e) { throw new RuntimeException(e); // may not happen } } return etagCache; }
// in core/src/java/org/apache/solr/schema/IndexSchema.java
private SchemaField getIndexedField(String fname) { SchemaField f = getFields().get(fname); if (f==null) { throw new RuntimeException("unknown field '" + fname + "'"); } if (!f.indexed()) { throw new RuntimeException("'"+fname+"' is not an indexed field:" + f); } return f; }
// in core/src/java/org/apache/solr/schema/IndexSchema.java
private void readSchema(InputSource is) { log.info("Reading Solr Schema"); try { // pass the config resource loader to avoid building an empty one for no reason: // in the current case though, the stream is valid so we wont load the resource by name Config schemaConf = new Config(loader, "schema", is, "/schema/"); Document document = schemaConf.getDocument(); final XPath xpath = schemaConf.getXPath(); final List<SchemaAware> schemaAware = new ArrayList<SchemaAware>(); Node nd = (Node) xpath.evaluate("/schema/@name", document, XPathConstants.NODE); if (nd==null) { log.warn("schema has no name!"); } else { name = nd.getNodeValue(); log.info("Schema name=" + name); } version = schemaConf.getFloat("/schema/@version", 1.0f); // load the Field Types final FieldTypePluginLoader typeLoader = new FieldTypePluginLoader(this, fieldTypes, schemaAware); String expression = "/schema/types/fieldtype | /schema/types/fieldType"; NodeList nodes = (NodeList) xpath.evaluate(expression, document, XPathConstants.NODESET); typeLoader.load( loader, nodes ); // load the Fields // Hang on to the fields that say if they are required -- this lets us set a reasonable default for the unique key Map<String,Boolean> explicitRequiredProp = new HashMap<String, Boolean>(); ArrayList<DynamicField> dFields = new ArrayList<DynamicField>(); expression = "/schema/fields/field | /schema/fields/dynamicField"; nodes = (NodeList) xpath.evaluate(expression, document, XPathConstants.NODESET); for (int i=0; i<nodes.getLength(); i++) { Node node = nodes.item(i); NamedNodeMap attrs = node.getAttributes(); String name = DOMUtil.getAttr(attrs,"name","field definition"); log.trace("reading field def "+name); String type = DOMUtil.getAttr(attrs,"type","field " + name); FieldType ft = fieldTypes.get(type); if (ft==null) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST,"Unknown fieldtype '" + type + "' specified on field " + name); } Map<String,String> args = DOMUtil.toMapExcept(attrs, "name", "type"); if( args.get( "required" ) != null ) { explicitRequiredProp.put( name, Boolean.valueOf( args.get( "required" ) ) ); } SchemaField f = SchemaField.create(name,ft,args); if (node.getNodeName().equals("field")) { SchemaField old = fields.put(f.getName(),f); if( old != null ) { String msg = "[schema.xml] Duplicate field definition for '" + f.getName() + "' [[["+old.toString()+"]]] and [[["+f.toString()+"]]]"; throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, msg ); } log.debug("field defined: " + f); if( f.getDefaultValue() != null ) { log.debug(name+" contains default value: " + f.getDefaultValue()); fieldsWithDefaultValue.add( f ); } if (f.isRequired()) { log.debug(name+" is required in this schema"); requiredFields.add(f); } } else if (node.getNodeName().equals("dynamicField")) { // make sure nothing else has the same path addDynamicField(dFields, f); } else { // we should never get here throw new RuntimeException("Unknown field type"); } } //fields with default values are by definition required //add them to required fields, and we only have to loop once // in DocumentBuilder.getDoc() requiredFields.addAll(getFieldsWithDefaultValue()); // OK, now sort the dynamic fields largest to smallest size so we don't get // any false matches. We want to act like a compiler tool and try and match // the largest string possible. Collections.sort(dFields); log.trace("Dynamic Field Ordering:" + dFields); // stuff it in a normal array for faster access dynamicFields = dFields.toArray(new DynamicField[dFields.size()]); Node node = (Node) xpath.evaluate("/schema/similarity", document, XPathConstants.NODE); SimilarityFactory simFactory = readSimilarity(loader, node); if (simFactory == null) { simFactory = new DefaultSimilarityFactory(); } if (simFactory instanceof SchemaAware) { ((SchemaAware)simFactory).inform(this); } similarity = simFactory.getSimilarity(); node = (Node) xpath.evaluate("/schema/defaultSearchField/text()", document, XPathConstants.NODE); if (node==null) { log.warn("no default search field specified in schema."); } else { defaultSearchFieldName=node.getNodeValue().trim(); // throw exception if specified, but not found or not indexed if (defaultSearchFieldName!=null) { SchemaField defaultSearchField = getFields().get(defaultSearchFieldName); if ((defaultSearchField == null) || !defaultSearchField.indexed()) { String msg = "default search field '" + defaultSearchFieldName + "' not defined or not indexed" ; throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, msg ); } } log.info("default search field is "+defaultSearchFieldName); } node = (Node) xpath.evaluate("/schema/solrQueryParser/@defaultOperator", document, XPathConstants.NODE); if (node==null) { log.debug("using default query parser operator (OR)"); } else { queryParserDefaultOperator=node.getNodeValue().trim(); log.info("query parser default operator is "+queryParserDefaultOperator); } node = (Node) xpath.evaluate("/schema/uniqueKey/text()", document, XPathConstants.NODE); if (node==null) { log.warn("no uniqueKey specified in schema."); } else { uniqueKeyField=getIndexedField(node.getNodeValue().trim()); if (!uniqueKeyField.stored()) { log.error("uniqueKey is not stored - distributed search will not work"); } if (uniqueKeyField.multiValued()) { log.error("uniqueKey should not be multivalued"); } uniqueKeyFieldName=uniqueKeyField.getName(); uniqueKeyFieldType=uniqueKeyField.getType(); log.info("unique key field: "+uniqueKeyFieldName); // Unless the uniqueKeyField is marked 'required=false' then make sure it exists if( Boolean.FALSE != explicitRequiredProp.get( uniqueKeyFieldName ) ) { uniqueKeyField.required = true; requiredFields.add(uniqueKeyField); } } /////////////// parse out copyField commands /////////////// // Map<String,ArrayList<SchemaField>> cfields = new HashMap<String,ArrayList<SchemaField>>(); // expression = "/schema/copyField"; dynamicCopyFields = new DynamicCopy[] {}; expression = "//copyField"; nodes = (NodeList) xpath.evaluate(expression, document, XPathConstants.NODESET); for (int i=0; i<nodes.getLength(); i++) { node = nodes.item(i); NamedNodeMap attrs = node.getAttributes(); String source = DOMUtil.getAttr(attrs,"source","copyField definition"); String dest = DOMUtil.getAttr(attrs,"dest", "copyField definition"); String maxChars = DOMUtil.getAttr(attrs, "maxChars"); int maxCharsInt = CopyField.UNLIMITED; if (maxChars != null) { try { maxCharsInt = Integer.parseInt(maxChars); } catch (NumberFormatException e) { log.warn("Couldn't parse maxChars attribute for copyField from " + source + " to " + dest + " as integer. The whole field will be copied."); } } registerCopyField(source, dest, maxCharsInt); } for (Map.Entry<SchemaField, Integer> entry : copyFieldTargetCounts.entrySet()) { if (entry.getValue() > 1 && !entry.getKey().multiValued()) { log.warn("Field " + entry.getKey().name + " is not multivalued "+ "and destination for multiple copyFields ("+ entry.getValue()+")"); } } //Run the callbacks on SchemaAware now that everything else is done for (SchemaAware aware : schemaAware) { aware.inform(this); } } catch (SolrException e) { throw e; } catch(Exception e) { // unexpected exception... throw new SolrException( SolrException.ErrorCode.SERVER_ERROR,"Schema Parsing Failed: " + e.getMessage(), e); } // create the field analyzers refreshAnalyzers(); }
// in core/src/java/org/apache/solr/schema/DateField.java
Override public Date toObject(IndexableField f) { try { return parseDate( toExternal(f) ); } catch( ParseException ex ) { throw new RuntimeException( ex ); } }
// in core/src/java/org/apache/solr/schema/BinaryField.java
Override public SortField getSortField(SchemaField field, boolean top) { throw new RuntimeException("Cannot sort on a Binary field"); }
// in core/src/java/org/apache/solr/schema/SimplePreAnalyzedParser.java
static final int charToNibble(char c) { if (c >= '0' && c <= '9') { return c - '0'; } else if (c >= 'a' && c <= 'f') { return 0xa + (c - 'a'); } else if (c >= 'A' && c <= 'F') { return 0xA + (c - 'A'); } else { throw new RuntimeException("Not a hex character: '" + c + "'"); } }
// in core/src/java/org/apache/solr/schema/SchemaField.java
static int calcProps(String name, FieldType ft, Map<String, String> props) { int trueProps = parseProperties(props,true); int falseProps = parseProperties(props,false); int p = ft.properties; // // If any properties were explicitly turned off, then turn off other properties // that depend on that. // if (on(falseProps,STORED)) { int pp = STORED | BINARY; if (on(pp,trueProps)) { throw new RuntimeException("SchemaField: " + name + " conflicting stored field options:" + props); } p &= ~pp; } if (on(falseProps,INDEXED)) { int pp = (INDEXED | STORE_TERMVECTORS | STORE_TERMPOSITIONS | STORE_TERMOFFSETS | SORT_MISSING_FIRST | SORT_MISSING_LAST); if (on(pp,trueProps)) { throw new RuntimeException("SchemaField: " + name + " conflicting 'true' field options for non-indexed field:" + props); } p &= ~pp; } if (on(falseProps,INDEXED)) { int pp = (OMIT_NORMS | OMIT_TF_POSITIONS | OMIT_POSITIONS); if (on(pp,falseProps)) { throw new RuntimeException("SchemaField: " + name + " conflicting 'false' field options for non-indexed field:" + props); } p &= ~pp; } if (on(trueProps,OMIT_TF_POSITIONS)) { int pp = (OMIT_POSITIONS | OMIT_TF_POSITIONS); if (on(pp, falseProps)) { throw new RuntimeException("SchemaField: " + name + " conflicting tf and position field options:" + props); } p &= ~pp; } if (on(falseProps,STORE_TERMVECTORS)) { int pp = (STORE_TERMVECTORS | STORE_TERMPOSITIONS | STORE_TERMOFFSETS); if (on(pp,trueProps)) { throw new RuntimeException("SchemaField: " + name + " conflicting termvector field options:" + props); } p &= ~pp; } // override sort flags if (on(trueProps,SORT_MISSING_FIRST)) { p &= ~SORT_MISSING_LAST; } if (on(trueProps,SORT_MISSING_LAST)) { p &= ~SORT_MISSING_FIRST; } p &= ~falseProps; p |= trueProps; return p; }
// in core/src/java/org/apache/solr/schema/TextField.java
public static BytesRef analyzeMultiTerm(String field, String part, Analyzer analyzerIn) { if (part == null) return null; TokenStream source; try { source = analyzerIn.tokenStream(field, new StringReader(part)); source.reset(); } catch (IOException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Unable to initialize TokenStream to analyze multiTerm term: " + part, e); } TermToBytesRefAttribute termAtt = source.getAttribute(TermToBytesRefAttribute.class); BytesRef bytes = termAtt.getBytesRef(); try { if (!source.incrementToken()) throw new SolrException(SolrException.ErrorCode.BAD_REQUEST,"analyzer returned no terms for multiTerm term: " + part); termAtt.fillBytesRef(); if (source.incrementToken()) throw new SolrException(SolrException.ErrorCode.BAD_REQUEST,"analyzer returned too many terms for multiTerm term: " + part); } catch (IOException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST,"error analyzing range part: " + part, e); } try { source.end(); source.close(); } catch (IOException e) { throw new RuntimeException("Unable to end & close TokenStream after analyzing multiTerm term: " + part, e); } return BytesRef.deepCopyOf(bytes); }
// in core/src/java/org/apache/solr/schema/TextField.java
static Query parseFieldQuery(QParser parser, Analyzer analyzer, String field, String queryText) { int phraseSlop = 0; boolean enablePositionIncrements = true; // most of the following code is taken from the Lucene QueryParser // Use the analyzer to get all the tokens, and then build a TermQuery, // PhraseQuery, or nothing based on the term count TokenStream source; try { source = analyzer.tokenStream(field, new StringReader(queryText)); source.reset(); } catch (IOException e) { throw new RuntimeException("Unable to initialize TokenStream to analyze query text", e); } CachingTokenFilter buffer = new CachingTokenFilter(source); CharTermAttribute termAtt = null; PositionIncrementAttribute posIncrAtt = null; int numTokens = 0; try { buffer.reset(); } catch (IOException e) { throw new RuntimeException("Unable to initialize TokenStream to analyze query text", e); } if (buffer.hasAttribute(CharTermAttribute.class)) { termAtt = buffer.getAttribute(CharTermAttribute.class); } if (buffer.hasAttribute(PositionIncrementAttribute.class)) { posIncrAtt = buffer.getAttribute(PositionIncrementAttribute.class); } int positionCount = 0; boolean severalTokensAtSamePosition = false; boolean hasMoreTokens = false; if (termAtt != null) { try { hasMoreTokens = buffer.incrementToken(); while (hasMoreTokens) { numTokens++; int positionIncrement = (posIncrAtt != null) ? posIncrAtt.getPositionIncrement() : 1; if (positionIncrement != 0) { positionCount += positionIncrement; } else { severalTokensAtSamePosition = true; } hasMoreTokens = buffer.incrementToken(); } } catch (IOException e) { // ignore } } try { // rewind the buffer stream buffer.reset(); // close original stream - all tokens buffered source.close(); } catch (IOException e) { // ignore } if (numTokens == 0) return null; else if (numTokens == 1) { String term = null; try { boolean hasNext = buffer.incrementToken(); assert hasNext == true; term = termAtt.toString(); } catch (IOException e) { // safe to ignore, because we know the number of tokens } // return newTermQuery(new Term(field, term)); return new TermQuery(new Term(field, term)); } else { if (severalTokensAtSamePosition) { if (positionCount == 1) { // no phrase query: // BooleanQuery q = newBooleanQuery(true); BooleanQuery q = new BooleanQuery(true); for (int i = 0; i < numTokens; i++) { String term = null; try { boolean hasNext = buffer.incrementToken(); assert hasNext == true; term = termAtt.toString(); } catch (IOException e) { // safe to ignore, because we know the number of tokens } // Query currentQuery = newTermQuery(new Term(field, term)); Query currentQuery = new TermQuery(new Term(field, term)); q.add(currentQuery, BooleanClause.Occur.SHOULD); } return q; } else { // phrase query: // MultiPhraseQuery mpq = newMultiPhraseQuery(); MultiPhraseQuery mpq = new MultiPhraseQuery(); mpq.setSlop(phraseSlop); List multiTerms = new ArrayList(); int position = -1; for (int i = 0; i < numTokens; i++) { String term = null; int positionIncrement = 1; try { boolean hasNext = buffer.incrementToken(); assert hasNext == true; term = termAtt.toString(); if (posIncrAtt != null) { positionIncrement = posIncrAtt.getPositionIncrement(); } } catch (IOException e) { // safe to ignore, because we know the number of tokens } if (positionIncrement > 0 && multiTerms.size() > 0) { if (enablePositionIncrements) { mpq.add((Term[])multiTerms.toArray(new Term[0]),position); } else { mpq.add((Term[])multiTerms.toArray(new Term[0])); } multiTerms.clear(); } position += positionIncrement; multiTerms.add(new Term(field, term)); } if (enablePositionIncrements) { mpq.add((Term[])multiTerms.toArray(new Term[0]),position); } else { mpq.add((Term[])multiTerms.toArray(new Term[0])); } return mpq; } } else { // PhraseQuery pq = newPhraseQuery(); PhraseQuery pq = new PhraseQuery(); pq.setSlop(phraseSlop); int position = -1; for (int i = 0; i < numTokens; i++) { String term = null; int positionIncrement = 1; try { boolean hasNext = buffer.incrementToken(); assert hasNext == true; term = termAtt.toString(); if (posIncrAtt != null) { positionIncrement = posIncrAtt.getPositionIncrement(); } } catch (IOException e) { // safe to ignore, because we know the number of tokens } if (enablePositionIncrements) { position += positionIncrement; pq.add(new Term(field, term),position); } else { pq.add(new Term(field, term)); } } return pq; } } }
// in core/src/java/org/apache/solr/schema/CollationField.java
private Collator createFromRules(String fileName, ResourceLoader loader) { InputStream input = null; try { input = loader.openResource(fileName); String rules = IOUtils.toString(input, "UTF-8"); return new RuleBasedCollator(rules); } catch (IOException e) { // io error throw new RuntimeException(e); } catch (ParseException e) { // invalid rules throw new RuntimeException(e); } finally { IOUtils.closeQuietly(input); } }
// in core/src/java/org/apache/solr/schema/CollationField.java
private BytesRef analyzeRangePart(String field, String part) { TokenStream source; try { source = analyzer.tokenStream(field, new StringReader(part)); source.reset(); } catch (IOException e) { throw new RuntimeException("Unable to initialize TokenStream to analyze range part: " + part, e); } TermToBytesRefAttribute termAtt = source.getAttribute(TermToBytesRefAttribute.class); BytesRef bytes = termAtt.getBytesRef(); // we control the analyzer here: most errors are impossible try { if (!source.incrementToken()) throw new IllegalArgumentException("analyzer returned no terms for range part: " + part); termAtt.fillBytesRef(); assert !source.incrementToken(); } catch (IOException e) { throw new RuntimeException("error analyzing range part: " + part, e); } try { source.end(); source.close(); } catch (IOException e) { throw new RuntimeException("Unable to end & close TokenStream after analyzing range part: " + part, e); } return BytesRef.deepCopyOf(bytes); }
// in core/src/java/org/apache/solr/schema/FieldType.java
void setArgs(IndexSchema schema, Map<String,String> args) { // default to STORED, INDEXED, OMIT_TF_POSITIONS and MULTIVALUED depending on schema version properties = (STORED | INDEXED); float schemaVersion = schema.getVersion(); if (schemaVersion < 1.1f) properties |= MULTIVALUED; if (schemaVersion > 1.1f) properties |= OMIT_TF_POSITIONS; if (schemaVersion < 1.3) { args.remove("compressThreshold"); } this.args=args; Map<String,String> initArgs = new HashMap<String,String>(args); trueProperties = FieldProperties.parseProperties(initArgs,true); falseProperties = FieldProperties.parseProperties(initArgs,false); properties &= ~falseProperties; properties |= trueProperties; for (String prop : FieldProperties.propertyNames) initArgs.remove(prop); init(schema, initArgs); String positionInc = initArgs.get("positionIncrementGap"); if (positionInc != null) { Analyzer analyzer = getAnalyzer(); if (analyzer instanceof SolrAnalyzer) { ((SolrAnalyzer)analyzer).setPositionIncrementGap(Integer.parseInt(positionInc)); } else { throw new RuntimeException("Can't set positionIncrementGap on custom analyzer " + analyzer.getClass()); } analyzer = getQueryAnalyzer(); if (analyzer instanceof SolrAnalyzer) { ((SolrAnalyzer)analyzer).setPositionIncrementGap(Integer.parseInt(positionInc)); } else { throw new RuntimeException("Can't set positionIncrementGap on custom analyzer " + analyzer.getClass()); } initArgs.remove("positionIncrementGap"); } final String postingsFormat = initArgs.get("postingsFormat"); if (postingsFormat != null) { this.postingsFormat = postingsFormat; initArgs.remove("postingsFormat"); } if (initArgs.size() > 0) { throw new RuntimeException("schema fieldtype " + typeName + "("+ this.getClass().getName() + ")" + " invalid arguments:" + initArgs); } }
// in core/src/java/org/apache/solr/schema/FieldType.java
protected void restrictProps(int props) { if ((properties & props) != 0) { throw new RuntimeException("schema fieldtype " + typeName + "("+ this.getClass().getName() + ")" + " invalid properties:" + propertiesToString(properties & props)); } }
// in core/src/java/org/apache/solr/internal/csv/CSVStrategy.java
public Object clone() { try { return super.clone(); } catch (CloneNotSupportedException e) { throw new RuntimeException(e); // impossible } }
// in core/src/java/org/apache/solr/search/similarities/IBSimilarityFactory.java
private Distribution parseDistribution(String expr) { if ("LL".equals(expr)) { return new DistributionLL(); } else if ("SPL".equals(expr)) { return new DistributionSPL(); } else { throw new RuntimeException("Invalid distribution: " + expr); } }
// in core/src/java/org/apache/solr/search/similarities/IBSimilarityFactory.java
private Lambda parseLambda(String expr) { if ("DF".equals(expr)) { return new LambdaDF(); } else if ("TTF".equals(expr)) { return new LambdaTTF(); } else { throw new RuntimeException("Invalid lambda: " + expr); } }
// in core/src/java/org/apache/solr/search/similarities/DFRSimilarityFactory.java
private BasicModel parseBasicModel(String expr) { if ("Be".equals(expr)) { return new BasicModelBE(); } else if ("D".equals(expr)) { return new BasicModelD(); } else if ("G".equals(expr)) { return new BasicModelG(); } else if ("I(F)".equals(expr)) { return new BasicModelIF(); } else if ("I(n)".equals(expr)) { return new BasicModelIn(); } else if ("I(ne)".equals(expr)) { return new BasicModelIne(); } else if ("P".equals(expr)) { return new BasicModelP(); } else { throw new RuntimeException("Invalid basicModel: " + expr); } }
// in core/src/java/org/apache/solr/search/similarities/DFRSimilarityFactory.java
private AfterEffect parseAfterEffect(String expr) { if ("B".equals(expr)) { return new AfterEffectB(); } else if ("L".equals(expr)) { return new AfterEffectL(); } else if ("none".equals(expr)) { return new AfterEffect.NoAfterEffect(); } else { throw new RuntimeException("Invalid afterEffect: " + expr); } }
// in core/src/java/org/apache/solr/search/similarities/DFRSimilarityFactory.java
static Normalization parseNormalization(String expr, String c, String mu, String z) { if (mu != null && !"H3".equals(expr)) { throw new RuntimeException( "parameter mu only makes sense for normalization H3"); } if (z != null && !"Z".equals(expr)) { throw new RuntimeException( "parameter z only makes sense for normalization Z"); } if (c != null && !("H1".equals(expr) || "H2".equals(expr))) { throw new RuntimeException( "parameter c only makese sense for normalizations H1 and H2"); } if ("H1".equals(expr)) { return (c != null) ? new NormalizationH1(Float.parseFloat(c)) : new NormalizationH1(); } else if ("H2".equals(expr)) { return (c != null) ? new NormalizationH2(Float.parseFloat(c)) : new NormalizationH2(); } else if ("H3".equals(expr)) { return (mu != null) ? new NormalizationH3(Float.parseFloat(mu)) : new NormalizationH3(); } else if ("Z".equals(expr)) { return (z != null) ? new NormalizationZ(Float.parseFloat(z)) : new NormalizationZ(); } else if ("none".equals(expr)) { return new Normalization.NoNormalization(); } else { throw new RuntimeException("Invalid normalization: " + expr); } }
// in core/src/java/org/apache/solr/search/QueryParsing.java
public static String toString(Query query, IndexSchema schema) { try { StringBuilder sb = new StringBuilder(); toString(query, schema, sb, 0); return sb.toString(); } catch (Exception e) { throw new RuntimeException(e); } }
// in core/src/java/org/apache/solr/spelling/AbstractLuceneSpellChecker.java
Override public String init(NamedList config, SolrCore core) { super.init(config, core); indexDir = (String) config.get(INDEX_DIR); String accuracy = (String) config.get(ACCURACY); //If indexDir is relative then create index inside core.getDataDir() if (indexDir != null) { if (!new File(indexDir).isAbsolute()) { indexDir = core.getDataDir() + File.separator + indexDir; } } sourceLocation = (String) config.get(LOCATION); String compClass = (String) config.get(COMPARATOR_CLASS); Comparator<SuggestWord> comp = null; if (compClass != null){ if (compClass.equalsIgnoreCase(SCORE_COMP)){ comp = SuggestWordQueue.DEFAULT_COMPARATOR; } else if (compClass.equalsIgnoreCase(FREQ_COMP)){ comp = new SuggestWordFrequencyComparator(); } else{//must be a FQCN comp = (Comparator<SuggestWord>) core.getResourceLoader().newInstance(compClass, Comparator.class); } } else { comp = SuggestWordQueue.DEFAULT_COMPARATOR; } String strDistanceName = (String)config.get(STRING_DISTANCE); if (strDistanceName != null) { sd = core.getResourceLoader().newInstance(strDistanceName, StringDistance.class); //TODO: Figure out how to configure options. Where's Spring when you need it? Or at least BeanUtils... } else { sd = new LevensteinDistance(); } try { initIndex(); spellChecker = new SpellChecker(index, sd, comp); } catch (IOException e) { throw new RuntimeException(e); } if (accuracy != null) { try { this.accuracy = Float.parseFloat(accuracy); spellChecker.setAccuracy(this.accuracy); } catch (NumberFormatException e) { throw new RuntimeException( "Unparseable accuracy given for dictionary: " + name, e); } } return name; }
// in core/src/java/org/apache/solr/spelling/FileBasedSpellChecker.java
Override public void build(SolrCore core, SolrIndexSearcher searcher) { try { loadExternalFileDictionary(core); spellChecker.clearIndex(); // TODO: you should be able to specify the IWC params? // TODO: if we enable this, codec gets angry since field won't exist in the schema // config.setCodec(core.getCodec()); spellChecker.indexDictionary(dictionary, new IndexWriterConfig(core.getSolrConfig().luceneMatchVersion, null), false); } catch (IOException e) { throw new RuntimeException(e); } }
// in core/src/java/org/apache/solr/spelling/SuggestQueryConverter.java
Override public Collection<Token> convert(String original) { if (original == null) { // this can happen with q.alt = and no query return Collections.emptyList(); } Collection<Token> result = new ArrayList<Token>(); try { analyze(result, new StringReader(original), 0); } catch (IOException e) { throw new RuntimeException(e); } return result; }
// in core/src/java/org/apache/solr/spelling/IndexBasedSpellChecker.java
private void initSourceReader() { if (sourceLocation != null) { try { FSDirectory luceneIndexDir = FSDirectory.open(new File(sourceLocation)); this.reader = DirectoryReader.open(luceneIndexDir); } catch (IOException e) { throw new RuntimeException(e); } } }
// in core/src/java/org/apache/solr/spelling/IndexBasedSpellChecker.java
Override public void build(SolrCore core, SolrIndexSearcher searcher) { IndexReader reader = null; try { if (sourceLocation == null) { // Load from Solr's index reader = searcher.getIndexReader(); } else { // Load from Lucene index at given sourceLocation reader = this.reader; } // Create the dictionary dictionary = new HighFrequencyDictionary(reader, field, threshold); // TODO: maybe whether or not to clear the index should be configurable? // an incremental update is faster (just adds new terms), but if you 'expunged' // old terms I think they might hang around. spellChecker.clearIndex(); // TODO: you should be able to specify the IWC params? // TODO: if we enable this, codec gets angry since field won't exist in the schema // config.setCodec(core.getCodec()); spellChecker.indexDictionary(dictionary, new IndexWriterConfig(core.getSolrConfig().luceneMatchVersion, null), false); } catch (IOException e) { throw new RuntimeException(e); } }
// in core/src/java/org/apache/solr/cloud/ZkSolrResourceLoader.java
Override public InputStream openResource(String resource) { InputStream is = null; String file = collectionZkPath + "/" + resource; try { if (zkController.pathExists(file)) { byte[] bytes = zkController.getZkClient().getData(collectionZkPath + "/" + resource, null, null, true); return new ByteArrayInputStream(bytes); } } catch (Exception e) { throw new RuntimeException("Error opening " + file, e); } try { // delegate to the class loader (looking into $INSTANCE_DIR/lib jars) is = classLoader.getResourceAsStream(resource); } catch (Exception e) { throw new RuntimeException("Error opening " + resource, e); } if (is == null) { throw new RuntimeException("Can't find resource '" + resource + "' in classpath or '" + collectionZkPath + "', cwd=" + System.getProperty("user.dir")); } return is; }
// in core/src/java/org/apache/solr/cloud/SolrZkServer.java
public void setClientPort(int clientPort) { if (clientPortAddress != null) { try { this.clientPortAddress = new InetSocketAddress( InetAddress.getByName(clientPortAddress.getHostName()), clientPort); } catch (UnknownHostException e) { throw new RuntimeException(e); } } else { this.clientPortAddress = new InetSocketAddress(clientPort); } }
// in core/src/java/org/apache/solr/cloud/ZkController.java
private ZkCoreNodeProps getLeaderProps(final String collection, final String slice) throws KeeperException, InterruptedException { int iterCount = 60; while (iterCount-- > 0) try { byte[] data = zkClient.getData( ZkStateReader.getShardLeadersPath(collection, slice), null, null, true); ZkCoreNodeProps leaderProps = new ZkCoreNodeProps( ZkNodeProps.load(data)); return leaderProps; } catch (NoNodeException e) { Thread.sleep(500); } throw new RuntimeException("Could not get leader props"); }
// in core/src/java/org/apache/solr/update/UpdateLog.java
public Future<RecoveryInfo> applyBufferedUpdates() { // recovery trips this assert under some race - even when // it checks the state first // assert state == State.BUFFERING; // block all updates to eliminate race conditions // reading state and acting on it in the update processor versionInfo.blockUpdates(); try { cancelApplyBufferUpdate = false; if (state != State.BUFFERING) return null; // handle case when no log was even created because no updates // were received. if (tlog == null) { state = State.ACTIVE; return null; } tlog.incref(); state = State.APPLYING_BUFFERED; operationFlags &= ~FLAG_GAP; } finally { versionInfo.unblockUpdates(); } if (recoveryExecutor.isShutdown()) { tlog.decref(); throw new RuntimeException("executor is not running..."); } ExecutorCompletionService<RecoveryInfo> cs = new ExecutorCompletionService<RecoveryInfo>(recoveryExecutor); LogReplayer replayer = new LogReplayer(Arrays.asList(new TransactionLog[]{tlog}), true); return cs.submit(replayer, recoveryInfo); }
// in core/src/java/org/apache/solr/update/processor/MD5Signature.java
Override protected MessageDigest initialValue() { try { return MessageDigest.getInstance("MD5"); } catch (NoSuchAlgorithmException e) { throw new RuntimeException(e); } }
// in core/src/java/org/apache/solr/update/processor/MD5Signature.java
Override public void add(String content) { try { digester.update(content.getBytes("UTF-8")); } catch (UnsupportedEncodingException e) { // won't happen log.error("UTF-8 not supported", e); throw new RuntimeException(e); } }
// in core/src/java/org/apache/solr/core/Config.java
public Node getNode(String path, boolean errIfMissing) { XPath xpath = xpathFactory.newXPath(); Node nd = null; String xstr = normalize(path); try { nd = (Node)xpath.evaluate(xstr, doc, XPathConstants.NODE); if (nd==null) { if (errIfMissing) { throw new RuntimeException(name + " missing "+path); } else { log.debug(name + " missing optional " + path); return null; } } log.trace(name + ":" + path + "=" + nd); return nd; } catch (XPathExpressionException e) { SolrException.log(log,"Error in xpath",e); throw new SolrException( SolrException.ErrorCode.SERVER_ERROR,"Error in xpath:" + xstr + " for " + name,e); } catch (SolrException e) { throw(e); } catch (Throwable e) { SolrException.log(log,"Error in xpath",e); throw new SolrException( SolrException.ErrorCode.SERVER_ERROR,"Error in xpath:" + xstr+ " for " + name,e); } }
// in core/src/java/org/apache/solr/core/SolrConfig.java
private void initLibs() { NodeList nodes = (NodeList) evaluate("lib", XPathConstants.NODESET); if (nodes==null || nodes.getLength()==0) return; log.info("Adding specified lib dirs to ClassLoader"); for (int i=0; i<nodes.getLength(); i++) { Node node = nodes.item(i); String baseDir = DOMUtil.getAttr(node, "dir"); String path = DOMUtil.getAttr(node, "path"); if (null != baseDir) { // :TODO: add support for a simpler 'glob' mutually eclusive of regex String regex = DOMUtil.getAttr(node, "regex"); FileFilter filter = (null == regex) ? null : new RegexFileFilter(regex); getResourceLoader().addToClassLoader(baseDir, filter); } else if (null != path) { getResourceLoader().addToClassLoader(path); } else { throw new RuntimeException ("lib: missing mandatory attributes: 'dir' or 'path'"); } } }
// in core/src/java/org/apache/solr/core/SolrCore.java
void initIndex() { try { String indexDir = getNewIndexDir(); boolean indexExists = getDirectoryFactory().exists(indexDir); boolean firstTime; synchronized (SolrCore.class) { firstTime = dirs.add(new File(indexDir).getCanonicalPath()); } boolean removeLocks = solrConfig.unlockOnStartup; initIndexReaderFactory(); if (indexExists && firstTime) { // to remove locks, the directory must already exist... so we create it // if it didn't exist already... Directory dir = directoryFactory.get(indexDir, getSolrConfig().indexConfig.lockType); if (dir != null) { if (IndexWriter.isLocked(dir)) { if (removeLocks) { log.warn(logid + "WARNING: Solr index directory '{}' is locked. Unlocking...", indexDir); IndexWriter.unlock(dir); } else { log.error(logid + "Solr index directory '{}' is locked. Throwing exception", indexDir); throw new LockObtainFailedException("Index locked for write for core " + name); } } directoryFactory.release(dir); } } // Create the index if it doesn't exist. if(!indexExists) { log.warn(logid+"Solr index directory '" + new File(indexDir) + "' doesn't exist." + " Creating new index..."); SolrIndexWriter writer = new SolrIndexWriter("SolrCore.initIndex", indexDir, getDirectoryFactory(), true, schema, solrConfig.indexConfig, solrDelPolicy, codec, false); writer.close(); } } catch (IOException e) { throw new RuntimeException(e); } }
// in core/src/java/org/apache/solr/core/SolrResourceLoader.java
public InputStream openResource(String resource) { InputStream is=null; try { File f0 = new File(resource); File f = f0; if (!f.isAbsolute()) { // try $CWD/$configDir/$resource f = new File(getConfigDir() + resource); } if (f.isFile() && f.canRead()) { return new FileInputStream(f); } else if (f != f0) { // no success with $CWD/$configDir/$resource if (f0.isFile() && f0.canRead()) return new FileInputStream(f0); } // delegate to the class loader (looking into $INSTANCE_DIR/lib jars) is = classLoader.getResourceAsStream(resource); if (is == null) is = classLoader.getResourceAsStream(getConfigDir() + resource); } catch (Exception e) { throw new RuntimeException("Error opening " + resource, e); } if (is==null) { throw new RuntimeException("Can't find resource '" + resource + "' in classpath or '" + getConfigDir() + "', cwd="+System.getProperty("user.dir")); } return is; }
// in core/src/java/org/apache/solr/core/CoreContainer.java
public SolrCore register(String name, SolrCore core, boolean returnPrevNotClosed) { if( core == null ) { throw new RuntimeException( "Can not register a null core." ); } if( name == null || name.indexOf( '/' ) >= 0 || name.indexOf( '\\' ) >= 0 ){ throw new RuntimeException( "Invalid core name: "+name ); } if (zkController != null) { // this happens before we can receive requests zkController.preRegister(core.getCoreDescriptor()); } SolrCore old = null; synchronized (cores) { if (isShutDown) { core.close(); throw new IllegalStateException("This CoreContainer has been shutdown"); } old = cores.put(name, core); /* * set both the name of the descriptor and the name of the * core, since the descriptors name is used for persisting. */ core.setName(name); core.getCoreDescriptor().name = name; } if( old == null || old == core) { log.info( "registering core: "+name ); registerInZk(core); return null; } else { log.info( "replacing core: "+name ); if (!returnPrevNotClosed) { old.close(); } registerInZk(core); return old; } }
// in core/src/java/org/apache/solr/core/JmxMonitoredMap.java
private OpenType determineType(Class type) { try { for (Field field : SimpleType.class.getFields()) { if (field.getType().equals(SimpleType.class)) { SimpleType candidate = (SimpleType) field.get(SimpleType.class); if (candidate.getTypeName().equals(type.getName())) { return candidate; } } } } catch (Exception e) { throw new RuntimeException(e); } return null; }
// in core/src/java/org/apache/solr/util/SolrPluginUtils.java
public static void invokeSetters(Object bean, NamedList initArgs) { if (initArgs == null) return; Class clazz = bean.getClass(); Method[] methods = clazz.getMethods(); Iterator<Map.Entry<String, Object>> iterator = initArgs.iterator(); while (iterator.hasNext()) { Map.Entry<String, Object> entry = iterator.next(); String key = entry.getKey(); String setterName = "set" + String.valueOf(Character.toUpperCase(key.charAt(0))) + key.substring(1); Method method = null; try { for (Method m : methods) { if (m.getName().equals(setterName) && m.getParameterTypes().length == 1) { method = m; break; } } if (method == null) { throw new RuntimeException("no setter corrresponding to '" + key + "' in " + clazz.getName()); } Class pClazz = method.getParameterTypes()[0]; Object val = entry.getValue(); method.invoke(bean, val); } catch (InvocationTargetException e1) { throw new RuntimeException("Error invoking setter " + setterName + " on class : " + clazz.getName(), e1); } catch (IllegalAccessException e1) { throw new RuntimeException("Error invoking setter " + setterName + " on class : " + clazz.getName(), e1); } } }
// in core/src/java/org/apache/solr/util/DOMUtil.java
public static String getAttr(NamedNodeMap attrs, String name, String missing_err) { Node attr = attrs==null? null : attrs.getNamedItem(name); if (attr==null) { if (missing_err==null) return null; throw new RuntimeException(missing_err + ": missing mandatory attribute '" + name + "'"); } String val = attr.getNodeValue(); return val; }
// in core/src/java/org/apache/solr/util/DOMUtil.java
private static void parsePropertyString(String value, List<String> fragments, List<String> propertyRefs) { int prev = 0; int pos; //search for the next instance of $ from the 'prev' position while ((pos = value.indexOf("$", prev)) >= 0) { //if there was any text before this, add it as a fragment //TODO, this check could be modified to go if pos>prev; //seems like this current version could stick empty strings //into the list if (pos > 0) { fragments.add(value.substring(prev, pos)); } //if we are at the end of the string, we tack on a $ //then move past it if (pos == (value.length() - 1)) { fragments.add("$"); prev = pos + 1; } else if (value.charAt(pos + 1) != '{') { //peek ahead to see if the next char is a property or not //not a property: insert the char as a literal /* fragments.addElement(value.substring(pos + 1, pos + 2)); prev = pos + 2; */ if (value.charAt(pos + 1) == '$') { //backwards compatibility two $ map to one mode fragments.add("$"); prev = pos + 2; } else { //new behaviour: $X maps to $X for all values of X!='$' fragments.add(value.substring(pos, pos + 2)); prev = pos + 2; } } else { //property found, extract its name or bail on a typo int endName = value.indexOf('}', pos); if (endName < 0) { throw new RuntimeException("Syntax error in property: " + value); } String propertyName = value.substring(pos + 2, endName); fragments.add(null); propertyRefs.add(propertyName); prev = endName + 1; } } //no more $ signs found //if there is any tail to the string, append it if (prev < value.length()) { fragments.add(value.substring(prev)); } }
65
            
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
catch (UnsupportedEncodingException e) { // can't happen - UTF-8 throw new RuntimeException(e); }
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
catch (Exception e) { throw new RuntimeException("Problem pretty printing XML", e); }
// in solrj/src/java/org/apache/solr/common/cloud/ZkStateReader.java
catch (IOException e) { throw new RuntimeException(e); // should never happen w/o using real IO }
// in solrj/src/java/org/apache/solr/common/cloud/ZkCmdExecutor.java
catch (KeeperException.ConnectionLossException e) { if (exception == null) { exception = e; } if (Thread.currentThread().isInterrupted()) { Thread.currentThread().interrupt(); throw new InterruptedException(); } if (Thread.currentThread() instanceof SafeStopThread) { if (((SafeStopThread) Thread.currentThread()).isClosed()) { throw new RuntimeException("Interrupted"); } } retryDelay(i); }
// in solrj/src/java/org/apache/solr/common/params/MapSolrParams.java
catch (IOException e) {throw new RuntimeException(e);}
// in solrj/src/java/org/apache/solr/common/params/MultiMapSolrParams.java
catch (IOException e) {throw new RuntimeException(e);}
// in solrj/src/java/org/apache/solr/common/params/ModifiableSolrParams.java
catch (IOException e) {throw new RuntimeException(e);}
// in solrj/src/java/org/apache/solr/client/solrj/request/RequestWriter.java
catch (IOException e) { throw new RuntimeException("Unable to write xml into a stream", e); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
catch (MalformedURLException e) { throw new RuntimeException(e); }
// in solrj/src/java/org/apache/solr/client/solrj/util/ClientUtils.java
catch (IOException e) {throw new RuntimeException(e);}
// in solrj/src/java/org/apache/noggit/CharArr.java
catch (IOException e) { throw new RuntimeException(e); }
// in solrj/src/java/org/apache/noggit/CharArr.java
catch (IOException e) { throw new RuntimeException(e); }
// in solrj/src/java/org/apache/noggit/CharArr.java
catch (IOException e) { throw new RuntimeException(e); }
// in contrib/langid/src/java/org/apache/solr/update/processor/LangDetectLanguageIdentifierUpdateProcessorFactory.java
catch (Exception e) { throw new RuntimeException("Couldn't load profile data, will return empty languages always!", e); }
// in contrib/analysis-extras/src/java/org/apache/solr/schema/ICUCollationField.java
catch (Exception e) { // io error or invalid rules throw new RuntimeException(e); }
// in contrib/analysis-extras/src/java/org/apache/solr/schema/ICUCollationField.java
catch (IOException e) { throw new RuntimeException("Unable to initialize TokenStream to analyze range part: " + part, e); }
// in contrib/analysis-extras/src/java/org/apache/solr/schema/ICUCollationField.java
catch (IOException e) { throw new RuntimeException("error analyzing range part: " + part, e); }
// in contrib/analysis-extras/src/java/org/apache/solr/schema/ICUCollationField.java
catch (IOException e) { throw new RuntimeException("Unable to end & close TokenStream after analyzing range part: " + part, e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/FileDataSource.java
catch (FileNotFoundException e) { throw new RuntimeException(e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/XPathRecordReader.java
catch (Exception e) { throw new RuntimeException(e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DocBuilder.java
catch(Exception e) { throw new RuntimeException(e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DocBuilder.java
catch (Exception e) { throw new RuntimeException(e); }
// in contrib/velocity/src/java/org/apache/solr/response/SolrParamResourceLoader.java
catch (UnsupportedEncodingException e) { throw new RuntimeException(e); // may not happen }
// in contrib/velocity/src/java/org/apache/solr/response/VelocityResponseWriter.java
catch (Exception e) { throw new RuntimeException(e); }
// in core/src/java/org/apache/solr/handler/AnalysisRequestHandlerBase.java
catch (IOException ioe) { throw new RuntimeException("Error occured while iterating over tokenstream", ioe); }
// in core/src/java/org/apache/solr/handler/AnalysisRequestHandlerBase.java
catch (IOException ioe) { throw new RuntimeException("Error occured while iterating over tokenstream", ioe); }
// in core/src/java/org/apache/solr/handler/component/QueryElevationComponent.java
catch (MalformedURLException e) { throw new RuntimeException(e); }
// in core/src/java/org/apache/solr/handler/component/StatsComponent.java
catch (IOException e) { throw new RuntimeException( "failed to open field cache for: "+fieldName, e ); }
// in core/src/java/org/apache/solr/handler/component/StatsComponent.java
catch (IOException e) { throw new RuntimeException( "failed to open field cache for: " + facetField, e ); }
// in core/src/java/org/apache/solr/handler/component/TermVectorComponent.java
catch (IOException e) { throw new RuntimeException(e); }
// in core/src/java/org/apache/solr/client/solrj/embedded/EmbeddedSolrServer.java
catch (Exception ex) { throw new RuntimeException(ex); }
// in core/src/java/org/apache/solr/response/XSLTResponseWriter.java
catch(Exception e) { // TODO should our parent interface throw (IO)Exception? throw new RuntimeException("getTransformer fails in getContentType",e); }
// in core/src/java/org/apache/solr/response/BinaryResponseWriter.java
catch (Exception ex) { throw new RuntimeException(ex); }
// in core/src/java/org/apache/solr/request/UnInvertedField.java
catch (IOException e) { throw new RuntimeException("failed to open field cache for: " + f, e); }
// in core/src/java/org/apache/solr/servlet/cache/HttpCacheHeaderUtil.java
catch (UnsupportedEncodingException e) { throw new RuntimeException(e); // may not happen }
// in core/src/java/org/apache/solr/schema/DateField.java
catch( ParseException ex ) { throw new RuntimeException( ex ); }
// in core/src/java/org/apache/solr/schema/TextField.java
catch (IOException e) { throw new RuntimeException("Unable to end & close TokenStream after analyzing multiTerm term: " + part, e); }
// in core/src/java/org/apache/solr/schema/TextField.java
catch (IOException e) { throw new RuntimeException("Unable to initialize TokenStream to analyze query text", e); }
// in core/src/java/org/apache/solr/schema/TextField.java
catch (IOException e) { throw new RuntimeException("Unable to initialize TokenStream to analyze query text", e); }
// in core/src/java/org/apache/solr/schema/CollationField.java
catch (IOException e) { // io error throw new RuntimeException(e); }
// in core/src/java/org/apache/solr/schema/CollationField.java
catch (ParseException e) { // invalid rules throw new RuntimeException(e); }
// in core/src/java/org/apache/solr/schema/CollationField.java
catch (IOException e) { throw new RuntimeException("Unable to initialize TokenStream to analyze range part: " + part, e); }
// in core/src/java/org/apache/solr/schema/CollationField.java
catch (IOException e) { throw new RuntimeException("error analyzing range part: " + part, e); }
// in core/src/java/org/apache/solr/schema/CollationField.java
catch (IOException e) { throw new RuntimeException("Unable to end & close TokenStream after analyzing range part: " + part, e); }
// in core/src/java/org/apache/solr/internal/csv/CSVStrategy.java
catch (CloneNotSupportedException e) { throw new RuntimeException(e); // impossible }
// in core/src/java/org/apache/solr/search/QueryParsing.java
catch (Exception e) { throw new RuntimeException(e); }
// in core/src/java/org/apache/solr/search/SolrCacheBase.java
catch (Exception e) { throw new RuntimeException("Can't parse autoWarm value: " + configValue, e); }
// in core/src/java/org/apache/solr/spelling/AbstractLuceneSpellChecker.java
catch (IOException e) { throw new RuntimeException(e); }
// in core/src/java/org/apache/solr/spelling/AbstractLuceneSpellChecker.java
catch (NumberFormatException e) { throw new RuntimeException( "Unparseable accuracy given for dictionary: " + name, e); }
// in core/src/java/org/apache/solr/spelling/FileBasedSpellChecker.java
catch (IOException e) { throw new RuntimeException(e); }
// in core/src/java/org/apache/solr/spelling/SuggestQueryConverter.java
catch (IOException e) { throw new RuntimeException(e); }
// in core/src/java/org/apache/solr/spelling/IndexBasedSpellChecker.java
catch (IOException e) { throw new RuntimeException(e); }
// in core/src/java/org/apache/solr/spelling/IndexBasedSpellChecker.java
catch (IOException e) { throw new RuntimeException(e); }
// in core/src/java/org/apache/solr/cloud/ZkSolrResourceLoader.java
catch (Exception e) { throw new RuntimeException("Error opening " + file, e); }
// in core/src/java/org/apache/solr/cloud/ZkSolrResourceLoader.java
catch (Exception e) { throw new RuntimeException("Error opening " + resource, e); }
// in core/src/java/org/apache/solr/cloud/SolrZkServer.java
catch (UnknownHostException e) { throw new RuntimeException(e); }
// in core/src/java/org/apache/solr/update/processor/MD5Signature.java
catch (NoSuchAlgorithmException e) { throw new RuntimeException(e); }
// in core/src/java/org/apache/solr/update/processor/MD5Signature.java
catch (UnsupportedEncodingException e) { // won't happen log.error("UTF-8 not supported", e); throw new RuntimeException(e); }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (IOException e) { throw new RuntimeException(e); }
// in core/src/java/org/apache/solr/core/SolrResourceLoader.java
catch (Exception e) { throw new RuntimeException("Error opening " + resource, e); }
// in core/src/java/org/apache/solr/core/JmxMonitoredMap.java
catch (Exception e) { // Release the reference server = null; throw new RuntimeException("Could not start JMX monitoring ", e); }
// in core/src/java/org/apache/solr/core/JmxMonitoredMap.java
catch (Exception e) { throw new RuntimeException(e); }
// in core/src/java/org/apache/solr/util/SolrPluginUtils.java
catch (InvocationTargetException e1) { throw new RuntimeException("Error invoking setter " + setterName + " on class : " + clazz.getName(), e1); }
// in core/src/java/org/apache/solr/util/SolrPluginUtils.java
catch (IllegalAccessException e1) { throw new RuntimeException("Error invoking setter " + setterName + " on class : " + clazz.getName(), e1); }
// in core/src/java/org/apache/solr/util/SuggestMissingFactories.java
catch (MalformedURLException e) { throw new RuntimeException ("WTF, how can JarFile.toURL() be malformed?", e); }
0 10
            
// in contrib/clustering/src/java/org/apache/solr/handler/clustering/carrot2/CarrotClusteringEngine.java
catch (RuntimeException e) { log.debug("Resource not found in Solr's config: " + resourceName + ". Using the default " + resource + " from Carrot JAR."); return new IResource[] {}; }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/XPathEntityProcessor.java
catch (RuntimeException e) { throw new DataImportHandlerException(SEVERE, "Exception while reading xpaths for fields", e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DataImportHandler.java
catch (RuntimeException e) { rsp.add("exception", DebugLogger.getStacktraceString(e)); importer = null; return; }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DataImportHandler.java
catch (RuntimeException e) { LOG.error( "Exception while adding: " + document, e); return false; }
// in core/src/java/org/apache/solr/schema/FieldType.java
catch (RuntimeException e) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "Error while creating field '" + field + "' from value '" + value + "'", e); }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
catch (RuntimeException e) { // getField() throws a SolrException, but it arrives as a RuntimeException log.warn("Field \"" + fieldName + "\" found in index, but not defined in schema."); }
// in core/src/java/org/apache/solr/core/SolrResourceLoader.java
catch( RuntimeException ex ) { log.warn("Odd RuntimeException while testing for JNDI: " + ex.getMessage()); }
// in core/src/java/org/apache/solr/core/JmxMonitoredMap.java
catch (RuntimeException e) { LOG.warn( "Failed to unregister info bean: " + key, e); }
// in core/src/java/org/apache/solr/util/SystemIdResolver.java
catch (RuntimeException re) { // unfortunately XInclude fallback only works with IOException, but openResource() never throws that one throw (IOException) (new IOException(re.getMessage()).initCause(re)); }
// in core/src/java/org/apache/solr/util/SimplePostTool.java
catch(RuntimeException e) { e.printStackTrace(); fatal("RuntimeException " + e); }
3
            
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/XPathEntityProcessor.java
catch (RuntimeException e) { throw new DataImportHandlerException(SEVERE, "Exception while reading xpaths for fields", e); }
// in core/src/java/org/apache/solr/schema/FieldType.java
catch (RuntimeException e) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "Error while creating field '" + field + "' from value '" + value + "'", e); }
// in core/src/java/org/apache/solr/util/SystemIdResolver.java
catch (RuntimeException re) { // unfortunately XInclude fallback only works with IOException, but openResource() never throws that one throw (IOException) (new IOException(re.getMessage()).initCause(re)); }
2
unknown (Lib) SAXException 0 0 24
            
// in solrj/src/java/org/apache/solr/common/util/XMLErrorLogger.java
public void error(SAXParseException e) throws SAXException { throw e; }
// in solrj/src/java/org/apache/solr/common/util/XMLErrorLogger.java
public void fatalError(SAXParseException e) throws SAXException { throw e; }
// in contrib/extraction/src/java/org/apache/solr/handler/extraction/SolrContentHandler.java
Override public void startDocument() throws SAXException { document.clear(); catchAllBuilder.setLength(0); for (StringBuilder builder : fieldBuilders.values()) { builder.setLength(0); } bldrStack.clear(); bldrStack.add(catchAllBuilder); }
// in contrib/extraction/src/java/org/apache/solr/handler/extraction/SolrContentHandler.java
Override public void startElement(String uri, String localName, String qName, Attributes attributes) throws SAXException { StringBuilder theBldr = fieldBuilders.get(localName); if (theBldr != null) { //we need to switch the currentBuilder bldrStack.add(theBldr); } if (captureAttribs == true) { for (int i = 0; i < attributes.getLength(); i++) { addField(localName, attributes.getValue(i), null); } } else { for (int i = 0; i < attributes.getLength(); i++) { bldrStack.getLast().append(attributes.getValue(i)).append(' '); } } bldrStack.getLast().append(' '); }
// in contrib/extraction/src/java/org/apache/solr/handler/extraction/SolrContentHandler.java
Override public void endElement(String uri, String localName, String qName) throws SAXException { StringBuilder theBldr = fieldBuilders.get(localName); if (theBldr != null) { //pop the stack bldrStack.removeLast(); assert (bldrStack.size() >= 1); } bldrStack.getLast().append(' '); }
// in contrib/extraction/src/java/org/apache/solr/handler/extraction/SolrContentHandler.java
Override public void characters(char[] chars, int offset, int length) throws SAXException { bldrStack.getLast().append(chars, offset, length); }
// in contrib/dataimporthandler-extras/src/java/org/apache/solr/handler/dataimport/TikaEntityProcessor.java
private static ContentHandler getHtmlHandler(Writer writer) throws TransformerConfigurationException { SAXTransformerFactory factory = (SAXTransformerFactory) TransformerFactory.newInstance(); TransformerHandler handler = factory.newTransformerHandler(); handler.getTransformer().setOutputProperty(OutputKeys.METHOD, "html"); handler.setResult(new StreamResult(writer)); return new ContentHandlerDecorator(handler) { @Override public void startElement( String uri, String localName, String name, Attributes atts) throws SAXException { if (XHTMLContentHandler.XHTML.equals(uri)) { uri = null; } if (!"head".equals(localName)) { super.startElement(uri, localName, name, atts); } } @Override public void endElement(String uri, String localName, String name) throws SAXException { if (XHTMLContentHandler.XHTML.equals(uri)) { uri = null; } if (!"head".equals(localName)) { super.endElement(uri, localName, name); } } @Override public void startPrefixMapping(String prefix, String uri) {/*no op*/ } @Override public void endPrefixMapping(String prefix) {/*no op*/ } }; }
// in contrib/dataimporthandler-extras/src/java/org/apache/solr/handler/dataimport/TikaEntityProcessor.java
Override public void startElement( String uri, String localName, String name, Attributes atts) throws SAXException { if (XHTMLContentHandler.XHTML.equals(uri)) { uri = null; } if (!"head".equals(localName)) { super.startElement(uri, localName, name, atts); } }
// in contrib/dataimporthandler-extras/src/java/org/apache/solr/handler/dataimport/TikaEntityProcessor.java
Override public void endElement(String uri, String localName, String name) throws SAXException { if (XHTMLContentHandler.XHTML.equals(uri)) { uri = null; } if (!"head".equals(localName)) { super.endElement(uri, localName, name); } }
// in core/src/java/org/apache/solr/core/SolrCore.java
public SolrCore reload(SolrResourceLoader resourceLoader) throws IOException, ParserConfigurationException, SAXException { // TODO - what if indexwriter settings have changed SolrConfig config = new SolrConfig(resourceLoader, getSolrConfig().getName(), null); IndexSchema schema = new IndexSchema(config, getSchema().getResourceName(), null); updateHandler.incref(); SolrCore core = new SolrCore(getName(), null, config, schema, coreDescriptor, updateHandler); return core; }
// in core/src/java/org/apache/solr/core/CoreContainer.java
public CoreContainer initialize() throws IOException, ParserConfigurationException, SAXException { CoreContainer cores = null; String solrHome = SolrResourceLoader.locateSolrHome(); File fconf = new File(solrHome, containerConfigFilename == null ? "solr.xml" : containerConfigFilename); log.info("looking for solr.xml: " + fconf.getAbsolutePath()); cores = new CoreContainer(solrHome); if (fconf.exists()) { cores.load(solrHome, fconf); } else { log.info("no solr.xml file found - using default"); cores.load(solrHome, new InputSource(new ByteArrayInputStream(DEF_SOLR_XML.getBytes("UTF-8")))); cores.configFile = fconf; } containerConfigFilename = cores.getConfigFile().getName(); return cores; }
// in core/src/java/org/apache/solr/core/CoreContainer.java
public void load(String dir, File configFile ) throws ParserConfigurationException, IOException, SAXException { this.configFile = configFile; this.load(dir, new InputSource(configFile.toURI().toASCIIString())); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
public void load(String dir, InputSource cfgis) throws ParserConfigurationException, IOException, SAXException { if (null == dir) { // don't rely on SolrResourceLoader(), determine explicitly first dir = SolrResourceLoader.locateSolrHome(); } log.info("Loading CoreContainer using Solr Home: '{}'", dir); this.loader = new SolrResourceLoader(dir); solrHome = loader.getInstanceDir(); Config cfg = new Config(loader, null, cfgis, null, false); // keep orig config for persist to consult try { this.cfg = new Config(loader, null, copyDoc(cfg.getDocument())); } catch (TransformerException e) { throw new SolrException(ErrorCode.SERVER_ERROR, "", e); } cfg.substituteProperties(); // Initialize Logging if(cfg.getBool("solr/logging/@enabled",true)) { String slf4jImpl = null; String fname = cfg.get("solr/logging/watcher/@class", null); try { slf4jImpl = StaticLoggerBinder.getSingleton().getLoggerFactoryClassStr(); if(fname==null) { if( slf4jImpl.indexOf("Log4j") > 0) { log.warn("Log watching is not yet implemented for log4j" ); } else if( slf4jImpl.indexOf("JDK") > 0) { fname = "JUL"; } } } catch(Throwable ex) { log.warn("Unable to read SLF4J version. LogWatcher will be disabled: "+ex); } // Now load the framework if(fname!=null) { if("JUL".equalsIgnoreCase(fname)) { logging = new JulWatcher(slf4jImpl); } // else if( "Log4j".equals(fname) ) { // logging = new Log4jWatcher(slf4jImpl); // } else { try { logging = loader.newInstance(fname, LogWatcher.class); } catch (Throwable e) { log.warn("Unable to load LogWatcher", e); } } if( logging != null ) { ListenerConfig v = new ListenerConfig(); v.size = cfg.getInt("solr/logging/watcher/@size",50); v.threshold = cfg.get("solr/logging/watcher/@threshold",null); if(v.size>0) { log.info("Registering Log Listener"); logging.registerListener(v, this); } } } } String dcoreName = cfg.get("solr/cores/@defaultCoreName", null); if(dcoreName != null && !dcoreName.isEmpty()) { defaultCoreName = dcoreName; } persistent = cfg.getBool("solr/@persistent", false); libDir = cfg.get("solr/@sharedLib", null); zkHost = cfg.get("solr/@zkHost" , null); adminPath = cfg.get("solr/cores/@adminPath", null); shareSchema = cfg.getBool("solr/cores/@shareSchema", DEFAULT_SHARE_SCHEMA); zkClientTimeout = cfg.getInt("solr/cores/@zkClientTimeout", DEFAULT_ZK_CLIENT_TIMEOUT); hostPort = cfg.get("solr/cores/@hostPort", DEFAULT_HOST_PORT); hostContext = cfg.get("solr/cores/@hostContext", DEFAULT_HOST_CONTEXT); host = cfg.get("solr/cores/@host", null); if(shareSchema){ indexSchemaCache = new ConcurrentHashMap<String ,IndexSchema>(); } adminHandler = cfg.get("solr/cores/@adminHandler", null ); managementPath = cfg.get("solr/cores/@managementPath", null ); zkClientTimeout = Integer.parseInt(System.getProperty("zkClientTimeout", Integer.toString(zkClientTimeout))); initZooKeeper(zkHost, zkClientTimeout); if (libDir != null) { File f = FileUtils.resolvePath(new File(dir), libDir); log.info( "loading shared library: "+f.getAbsolutePath() ); libLoader = SolrResourceLoader.createClassLoader(f, null); } if (adminPath != null) { if (adminHandler == null) { coreAdminHandler = new CoreAdminHandler(this); } else { coreAdminHandler = this.createMultiCoreHandler(adminHandler); } } try { containerProperties = readProperties(cfg, ((NodeList) cfg.evaluate(DEFAULT_HOST_CONTEXT, XPathConstants.NODESET)).item(0)); } catch (Throwable e) { SolrException.log(log,null,e); } NodeList nodes = (NodeList)cfg.evaluate("solr/cores/core", XPathConstants.NODESET); for (int i=0; i<nodes.getLength(); i++) { Node node = nodes.item(i); try { String rawName = DOMUtil.getAttr(node, "name", null); if (null == rawName) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Each core in solr.xml must have a 'name'"); } String name = rawName; CoreDescriptor p = new CoreDescriptor(this, name, DOMUtil.getAttr(node, "instanceDir", null)); // deal with optional settings String opt = DOMUtil.getAttr(node, "config", null); if (opt != null) { p.setConfigName(opt); } opt = DOMUtil.getAttr(node, "schema", null); if (opt != null) { p.setSchemaName(opt); } if (zkController != null) { opt = DOMUtil.getAttr(node, "shard", null); if (opt != null && opt.length() > 0) { p.getCloudDescriptor().setShardId(opt); } opt = DOMUtil.getAttr(node, "collection", null); if (opt != null) { p.getCloudDescriptor().setCollectionName(opt); } opt = DOMUtil.getAttr(node, "roles", null); if(opt != null){ p.getCloudDescriptor().setRoles(opt); } } opt = DOMUtil.getAttr(node, "properties", null); if (opt != null) { p.setPropertiesName(opt); } opt = DOMUtil.getAttr(node, CoreAdminParams.DATA_DIR, null); if (opt != null) { p.setDataDir(opt); } p.setCoreProperties(readProperties(cfg, node)); SolrCore core = create(p); register(name, core, false); // track original names coreToOrigName.put(core, rawName); } catch (Throwable ex) { SolrException.log(log,null,ex); } } }
// in core/src/java/org/apache/solr/core/CoreContainer.java
public SolrCore create(CoreDescriptor dcore) throws ParserConfigurationException, IOException, SAXException { // Make the instanceDir relative to the cores instanceDir if not absolute File idir = new File(dcore.getInstanceDir()); if (!idir.isAbsolute()) { idir = new File(solrHome, dcore.getInstanceDir()); } String instanceDir = idir.getPath(); log.info("Creating SolrCore '{}' using instanceDir: {}", dcore.getName(), instanceDir); // Initialize the solr config SolrResourceLoader solrLoader = null; SolrConfig config = null; String zkConfigName = null; if(zkController == null) { solrLoader = new SolrResourceLoader(instanceDir, libLoader, getCoreProps(instanceDir, dcore.getPropertiesName(),dcore.getCoreProperties())); config = new SolrConfig(solrLoader, dcore.getConfigName(), null); } else { try { String collection = dcore.getCloudDescriptor().getCollectionName(); zkController.createCollectionZkNode(dcore.getCloudDescriptor()); zkConfigName = zkController.readConfigName(collection); if (zkConfigName == null) { log.error("Could not find config name for collection:" + collection); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "Could not find config name for collection:" + collection); } solrLoader = new ZkSolrResourceLoader(instanceDir, zkConfigName, libLoader, getCoreProps(instanceDir, dcore.getPropertiesName(),dcore.getCoreProperties()), zkController); config = getSolrConfigFromZk(zkConfigName, dcore.getConfigName(), solrLoader); } catch (KeeperException e) { log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } } IndexSchema schema = null; if (indexSchemaCache != null) { if (zkController != null) { File schemaFile = new File(dcore.getSchemaName()); if (!schemaFile.isAbsolute()) { schemaFile = new File(solrLoader.getInstanceDir() + "conf" + File.separator + dcore.getSchemaName()); } if (schemaFile.exists()) { String key = schemaFile.getAbsolutePath() + ":" + new SimpleDateFormat("yyyyMMddHHmmss", Locale.US).format(new Date( schemaFile.lastModified())); schema = indexSchemaCache.get(key); if (schema == null) { log.info("creating new schema object for core: " + dcore.name); schema = new IndexSchema(config, dcore.getSchemaName(), null); indexSchemaCache.put(key, schema); } else { log.info("re-using schema object for core: " + dcore.name); } } } else { // TODO: handle caching from ZooKeeper - perhaps using ZooKeepers versioning // Don't like this cache though - how does it empty as last modified changes? } } if(schema == null){ if(zkController != null) { try { schema = getSchemaFromZk(zkConfigName, dcore.getSchemaName(), config, solrLoader); } catch (KeeperException e) { log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } } else { schema = new IndexSchema(config, dcore.getSchemaName(), null); } } SolrCore core = new SolrCore(dcore.getName(), null, config, schema, dcore); if (zkController == null && core.getUpdateHandler().getUpdateLog() != null) { // always kick off recovery if we are in standalone mode. core.getUpdateHandler().getUpdateLog().recoverFromLog(); } return core; }
// in core/src/java/org/apache/solr/core/CoreContainer.java
public void reload(String name) throws ParserConfigurationException, IOException, SAXException { name= checkDefault(name); SolrCore core; synchronized(cores) { core = cores.get(name); } if (core == null) throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "No such core: " + name ); CoreDescriptor cd = core.getCoreDescriptor(); File instanceDir = new File(cd.getInstanceDir()); if (!instanceDir.isAbsolute()) { instanceDir = new File(getSolrHome(), cd.getInstanceDir()); } log.info("Reloading SolrCore '{}' using instanceDir: {}", cd.getName(), instanceDir.getAbsolutePath()); SolrResourceLoader solrLoader; if(zkController == null) { solrLoader = new SolrResourceLoader(instanceDir.getAbsolutePath(), libLoader, getCoreProps(instanceDir.getAbsolutePath(), cd.getPropertiesName(),cd.getCoreProperties())); } else { try { String collection = cd.getCloudDescriptor().getCollectionName(); zkController.createCollectionZkNode(cd.getCloudDescriptor()); String zkConfigName = zkController.readConfigName(collection); if (zkConfigName == null) { log.error("Could not find config name for collection:" + collection); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "Could not find config name for collection:" + collection); } solrLoader = new ZkSolrResourceLoader(instanceDir.getAbsolutePath(), zkConfigName, libLoader, getCoreProps(instanceDir.getAbsolutePath(), cd.getPropertiesName(),cd.getCoreProperties()), zkController); } catch (KeeperException e) { log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } } SolrCore newCore = core.reload(solrLoader); // keep core to orig name link String origName = coreToOrigName.remove(core); if (origName != null) { coreToOrigName.put(newCore, origName); } register(name, newCore, false); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
private SolrConfig getSolrConfigFromZk(String zkConfigName, String solrConfigFileName, SolrResourceLoader resourceLoader) throws IOException, ParserConfigurationException, SAXException, KeeperException, InterruptedException { byte[] config = zkController.getConfigFileData(zkConfigName, solrConfigFileName); InputSource is = new InputSource(new ByteArrayInputStream(config)); is.setSystemId(SystemIdResolver.createSystemIdFromResourceName(solrConfigFileName)); SolrConfig cfg = solrConfigFileName == null ? new SolrConfig( resourceLoader, SolrConfig.DEFAULT_CONF_FILE, is) : new SolrConfig( resourceLoader, solrConfigFileName, is); return cfg; }
3
            
// in contrib/extraction/src/java/org/apache/solr/handler/extraction/ExtractingDocumentLoader.java
catch (SAXException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/schema/CurrencyField.java
catch (SAXException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Error parsing currency config.", e); }
// in core/src/java/org/apache/solr/core/Config.java
catch (SAXException e) { SolrException.log(log, "Exception during parsing file: " + name, e); throw e; }
3
            
// in contrib/extraction/src/java/org/apache/solr/handler/extraction/ExtractingDocumentLoader.java
catch (SAXException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/schema/CurrencyField.java
catch (SAXException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Error parsing currency config.", e); }
// in core/src/java/org/apache/solr/core/Config.java
catch (SAXException e) { SolrException.log(log, "Exception during parsing file: " + name, e); throw e; }
2
unknown (Lib) SQLException 0 0 2
            
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/JdbcDataSource.java
private List<String> readFieldNames(ResultSetMetaData metaData) throws SQLException { List<String> colNames = new ArrayList<String>(); int count = metaData.getColumnCount(); for (int i = 0; i < count; i++) { colNames.add(metaData.getColumnLabel(i + 1)); } return colNames; }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/FieldReaderDataSource.java
private Reader getReader(Blob blob) throws SQLException, UnsupportedEncodingException { if (encoding == null) { return (new InputStreamReader(blob.getBinaryStream())); } else { return (new InputStreamReader(blob.getBinaryStream(), encoding)); } }
3
            
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/JdbcDataSource.java
catch (SQLException e) { // DriverManager does not allow you to use a driver which is not loaded through // the class loader of the class which is trying to make the connection. // This is a workaround for cases where the user puts the driver jar in the // solr.home/lib or solr.home/core/lib directories. Driver d = (Driver) DocBuilder.loadClass(driver, context.getSolrCore()).newInstance(); c = d.connect(url, initProps); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/JdbcDataSource.java
catch (SQLException e) { logError("Error reading data ", e); wrapAndThrow(SEVERE, e, "Error reading data from database"); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/JdbcDataSource.java
catch (SQLException e) { close(); wrapAndThrow(SEVERE,e); return false; }
0 0
unknown (Lib) ScriptException 0 0 0 1
            
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/ScriptTransformer.java
catch (ScriptException e) { wrapAndThrow(SEVERE, e, "'eval' failed with language: " + scriptLang + " and script: \n" + scriptText); }
0 0
unknown (Lib) SecurityException 0 0 1
            
// in core/src/java/org/apache/solr/logging/jul/RecordHandler.java
Override public void close() throws SecurityException { //history.reset(); }
1
            
// in core/src/java/org/apache/solr/util/VersionedFile.java
catch (SecurityException e) { if (!df.exists()) { deleted.add(df); } }
0 0
unknown (Lib) ServletException 1
            
// in core/src/java/org/apache/solr/servlet/RedirectServlet.java
public void init(ServletConfig config) throws ServletException { super.init(config); destination = config.getInitParameter("destination"); if(destination==null) { throw new ServletException("RedirectServlet missing destination configuration"); } if( "false".equals(config.getInitParameter("permanent") )) { code = HttpServletResponse.SC_MOVED_TEMPORARILY; } // Replace the context key if(destination.startsWith(CONTEXT_KEY)) { destination = config.getServletContext().getContextPath() +destination.substring(CONTEXT_KEY.length()); } }
0 10
            
// in core/src/java/org/apache/solr/servlet/SolrDispatchFilter.java
public void init(FilterConfig config) throws ServletException { log.info("SolrDispatchFilter.init()"); CoreContainer.Initializer init = createInitializer(); try { // web.xml configuration this.pathPrefix = config.getInitParameter( "path-prefix" ); this.cores = init.initialize(); log.info("user.dir=" + System.getProperty("user.dir")); } catch( Throwable t ) { // catch this so our filter still works log.error( "Could not start Solr. Check solr/home property and the logs"); SolrCore.log( t ); } log.info("SolrDispatchFilter.init() done"); }
// in core/src/java/org/apache/solr/servlet/SolrDispatchFilter.java
public void doFilter(ServletRequest request, ServletResponse response, FilterChain chain) throws IOException, ServletException { if( abortErrorMessage != null ) { ((HttpServletResponse)response).sendError( 500, abortErrorMessage ); return; } if (this.cores == null) { ((HttpServletResponse)response).sendError( 403, "Server is shutting down" ); return; } CoreContainer cores = this.cores; SolrCore core = null; SolrQueryRequest solrReq = null; if( request instanceof HttpServletRequest) { HttpServletRequest req = (HttpServletRequest)request; HttpServletResponse resp = (HttpServletResponse)response; SolrRequestHandler handler = null; String corename = ""; try { // put the core container in request attribute req.setAttribute("org.apache.solr.CoreContainer", cores); String path = req.getServletPath(); if( req.getPathInfo() != null ) { // this lets you handle /update/commit when /update is a servlet path += req.getPathInfo(); } if( pathPrefix != null && path.startsWith( pathPrefix ) ) { path = path.substring( pathPrefix.length() ); } // check for management path String alternate = cores.getManagementPath(); if (alternate != null && path.startsWith(alternate)) { path = path.substring(0, alternate.length()); } // unused feature ? int idx = path.indexOf( ':' ); if( idx > 0 ) { // save the portion after the ':' for a 'handler' path parameter path = path.substring( 0, idx ); } // Check for the core admin page if( path.equals( cores.getAdminPath() ) ) { handler = cores.getMultiCoreHandler(); solrReq = adminRequestParser.parse(null,path, req); handleAdminRequest(req, response, handler, solrReq); return; } else { //otherwise, we should find a core from the path idx = path.indexOf( "/", 1 ); if( idx > 1 ) { // try to get the corename as a request parameter first corename = path.substring( 1, idx ); core = cores.getCore(corename); if (core != null) { path = path.substring( idx ); } } if (core == null) { if (!cores.isZooKeeperAware() ) { core = cores.getCore(""); } } } if (core == null && cores.isZooKeeperAware()) { // we couldn't find the core - lets make sure a collection was not specified instead core = getCoreByCollection(cores, corename, path); if (core != null) { // we found a core, update the path path = path.substring( idx ); } else { // try the default core core = cores.getCore(""); } // TODO: if we couldn't find it locally, look on other nodes } // With a valid core... if( core != null ) { final SolrConfig config = core.getSolrConfig(); // get or create/cache the parser for the core SolrRequestParsers parser = null; parser = parsers.get(config); if( parser == null ) { parser = new SolrRequestParsers(config); parsers.put(config, parser ); } // Determine the handler from the url path if not set // (we might already have selected the cores handler) if( handler == null && path.length() > 1 ) { // don't match "" or "/" as valid path handler = core.getRequestHandler( path ); // no handler yet but allowed to handle select; let's check if( handler == null && parser.isHandleSelect() ) { if( "/select".equals( path ) || "/select/".equals( path ) ) { solrReq = parser.parse( core, path, req ); String qt = solrReq.getParams().get( CommonParams.QT ); handler = core.getRequestHandler( qt ); if( handler == null ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "unknown handler: "+qt); } if( qt != null && qt.startsWith("/") && (handler instanceof ContentStreamHandlerBase)) { //For security reasons it's a bad idea to allow a leading '/', ex: /select?qt=/update see SOLR-3161 //There was no restriction from Solr 1.4 thru 3.5 and it's not supported for update handlers. throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "Invalid query type. Do not use /select to access: "+qt); } } } } // With a valid handler and a valid core... if( handler != null ) { // if not a /select, create the request if( solrReq == null ) { solrReq = parser.parse( core, path, req ); } final Method reqMethod = Method.getMethod(req.getMethod()); HttpCacheHeaderUtil.setCacheControlHeader(config, resp, reqMethod); // unless we have been explicitly told not to, do cache validation // if we fail cache validation, execute the query if (config.getHttpCachingConfig().isNever304() || !HttpCacheHeaderUtil.doCacheHeaderValidation(solrReq, req, reqMethod, resp)) { SolrQueryResponse solrRsp = new SolrQueryResponse(); /* even for HEAD requests, we need to execute the handler to * ensure we don't get an error (and to make sure the correct * QueryResponseWriter is selected and we get the correct * Content-Type) */ SolrRequestInfo.setRequestInfo(new SolrRequestInfo(solrReq, solrRsp)); this.execute( req, handler, solrReq, solrRsp ); HttpCacheHeaderUtil.checkHttpCachingVeto(solrRsp, resp, reqMethod); // add info to http headers //TODO: See SOLR-232 and SOLR-267. /*try { NamedList solrRspHeader = solrRsp.getResponseHeader(); for (int i=0; i<solrRspHeader.size(); i++) { ((javax.servlet.http.HttpServletResponse) response).addHeader(("Solr-" + solrRspHeader.getName(i)), String.valueOf(solrRspHeader.getVal(i))); } } catch (ClassCastException cce) { log.log(Level.WARNING, "exception adding response header log information", cce); }*/ QueryResponseWriter responseWriter = core.getQueryResponseWriter(solrReq); writeResponse(solrRsp, response, responseWriter, solrReq, reqMethod); } return; // we are done with a valid handler } } log.debug("no handler or core retrieved for " + path + ", follow through..."); } catch (Throwable ex) { sendError( core, solrReq, request, (HttpServletResponse)response, ex ); return; } finally { if( solrReq != null ) { solrReq.close(); } if (core != null) { core.close(); } SolrRequestInfo.clearRequestInfo(); } } // Otherwise let the webapp handle the request chain.doFilter(request, response); }
// in core/src/java/org/apache/solr/servlet/ZookeeperInfoServlet.java
Override public void init() throws ServletException { }
// in core/src/java/org/apache/solr/servlet/ZookeeperInfoServlet.java
Override public void doGet(HttpServletRequest request, HttpServletResponse response) throws IOException, ServletException { response.setCharacterEncoding("UTF-8"); response.setContentType("application/json"); // This attribute is set by the SolrDispatchFilter CoreContainer cores = (CoreContainer) request.getAttribute("org.apache.solr.CoreContainer"); String path = request.getParameter("path"); String addr = request.getParameter("addr"); if (addr != null && addr.length() == 0) { addr = null; } String detailS = request.getParameter("detail"); boolean detail = detailS != null && detailS.equals("true"); String dumpS = request.getParameter("dump"); boolean dump = dumpS != null && dumpS.equals("true"); PrintWriter out = response.getWriter(); ZKPrinter printer = new ZKPrinter(response, out, cores.getZkController(), addr); printer.detail = detail; printer.dump = dump; try { printer.print(path); } finally { printer.close(); } }
// in core/src/java/org/apache/solr/servlet/ZookeeperInfoServlet.java
Override public void doPost(HttpServletRequest request, HttpServletResponse response) throws IOException, ServletException { doGet(request, response); }
// in core/src/java/org/apache/solr/servlet/LoadAdminUiServlet.java
Override public void doGet(HttpServletRequest request, HttpServletResponse response) throws IOException, ServletException { response.setCharacterEncoding("UTF-8"); response.setContentType("text/html"); PrintWriter out = response.getWriter(); InputStream in = getServletContext().getResourceAsStream("/admin.html"); if(in != null) { try { // This attribute is set by the SolrDispatchFilter CoreContainer cores = (CoreContainer) request.getAttribute("org.apache.solr.CoreContainer"); String html = IOUtils.toString(in, "UTF-8"); String[] search = new String[] { "${contextPath}", "${adminPath}" }; String[] replace = new String[] { StringEscapeUtils.escapeJavaScript(request.getContextPath()), StringEscapeUtils.escapeJavaScript(cores.getAdminPath()) }; out.println( StringUtils.replaceEach(html, search, replace) ); } finally { IOUtils.closeQuietly(in); } } else { out.println("solr"); } }
// in core/src/java/org/apache/solr/servlet/LoadAdminUiServlet.java
Override public void doPost(HttpServletRequest request, HttpServletResponse response) throws IOException, ServletException { doGet(request, response); }
// in core/src/java/org/apache/solr/servlet/RedirectServlet.java
public void init(ServletConfig config) throws ServletException { super.init(config); destination = config.getInitParameter("destination"); if(destination==null) { throw new ServletException("RedirectServlet missing destination configuration"); } if( "false".equals(config.getInitParameter("permanent") )) { code = HttpServletResponse.SC_MOVED_TEMPORARILY; } // Replace the context key if(destination.startsWith(CONTEXT_KEY)) { destination = config.getServletContext().getContextPath() +destination.substring(CONTEXT_KEY.length()); } }
// in core/src/java/org/apache/solr/servlet/RedirectServlet.java
public void doGet(HttpServletRequest req, HttpServletResponse res) throws ServletException,IOException { res.setStatus(code); res.setHeader("Location", destination); }
// in core/src/java/org/apache/solr/servlet/RedirectServlet.java
public void doPost(HttpServletRequest req, HttpServletResponse res) throws ServletException,IOException { doGet(req,res); }
0 0 0
unknown (Lib) SessionExpiredException 0 0 0 1
            
// in core/src/java/org/apache/solr/cloud/LeaderElector.java
catch (KeeperException.SessionExpiredException e) { throw e; }
1
            
// in core/src/java/org/apache/solr/cloud/LeaderElector.java
catch (KeeperException.SessionExpiredException e) { throw e; }
0
unknown (Lib) SocketException 0 0 0 2
            
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
catch (SocketException e) { ex = addZombie(server, e); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
catch (SocketException e) { ex = e; }
0 0
unknown (Lib) SocketTimeoutException 0 0 0 3
            
// in solrj/src/java/org/apache/solr/client/solrj/impl/HttpSolrServer.java
catch (SocketTimeoutException e) { throw new SolrServerException( "Timeout occured while waiting response from server at: " + getBaseURL(), e); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
catch (SocketTimeoutException e) { ex = addZombie(server, e); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
catch (SocketTimeoutException e) { ex = e; }
1
            
// in solrj/src/java/org/apache/solr/client/solrj/impl/HttpSolrServer.java
catch (SocketTimeoutException e) { throw new SolrServerException( "Timeout occured while waiting response from server at: " + getBaseURL(), e); }
0
runtime (Domain) SolrException
public class SolrException extends RuntimeException {

  /**
   * @since solr 1.2
   */
  public enum ErrorCode {
    BAD_REQUEST( 400 ),
    UNAUTHORIZED( 401 ),
    FORBIDDEN( 403 ),
    NOT_FOUND( 404 ),
    CONFLICT( 409 ),
    SERVER_ERROR( 500 ),
    SERVICE_UNAVAILABLE( 503 ),
    UNKNOWN(0);
    public final int code;
    
    private ErrorCode( int c )
    {
      code = c;
    }
    public static ErrorCode getErrorCode(int c){
      for (ErrorCode err : values()) {
        if(err.code == c) return err;
      }
      return UNKNOWN;
    }
  };

  public SolrException(ErrorCode code, String msg) {
    super(msg);
    this.code = code.code;
  }
  public SolrException(ErrorCode code, String msg, Throwable th) {
    super(msg, th);
    this.code = code.code;
  }

  public SolrException(ErrorCode code, Throwable th) {
    super(th);
    this.code = code.code;
  }
  
  int code=0;
  public int code() { return code; }


  public void log(Logger log) { log(log,this); }
  public static void log(Logger log, Throwable e) {
    if (e instanceof SolrException
        && ((SolrException) e).code() == ErrorCode.SERVICE_UNAVAILABLE.code) {
      return;
    }
    String stackTrace = toStr(e);
    String ignore = doIgnore(e, stackTrace);
    if (ignore != null) {
      log.info(ignore);
      return;
    }
    log.error(stackTrace);

  }

  public static void log(Logger log, String msg, Throwable e) {
    if (e instanceof SolrException
        && ((SolrException) e).code() == ErrorCode.SERVICE_UNAVAILABLE.code) {
      log(log, msg);
    }
    String stackTrace = msg + ':' + toStr(e);
    String ignore = doIgnore(e, stackTrace);
    if (ignore != null) {
      log.info(ignore);
      return;
    }
    log.error(stackTrace);
  }
  
  public static void log(Logger log, String msg) {
    String stackTrace = msg;
    String ignore = doIgnore(null, stackTrace);
    if (ignore != null) {
      log.info(ignore);
      return;
    }
    log.error(stackTrace);
  }

  // public String toString() { return toStr(this); }  // oops, inf loop
  @Override
  public String toString() { return super.toString(); }

  public static String toStr(Throwable e) {   
    CharArrayWriter cw = new CharArrayWriter();
    PrintWriter pw = new PrintWriter(cw);
    e.printStackTrace(pw);
    pw.flush();
    return cw.toString();

/** This doesn't work for some reason!!!!!
    StringWriter sw = new StringWriter();
    PrintWriter pw = new PrintWriter(sw);
    e.printStackTrace(pw);
    pw.flush();
    System.out.println("The STRING:" + sw.toString());
    return sw.toString();
**/
  }


  /** For test code - do not log exceptions that match any of the regular expressions in ignorePatterns */
  public static Set<String> ignorePatterns;

  /** Returns null if this exception does not match any ignore patterns, or a message string to use if it does. */
  public static String doIgnore(Throwable t, String m) {
    if (ignorePatterns == null || m == null) return null;
    if (t != null && t instanceof AssertionError) return null;

    for (String regex : ignorePatterns) {
      Pattern pattern = Pattern.compile(regex);
      Matcher matcher = pattern.matcher(m);
      
      if (matcher.find()) return "Ignoring exception matching " + regex;
    }

    return null;
  }
  
  public static Throwable getRootCause(Throwable t) {
    while (true) {
      Throwable cause = t.getCause();
      if (cause!=null) {
        t = cause;
      } else {
        break;
      }
    }
    return t;
  }

}
504
            
// in solrj/src/java/org/apache/solr/common/cloud/CloudState.java
private RangeInfo addRangeInfo(String collection) { List<Range> ranges; RangeInfo rangeInfo; rangeInfo = new RangeInfo(); Map<String,Slice> slices = getSlices(collection); if (slices == null) { throw new SolrException(ErrorCode.BAD_REQUEST, "Can not find collection " + collection + " in " + this); } Set<String> shards = slices.keySet(); ArrayList<String> shardList = new ArrayList<String>(shards.size()); shardList.addAll(shards); Collections.sort(shardList); ranges = hp.partitionRange(shards.size()); rangeInfo.ranges = ranges; rangeInfo.shardList = shardList; rangeInfos.put(collection, rangeInfo); return rangeInfo; }
// in solrj/src/java/org/apache/solr/common/util/StrUtils.java
public static boolean parseBool(String s) { if( s != null ) { if( s.startsWith("true") || s.startsWith("on") || s.startsWith("yes") ) { return true; } if( s.startsWith("false") || s.startsWith("off") || s.equals("no") ) { return false; } } throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "invalid boolean value: "+s ); }
// in solrj/src/java/org/apache/solr/common/params/RequiredSolrParams.java
Override public String get(String param) { String val = params.get(param); if( val == null ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "Missing required parameter: "+param ); } return val; }
// in solrj/src/java/org/apache/solr/common/params/RequiredSolrParams.java
Override public String getFieldParam(final String field, final String param) { final String fpname = fpname(field,param); String val = params.get(fpname); if (null == val) { // don't call this.get, we want a specified exception message val = params.get(param); if (null == val) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "Missing required parameter: "+fpname+ " (or default: "+param+")" ); } } return val; }
// in solrj/src/java/org/apache/solr/common/params/RequiredSolrParams.java
Override public String[] getFieldParams(final String field, final String param) { final String fpname = fpname(field,param); String[] val = params.getParams(fpname); if (null == val) { // don't call this.getParams, we want a specified exception message val = params.getParams(param); if (null == val) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "Missing required parameter: "+fpname+ " (or default: "+param+")" ); } } return val; }
// in solrj/src/java/org/apache/solr/common/params/RequiredSolrParams.java
Override public String[] getParams(String param) { String[] vals = params.getParams(param); if( vals == null || vals.length == 0 ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "Missing required parameter: "+param ); } return vals; }
// in solrj/src/java/org/apache/solr/common/params/FacetParams.java
public static FacetRangeOther get(String label) { try { return valueOf(label.toUpperCase()); } catch (IllegalArgumentException e) { throw new SolrException (SolrException.ErrorCode.BAD_REQUEST, label+" is not a valid type of 'other' range facet information",e); } }
// in solrj/src/java/org/apache/solr/common/params/FacetParams.java
public static FacetDateOther get(String label) { try { return valueOf(label.toUpperCase()); } catch (IllegalArgumentException e) { throw new SolrException (SolrException.ErrorCode.BAD_REQUEST, label+" is not a valid type of 'other' range facet information",e); } }
// in solrj/src/java/org/apache/solr/common/params/FacetParams.java
public static FacetRangeInclude get(String label) { try { return valueOf(label.toUpperCase(Locale.ENGLISH)); } catch (IllegalArgumentException e) { throw new SolrException (SolrException.ErrorCode.BAD_REQUEST, label+" is not a valid type of for range 'include' information",e); } }
// in solrj/src/java/org/apache/solr/common/params/SolrParams.java
public Integer getInt(String param) { String val = get(param); try { return val==null ? null : Integer.valueOf(val); } catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, ex.getMessage(), ex ); } }
// in solrj/src/java/org/apache/solr/common/params/SolrParams.java
public int getInt(String param, int def) { String val = get(param); try { return val==null ? def : Integer.parseInt(val); } catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, ex.getMessage(), ex ); } }
// in solrj/src/java/org/apache/solr/common/params/SolrParams.java
public Integer getFieldInt(String field, String param) { String val = getFieldParam(field, param); try { return val==null ? null : Integer.valueOf(val); } catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, ex.getMessage(), ex ); } }
// in solrj/src/java/org/apache/solr/common/params/SolrParams.java
public int getFieldInt(String field, String param, int def) { String val = getFieldParam(field, param); try { return val==null ? def : Integer.parseInt(val); } catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, ex.getMessage(), ex ); } }
// in solrj/src/java/org/apache/solr/common/params/SolrParams.java
public Float getFloat(String param) { String val = get(param); try { return val==null ? null : Float.valueOf(val); } catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, ex.getMessage(), ex ); } }
// in solrj/src/java/org/apache/solr/common/params/SolrParams.java
public float getFloat(String param, float def) { String val = get(param); try { return val==null ? def : Float.parseFloat(val); } catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, ex.getMessage(), ex ); } }
// in solrj/src/java/org/apache/solr/common/params/SolrParams.java
public Double getDouble(String param) { String val = get(param); try { return val==null ? null : Double.valueOf(val); } catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, ex.getMessage(), ex ); } }
// in solrj/src/java/org/apache/solr/common/params/SolrParams.java
public double getDouble(String param, double def) { String val = get(param); try { return val==null ? def : Double.parseDouble(val); } catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, ex.getMessage(), ex ); } }
// in solrj/src/java/org/apache/solr/common/params/SolrParams.java
public Float getFieldFloat(String field, String param) { String val = getFieldParam(field, param); try { return val==null ? null : Float.valueOf(val); } catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, ex.getMessage(), ex ); } }
// in solrj/src/java/org/apache/solr/common/params/SolrParams.java
public float getFieldFloat(String field, String param, float def) { String val = getFieldParam(field, param); try { return val==null ? def : Float.parseFloat(val); } catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, ex.getMessage(), ex ); } }
// in solrj/src/java/org/apache/solr/common/params/SolrParams.java
public Double getFieldDouble(String field, String param) { String val = getFieldParam(field, param); try { return val==null ? null : Double.valueOf(val); } catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, ex.getMessage(), ex ); } }
// in solrj/src/java/org/apache/solr/common/params/SolrParams.java
public double getFieldDouble(String field, String param, double def) { String val = getFieldParam(field, param); try { return val==null ? def : Double.parseDouble(val); } catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, ex.getMessage(), ex ); } }
// in solrj/src/java/org/apache/solr/client/solrj/impl/BinaryResponseParser.java
Override public NamedList<Object> processResponse(InputStream body, String encoding) { try { return (NamedList<Object>) new JavaBinCodec().unmarshal(body); } catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "parsing error", e); } }
// in solrj/src/java/org/apache/solr/client/solrj/impl/StreamingBinaryResponseParser.java
Override public NamedList<Object> processResponse(InputStream body, String encoding) { try { JavaBinCodec codec = new JavaBinCodec() { @Override public SolrDocument readSolrDocument(FastInputStream dis) throws IOException { SolrDocument doc = super.readSolrDocument(dis); callback.streamSolrDocument( doc ); return null; } @Override public SolrDocumentList readSolrDocumentList(FastInputStream dis) throws IOException { SolrDocumentList solrDocs = new SolrDocumentList(); List list = (List) readVal(dis); solrDocs.setNumFound((Long) list.get(0)); solrDocs.setStart((Long) list.get(1)); solrDocs.setMaxScore((Float) list.get(2)); callback.streamDocListInfo( solrDocs.getNumFound(), solrDocs.getStart(), solrDocs.getMaxScore() ); // Read the Array tagByte = dis.readByte(); if( (tagByte >>> 5) != (ARR >>> 5) ) { throw new RuntimeException( "doclist must have an array" ); } int sz = readSize(dis); for (int i = 0; i < sz; i++) { // must be a SolrDocument readVal( dis ); } return solrDocs; } }; return (NamedList<Object>) codec.unmarshal(body); } catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "parsing error", e); } }
// in solrj/src/java/org/apache/solr/client/solrj/impl/HttpSolrServer.java
public NamedList<Object> request(final SolrRequest request, final ResponseParser processor) throws SolrServerException, IOException { HttpRequestBase method = null; InputStream is = null; SolrParams params = request.getParams(); Collection<ContentStream> streams = requestWriter.getContentStreams(request); String path = requestWriter.getPath(request); if (path == null || !path.startsWith("/")) { path = DEFAULT_PATH; } ResponseParser parser = request.getResponseParser(); if (parser == null) { parser = this.parser; } // The parser 'wt=' and 'version=' params are used instead of the original // params ModifiableSolrParams wparams = new ModifiableSolrParams(params); wparams.set(CommonParams.WT, parser.getWriterType()); wparams.set(CommonParams.VERSION, parser.getVersion()); if (invariantParams != null) { wparams.add(invariantParams); } params = wparams; int tries = maxRetries + 1; try { while( tries-- > 0 ) { // Note: since we aren't do intermittent time keeping // ourselves, the potential non-timeout latency could be as // much as tries-times (plus scheduling effects) the given // timeAllowed. try { if( SolrRequest.METHOD.GET == request.getMethod() ) { if( streams != null ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "GET can't send streams!" ); } method = new HttpGet( baseUrl + path + ClientUtils.toQueryString( params, false ) ); } else if( SolrRequest.METHOD.POST == request.getMethod() ) { String url = baseUrl + path; boolean isMultipart = ( streams != null && streams.size() > 1 ); LinkedList<NameValuePair> postParams = new LinkedList<NameValuePair>(); if (streams == null || isMultipart) { HttpPost post = new HttpPost(url); post.setHeader("Content-Charset", "UTF-8"); if (!this.useMultiPartPost && !isMultipart) { post.addHeader("Content-Type", "application/x-www-form-urlencoded; charset=UTF-8"); } List<FormBodyPart> parts = new LinkedList<FormBodyPart>(); Iterator<String> iter = params.getParameterNamesIterator(); while (iter.hasNext()) { String p = iter.next(); String[] vals = params.getParams(p); if (vals != null) { for (String v : vals) { if (this.useMultiPartPost || isMultipart) { parts.add(new FormBodyPart(p, new StringBody(v, Charset.forName("UTF-8")))); } else { postParams.add(new BasicNameValuePair(p, v)); } } } } if (isMultipart) { for (ContentStream content : streams) { String contentType = content.getContentType(); if(contentType==null) { contentType = "application/octet-stream"; // default } parts.add(new FormBodyPart(content.getName(), new InputStreamBody( content.getStream(), contentType, content.getName()))); } } if (parts.size() > 0) { MultipartEntity entity = new MultipartEntity(HttpMultipartMode.STRICT); for(FormBodyPart p: parts) { entity.addPart(p); } post.setEntity(entity); } else { //not using multipart post.setEntity(new UrlEncodedFormEntity(postParams, "UTF-8")); } method = post; } // It is has one stream, it is the post body, put the params in the URL else { String pstr = ClientUtils.toQueryString(params, false); HttpPost post = new HttpPost(url + pstr); // Single stream as body // Using a loop just to get the first one final ContentStream[] contentStream = new ContentStream[1]; for (ContentStream content : streams) { contentStream[0] = content; break; } if (contentStream[0] instanceof RequestWriter.LazyContentStream) { post.setEntity(new InputStreamEntity(contentStream[0].getStream(), -1) { @Override public Header getContentType() { return new BasicHeader("Content-Type", contentStream[0].getContentType()); } @Override public boolean isRepeatable() { return false; } }); } else { post.setEntity(new InputStreamEntity(contentStream[0].getStream(), -1) { @Override public Header getContentType() { return new BasicHeader("Content-Type", contentStream[0].getContentType()); } @Override public boolean isRepeatable() { return false; } }); } method = post; } } else { throw new SolrServerException("Unsupported method: "+request.getMethod() ); } } catch( NoHttpResponseException r ) { method = null; if(is != null) { is.close(); } // If out of tries then just rethrow (as normal error). if (tries < 1) { throw r; } } }
// in solrj/src/java/org/apache/solr/client/solrj/impl/XMLResponseParser.java
Override public NamedList<Object> processResponse(Reader in) { XMLStreamReader parser = null; try { parser = factory.createXMLStreamReader(in); } catch (XMLStreamException e) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "parsing error", e); } return processResponse(parser); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/XMLResponseParser.java
Override public NamedList<Object> processResponse(InputStream in, String encoding) { XMLStreamReader parser = null; try { parser = factory.createXMLStreamReader(in, encoding); } catch (XMLStreamException e) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "parsing error", e); } return processResponse(parser); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/XMLResponseParser.java
private NamedList<Object> processResponse(XMLStreamReader parser) { try { NamedList<Object> response = null; for (int event = parser.next(); event != XMLStreamConstants.END_DOCUMENT; event = parser.next()) { switch (event) { case XMLStreamConstants.START_ELEMENT: if( response != null ) { throw new Exception( "already read the response!" ); } // only top-level element is "response String name = parser.getLocalName(); if( name.equals( "response" ) || name.equals( "result" ) ) { response = readNamedList( parser ); } else if( name.equals( "solr" ) ) { return new SimpleOrderedMap<Object>(); } else { throw new Exception( "really needs to be response or result. " + "not:"+parser.getLocalName() ); } break; } } return response; } catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "parsing error", ex ); } finally { try { parser.close(); } catch( Exception ex ){} } }
// in contrib/extraction/src/java/org/apache/solr/handler/extraction/ExtractingDocumentLoader.java
Override public void load(SolrQueryRequest req, SolrQueryResponse rsp, ContentStream stream, UpdateRequestProcessor processor) throws Exception { Parser parser = null; String streamType = req.getParams().get(ExtractingParams.STREAM_TYPE, null); if (streamType != null) { //Cache? Parsers are lightweight to construct and thread-safe, so I'm told MediaType mt = MediaType.parse(streamType.trim().toLowerCase(Locale.ENGLISH)); parser = new DefaultParser(config.getMediaTypeRegistry()).getParsers().get(mt); } else { parser = autoDetectParser; } if (parser != null) { Metadata metadata = new Metadata(); // If you specify the resource name (the filename, roughly) with this parameter, // then Tika can make use of it in guessing the appropriate MIME type: String resourceName = req.getParams().get(ExtractingParams.RESOURCE_NAME, null); if (resourceName != null) { metadata.add(TikaMetadataKeys.RESOURCE_NAME_KEY, resourceName); } // Provide stream's content type as hint for auto detection if(stream.getContentType() != null) { metadata.add(HttpHeaders.CONTENT_TYPE, stream.getContentType()); } InputStream inputStream = null; try { inputStream = stream.getStream(); metadata.add(ExtractingMetadataConstants.STREAM_NAME, stream.getName()); metadata.add(ExtractingMetadataConstants.STREAM_SOURCE_INFO, stream.getSourceInfo()); metadata.add(ExtractingMetadataConstants.STREAM_SIZE, String.valueOf(stream.getSize())); metadata.add(ExtractingMetadataConstants.STREAM_CONTENT_TYPE, stream.getContentType()); // HtmlParser and TXTParser regard Metadata.CONTENT_ENCODING in metadata String charset = ContentStreamBase.getCharsetFromContentType(stream.getContentType()); if(charset != null){ metadata.add(HttpHeaders.CONTENT_ENCODING, charset); } String xpathExpr = params.get(ExtractingParams.XPATH_EXPRESSION); boolean extractOnly = params.getBool(ExtractingParams.EXTRACT_ONLY, false); SolrContentHandler handler = factory.createSolrContentHandler(metadata, params, schema); ContentHandler parsingHandler = handler; StringWriter writer = null; BaseMarkupSerializer serializer = null; if (extractOnly == true) { String extractFormat = params.get(ExtractingParams.EXTRACT_FORMAT, "xml"); writer = new StringWriter(); if (extractFormat.equals(TEXT_FORMAT)) { serializer = new TextSerializer(); serializer.setOutputCharStream(writer); serializer.setOutputFormat(new OutputFormat("Text", "UTF-8", true)); } else { serializer = new XMLSerializer(writer, new OutputFormat("XML", "UTF-8", true)); } if (xpathExpr != null) { Matcher matcher = PARSER.parse(xpathExpr); serializer.startDocument();//The MatchingContentHandler does not invoke startDocument. See http://tika.markmail.org/message/kknu3hw7argwiqin parsingHandler = new MatchingContentHandler(serializer, matcher); } else { parsingHandler = serializer; } } else if (xpathExpr != null) { Matcher matcher = PARSER.parse(xpathExpr); parsingHandler = new MatchingContentHandler(handler, matcher); } //else leave it as is try{ //potentially use a wrapper handler for parsing, but we still need the SolrContentHandler for getting the document. ParseContext context = new ParseContext();//TODO: should we design a way to pass in parse context? parser.parse(inputStream, parsingHandler, metadata, context); } catch (TikaException e) { if(ignoreTikaException) log.warn(new StringBuilder("skip extracting text due to ").append(e.getLocalizedMessage()) .append(". metadata=").append(metadata.toString()).toString()); else throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); } if (extractOnly == false) { addDoc(handler); } else { //serializer is not null, so we need to call endDoc on it if using xpath if (xpathExpr != null){ serializer.endDocument(); } rsp.add(stream.getName(), writer.toString()); writer.close(); String[] names = metadata.names(); NamedList metadataNL = new NamedList(); for (int i = 0; i < names.length; i++) { String[] vals = metadata.getValues(names[i]); metadataNL.add(names[i], vals); } rsp.add(stream.getName() + "_metadata", metadataNL); } } catch (SAXException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); } finally { IOUtils.closeQuietly(inputStream); } } else { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Stream type of " + streamType + " didn't match any known parsers. Please supply the " + ExtractingParams.STREAM_TYPE + " parameter."); } }
// in contrib/extraction/src/java/org/apache/solr/handler/extraction/ExtractingRequestHandler.java
public void inform(SolrCore core) { if (initArgs != null) { //if relative,then relative to config dir, otherwise, absolute path String tikaConfigLoc = (String) initArgs.get(CONFIG_LOCATION); if (tikaConfigLoc != null) { File configFile = new File(tikaConfigLoc); if (configFile.isAbsolute() == false) { configFile = new File(core.getResourceLoader().getConfigDir(), configFile.getPath()); } try { config = new TikaConfig(configFile); } catch (Exception e) { throw new SolrException(ErrorCode.SERVER_ERROR, e); } } NamedList configDateFormats = (NamedList) initArgs.get(DATE_FORMATS); if (configDateFormats != null && configDateFormats.size() > 0) { dateFormats = new HashSet<String>(); Iterator<Map.Entry> it = configDateFormats.iterator(); while (it.hasNext()) { String format = (String) it.next().getValue(); log.info("Adding Date Format: " + format); dateFormats.add(format); } } } if (config == null) { try { config = getDefaultConfig(core.getResourceLoader().getClassLoader()); } catch (MimeTypeException e) { throw new SolrException(ErrorCode.SERVER_ERROR, e); } catch (IOException e) { throw new SolrException(ErrorCode.SERVER_ERROR, e); } } factory = createFactory(); }
// in contrib/uima/src/java/org/apache/solr/uima/processor/UIMAUpdateRequestProcessor.java
Override public void processAdd(AddUpdateCommand cmd) throws IOException { String text = null; try { /* get Solr document */ SolrInputDocument solrInputDocument = cmd.getSolrInputDocument(); /* get the fields to analyze */ String[] texts = getTextsToAnalyze(solrInputDocument); for (int i = 0; i < texts.length; i++) { text = texts[i]; if (text != null && text.length()>0) { /* process the text value */ JCas jcas = processText(text); UIMAToSolrMapper uimaToSolrMapper = new UIMAToSolrMapper(solrInputDocument, jcas); /* get field mapping from config */ Map<String, Map<String, MapField>> typesAndFeaturesFieldsMap = solrUIMAConfiguration .getTypesFeaturesFieldsMapping(); /* map type features on fields */ for (String typeFQN : typesAndFeaturesFieldsMap.keySet()) { uimaToSolrMapper.map(typeFQN, typesAndFeaturesFieldsMap.get(typeFQN)); } } } } catch (Exception e) { String logField = solrUIMAConfiguration.getLogField(); if(logField == null){ SchemaField uniqueKeyField = solrCore.getSchema().getUniqueKeyField(); if(uniqueKeyField != null){ logField = uniqueKeyField.getName(); } } String optionalFieldInfo = logField == null ? "." : new StringBuilder(". ").append(logField).append("=") .append((String)cmd.getSolrInputDocument().getField(logField).getValue()) .append(", ").toString(); int len = Math.min(text.length(), 100); if (solrUIMAConfiguration.isIgnoreErrors()) { log.warn(new StringBuilder("skip the text processing due to ") .append(e.getLocalizedMessage()).append(optionalFieldInfo) .append(" text=\"").append(text.substring(0, len)).append("...\"").toString()); } else { throw new SolrException(ErrorCode.SERVER_ERROR, new StringBuilder("processing error: ") .append(e.getLocalizedMessage()).append(optionalFieldInfo) .append(" text=\"").append(text.substring(0, len)).append("...\"").toString(), e); } } super.processAdd(cmd); }
// in contrib/clustering/src/java/org/apache/solr/handler/clustering/carrot2/CarrotClusteringEngine.java
Override public Object cluster(Query query, SolrDocumentList solrDocList, Map<SolrDocument, Integer> docIds, SolrQueryRequest sreq) { try { // Prepare attributes for Carrot2 clustering call Map<String, Object> attributes = new HashMap<String, Object>(); List<Document> documents = getDocuments(solrDocList, docIds, query, sreq); attributes.put(AttributeNames.DOCUMENTS, documents); attributes.put(AttributeNames.QUERY, query.toString()); // Pass the fields on which clustering runs to the // SolrStopwordsCarrot2LexicalDataFactory attributes.put("solrFieldNames", getFieldsForClustering(sreq)); // Pass extra overriding attributes from the request, if any extractCarrotAttributes(sreq.getParams(), attributes); // Perform clustering and convert to named list // Carrot2 uses current thread's context class loader to get // certain classes (e.g. custom tokenizer/stemmer) at runtime. // To make sure classes from contrib JARs are available, // we swap the context class loader for the time of clustering. Thread ct = Thread.currentThread(); ClassLoader prev = ct.getContextClassLoader(); try { ct.setContextClassLoader(core.getResourceLoader().getClassLoader()); return clustersToNamedList(controller.process(attributes, clusteringAlgorithmClass).getClusters(), sreq.getParams()); } finally { ct.setContextClassLoader(prev); } } catch (Exception e) { log.error("Carrot2 clustering failed", e); throw new SolrException(ErrorCode.SERVER_ERROR, "Carrot2 clustering failed", e); } }
// in contrib/clustering/src/java/org/apache/solr/handler/clustering/carrot2/CarrotClusteringEngine.java
public String init(NamedList config, final SolrCore core) { this.core = core; String result = super.init(config, core); final SolrParams initParams = SolrParams.toSolrParams(config); // Initialize Carrot2 controller. Pass initialization attributes, if any. HashMap<String, Object> initAttributes = new HashMap<String, Object>(); extractCarrotAttributes(initParams, initAttributes); // Customize the stemmer and tokenizer factories. The implementations we provide here // are included in the code base of Solr, so that it's possible to refactor // the Lucene APIs the factories rely on if needed. // Additionally, we set a custom lexical resource factory for Carrot2 that // will use both Carrot2 default stop words as well as stop words from // the StopFilter defined on the field. final AttributeBuilder attributeBuilder = BasicPreprocessingPipelineDescriptor.attributeBuilder(initAttributes); attributeBuilder.lexicalDataFactory(SolrStopwordsCarrot2LexicalDataFactory.class); if (!initAttributes.containsKey(BasicPreprocessingPipelineDescriptor.Keys.TOKENIZER_FACTORY)) { attributeBuilder.tokenizerFactory(LuceneCarrot2TokenizerFactory.class); } if (!initAttributes.containsKey(BasicPreprocessingPipelineDescriptor.Keys.STEMMER_FACTORY)) { attributeBuilder.stemmerFactory(LuceneCarrot2StemmerFactory.class); } // Pass the schema to SolrStopwordsCarrot2LexicalDataFactory. initAttributes.put("solrIndexSchema", core.getSchema()); // Customize Carrot2's resource lookup to first look for resources // using Solr's resource loader. If that fails, try loading from the classpath. DefaultLexicalDataFactoryDescriptor.attributeBuilder(initAttributes).resourceLookup( new ResourceLookup( // Solr-specific resource loading. new SolrResourceLocator(core, initParams), // Using the class loader directly because this time we want to omit the prefix new ClassLoaderLocator(core.getResourceLoader().getClassLoader()))); // Carrot2 uses current thread's context class loader to get // certain classes (e.g. custom tokenizer/stemmer) at initialization time. // To make sure classes from contrib JARs are available, // we swap the context class loader for the time of clustering. Thread ct = Thread.currentThread(); ClassLoader prev = ct.getContextClassLoader(); try { ct.setContextClassLoader(core.getResourceLoader().getClassLoader()); this.controller.init(initAttributes); } finally { ct.setContextClassLoader(prev); } SchemaField uniqueField = core.getSchema().getUniqueKeyField(); if (uniqueField == null) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, CarrotClusteringEngine.class.getSimpleName() + " requires the schema to have a uniqueKeyField"); } this.idFieldName = uniqueField.getName(); // Make sure the requested Carrot2 clustering algorithm class is available String carrotAlgorithmClassName = initParams.get(CarrotParams.ALGORITHM); this.clusteringAlgorithmClass = core.getResourceLoader().findClass(carrotAlgorithmClassName, IClusteringAlgorithm.class); return result; }
// in contrib/clustering/src/java/org/apache/solr/handler/clustering/carrot2/CarrotClusteringEngine.java
private Set<String> getFieldsForClustering(SolrQueryRequest sreq) { SolrParams solrParams = sreq.getParams(); String titleFieldSpec = solrParams.get(CarrotParams.TITLE_FIELD_NAME, "title"); String snippetFieldSpec = solrParams.get(CarrotParams.SNIPPET_FIELD_NAME, titleFieldSpec); if (StringUtils.isBlank(snippetFieldSpec)) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, CarrotParams.SNIPPET_FIELD_NAME + " must not be blank."); } final Set<String> fields = Sets.newHashSet(); fields.addAll(Arrays.asList(titleFieldSpec.split("[, ]"))); fields.addAll(Arrays.asList(snippetFieldSpec.split("[, ]"))); return fields; }
// in contrib/langid/src/java/org/apache/solr/update/processor/LanguageIdentifierUpdateProcessor.java
private void initParams(SolrParams params) { if (params != null) { // Document-centric langId params setEnabled(params.getBool(LANGUAGE_ID, true)); if(params.get(FIELDS_PARAM, "").length() > 0) { inputFields = params.get(FIELDS_PARAM, "").split(","); } langField = params.get(LANG_FIELD, DOCID_LANGFIELD_DEFAULT); langsField = params.get(LANGS_FIELD, DOCID_LANGSFIELD_DEFAULT); docIdField = params.get(DOCID_PARAM, DOCID_FIELD_DEFAULT); fallbackValue = params.get(FALLBACK); if(params.get(FALLBACK_FIELDS, "").length() > 0) { fallbackFields = params.get(FALLBACK_FIELDS).split(","); } overwrite = params.getBool(OVERWRITE, false); langWhitelist = new HashSet<String>(); threshold = params.getDouble(THRESHOLD, DOCID_THRESHOLD_DEFAULT); if(params.get(LANG_WHITELIST, "").length() > 0) { for(String lang : params.get(LANG_WHITELIST, "").split(",")) { langWhitelist.add(lang); } } // Mapping params (field centric) enableMapping = params.getBool(MAP_ENABLE, false); if(params.get(MAP_FL, "").length() > 0) { mapFields = params.get(MAP_FL, "").split(","); } else { mapFields = inputFields; } mapKeepOrig = params.getBool(MAP_KEEP_ORIG, false); mapOverwrite = params.getBool(MAP_OVERWRITE, false); mapIndividual = params.getBool(MAP_INDIVIDUAL, false); // Process individual fields String[] mapIndividualFields = {}; if(params.get(MAP_INDIVIDUAL_FL, "").length() > 0) { mapIndividualFields = params.get(MAP_INDIVIDUAL_FL, "").split(","); } else { mapIndividualFields = mapFields; } mapIndividualFieldsSet = new HashSet<String>(Arrays.asList(mapIndividualFields)); // Compile a union of the lists of fields to map allMapFieldsSet = new HashSet<String>(Arrays.asList(mapFields)); if(Arrays.equals(mapFields, mapIndividualFields)) { allMapFieldsSet.addAll(mapIndividualFieldsSet); } // Language Code mapping lcMap = new HashMap<String,String>(); if(params.get(MAP_LCMAP) != null) { for(String mapping : params.get(MAP_LCMAP).split("[, ]")) { String[] keyVal = mapping.split(":"); if(keyVal.length == 2) { lcMap.put(keyVal[0], keyVal[1]); } else { log.error("Unsupported format for langid.map.lcmap: "+mapping+". Skipping this mapping."); } } } enforceSchema = params.getBool(ENFORCE_SCHEMA, true); mapPattern = Pattern.compile(params.get(MAP_PATTERN, MAP_PATTERN_DEFAULT)); mapReplaceStr = params.get(MAP_REPLACE, MAP_REPLACE_DEFAULT); } log.debug("LangId configured"); if (inputFields.length == 0) { throw new SolrException(ErrorCode.BAD_REQUEST, "Missing or faulty configuration of LanguageIdentifierUpdateProcessor. Input fields must be specified as a comma separated list"); } }
// in contrib/langid/src/java/org/apache/solr/update/processor/LanguageIdentifierUpdateProcessor.java
protected SolrInputDocument process(SolrInputDocument doc) { String docLang = null; HashSet<String> docLangs = new HashSet<String>(); String fallbackLang = getFallbackLang(doc, fallbackFields, fallbackValue); if(langField == null || !doc.containsKey(langField) || (doc.containsKey(langField) && overwrite)) { String allText = concatFields(doc, inputFields); List<DetectedLanguage> languagelist = detectLanguage(allText); docLang = resolveLanguage(languagelist, fallbackLang); docLangs.add(docLang); log.debug("Detected main document language from fields "+inputFields+": "+docLang); if(doc.containsKey(langField) && overwrite) { log.debug("Overwritten old value "+doc.getFieldValue(langField)); } if(langField != null && langField.length() != 0) { doc.setField(langField, docLang); } } else { // langField is set, we sanity check it against whitelist and fallback docLang = resolveLanguage((String) doc.getFieldValue(langField), fallbackLang); docLangs.add(docLang); log.debug("Field "+langField+" already contained value "+docLang+", not overwriting."); } if(enableMapping) { for (String fieldName : allMapFieldsSet) { if(doc.containsKey(fieldName)) { String fieldLang; if(mapIndividual && mapIndividualFieldsSet.contains(fieldName)) { String text = (String) doc.getFieldValue(fieldName); List<DetectedLanguage> languagelist = detectLanguage(text); fieldLang = resolveLanguage(languagelist, docLang); docLangs.add(fieldLang); log.debug("Mapping field "+fieldName+" using individually detected language "+fieldLang); } else { fieldLang = docLang; log.debug("Mapping field "+fieldName+" using document global language "+fieldLang); } String mappedOutputField = getMappedField(fieldName, fieldLang); if(enforceSchema && schema.getFieldOrNull(fieldName) == null) { log.warn("Unsuccessful field name mapping to {}, field does not exist, skipping mapping.", mappedOutputField, fieldName); mappedOutputField = fieldName; } if (mappedOutputField != null) { log.debug("Mapping field {} to {}", doc.getFieldValue(docIdField), fieldLang); SolrInputField inField = doc.getField(fieldName); doc.setField(mappedOutputField, inField.getValue(), inField.getBoost()); if(!mapKeepOrig) { log.debug("Removing old field {}", fieldName); doc.removeField(fieldName); } } else { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Invalid output field mapping for " + fieldName + " field and language: " + fieldLang); } } else { log.warn("Document {} does not contain input field {}. Skipping this field.", doc.getFieldValue(docIdField), fieldName); } } } // Set the languages field to an array of all detected languages if(langsField != null && langsField.length() != 0) { doc.setField(langsField, docLangs.toArray()); } return doc; }
// in contrib/analysis-extras/src/java/org/apache/solr/analysis/ICUNormalizer2FilterFactory.java
Override public void init(Map<String,String> args) { super.init(args); String name = args.get("name"); if (name == null) name = "nfkc_cf"; String mode = args.get("mode"); if (mode == null) mode = "compose"; if (mode.equals("compose")) normalizer = Normalizer2.getInstance(null, name, Normalizer2.Mode.COMPOSE); else if (mode.equals("decompose")) normalizer = Normalizer2.getInstance(null, name, Normalizer2.Mode.DECOMPOSE); else throw new SolrException(ErrorCode.SERVER_ERROR, "Invalid mode: " + mode); String filter = args.get("filter"); if (filter != null) { UnicodeSet set = new UnicodeSet(filter); if (!set.isEmpty()) { set.freeze(); normalizer = new FilteredNormalizer2(normalizer, set); } } }
// in contrib/analysis-extras/src/java/org/apache/solr/analysis/StempelPolishStemFilterFactory.java
public void inform(ResourceLoader loader) { try { stemmer = StempelStemmer.load(loader.openResource(STEMTABLE)); } catch (IOException e) { throw new SolrException(ErrorCode.SERVER_ERROR, "Could not load stem table: " + STEMTABLE); } }
// in contrib/analysis-extras/src/java/org/apache/solr/analysis/ICUTransformFilterFactory.java
Override public void init(Map<String,String> args) { super.init(args); String id = args.get("id"); if (id == null) { throw new SolrException(ErrorCode.SERVER_ERROR, "id is required."); } int dir; String direction = args.get("direction"); if (direction == null || direction.equalsIgnoreCase("forward")) dir = Transliterator.FORWARD; else if (direction.equalsIgnoreCase("reverse")) dir = Transliterator.REVERSE; else throw new SolrException(ErrorCode.SERVER_ERROR, "invalid direction: " + direction); transliterator = Transliterator.getInstance(id, dir); }
// in contrib/analysis-extras/src/java/org/apache/solr/schema/ICUCollationField.java
private void setup(ResourceLoader loader, Map<String,String> args) { String custom = args.remove("custom"); String localeID = args.remove("locale"); String strength = args.remove("strength"); String decomposition = args.remove("decomposition"); String alternate = args.remove("alternate"); String caseLevel = args.remove("caseLevel"); String caseFirst = args.remove("caseFirst"); String numeric = args.remove("numeric"); String variableTop = args.remove("variableTop"); if (custom == null && localeID == null) throw new SolrException(ErrorCode.SERVER_ERROR, "Either custom or locale is required."); if (custom != null && localeID != null) throw new SolrException(ErrorCode.SERVER_ERROR, "Cannot specify both locale and custom. " + "To tailor rules for a built-in language, see the javadocs for RuleBasedCollator. " + "Then save the entire customized ruleset to a file, and use with the custom parameter"); final Collator collator; if (localeID != null) { // create from a system collator, based on Locale. collator = createFromLocale(localeID); } else { // create from a custom ruleset collator = createFromRules(custom, loader); } // set the strength flag, otherwise it will be the default. if (strength != null) { if (strength.equalsIgnoreCase("primary")) collator.setStrength(Collator.PRIMARY); else if (strength.equalsIgnoreCase("secondary")) collator.setStrength(Collator.SECONDARY); else if (strength.equalsIgnoreCase("tertiary")) collator.setStrength(Collator.TERTIARY); else if (strength.equalsIgnoreCase("quaternary")) collator.setStrength(Collator.QUATERNARY); else if (strength.equalsIgnoreCase("identical")) collator.setStrength(Collator.IDENTICAL); else throw new SolrException(ErrorCode.SERVER_ERROR, "Invalid strength: " + strength); } // set the decomposition flag, otherwise it will be the default. if (decomposition != null) { if (decomposition.equalsIgnoreCase("no")) collator.setDecomposition(Collator.NO_DECOMPOSITION); else if (decomposition.equalsIgnoreCase("canonical")) collator.setDecomposition(Collator.CANONICAL_DECOMPOSITION); else throw new SolrException(ErrorCode.SERVER_ERROR, "Invalid decomposition: " + decomposition); } // expert options: concrete subclasses are always a RuleBasedCollator RuleBasedCollator rbc = (RuleBasedCollator) collator; if (alternate != null) { if (alternate.equalsIgnoreCase("shifted")) { rbc.setAlternateHandlingShifted(true); } else if (alternate.equalsIgnoreCase("non-ignorable")) { rbc.setAlternateHandlingShifted(false); } else { throw new SolrException(ErrorCode.SERVER_ERROR, "Invalid alternate: " + alternate); } } if (caseLevel != null) { rbc.setCaseLevel(Boolean.parseBoolean(caseLevel)); } if (caseFirst != null) { if (caseFirst.equalsIgnoreCase("lower")) { rbc.setLowerCaseFirst(true); } else if (caseFirst.equalsIgnoreCase("upper")) { rbc.setUpperCaseFirst(true); } else { throw new SolrException(ErrorCode.SERVER_ERROR, "Invalid caseFirst: " + caseFirst); } } if (numeric != null) { rbc.setNumericCollation(Boolean.parseBoolean(numeric)); } if (variableTop != null) { rbc.setVariableTop(variableTop); } // we use 4.0 because it ensures we just encode the pure byte[] keys. analyzer = new ICUCollationKeyAnalyzer(Version.LUCENE_40, collator); }
// in core/src/java/org/apache/solr/handler/DocumentAnalysisRequestHandler.java
SolrInputDocument readDocument(XMLStreamReader reader, IndexSchema schema) throws XMLStreamException { SolrInputDocument doc = new SolrInputDocument(); String uniqueKeyField = schema.getUniqueKeyField().getName(); StringBuilder text = new StringBuilder(); String fieldName = null; boolean hasId = false; while (true) { int event = reader.next(); switch (event) { // Add everything to the text case XMLStreamConstants.SPACE: case XMLStreamConstants.CDATA: case XMLStreamConstants.CHARACTERS: text.append(reader.getText()); break; case XMLStreamConstants.END_ELEMENT: if ("doc".equals(reader.getLocalName())) { if (!hasId) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "All documents must contain a unique key value: '" + doc.toString() + "'"); } return doc; } else if ("field".equals(reader.getLocalName())) { doc.addField(fieldName, text.toString(), DEFAULT_BOOST); if (uniqueKeyField.equals(fieldName)) { hasId = true; } } break; case XMLStreamConstants.START_ELEMENT: text.setLength(0); String localName = reader.getLocalName(); if (!"field".equals(localName)) { log.warn("unexpected XML tag doc/" + localName); throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "unexpected XML tag doc/" + localName); } for (int i = 0; i < reader.getAttributeCount(); i++) { String attrName = reader.getAttributeLocalName(i); if ("name".equals(attrName)) { fieldName = reader.getAttributeValue(i); } } break; } } }
// in core/src/java/org/apache/solr/handler/DocumentAnalysisRequestHandler.java
private ContentStream extractSingleContentStream(SolrQueryRequest req) { Iterable<ContentStream> streams = req.getContentStreams(); String exceptionMsg = "DocumentAnalysisRequestHandler expects a single content stream with documents to analyze"; if (streams == null) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, exceptionMsg); } Iterator<ContentStream> iter = streams.iterator(); if (!iter.hasNext()) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, exceptionMsg); } ContentStream stream = iter.next(); if (iter.hasNext()) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, exceptionMsg); } return stream; }
// in core/src/java/org/apache/solr/handler/UpdateRequestHandler.java
Override public void load(SolrQueryRequest req, SolrQueryResponse rsp, ContentStream stream, UpdateRequestProcessor processor) throws Exception { String type = req.getParams().get(UpdateParams.ASSUME_CONTENT_TYPE); if(type == null) { type = stream.getContentType(); } if( type == null ) { // Normal requests will not get here. throw new SolrException(ErrorCode.BAD_REQUEST, "Missing ContentType"); } int idx = type.indexOf(';'); if(idx>0) { type = type.substring(0,idx); } ContentStreamLoader loader = loaders.get(type); if(loader==null) { throw new SolrException(ErrorCode.BAD_REQUEST, "Unsupported ContentType: " +type+ " Not in: "+loaders.keySet()); } if(loader.getDefaultWT()!=null) { setDefaultWT(req,loader); } loader.load(req, rsp, stream, processor); }
// in core/src/java/org/apache/solr/handler/AnalysisRequestHandlerBase.java
protected NamedList<? extends Object> analyzeValue(String value, AnalysisContext context) { Analyzer analyzer = context.getAnalyzer(); if (!TokenizerChain.class.isInstance(analyzer)) { TokenStream tokenStream = null; try { tokenStream = analyzer.tokenStream(context.getFieldName(), new StringReader(value)); } catch (IOException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); } NamedList<List<NamedList>> namedList = new NamedList<List<NamedList>>(); namedList.add(tokenStream.getClass().getName(), convertTokensToNamedLists(analyzeTokenStream(tokenStream), context)); return namedList; } TokenizerChain tokenizerChain = (TokenizerChain) analyzer; CharFilterFactory[] cfiltfacs = tokenizerChain.getCharFilterFactories(); TokenizerFactory tfac = tokenizerChain.getTokenizerFactory(); TokenFilterFactory[] filtfacs = tokenizerChain.getTokenFilterFactories(); NamedList<Object> namedList = new NamedList<Object>(); if( cfiltfacs != null ){ String source = value; for(CharFilterFactory cfiltfac : cfiltfacs ){ CharStream reader = CharReader.get(new StringReader(source)); reader = cfiltfac.create(reader); source = writeCharStream(namedList, reader); } } TokenStream tokenStream = tfac.create(tokenizerChain.initReader(new StringReader(value))); List<AttributeSource> tokens = analyzeTokenStream(tokenStream); namedList.add(tokenStream.getClass().getName(), convertTokensToNamedLists(tokens, context)); ListBasedTokenStream listBasedTokenStream = new ListBasedTokenStream(tokens); for (TokenFilterFactory tokenFilterFactory : filtfacs) { for (final AttributeSource tok : tokens) { tok.getAttribute(TokenTrackingAttribute.class).freezeStage(); } tokenStream = tokenFilterFactory.create(listBasedTokenStream); tokens = analyzeTokenStream(tokenStream); namedList.add(tokenStream.getClass().getName(), convertTokensToNamedLists(tokens, context)); listBasedTokenStream = new ListBasedTokenStream(tokens); } return namedList; }
// in core/src/java/org/apache/solr/handler/AnalysisRequestHandlerBase.java
private String writeCharStream(NamedList<Object> out, CharStream input ){ final int BUFFER_SIZE = 1024; char[] buf = new char[BUFFER_SIZE]; int len = 0; StringBuilder sb = new StringBuilder(); do { try { len = input.read( buf, 0, BUFFER_SIZE ); } catch (IOException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); } if( len > 0 ) sb.append(buf, 0, len); } while( len == BUFFER_SIZE ); out.add( input.getClass().getName(), sb.toString()); return sb.toString(); }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
private NamedList<?> getNamedListResponse(HttpPost method) throws IOException { InputStream input = null; NamedList<?> result = null; try { HttpResponse response = myHttpClient.execute(method); int status = response.getStatusLine().getStatusCode(); if (status != HttpStatus.SC_OK) { throw new SolrException(SolrException.ErrorCode.SERVICE_UNAVAILABLE, "Request failed for the url " + method); } input = response.getEntity().getContent(); result = (NamedList<?>)new JavaBinCodec().unmarshal(input); } finally { try { if (input != null) { input.close(); } } catch (Exception e) { } } return result; }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
boolean fetchLatestIndex(SolrCore core, boolean force) throws IOException, InterruptedException { successfulInstall = false; replicationStartTime = System.currentTimeMillis(); try { //get the current 'replicateable' index version in the master NamedList response = null; try { response = getLatestVersion(); } catch (Exception e) { LOG.error("Master at: " + masterUrl + " is not available. Index fetch failed. Exception: " + e.getMessage()); return false; } long latestVersion = (Long) response.get(CMD_INDEX_VERSION); long latestGeneration = (Long) response.get(GENERATION); IndexCommit commit; RefCounted<SolrIndexSearcher> searcherRefCounted = null; try { searcherRefCounted = core.getNewestSearcher(false); if (searcherRefCounted == null) { SolrException.log(LOG, "No open searcher found - fetch aborted"); return false; } commit = searcherRefCounted.get().getIndexReader().getIndexCommit(); } finally { if (searcherRefCounted != null) searcherRefCounted.decref(); } if (latestVersion == 0L) { if (force && commit.getGeneration() != 0) { // since we won't get the files for an empty index, // we just clear ours and commit core.getUpdateHandler().getSolrCoreState().getIndexWriter(core).deleteAll(); SolrQueryRequest req = new LocalSolrQueryRequest(core, new ModifiableSolrParams()); core.getUpdateHandler().commit(new CommitUpdateCommand(req, false)); } //there is nothing to be replicated successfulInstall = true; return true; } if (!force && IndexDeletionPolicyWrapper.getCommitTimestamp(commit) == latestVersion) { //master and slave are already in sync just return LOG.info("Slave in sync with master."); successfulInstall = true; return true; } LOG.info("Master's generation: " + latestGeneration); LOG.info("Slave's generation: " + commit.getGeneration()); LOG.info("Starting replication process"); // get the list of files first fetchFileList(latestGeneration); // this can happen if the commit point is deleted before we fetch the file list. if(filesToDownload.isEmpty()) return false; LOG.info("Number of files in latest index in master: " + filesToDownload.size()); // Create the sync service fsyncService = Executors.newSingleThreadExecutor(); // use a synchronized list because the list is read by other threads (to show details) filesDownloaded = Collections.synchronizedList(new ArrayList<Map<String, Object>>()); // if the generateion of master is older than that of the slave , it means they are not compatible to be copied // then a new index direcory to be created and all the files need to be copied boolean isFullCopyNeeded = IndexDeletionPolicyWrapper.getCommitTimestamp(commit) >= latestVersion || force; File tmpIndexDir = createTempindexDir(core); if (isIndexStale()) isFullCopyNeeded = true; successfulInstall = false; boolean deleteTmpIdxDir = true; File indexDir = null ; try { indexDir = new File(core.getIndexDir()); downloadIndexFiles(isFullCopyNeeded, tmpIndexDir, latestGeneration); LOG.info("Total time taken for download : " + ((System.currentTimeMillis() - replicationStartTime) / 1000) + " secs"); Collection<Map<String, Object>> modifiedConfFiles = getModifiedConfFiles(confFilesToDownload); if (!modifiedConfFiles.isEmpty()) { downloadConfFiles(confFilesToDownload, latestGeneration); if (isFullCopyNeeded) { successfulInstall = modifyIndexProps(tmpIndexDir.getName()); deleteTmpIdxDir = false; } else { successfulInstall = copyIndexFiles(tmpIndexDir, indexDir); } if (successfulInstall) { LOG.info("Configuration files are modified, core will be reloaded"); logReplicationTimeAndConfFiles(modifiedConfFiles, successfulInstall);//write to a file time of replication and conf files. reloadCore(); } } else { terminateAndWaitFsyncService(); if (isFullCopyNeeded) { successfulInstall = modifyIndexProps(tmpIndexDir.getName()); deleteTmpIdxDir = false; } else { successfulInstall = copyIndexFiles(tmpIndexDir, indexDir); } if (successfulInstall) { logReplicationTimeAndConfFiles(modifiedConfFiles, successfulInstall); doCommit(); } } replicationStartTime = 0; return successfulInstall; } catch (ReplicationHandlerException e) { LOG.error("User aborted Replication"); return false; } catch (SolrException e) { throw e; } catch (InterruptedException e) { throw new InterruptedException("Index fetch interrupted"); } catch (Exception e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Index fetch failed : ", e); } finally { if (deleteTmpIdxDir) delTree(tmpIndexDir); else delTree(indexDir); } } finally { if (!successfulInstall) { logReplicationTimeAndConfFiles(null, successfulInstall); } filesToDownload = filesDownloaded = confFilesDownloaded = confFilesToDownload = null; replicationStartTime = 0; fileFetcher = null; if (fsyncService != null && !fsyncService.isShutdown()) fsyncService.shutdownNow(); fsyncService = null; stop = false; fsyncException = null; } }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
private void downloadConfFiles(List<Map<String, Object>> confFilesToDownload, long latestGeneration) throws Exception { LOG.info("Starting download of configuration files from master: " + confFilesToDownload); confFilesDownloaded = Collections.synchronizedList(new ArrayList<Map<String, Object>>()); File tmpconfDir = new File(solrCore.getResourceLoader().getConfigDir(), "conf." + getDateAsStr(new Date())); try { boolean status = tmpconfDir.mkdirs(); if (!status) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Failed to create temporary config folder: " + tmpconfDir.getName()); } for (Map<String, Object> file : confFilesToDownload) { String saveAs = (String) (file.get(ALIAS) == null ? file.get(NAME) : file.get(ALIAS)); fileFetcher = new FileFetcher(tmpconfDir, file, saveAs, true, latestGeneration); currentFile = file; fileFetcher.fetchFile(); confFilesDownloaded.add(new HashMap<String, Object>(file)); } // this is called before copying the files to the original conf dir // so that if there is an exception avoid corrupting the original files. terminateAndWaitFsyncService(); copyTmpConfFiles2Conf(tmpconfDir); } finally { delTree(tmpconfDir); } }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
private void copyTmpConfFiles2Conf(File tmpconfDir) throws IOException { File confDir = new File(solrCore.getResourceLoader().getConfigDir()); for (File file : tmpconfDir.listFiles()) { File oldFile = new File(confDir, file.getName()); if (oldFile.exists()) { File backupFile = new File(confDir, oldFile.getName() + "." + getDateAsStr(new Date(oldFile.lastModified()))); boolean status = oldFile.renameTo(backupFile); if (!status) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Unable to rename: " + oldFile + " to: " + backupFile); } } boolean status = file.renameTo(oldFile); if (status) { } else { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Unable to rename: " + file + " to: " + oldFile); } } }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
private boolean modifyIndexProps(String tmpIdxDirName) { LOG.info("New index installed. Updating index properties..."); File idxprops = new File(solrCore.getDataDir() + "index.properties"); Properties p = new Properties(); if (idxprops.exists()) { InputStream is = null; try { is = new FileInputStream(idxprops); p.load(is); } catch (Exception e) { LOG.error("Unable to load index.properties"); } finally { IOUtils.closeQuietly(is); } } p.put("index", tmpIdxDirName); FileOutputStream os = null; try { os = new FileOutputStream(idxprops); p.store(os, "index properties"); } catch (Exception e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Unable to write index.properties", e); } finally { IOUtils.closeQuietly(os); } return true; }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
private int fetchPackets(FastInputStream fis) throws Exception { byte[] intbytes = new byte[4]; byte[] longbytes = new byte[8]; try { while (true) { if (stop) { stop = false; aborted = true; throw new ReplicationHandlerException("User aborted replication"); } long checkSumServer = -1; fis.readFully(intbytes); //read the size of the packet int packetSize = readInt(intbytes); if (packetSize <= 0) { LOG.warn("No content recieved for file: " + currentFile); return NO_CONTENT; } if (buf.length < packetSize) buf = new byte[packetSize]; if (checksum != null) { //read the checksum fis.readFully(longbytes); checkSumServer = readLong(longbytes); } //then read the packet of bytes fis.readFully(buf, 0, packetSize); //compare the checksum as sent from the master if (includeChecksum) { checksum.reset(); checksum.update(buf, 0, packetSize); long checkSumClient = checksum.getValue(); if (checkSumClient != checkSumServer) { LOG.error("Checksum not matched between client and server for: " + currentFile); //if checksum is wrong it is a problem return for retry return 1; } } //if everything is fine, write down the packet to the file fileChannel.write(ByteBuffer.wrap(buf, 0, packetSize)); bytesDownloaded += packetSize; if (bytesDownloaded >= size) return 0; //errorcount is always set to zero after a successful packet errorCount = 0; } } catch (ReplicationHandlerException e) { throw e; } catch (Exception e) { LOG.warn("Error in fetching packets ", e); //for any failure , increment the error count errorCount++; //if it fails for the same pacaket for MAX_RETRIES fail and come out if (errorCount > MAX_RETRIES) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Fetch failed for file:" + fileName, e); } return ERR; } }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
private void cleanup() { try { //close the FileOutputStream (which also closes the Channel) fileOutputStream.close(); } catch (Exception e) {/* noop */ LOG.error("Error closing the file stream: "+ this.saveAs ,e); } if (bytesDownloaded != size) { //if the download is not complete then //delete the file being downloaded try { file.delete(); } catch (Exception e) { LOG.error("Error deleting file in cleanup" + e.getMessage()); } //if the failure is due to a user abort it is returned nomally else an exception is thrown if (!aborted) throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Unable to download " + fileName + " completely. Downloaded " + bytesDownloaded + "!=" + size); } }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
static Integer readInterval(String interval) { if (interval == null) return null; int result = 0; if (interval != null) { Matcher m = INTERVAL_PATTERN.matcher(interval.trim()); if (m.find()) { String hr = m.group(1); String min = m.group(2); String sec = m.group(3); result = 0; try { if (sec != null && sec.length() > 0) result += Integer.parseInt(sec); if (min != null && min.length() > 0) result += (60 * Integer.parseInt(min)); if (hr != null && hr.length() > 0) result += (60 * 60 * Integer.parseInt(hr)); result *= 1000; } catch (NumberFormatException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, INTERVAL_ERR_MSG); } } else { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, INTERVAL_ERR_MSG); } } return result; }
// in core/src/java/org/apache/solr/handler/loader/XMLLoader.java
Override public void load(SolrQueryRequest req, SolrQueryResponse rsp, ContentStream stream, UpdateRequestProcessor processor) throws Exception { final String charset = ContentStreamBase.getCharsetFromContentType(stream.getContentType()); InputStream is = null; XMLStreamReader parser = null; String tr = req.getParams().get(CommonParams.TR,null); if(tr!=null) { Transformer t = getTransformer(tr,req); final DOMResult result = new DOMResult(); // first step: read XML and build DOM using Transformer (this is no overhead, as XSL always produces // an internal result DOM tree, we just access it directly as input for StAX): try { is = stream.getStream(); final InputSource isrc = new InputSource(is); isrc.setEncoding(charset); final SAXSource source = new SAXSource(isrc); t.transform(source, result); } catch(TransformerException te) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, te.getMessage(), te); } finally { IOUtils.closeQuietly(is); } // second step feed the intermediate DOM tree into StAX parser: try { parser = inputFactory.createXMLStreamReader(new DOMSource(result.getNode())); this.processUpdate(req, processor, parser); } catch (XMLStreamException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e.getMessage(), e); } finally { if (parser != null) parser.close(); } } // Normal XML Loader else { try { is = stream.getStream(); if (UpdateRequestHandler.log.isTraceEnabled()) { final byte[] body = IOUtils.toByteArray(is); // TODO: The charset may be wrong, as the real charset is later // determined by the XML parser, the content-type is only used as a hint! UpdateRequestHandler.log.trace("body", new String(body, (charset == null) ? ContentStreamBase.DEFAULT_CHARSET : charset)); IOUtils.closeQuietly(is); is = new ByteArrayInputStream(body); } parser = (charset == null) ? inputFactory.createXMLStreamReader(is) : inputFactory.createXMLStreamReader(is, charset); this.processUpdate(req, processor, parser); } catch (XMLStreamException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e.getMessage(), e); } finally { if (parser != null) parser.close(); IOUtils.closeQuietly(is); } } }
// in core/src/java/org/apache/solr/handler/loader/XMLLoader.java
void processUpdate(SolrQueryRequest req, UpdateRequestProcessor processor, XMLStreamReader parser) throws XMLStreamException, IOException, FactoryConfigurationError, InstantiationException, IllegalAccessException, TransformerConfigurationException { AddUpdateCommand addCmd = null; SolrParams params = req.getParams(); while (true) { int event = parser.next(); switch (event) { case XMLStreamConstants.END_DOCUMENT: parser.close(); return; case XMLStreamConstants.START_ELEMENT: String currTag = parser.getLocalName(); if (currTag.equals(UpdateRequestHandler.ADD)) { log.trace("SolrCore.update(add)"); addCmd = new AddUpdateCommand(req); // First look for commitWithin parameter on the request, will be overwritten for individual <add>'s addCmd.commitWithin = params.getInt(UpdateParams.COMMIT_WITHIN, -1); addCmd.overwrite = params.getBool(UpdateParams.OVERWRITE, true); for (int i = 0; i < parser.getAttributeCount(); i++) { String attrName = parser.getAttributeLocalName(i); String attrVal = parser.getAttributeValue(i); if (UpdateRequestHandler.OVERWRITE.equals(attrName)) { addCmd.overwrite = StrUtils.parseBoolean(attrVal); } else if (UpdateRequestHandler.COMMIT_WITHIN.equals(attrName)) { addCmd.commitWithin = Integer.parseInt(attrVal); } else { log.warn("Unknown attribute id in add:" + attrName); } } } else if ("doc".equals(currTag)) { if(addCmd != null) { log.trace("adding doc..."); addCmd.clear(); addCmd.solrDoc = readDoc(parser); processor.processAdd(addCmd); } else { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Unexpected <doc> tag without an <add> tag surrounding it."); } } else if (UpdateRequestHandler.COMMIT.equals(currTag) || UpdateRequestHandler.OPTIMIZE.equals(currTag)) { log.trace("parsing " + currTag); CommitUpdateCommand cmd = new CommitUpdateCommand(req, UpdateRequestHandler.OPTIMIZE.equals(currTag)); ModifiableSolrParams mp = new ModifiableSolrParams(); for (int i = 0; i < parser.getAttributeCount(); i++) { String attrName = parser.getAttributeLocalName(i); String attrVal = parser.getAttributeValue(i); mp.set(attrName, attrVal); } RequestHandlerUtils.validateCommitParams(mp); SolrParams p = SolrParams.wrapDefaults(mp, req.getParams()); // default to the normal request params for commit options RequestHandlerUtils.updateCommit(cmd, p); processor.processCommit(cmd); } // end commit else if (UpdateRequestHandler.ROLLBACK.equals(currTag)) { log.trace("parsing " + currTag); RollbackUpdateCommand cmd = new RollbackUpdateCommand(req); processor.processRollback(cmd); } // end rollback else if (UpdateRequestHandler.DELETE.equals(currTag)) { log.trace("parsing delete"); processDelete(req, processor, parser); } // end delete break; } } }
// in core/src/java/org/apache/solr/handler/loader/XMLLoader.java
void processDelete(SolrQueryRequest req, UpdateRequestProcessor processor, XMLStreamReader parser) throws XMLStreamException, IOException { // Parse the command DeleteUpdateCommand deleteCmd = new DeleteUpdateCommand(req); // First look for commitWithin parameter on the request, will be overwritten for individual <delete>'s SolrParams params = req.getParams(); deleteCmd.commitWithin = params.getInt(UpdateParams.COMMIT_WITHIN, -1); for (int i = 0; i < parser.getAttributeCount(); i++) { String attrName = parser.getAttributeLocalName(i); String attrVal = parser.getAttributeValue(i); if ("fromPending".equals(attrName)) { // deprecated } else if ("fromCommitted".equals(attrName)) { // deprecated } else if (UpdateRequestHandler.COMMIT_WITHIN.equals(attrName)) { deleteCmd.commitWithin = Integer.parseInt(attrVal); } else { log.warn("unexpected attribute delete/@" + attrName); } } StringBuilder text = new StringBuilder(); while (true) { int event = parser.next(); switch (event) { case XMLStreamConstants.START_ELEMENT: String mode = parser.getLocalName(); if (!("id".equals(mode) || "query".equals(mode))) { log.warn("unexpected XML tag /delete/" + mode); throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "unexpected XML tag /delete/" + mode); } text.setLength(0); if ("id".equals(mode)) { for (int i = 0; i < parser.getAttributeCount(); i++) { String attrName = parser.getAttributeLocalName(i); String attrVal = parser.getAttributeValue(i); if (UpdateRequestHandler.VERSION.equals(attrName)) { deleteCmd.setVersion(Long.parseLong(attrVal)); } } } break; case XMLStreamConstants.END_ELEMENT: String currTag = parser.getLocalName(); if ("id".equals(currTag)) { deleteCmd.setId(text.toString()); } else if ("query".equals(currTag)) { deleteCmd.setQuery(text.toString()); } else if ("delete".equals(currTag)) { return; } else { log.warn("unexpected XML tag /delete/" + currTag); throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "unexpected XML tag /delete/" + currTag); } processor.processDelete(deleteCmd); deleteCmd.clear(); break; // Add everything to the text case XMLStreamConstants.SPACE: case XMLStreamConstants.CDATA: case XMLStreamConstants.CHARACTERS: text.append(parser.getText()); break; } } }
// in core/src/java/org/apache/solr/handler/loader/XMLLoader.java
public SolrInputDocument readDoc(XMLStreamReader parser) throws XMLStreamException { SolrInputDocument doc = new SolrInputDocument(); String attrName = ""; for (int i = 0; i < parser.getAttributeCount(); i++) { attrName = parser.getAttributeLocalName(i); if ("boost".equals(attrName)) { doc.setDocumentBoost(Float.parseFloat(parser.getAttributeValue(i))); } else { log.warn("Unknown attribute doc/@" + attrName); } } StringBuilder text = new StringBuilder(); String name = null; float boost = 1.0f; boolean isNull = false; String update = null; while (true) { int event = parser.next(); switch (event) { // Add everything to the text case XMLStreamConstants.SPACE: case XMLStreamConstants.CDATA: case XMLStreamConstants.CHARACTERS: text.append(parser.getText()); break; case XMLStreamConstants.END_ELEMENT: if ("doc".equals(parser.getLocalName())) { return doc; } else if ("field".equals(parser.getLocalName())) { Object v = isNull ? null : text.toString(); if (update != null) { Map<String,Object> extendedValue = new HashMap<String,Object>(1); extendedValue.put(update, v); v = extendedValue; } doc.addField(name, v, boost); boost = 1.0f; } break; case XMLStreamConstants.START_ELEMENT: text.setLength(0); String localName = parser.getLocalName(); if (!"field".equals(localName)) { log.warn("unexpected XML tag doc/" + localName); throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "unexpected XML tag doc/" + localName); } boost = 1.0f; update = null; String attrVal = ""; for (int i = 0; i < parser.getAttributeCount(); i++) { attrName = parser.getAttributeLocalName(i); attrVal = parser.getAttributeValue(i); if ("name".equals(attrName)) { name = attrVal; } else if ("boost".equals(attrName)) { boost = Float.parseFloat(attrVal); } else if ("null".equals(attrName)) { isNull = StrUtils.parseBoolean(attrVal); } else if ("update".equals(attrName)) { update = attrVal; } else { log.warn("Unknown attribute doc/field/@" + attrName); } } break; } } }
// in core/src/java/org/apache/solr/handler/loader/JavabinLoader.java
Override public void update(SolrInputDocument document, UpdateRequest updateRequest) { if (document == null) { // Perhaps commit from the parameters try { RequestHandlerUtils.handleCommit(req, processor, updateRequest.getParams(), false); RequestHandlerUtils.handleRollback(req, processor, updateRequest.getParams(), false); } catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "ERROR handling commit/rollback"); } return; } if (addCmd == null) { addCmd = getAddCommand(req, updateRequest.getParams()); } addCmd.solrDoc = document; try { processor.processAdd(addCmd); addCmd.clear(); } catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "ERROR adding document " + document); } }
// in core/src/java/org/apache/solr/handler/loader/JsonLoader.java
DeleteUpdateCommand parseDelete() throws IOException { assertNextEvent( JSONParser.OBJECT_START ); DeleteUpdateCommand cmd = new DeleteUpdateCommand(req); cmd.commitWithin = commitWithin; while( true ) { int ev = parser.nextEvent(); if( ev == JSONParser.STRING ) { String key = parser.getString(); if( parser.wasKey() ) { if( "id".equals( key ) ) { cmd.setId(parser.getString()); } else if( "query".equals(key) ) { cmd.setQuery(parser.getString()); } else if( "commitWithin".equals(key) ) { cmd.commitWithin = Integer.parseInt(parser.getString()); } else { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Unknown key: "+key+" ["+parser.getPosition()+"]" ); } } else { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "invalid string: " + key +" at ["+parser.getPosition()+"]" ); } } else if( ev == JSONParser.OBJECT_END ) { if( cmd.getId() == null && cmd.getQuery() == null ) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Missing id or query for delete ["+parser.getPosition()+"]" ); } return cmd; } else { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Got: "+JSONParser.getEventString( ev ) +" at ["+parser.getPosition()+"]" ); } } }
// in core/src/java/org/apache/solr/handler/loader/JsonLoader.java
AddUpdateCommand parseAdd() throws IOException { AddUpdateCommand cmd = new AddUpdateCommand(req); cmd.commitWithin = commitWithin; cmd.overwrite = overwrite; float boost = 1.0f; while( true ) { int ev = parser.nextEvent(); if( ev == JSONParser.STRING ) { if( parser.wasKey() ) { String key = parser.getString(); if( "doc".equals( key ) ) { if( cmd.solrDoc != null ) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "multiple docs in same add command" ); } ev = assertNextEvent( JSONParser.OBJECT_START ); cmd.solrDoc = parseDoc( ev ); } else if( UpdateRequestHandler.OVERWRITE.equals( key ) ) { cmd.overwrite = parser.getBoolean(); // reads next boolean } else if( UpdateRequestHandler.COMMIT_WITHIN.equals( key ) ) { cmd.commitWithin = (int)parser.getLong(); } else if( "boost".equals( key ) ) { boost = Float.parseFloat( parser.getNumberChars().toString() ); } else { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Unknown key: "+key+" ["+parser.getPosition()+"]" ); } } else { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Should be a key " +" at ["+parser.getPosition()+"]" ); } } else if( ev == JSONParser.OBJECT_END ) { if( cmd.solrDoc == null ) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST,"missing solr document. "+parser.getPosition() ); } cmd.solrDoc.setDocumentBoost( boost ); return cmd; } else { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Got: "+JSONParser.getEventString( ev ) +" at ["+parser.getPosition()+"]" ); } } }
// in core/src/java/org/apache/solr/handler/loader/JsonLoader.java
void assertEvent(int ev, int expected) { if( ev != expected ) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Expected: "+JSONParser.getEventString( expected ) +" but got "+JSONParser.getEventString( ev ) +" at ["+parser.getPosition()+"]" ); } }
// in core/src/java/org/apache/solr/handler/loader/JsonLoader.java
private void parseExtendedFieldValue(SolrInputField sif, int ev) throws IOException { assert ev == JSONParser.OBJECT_START; float boost = 1.0f; Object normalFieldValue = null; Map<String, Object> extendedInfo = null; for (;;) { ev = parser.nextEvent(); switch (ev) { case JSONParser.STRING: String label = parser.getString(); if ("boost".equals(label)) { ev = parser.nextEvent(); if( ev != JSONParser.NUMBER && ev != JSONParser.LONG && ev != JSONParser.BIGNUMBER ) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "boost should have number! "+JSONParser.getEventString(ev) ); } boost = (float)parser.getDouble(); } else if ("value".equals(label)) { normalFieldValue = parseNormalFieldValue(parser.nextEvent()); } else { // If we encounter other unknown map keys, then use a map if (extendedInfo == null) { extendedInfo = new HashMap<String, Object>(2); } // for now, the only extended info will be field values // we could either store this as an Object or a SolrInputField Object val = parseNormalFieldValue(parser.nextEvent()); extendedInfo.put(label, val); } break; case JSONParser.OBJECT_END: if (extendedInfo != null) { if (normalFieldValue != null) { extendedInfo.put("value",normalFieldValue); } sif.setValue(extendedInfo, boost); } else { sif.setValue(normalFieldValue, boost); } return; default: throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Error parsing JSON extended field value. Unexpected "+JSONParser.getEventString(ev) ); } } }
// in core/src/java/org/apache/solr/handler/loader/JsonLoader.java
private Object parseSingleFieldValue(int ev) throws IOException { switch (ev) { case JSONParser.STRING: return parser.getString(); case JSONParser.LONG: case JSONParser.NUMBER: case JSONParser.BIGNUMBER: return parser.getNumberChars().toString(); case JSONParser.BOOLEAN: return Boolean.toString(parser.getBoolean()); // for legacy reasons, single values s are expected to be strings case JSONParser.NULL: parser.getNull(); return null; case JSONParser.ARRAY_START: return parseArrayFieldValue(ev); default: throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Error parsing JSON field value. Unexpected "+JSONParser.getEventString(ev) ); } }
// in core/src/java/org/apache/solr/handler/loader/CSVLoaderBase.java
Override void add(SolrInputDocument doc, int line, int column, String val) { CSVParser parser = new CSVParser(new StringReader(val), strategy); try { String[] vals = parser.getLine(); if (vals!=null) { for (String v: vals) base.add(doc,line,column,v); } else { base.add(doc,line,column,val); } } catch (IOException e) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST,e); } }
// in core/src/java/org/apache/solr/handler/loader/CSVLoaderBase.java
void prepareFields() { // Possible future optimization: for really rapid incremental indexing // from a POST, one could cache all of this setup info based on the params. // The link from FieldAdder to this would need to be severed for that to happen. fields = new SchemaField[fieldnames.length]; adders = new CSVLoaderBase.FieldAdder[fieldnames.length]; String skipStr = params.get(SKIP); List<String> skipFields = skipStr==null ? null : StrUtils.splitSmart(skipStr,','); CSVLoaderBase.FieldAdder adder = new CSVLoaderBase.FieldAdder(); CSVLoaderBase.FieldAdder adderKeepEmpty = new CSVLoaderBase.FieldAdderEmpty(); for (int i=0; i<fields.length; i++) { String fname = fieldnames[i]; // to skip a field, leave the entries in fields and addrs null if (fname.length()==0 || (skipFields!=null && skipFields.contains(fname))) continue; fields[i] = schema.getField(fname); boolean keepEmpty = params.getFieldBool(fname,EMPTY,false); adders[i] = keepEmpty ? adderKeepEmpty : adder; // Order that operations are applied: split -> trim -> map -> add // so create in reverse order. // Creation of FieldAdders could be optimized and shared among fields String[] fmap = params.getFieldParams(fname,MAP); if (fmap!=null) { for (String mapRule : fmap) { String[] mapArgs = colonSplit.split(mapRule,-1); if (mapArgs.length!=2) throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "Map rules must be of the form 'from:to' ,got '"+mapRule+"'"); adders[i] = new CSVLoaderBase.FieldMapperSingle(mapArgs[0], mapArgs[1], adders[i]); } } if (params.getFieldBool(fname,TRIM,false)) { adders[i] = new CSVLoaderBase.FieldTrimmer(adders[i]); } if (params.getFieldBool(fname,SPLIT,false)) { String sepStr = params.getFieldParam(fname,SEPARATOR); char fsep = sepStr==null || sepStr.length()==0 ? ',' : sepStr.charAt(0); String encStr = params.getFieldParam(fname,ENCAPSULATOR); char fenc = encStr==null || encStr.length()==0 ? (char)-2 : encStr.charAt(0); String escStr = params.getFieldParam(fname,ESCAPE); char fesc = escStr==null || escStr.length()==0 ? CSVStrategy.ESCAPE_DISABLED : escStr.charAt(0); CSVStrategy fstrat = new CSVStrategy(fsep,fenc,CSVStrategy.COMMENTS_DISABLED,fesc, false, false, false, false); adders[i] = new CSVLoaderBase.FieldSplitter(fstrat, adders[i]); } } // look for any literal fields - literal.foo=xyzzy Iterator<String> paramNames = params.getParameterNamesIterator(); while (paramNames.hasNext()) { String pname = paramNames.next(); if (!pname.startsWith(LITERALS_PREFIX)) continue; String name = pname.substring(LITERALS_PREFIX.length()); SchemaField sf = schema.getFieldOrNull(name); if(sf == null) throw new SolrException( SolrException.ErrorCode.BAD_REQUEST,"Invalid field name for literal:'"+ name +"'"); literals.put(sf, params.get(pname)); } }
// in core/src/java/org/apache/solr/handler/loader/CSVLoaderBase.java
private void input_err(String msg, String[] line, int lineno) { StringBuilder sb = new StringBuilder(); sb.append(errHeader).append(", line=").append(lineno).append(",").append(msg).append("\n\tvalues={"); for (String val: line) { sb.append("'").append(val).append("',"); } sb.append('}'); throw new SolrException( SolrException.ErrorCode.BAD_REQUEST,sb.toString()); }
// in core/src/java/org/apache/solr/handler/loader/CSVLoaderBase.java
private void input_err(String msg, String[] lines, int lineNo, Throwable e) { StringBuilder sb = new StringBuilder(); sb.append(errHeader).append(", line=").append(lineNo).append(",").append(msg).append("\n\tvalues={"); if (lines != null) { for (String val : lines) { sb.append("'").append(val).append("',"); } } else { sb.append("NO LINES AVAILABLE"); } sb.append('}'); throw new SolrException( SolrException.ErrorCode.BAD_REQUEST,sb.toString(), e); }
// in core/src/java/org/apache/solr/handler/loader/CSVLoaderBase.java
Override public void load(SolrQueryRequest req, SolrQueryResponse rsp, ContentStream stream, UpdateRequestProcessor processor) throws IOException { errHeader = "CSVLoader: input=" + stream.getSourceInfo(); Reader reader = null; try { reader = stream.getReader(); if (skipLines>0) { if (!(reader instanceof BufferedReader)) { reader = new BufferedReader(reader); } BufferedReader r = (BufferedReader)reader; for (int i=0; i<skipLines; i++) { r.readLine(); } } CSVParser parser = new CSVParser(reader, strategy); // parse the fieldnames from the header of the file if (fieldnames==null) { fieldnames = parser.getLine(); if (fieldnames==null) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST,"Expected fieldnames in CSV input"); } prepareFields(); } // read the rest of the CSV file for(;;) { int line = parser.getLineNumber(); // for error reporting in MT mode String[] vals = null; try { vals = parser.getLine(); } catch (IOException e) { //Catch the exception and rethrow it with more line information input_err("can't read line: " + line, null, line, e); } if (vals==null) break; if (vals.length != fields.length) { input_err("expected "+fields.length+" values but got "+vals.length, vals, line); } addDoc(line,vals); } } finally{ if (reader != null) { IOUtils.closeQuietly(reader); } } }
// in core/src/java/org/apache/solr/handler/ReplicationHandler.java
private void doSnapShoot(SolrParams params, SolrQueryResponse rsp, SolrQueryRequest req) { try { int numberToKeep = params.getInt(NUMBER_BACKUPS_TO_KEEP_REQUEST_PARAM, 0); if (numberToKeep > 0 && numberBackupsToKeep > 0) { throw new SolrException(ErrorCode.BAD_REQUEST, "Cannot use " + NUMBER_BACKUPS_TO_KEEP_REQUEST_PARAM + " if " + NUMBER_BACKUPS_TO_KEEP_INIT_PARAM + " was specified in the configuration."); } numberToKeep = Math.max(numberToKeep, numberBackupsToKeep); if (numberToKeep < 1) { numberToKeep = Integer.MAX_VALUE; } IndexDeletionPolicyWrapper delPolicy = core.getDeletionPolicy(); IndexCommit indexCommit = delPolicy.getLatestCommit(); if (indexCommit == null) { indexCommit = req.getSearcher().getIndexReader().getIndexCommit(); } // small race here before the commit point is saved new SnapShooter(core, params.get("location")).createSnapAsync( indexCommit, numberToKeep, this); } catch (Exception e) { LOG.warn("Exception during creating a snapshot", e); rsp.add("exception", e); } }
// in core/src/java/org/apache/solr/handler/RequestHandlerUtils.java
public static void validateCommitParams(SolrParams params) { Iterator<String> i = params.getParameterNamesIterator(); while (i.hasNext()) { String key = i.next(); if (!commitParams.contains(key)) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Unknown commit parameter '" + key + "'"); } } }
// in core/src/java/org/apache/solr/handler/MoreLikeThisHandler.java
Override public void handleRequestBody(SolrQueryRequest req, SolrQueryResponse rsp) throws Exception { SolrParams params = req.getParams(); // Set field flags ReturnFields returnFields = new ReturnFields( req ); rsp.setReturnFields( returnFields ); int flags = 0; if (returnFields.wantsScore()) { flags |= SolrIndexSearcher.GET_SCORES; } String defType = params.get(QueryParsing.DEFTYPE, QParserPlugin.DEFAULT_QTYPE); String q = params.get( CommonParams.Q ); Query query = null; SortSpec sortSpec = null; List<Query> filters = null; try { if (q != null) { QParser parser = QParser.getParser(q, defType, req); query = parser.getQuery(); sortSpec = parser.getSort(true); } String[] fqs = req.getParams().getParams(CommonParams.FQ); if (fqs!=null && fqs.length!=0) { filters = new ArrayList<Query>(); for (String fq : fqs) { if (fq != null && fq.trim().length()!=0) { QParser fqp = QParser.getParser(fq, null, req); filters.add(fqp.getQuery()); } } } } catch (ParseException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); } SolrIndexSearcher searcher = req.getSearcher(); MoreLikeThisHelper mlt = new MoreLikeThisHelper( params, searcher ); // Hold on to the interesting terms if relevant TermStyle termStyle = TermStyle.get( params.get( MoreLikeThisParams.INTERESTING_TERMS ) ); List<InterestingTerm> interesting = (termStyle == TermStyle.NONE ) ? null : new ArrayList<InterestingTerm>( mlt.mlt.getMaxQueryTerms() ); DocListAndSet mltDocs = null; // Parse Required Params // This will either have a single Reader or valid query Reader reader = null; try { if (q == null || q.trim().length() < 1) { Iterable<ContentStream> streams = req.getContentStreams(); if (streams != null) { Iterator<ContentStream> iter = streams.iterator(); if (iter.hasNext()) { reader = iter.next().getReader(); } if (iter.hasNext()) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "MoreLikeThis does not support multiple ContentStreams"); } } } int start = params.getInt(CommonParams.START, 0); int rows = params.getInt(CommonParams.ROWS, 10); // Find documents MoreLikeThis - either with a reader or a query // -------------------------------------------------------------------------------- if (reader != null) { mltDocs = mlt.getMoreLikeThis(reader, start, rows, filters, interesting, flags); } else if (q != null) { // Matching options boolean includeMatch = params.getBool(MoreLikeThisParams.MATCH_INCLUDE, true); int matchOffset = params.getInt(MoreLikeThisParams.MATCH_OFFSET, 0); // Find the base match DocList match = searcher.getDocList(query, null, null, matchOffset, 1, flags); // only get the first one... if (includeMatch) { rsp.add("match", match); } // This is an iterator, but we only handle the first match DocIterator iterator = match.iterator(); if (iterator.hasNext()) { // do a MoreLikeThis query for each document in results int id = iterator.nextDoc(); mltDocs = mlt.getMoreLikeThis(id, start, rows, filters, interesting, flags); } } else { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "MoreLikeThis requires either a query (?q=) or text to find similar documents."); } } finally { if (reader != null) { reader.close(); } } if( mltDocs == null ) { mltDocs = new DocListAndSet(); // avoid NPE } rsp.add( "response", mltDocs.docList ); if( interesting != null ) { if( termStyle == TermStyle.DETAILS ) { NamedList<Float> it = new NamedList<Float>(); for( InterestingTerm t : interesting ) { it.add( t.term.toString(), t.boost ); } rsp.add( "interestingTerms", it ); } else { List<String> it = new ArrayList<String>( interesting.size() ); for( InterestingTerm t : interesting ) { it.add( t.term.text()); } rsp.add( "interestingTerms", it ); } } // maybe facet the results if (params.getBool(FacetParams.FACET,false)) { if( mltDocs.docSet == null ) { rsp.add( "facet_counts", null ); } else { SimpleFacets f = new SimpleFacets(req, mltDocs.docSet, params ); rsp.add( "facet_counts", f.getFacetCounts() ); } } boolean dbg = req.getParams().getBool(CommonParams.DEBUG_QUERY, false); boolean dbgQuery = false, dbgResults = false; if (dbg == false){//if it's true, we are doing everything anyway. String[] dbgParams = req.getParams().getParams(CommonParams.DEBUG); if (dbgParams != null) { for (int i = 0; i < dbgParams.length; i++) { if (dbgParams[i].equals(CommonParams.QUERY)){ dbgQuery = true; } else if (dbgParams[i].equals(CommonParams.RESULTS)){ dbgResults = true; } } } } else { dbgQuery = true; dbgResults = true; } // Copied from StandardRequestHandler... perhaps it should be added to doStandardDebug? if (dbg == true) { try { NamedList<Object> dbgInfo = SolrPluginUtils.doStandardDebug(req, q, mlt.getRawMLTQuery(), mltDocs.docList, dbgQuery, dbgResults); if (null != dbgInfo) { if (null != filters) { dbgInfo.add("filter_queries",req.getParams().getParams(CommonParams.FQ)); List<String> fqs = new ArrayList<String>(filters.size()); for (Query fq : filters) { fqs.add(QueryParsing.toString(fq, req.getSchema())); } dbgInfo.add("parsed_filter_queries",fqs); } rsp.add("debug", dbgInfo); } } catch (Exception e) { SolrException.log(SolrCore.log, "Exception during debug", e); rsp.add("exception_during_debug", SolrException.toStr(e)); } } }
// in core/src/java/org/apache/solr/handler/component/TermsComponent.java
Override public void prepare(ResponseBuilder rb) throws IOException { SolrParams params = rb.req.getParams(); if (params.getBool(TermsParams.TERMS, false)) { rb.doTerms = true; } // TODO: temporary... this should go in a different component. String shards = params.get(ShardParams.SHARDS); if (shards != null) { rb.isDistrib = true; if (params.get(ShardParams.SHARDS_QT) == null) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "No shards.qt parameter specified"); } List<String> lst = StrUtils.splitSmart(shards, ",", true); rb.shards = lst.toArray(new String[lst.size()]); } }
// in core/src/java/org/apache/solr/handler/component/TermsComponent.java
int resolveRegexpFlags(SolrParams params) { String[] flagParams = params.getParams(TermsParams.TERMS_REGEXP_FLAG); if (flagParams == null) { return 0; } int flags = 0; for (String flagParam : flagParams) { try { flags |= TermsParams.TermsRegexpFlag.valueOf(flagParam.toUpperCase(Locale.ENGLISH)).getValue(); } catch (IllegalArgumentException iae) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Unknown terms regex flag '" + flagParam + "'"); } } return flags; }
// in core/src/java/org/apache/solr/handler/component/StatsValuesFactory.java
public static StatsValues createStatsValues(SchemaField sf) { FieldType fieldType = sf.getType(); if (DoubleField.class.isInstance(fieldType) || IntField.class.isInstance(fieldType) || LongField.class.isInstance(fieldType) || ShortField.class.isInstance(fieldType) || FloatField.class.isInstance(fieldType) || ByteField.class.isInstance(fieldType) || TrieField.class.isInstance(fieldType) || SortableDoubleField.class.isInstance(fieldType) || SortableIntField.class.isInstance(fieldType) || SortableLongField.class.isInstance(fieldType) || SortableFloatField.class.isInstance(fieldType)) { return new NumericStatsValues(sf); } else if (DateField.class.isInstance(fieldType)) { return new DateStatsValues(sf); } else if (StrField.class.isInstance(fieldType)) { return new StringStatsValues(sf); } else { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Field type " + fieldType + " is not currently supported"); } }
// in core/src/java/org/apache/solr/handler/component/HttpShardHandler.java
public ShardResponse call() throws Exception { ShardResponse srsp = new ShardResponse(); srsp.setShardRequest(sreq); srsp.setShard(shard); SimpleSolrResponse ssr = new SimpleSolrResponse(); srsp.setSolrResponse(ssr); long startTime = System.currentTimeMillis(); try { params.remove(CommonParams.WT); // use default (currently javabin) params.remove(CommonParams.VERSION); // SolrRequest req = new QueryRequest(SolrRequest.METHOD.POST, "/select"); // use generic request to avoid extra processing of queries QueryRequest req = new QueryRequest(params); req.setMethod(SolrRequest.METHOD.POST); // no need to set the response parser as binary is the default // req.setResponseParser(new BinaryResponseParser()); // if there are no shards available for a slice, urls.size()==0 if (urls.size()==0) { // TODO: what's the right error code here? We should use the same thing when // all of the servers for a shard are down. throw new SolrException(SolrException.ErrorCode.SERVICE_UNAVAILABLE, "no servers hosting shard: " + shard); } if (urls.size() <= 1) { String url = urls.get(0); srsp.setShardAddress(url); SolrServer server = new HttpSolrServer(url, httpClient); ssr.nl = server.request(req); } else { LBHttpSolrServer.Rsp rsp = httpShardHandlerFactory.loadbalancer.request(new LBHttpSolrServer.Req(req, urls)); ssr.nl = rsp.getResponse(); srsp.setShardAddress(rsp.getServer()); } } catch( ConnectException cex ) { srsp.setException(cex); //???? } catch (Throwable th) { srsp.setException(th); if (th instanceof SolrException) { srsp.setResponseCode(((SolrException)th).code()); } else { srsp.setResponseCode(-1); } } ssr.elapsedTime = System.currentTimeMillis() - startTime; return srsp; }
// in core/src/java/org/apache/solr/handler/component/HttpShardHandler.java
private ShardResponse take(boolean bailOnError) { while (pending.size() > 0) { try { Future<ShardResponse> future = completionService.take(); pending.remove(future); ShardResponse rsp = future.get(); if (bailOnError && rsp.getException() != null) return rsp; // if exception, return immediately // add response to the response list... we do this after the take() and // not after the completion of "call" so we know when the last response // for a request was received. Otherwise we might return the same // request more than once. rsp.getShardRequest().responses.add(rsp); if (rsp.getShardRequest().responses.size() == rsp.getShardRequest().actualShards.length) { return rsp; } } catch (InterruptedException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); } catch (ExecutionException e) { // should be impossible... the problem with catching the exception // at this level is we don't know what ShardRequest it applied to throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Impossible Exception",e); } } return null; }
// in core/src/java/org/apache/solr/handler/component/HttpShardHandler.java
public void checkDistributed(ResponseBuilder rb) { SolrQueryRequest req = rb.req; SolrParams params = req.getParams(); rb.isDistrib = params.getBool("distrib", req.getCore().getCoreDescriptor() .getCoreContainer().isZooKeeperAware()); String shards = params.get(ShardParams.SHARDS); // for back compat, a shards param with URLs like localhost:8983/solr will mean that this // search is distributed. boolean hasShardURL = shards != null && shards.indexOf('/') > 0; rb.isDistrib = hasShardURL | rb.isDistrib; if (rb.isDistrib) { // since the cost of grabbing cloud state is still up in the air, we grab it only // if we need it. CloudState cloudState = null; Map<String,Slice> slices = null; CoreDescriptor coreDescriptor = req.getCore().getCoreDescriptor(); CloudDescriptor cloudDescriptor = coreDescriptor.getCloudDescriptor(); ZkController zkController = coreDescriptor.getCoreContainer().getZkController(); if (shards != null) { List<String> lst = StrUtils.splitSmart(shards, ",", true); rb.shards = lst.toArray(new String[lst.size()]); rb.slices = new String[rb.shards.length]; if (zkController != null) { // figure out which shards are slices for (int i=0; i<rb.shards.length; i++) { if (rb.shards[i].indexOf('/') < 0) { // this is a logical shard rb.slices[i] = rb.shards[i]; rb.shards[i] = null; } } } } else if (zkController != null) { // we weren't provided with a list of slices to query, so find the list that will cover the complete index cloudState = zkController.getCloudState(); // This can be more efficient... we only record the name, even though we // have the shard info we need in the next step of mapping slice->shards // Stores the comma-separated list of specified collections. // Eg: "collection1,collection2,collection3" String collections = params.get("collection"); if (collections != null) { // If there were one or more collections specified in the query, split // each parameter and store as a seperate member of a List. List<String> collectionList = StrUtils.splitSmart(collections, ",", true); // First create an empty HashMap to add the slice info to. slices = new HashMap<String,Slice>(); // In turn, retrieve the slices that cover each collection from the // cloud state and add them to the Map 'slices'. for (int i = 0; i < collectionList.size(); i++) { String collection = collectionList.get(i); ClientUtils.appendMap(collection, slices, cloudState.getSlices(collection)); } } else { // If no collections were specified, default to the collection for // this core. slices = cloudState.getSlices(cloudDescriptor.getCollectionName()); if (slices == null) { throw new SolrException(ErrorCode.BAD_REQUEST, "Could not find collection:" + cloudDescriptor.getCollectionName()); } } // Store the logical slices in the ResponseBuilder and create a new // String array to hold the physical shards (which will be mapped // later). rb.slices = slices.keySet().toArray(new String[slices.size()]); rb.shards = new String[rb.slices.length]; /*** rb.slices = new String[slices.size()]; for (int i=0; i<rb.slices.length; i++) { rb.slices[i] = slices.get(i).getName(); } ***/ } // // Map slices to shards // if (zkController != null) { for (int i=0; i<rb.shards.length; i++) { if (rb.shards[i] == null) { if (cloudState == null) { cloudState = zkController.getCloudState(); slices = cloudState.getSlices(cloudDescriptor.getCollectionName()); } String sliceName = rb.slices[i]; Slice slice = slices.get(sliceName); if (slice==null) { // Treat this the same as "all servers down" for a slice, and let things continue // if partial results are acceptable rb.shards[i] = ""; continue; // throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "no such shard: " + sliceName); } Map<String, ZkNodeProps> sliceShards = slice.getShards(); // For now, recreate the | delimited list of equivalent servers Set<String> liveNodes = cloudState.getLiveNodes(); StringBuilder sliceShardsStr = new StringBuilder(); boolean first = true; for (ZkNodeProps nodeProps : sliceShards.values()) { ZkCoreNodeProps coreNodeProps = new ZkCoreNodeProps(nodeProps); if (!liveNodes.contains(coreNodeProps.getNodeName()) || !coreNodeProps.getState().equals( ZkStateReader.ACTIVE)) continue; if (first) { first = false; } else { sliceShardsStr.append('|'); } String url = coreNodeProps.getCoreUrl(); if (url.startsWith("http://")) url = url.substring(7); sliceShardsStr.append(url); } rb.shards[i] = sliceShardsStr.toString(); } } } } String shards_rows = params.get(ShardParams.SHARDS_ROWS); if(shards_rows != null) { rb.shards_rows = Integer.parseInt(shards_rows); } String shards_start = params.get(ShardParams.SHARDS_START); if(shards_start != null) { rb.shards_start = Integer.parseInt(shards_start); } }
// in core/src/java/org/apache/solr/handler/component/QueryElevationComponent.java
public void inform(SolrCore core) { String a = initArgs.get(FIELD_TYPE); if (a != null) { FieldType ft = core.getSchema().getFieldTypes().get(a); if (ft == null) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Unknown FieldType: '" + a + "' used in QueryElevationComponent"); } analyzer = ft.getQueryAnalyzer(); } SchemaField sf = core.getSchema().getUniqueKeyField(); if( sf == null) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "QueryElevationComponent requires the schema to have a uniqueKeyField." ); } idSchemaFT = sf.getType(); idField = sf.getName(); //register the EditorialMarkerFactory String excludeName = initArgs.get(QueryElevationParams.EXCLUDE_MARKER_FIELD_NAME, "excluded"); if (excludeName == null || excludeName.equals("") == true){ excludeName = "excluded"; } ExcludedMarkerFactory excludedMarkerFactory = new ExcludedMarkerFactory(); core.addTransformerFactory(excludeName, excludedMarkerFactory); ElevatedMarkerFactory elevatedMarkerFactory = new ElevatedMarkerFactory(); String markerName = initArgs.get(QueryElevationParams.EDITORIAL_MARKER_FIELD_NAME, "elevated"); if (markerName == null || markerName.equals("") == true) { markerName = "elevated"; } core.addTransformerFactory(markerName, elevatedMarkerFactory); forceElevation = initArgs.getBool(QueryElevationParams.FORCE_ELEVATION, forceElevation); try { synchronized (elevationCache) { elevationCache.clear(); String f = initArgs.get(CONFIG_FILE); if (f == null) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "QueryElevationComponent must specify argument: '" + CONFIG_FILE + "' -- path to elevate.xml"); } boolean exists = false; // check if using ZooKeeper ZkController zkController = core.getCoreDescriptor().getCoreContainer().getZkController(); if (zkController != null) { // TODO : shouldn't have to keep reading the config name when it has been read before exists = zkController.configFileExists(zkController.readConfigName(core.getCoreDescriptor().getCloudDescriptor().getCollectionName()), f); } else { File fC = new File(core.getResourceLoader().getConfigDir(), f); File fD = new File(core.getDataDir(), f); if (fC.exists() == fD.exists()) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "QueryElevationComponent missing config file: '" + f + "\n" + "either: " + fC.getAbsolutePath() + " or " + fD.getAbsolutePath() + " must exist, but not both."); } if (fC.exists()) { exists = true; log.info("Loading QueryElevation from: " + fC.getAbsolutePath()); Config cfg = new Config(core.getResourceLoader(), f); elevationCache.put(null, loadElevationMap(cfg)); } } //in other words, we think this is in the data dir, not the conf dir if (!exists) { // preload the first data RefCounted<SolrIndexSearcher> searchHolder = null; try { searchHolder = core.getNewestSearcher(false); IndexReader reader = searchHolder.get().getIndexReader(); getElevationMap(reader, core); } finally { if (searchHolder != null) searchHolder.decref(); } } } } catch (Exception ex) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Error initializing QueryElevationComponent.", ex); } }
// in core/src/java/org/apache/solr/handler/component/QueryElevationComponent.java
Map<String, ElevationObj> getElevationMap(IndexReader reader, SolrCore core) throws Exception { synchronized (elevationCache) { Map<String, ElevationObj> map = elevationCache.get(null); if (map != null) return map; map = elevationCache.get(reader); if (map == null) { String f = initArgs.get(CONFIG_FILE); if (f == null) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "QueryElevationComponent must specify argument: " + CONFIG_FILE); } log.info("Loading QueryElevation from data dir: " + f); Config cfg; ZkController zkController = core.getCoreDescriptor().getCoreContainer().getZkController(); if (zkController != null) { cfg = new Config(core.getResourceLoader(), f, null, null); } else { InputStream is = VersionedFile.getLatestFile(core.getDataDir(), f); cfg = new Config(core.getResourceLoader(), f, new InputSource(is), null); } map = loadElevationMap(cfg); elevationCache.put(reader, map); } return map; } }
// in core/src/java/org/apache/solr/handler/component/QueryElevationComponent.java
private Map<String, ElevationObj> loadElevationMap(Config cfg) throws IOException { XPath xpath = XPathFactory.newInstance().newXPath(); Map<String, ElevationObj> map = new HashMap<String, ElevationObj>(); NodeList nodes = (NodeList) cfg.evaluate("elevate/query", XPathConstants.NODESET); for (int i = 0; i < nodes.getLength(); i++) { Node node = nodes.item(i); String qstr = DOMUtil.getAttr(node, "text", "missing query 'text'"); NodeList children = null; try { children = (NodeList) xpath.evaluate("doc", node, XPathConstants.NODESET); } catch (XPathExpressionException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "query requires '<doc .../>' child"); } ArrayList<String> include = new ArrayList<String>(); ArrayList<String> exclude = new ArrayList<String>(); for (int j = 0; j < children.getLength(); j++) { Node child = children.item(j); String id = DOMUtil.getAttr(child, "id", "missing 'id'"); String e = DOMUtil.getAttr(child, EXCLUDE, null); if (e != null) { if (Boolean.valueOf(e)) { exclude.add(id); continue; } } include.add(id); } ElevationObj elev = new ElevationObj(qstr, include, exclude); if (map.containsKey(elev.analyzed)) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Boosting query defined twice for query: '" + elev.text + "' (" + elev.analyzed + "')"); } map.put(elev.analyzed, elev); } return map; }
// in core/src/java/org/apache/solr/handler/component/QueryElevationComponent.java
Override public void prepare(ResponseBuilder rb) throws IOException { SolrQueryRequest req = rb.req; SolrParams params = req.getParams(); // A runtime param can skip if (!params.getBool(QueryElevationParams.ENABLE, true)) { return; } boolean exclusive = params.getBool(QueryElevationParams.EXCLUSIVE, false); // A runtime parameter can alter the config value for forceElevation boolean force = params.getBool(QueryElevationParams.FORCE_ELEVATION, forceElevation); boolean markExcludes = params.getBool(QueryElevationParams.MARK_EXCLUDES, false); Query query = rb.getQuery(); String qstr = rb.getQueryString(); if (query == null || qstr == null) { return; } qstr = getAnalyzedQuery(qstr); IndexReader reader = req.getSearcher().getIndexReader(); ElevationObj booster = null; try { booster = getElevationMap(reader, req.getCore()).get(qstr); } catch (Exception ex) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Error loading elevation", ex); } if (booster != null) { rb.req.getContext().put(BOOSTED, booster.ids); // Change the query to insert forced documents if (exclusive == true) { //we only want these results rb.setQuery(booster.include); } else { BooleanQuery newq = new BooleanQuery(true); newq.add(query, BooleanClause.Occur.SHOULD); newq.add(booster.include, BooleanClause.Occur.SHOULD); if (booster.exclude != null) { if (markExcludes == false) { for (TermQuery tq : booster.exclude) { newq.add(new BooleanClause(tq, BooleanClause.Occur.MUST_NOT)); } } else { //we are only going to mark items as excluded, not actually exclude them. This works //with the EditorialMarkerFactory rb.req.getContext().put(EXCLUDED, booster.excludeIds); for (TermQuery tq : booster.exclude) { newq.add(new BooleanClause(tq, BooleanClause.Occur.SHOULD)); } } } rb.setQuery(newq); } ElevationComparatorSource comparator = new ElevationComparatorSource(booster); // if the sort is 'score desc' use a custom sorting method to // insert documents in their proper place SortSpec sortSpec = rb.getSortSpec(); if (sortSpec.getSort() == null) { sortSpec.setSort(new Sort(new SortField[]{ new SortField("_elevate_", comparator, true), new SortField(null, SortField.Type.SCORE, false) })); } else { // Check if the sort is based on score boolean modify = false; SortField[] current = sortSpec.getSort().getSort(); ArrayList<SortField> sorts = new ArrayList<SortField>(current.length + 1); // Perhaps force it to always sort by score if (force && current[0].getType() != SortField.Type.SCORE) { sorts.add(new SortField("_elevate_", comparator, true)); modify = true; } for (SortField sf : current) { if (sf.getType() == SortField.Type.SCORE) { sorts.add(new SortField("_elevate_", comparator, !sf.getReverse())); modify = true; } sorts.add(sf); } if (modify) { sortSpec.setSort(new Sort(sorts.toArray(new SortField[sorts.size()]))); } } } // Add debugging information if (rb.isDebug()) { List<String> match = null; if (booster != null) { // Extract the elevated terms into a list match = new ArrayList<String>(booster.priority.size()); for (Object o : booster.include.clauses()) { TermQuery tq = (TermQuery) ((BooleanClause) o).getQuery(); match.add(tq.getTerm().text()); } } SimpleOrderedMap<Object> dbg = new SimpleOrderedMap<Object>(); dbg.add("q", qstr); dbg.add("match", match); if (rb.isDebugQuery()) { rb.addDebugInfo("queryBoosting", dbg); } } }
// in core/src/java/org/apache/solr/handler/component/SearchHandler.java
Override public void handleRequestBody(SolrQueryRequest req, SolrQueryResponse rsp) throws Exception, ParseException, InstantiationException, IllegalAccessException { // int sleep = req.getParams().getInt("sleep",0); // if (sleep > 0) {log.error("SLEEPING for " + sleep); Thread.sleep(sleep);} ResponseBuilder rb = new ResponseBuilder(req, rsp, components); if (rb.requestInfo != null) { rb.requestInfo.setResponseBuilder(rb); } boolean dbg = req.getParams().getBool(CommonParams.DEBUG_QUERY, false); rb.setDebug(dbg); if (dbg == false){//if it's true, we are doing everything anyway. SolrPluginUtils.getDebugInterests(req.getParams().getParams(CommonParams.DEBUG), rb); } final RTimer timer = rb.isDebug() ? new RTimer() : null; ShardHandler shardHandler1 = shardHandlerFactory.getShardHandler(); shardHandler1.checkDistributed(rb); if (timer == null) { // non-debugging prepare phase for( SearchComponent c : components ) { c.prepare(rb); } } else { // debugging prepare phase RTimer subt = timer.sub( "prepare" ); for( SearchComponent c : components ) { rb.setTimer( subt.sub( c.getName() ) ); c.prepare(rb); rb.getTimer().stop(); } subt.stop(); } if (!rb.isDistrib) { // a normal non-distributed request // The semantics of debugging vs not debugging are different enough that // it makes sense to have two control loops if(!rb.isDebug()) { // Process for( SearchComponent c : components ) { c.process(rb); } } else { // Process RTimer subt = timer.sub( "process" ); for( SearchComponent c : components ) { rb.setTimer( subt.sub( c.getName() ) ); c.process(rb); rb.getTimer().stop(); } subt.stop(); timer.stop(); // add the timing info if (rb.isDebugTimings()) { rb.addDebugInfo("timing", timer.asNamedList() ); } } } else { // a distributed request if (rb.outgoing == null) { rb.outgoing = new LinkedList<ShardRequest>(); } rb.finished = new ArrayList<ShardRequest>(); int nextStage = 0; do { rb.stage = nextStage; nextStage = ResponseBuilder.STAGE_DONE; // call all components for( SearchComponent c : components ) { // the next stage is the minimum of what all components report nextStage = Math.min(nextStage, c.distributedProcess(rb)); } // check the outgoing queue and send requests while (rb.outgoing.size() > 0) { // submit all current request tasks at once while (rb.outgoing.size() > 0) { ShardRequest sreq = rb.outgoing.remove(0); sreq.actualShards = sreq.shards; if (sreq.actualShards==ShardRequest.ALL_SHARDS) { sreq.actualShards = rb.shards; } sreq.responses = new ArrayList<ShardResponse>(); // TODO: map from shard to address[] for (String shard : sreq.actualShards) { ModifiableSolrParams params = new ModifiableSolrParams(sreq.params); params.remove(ShardParams.SHARDS); // not a top-level request params.set("distrib", "false"); // not a top-level request params.remove("indent"); params.remove(CommonParams.HEADER_ECHO_PARAMS); params.set(ShardParams.IS_SHARD, true); // a sub (shard) request params.set(ShardParams.SHARD_URL, shard); // so the shard knows what was asked if (rb.requestInfo != null) { // we could try and detect when this is needed, but it could be tricky params.set("NOW", Long.toString(rb.requestInfo.getNOW().getTime())); } String shardQt = params.get(ShardParams.SHARDS_QT); if (shardQt == null) { params.remove(CommonParams.QT); } else { params.set(CommonParams.QT, shardQt); } shardHandler1.submit(sreq, shard, params); } } // now wait for replies, but if anyone puts more requests on // the outgoing queue, send them out immediately (by exiting // this loop) boolean tolerant = rb.req.getParams().getBool(ShardParams.SHARDS_TOLERANT, false); while (rb.outgoing.size() == 0) { ShardResponse srsp = tolerant ? shardHandler1.takeCompletedIncludingErrors(): shardHandler1.takeCompletedOrError(); if (srsp == null) break; // no more requests to wait for // Was there an exception? if (srsp.getException() != null) { // If things are not tolerant, abort everything and rethrow if(!tolerant) { shardHandler1.cancelAll(); if (srsp.getException() instanceof SolrException) { throw (SolrException)srsp.getException(); } else { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, srsp.getException()); } } } rb.finished.add(srsp.getShardRequest()); // let the components see the responses to the request for(SearchComponent c : components) { c.handleResponses(rb, srsp.getShardRequest()); } } } for(SearchComponent c : components) { c.finishStage(rb); } // we are done when the next stage is MAX_VALUE } while (nextStage != Integer.MAX_VALUE); } }
// in core/src/java/org/apache/solr/handler/component/PivotFacetHelper.java
public SimpleOrderedMap<List<NamedList<Object>>> process(ResponseBuilder rb, SolrParams params, String[] pivots) throws IOException { if (!rb.doFacets || pivots == null) return null; int minMatch = params.getInt( FacetParams.FACET_PIVOT_MINCOUNT, 1 ); SimpleOrderedMap<List<NamedList<Object>>> pivotResponse = new SimpleOrderedMap<List<NamedList<Object>>>(); for (String pivot : pivots) { String[] fields = pivot.split(","); // only support two levels for now if( fields.length < 2 ) { throw new SolrException( ErrorCode.BAD_REQUEST, "Pivot Facet needs at least two fields: "+pivot ); } DocSet docs = rb.getResults().docSet; String field = fields[0]; String subField = fields[1]; Deque<String> fnames = new LinkedList<String>(); for( int i=fields.length-1; i>1; i-- ) { fnames.push( fields[i] ); } SimpleFacets sf = getFacetImplementation(rb.req, rb.getResults().docSet, rb.req.getParams()); NamedList<Integer> superFacets = sf.getTermCounts(field); pivotResponse.add(pivot, doPivots(superFacets, field, subField, fnames, rb, docs, minMatch)); } return pivotResponse; }
// in core/src/java/org/apache/solr/handler/component/QueryComponent.java
Override public void prepare(ResponseBuilder rb) throws IOException { SolrQueryRequest req = rb.req; SolrParams params = req.getParams(); if (!params.getBool(COMPONENT_NAME, true)) { return; } SolrQueryResponse rsp = rb.rsp; // Set field flags ReturnFields returnFields = new ReturnFields( req ); rsp.setReturnFields( returnFields ); int flags = 0; if (returnFields.wantsScore()) { flags |= SolrIndexSearcher.GET_SCORES; } rb.setFieldFlags( flags ); String defType = params.get(QueryParsing.DEFTYPE,QParserPlugin.DEFAULT_QTYPE); // get it from the response builder to give a different component a chance // to set it. String queryString = rb.getQueryString(); if (queryString == null) { // this is the normal way it's set. queryString = params.get( CommonParams.Q ); rb.setQueryString(queryString); } try { QParser parser = QParser.getParser(rb.getQueryString(), defType, req); Query q = parser.getQuery(); if (q == null) { // normalize a null query to a query that matches nothing q = new BooleanQuery(); } rb.setQuery( q ); rb.setSortSpec( parser.getSort(true) ); rb.setQparser(parser); rb.setScoreDoc(parser.getPaging()); String[] fqs = req.getParams().getParams(CommonParams.FQ); if (fqs!=null && fqs.length!=0) { List<Query> filters = rb.getFilters(); if (filters==null) { filters = new ArrayList<Query>(fqs.length); } for (String fq : fqs) { if (fq != null && fq.trim().length()!=0) { QParser fqp = QParser.getParser(fq, null, req); filters.add(fqp.getQuery()); } } // only set the filters if they are not empty otherwise // fq=&someotherParam= will trigger all docs filter for every request // if filter cache is disabled if (!filters.isEmpty()) { rb.setFilters( filters ); } } } catch (ParseException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); } boolean grouping = params.getBool(GroupParams.GROUP, false); if (!grouping) { return; } SolrIndexSearcher.QueryCommand cmd = rb.getQueryCommand(); SolrIndexSearcher searcher = rb.req.getSearcher(); GroupingSpecification groupingSpec = new GroupingSpecification(); rb.setGroupingSpec(groupingSpec); //TODO: move weighting of sort Sort groupSort = searcher.weightSort(cmd.getSort()); if (groupSort == null) { groupSort = Sort.RELEVANCE; } // groupSort defaults to sort String groupSortStr = params.get(GroupParams.GROUP_SORT); //TODO: move weighting of sort Sort sortWithinGroup = groupSortStr == null ? groupSort : searcher.weightSort(QueryParsing.parseSort(groupSortStr, req)); if (sortWithinGroup == null) { sortWithinGroup = Sort.RELEVANCE; } groupingSpec.setSortWithinGroup(sortWithinGroup); groupingSpec.setGroupSort(groupSort); String formatStr = params.get(GroupParams.GROUP_FORMAT, Grouping.Format.grouped.name()); Grouping.Format responseFormat; try { responseFormat = Grouping.Format.valueOf(formatStr); } catch (IllegalArgumentException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, String.format("Illegal %s parameter", GroupParams.GROUP_FORMAT)); } groupingSpec.setResponseFormat(responseFormat); groupingSpec.setFields(params.getParams(GroupParams.GROUP_FIELD)); groupingSpec.setQueries(params.getParams(GroupParams.GROUP_QUERY)); groupingSpec.setFunctions(params.getParams(GroupParams.GROUP_FUNC)); groupingSpec.setGroupOffset(params.getInt(GroupParams.GROUP_OFFSET, 0)); groupingSpec.setGroupLimit(params.getInt(GroupParams.GROUP_LIMIT, 1)); groupingSpec.setOffset(rb.getSortSpec().getOffset()); groupingSpec.setLimit(rb.getSortSpec().getCount()); groupingSpec.setIncludeGroupCount(params.getBool(GroupParams.GROUP_TOTAL_COUNT, false)); groupingSpec.setMain(params.getBool(GroupParams.GROUP_MAIN, false)); groupingSpec.setNeedScore((cmd.getFlags() & SolrIndexSearcher.GET_SCORES) != 0); groupingSpec.setTruncateGroups(params.getBool(GroupParams.GROUP_TRUNCATE, false)); }
// in core/src/java/org/apache/solr/handler/component/QueryComponent.java
Override public void process(ResponseBuilder rb) throws IOException { SolrQueryRequest req = rb.req; SolrQueryResponse rsp = rb.rsp; SolrParams params = req.getParams(); if (!params.getBool(COMPONENT_NAME, true)) { return; } SolrIndexSearcher searcher = req.getSearcher(); if (rb.getQueryCommand().getOffset() < 0) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "'start' parameter cannot be negative"); } // -1 as flag if not set. long timeAllowed = (long)params.getInt( CommonParams.TIME_ALLOWED, -1 ); // Optional: This could also be implemented by the top-level searcher sending // a filter that lists the ids... that would be transparent to // the request handler, but would be more expensive (and would preserve score // too if desired). String ids = params.get(ShardParams.IDS); if (ids != null) { SchemaField idField = req.getSchema().getUniqueKeyField(); List<String> idArr = StrUtils.splitSmart(ids, ",", true); int[] luceneIds = new int[idArr.size()]; int docs = 0; for (int i=0; i<idArr.size(); i++) { int id = req.getSearcher().getFirstMatch( new Term(idField.getName(), idField.getType().toInternal(idArr.get(i)))); if (id >= 0) luceneIds[docs++] = id; } DocListAndSet res = new DocListAndSet(); res.docList = new DocSlice(0, docs, luceneIds, null, docs, 0); if (rb.isNeedDocSet()) { // TODO: create a cache for this! List<Query> queries = new ArrayList<Query>(); queries.add(rb.getQuery()); List<Query> filters = rb.getFilters(); if (filters != null) queries.addAll(filters); res.docSet = searcher.getDocSet(queries); } rb.setResults(res); ResultContext ctx = new ResultContext(); ctx.docs = rb.getResults().docList; ctx.query = null; // anything? rsp.add("response", ctx); return; } SolrIndexSearcher.QueryCommand cmd = rb.getQueryCommand(); cmd.setTimeAllowed(timeAllowed); SolrIndexSearcher.QueryResult result = new SolrIndexSearcher.QueryResult(); // // grouping / field collapsing // GroupingSpecification groupingSpec = rb.getGroupingSpec(); if (groupingSpec != null) { try { boolean needScores = (cmd.getFlags() & SolrIndexSearcher.GET_SCORES) != 0; if (params.getBool(GroupParams.GROUP_DISTRIBUTED_FIRST, false)) { CommandHandler.Builder topsGroupsActionBuilder = new CommandHandler.Builder() .setQueryCommand(cmd) .setNeedDocSet(false) // Order matters here .setIncludeHitCount(true) .setSearcher(searcher); for (String field : groupingSpec.getFields()) { topsGroupsActionBuilder.addCommandField(new SearchGroupsFieldCommand.Builder() .setField(searcher.getSchema().getField(field)) .setGroupSort(groupingSpec.getGroupSort()) .setTopNGroups(cmd.getOffset() + cmd.getLen()) .setIncludeGroupCount(groupingSpec.isIncludeGroupCount()) .build() ); } CommandHandler commandHandler = topsGroupsActionBuilder.build(); commandHandler.execute(); SearchGroupsResultTransformer serializer = new SearchGroupsResultTransformer(searcher); rsp.add("firstPhase", commandHandler.processResult(result, serializer)); rsp.add("totalHitCount", commandHandler.getTotalHitCount()); rb.setResult(result); return; } else if (params.getBool(GroupParams.GROUP_DISTRIBUTED_SECOND, false)) { CommandHandler.Builder secondPhaseBuilder = new CommandHandler.Builder() .setQueryCommand(cmd) .setTruncateGroups(groupingSpec.isTruncateGroups() && groupingSpec.getFields().length > 0) .setSearcher(searcher); for (String field : groupingSpec.getFields()) { String[] topGroupsParam = params.getParams(GroupParams.GROUP_DISTRIBUTED_TOPGROUPS_PREFIX + field); if (topGroupsParam == null) { topGroupsParam = new String[0]; } List<SearchGroup<BytesRef>> topGroups = new ArrayList<SearchGroup<BytesRef>>(topGroupsParam.length); for (String topGroup : topGroupsParam) { SearchGroup<BytesRef> searchGroup = new SearchGroup<BytesRef>(); if (!topGroup.equals(TopGroupsShardRequestFactory.GROUP_NULL_VALUE)) { searchGroup.groupValue = new BytesRef(searcher.getSchema().getField(field).getType().readableToIndexed(topGroup)); } topGroups.add(searchGroup); } secondPhaseBuilder.addCommandField( new TopGroupsFieldCommand.Builder() .setField(searcher.getSchema().getField(field)) .setGroupSort(groupingSpec.getGroupSort()) .setSortWithinGroup(groupingSpec.getSortWithinGroup()) .setFirstPhaseGroups(topGroups) .setMaxDocPerGroup(groupingSpec.getGroupOffset() + groupingSpec.getGroupLimit()) .setNeedScores(needScores) .setNeedMaxScore(needScores) .build() ); } for (String query : groupingSpec.getQueries()) { secondPhaseBuilder.addCommandField(new QueryCommand.Builder() .setDocsToCollect(groupingSpec.getOffset() + groupingSpec.getLimit()) .setSort(groupingSpec.getGroupSort()) .setQuery(query, rb.req) .setDocSet(searcher) .build() ); } CommandHandler commandHandler = secondPhaseBuilder.build(); commandHandler.execute(); TopGroupsResultTransformer serializer = new TopGroupsResultTransformer(rb); rsp.add("secondPhase", commandHandler.processResult(result, serializer)); rb.setResult(result); return; } int maxDocsPercentageToCache = params.getInt(GroupParams.GROUP_CACHE_PERCENTAGE, 0); boolean cacheSecondPassSearch = maxDocsPercentageToCache >= 1 && maxDocsPercentageToCache <= 100; Grouping.TotalCount defaultTotalCount = groupingSpec.isIncludeGroupCount() ? Grouping.TotalCount.grouped : Grouping.TotalCount.ungrouped; int limitDefault = cmd.getLen(); // this is normally from "rows" Grouping grouping = new Grouping(searcher, result, cmd, cacheSecondPassSearch, maxDocsPercentageToCache, groupingSpec.isMain()); grouping.setSort(groupingSpec.getGroupSort()) .setGroupSort(groupingSpec.getSortWithinGroup()) .setDefaultFormat(groupingSpec.getResponseFormat()) .setLimitDefault(limitDefault) .setDefaultTotalCount(defaultTotalCount) .setDocsPerGroupDefault(groupingSpec.getGroupLimit()) .setGroupOffsetDefault(groupingSpec.getGroupOffset()) .setGetGroupedDocSet(groupingSpec.isTruncateGroups()); if (groupingSpec.getFields() != null) { for (String field : groupingSpec.getFields()) { grouping.addFieldCommand(field, rb.req); } } if (groupingSpec.getFunctions() != null) { for (String groupByStr : groupingSpec.getFunctions()) { grouping.addFunctionCommand(groupByStr, rb.req); } } if (groupingSpec.getQueries() != null) { for (String groupByStr : groupingSpec.getQueries()) { grouping.addQueryCommand(groupByStr, rb.req); } } if (rb.doHighlights || rb.isDebug() || params.getBool(MoreLikeThisParams.MLT, false)) { // we need a single list of the returned docs cmd.setFlags(SolrIndexSearcher.GET_DOCLIST); } grouping.execute(); if (grouping.isSignalCacheWarning()) { rsp.add( "cacheWarning", String.format("Cache limit of %d percent relative to maxdoc has exceeded. Please increase cache size or disable caching.", maxDocsPercentageToCache) ); } rb.setResult(result); if (grouping.mainResult != null) { ResultContext ctx = new ResultContext(); ctx.docs = grouping.mainResult; ctx.query = null; // TODO? add the query? rsp.add("response", ctx); rsp.getToLog().add("hits", grouping.mainResult.matches()); } else if (!grouping.getCommands().isEmpty()) { // Can never be empty since grouping.execute() checks for this. rsp.add("grouped", result.groupedResults); rsp.getToLog().add("hits", grouping.getCommands().get(0).getMatches()); } return; } catch (ParseException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); } } // normal search result searcher.search(result,cmd); rb.setResult( result ); ResultContext ctx = new ResultContext(); ctx.docs = rb.getResults().docList; ctx.query = rb.getQuery(); rsp.add("response", ctx); rsp.getToLog().add("hits", rb.getResults().docList.matches()); doFieldSortValues(rb, searcher); doPrefetch(rb); }
// in core/src/java/org/apache/solr/handler/component/HighlightComponent.java
Override public void prepare(ResponseBuilder rb) throws IOException { SolrParams params = rb.req.getParams(); rb.doHighlights = highlighter.isHighlightingEnabled(params); if(rb.doHighlights){ String hlq = params.get(HighlightParams.Q); if(hlq != null){ try { QParser parser = QParser.getParser(hlq, null, rb.req); rb.setHighlightQuery(parser.getHighlightQuery()); } catch (ParseException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); } } } }
// in core/src/java/org/apache/solr/handler/component/HighlightComponent.java
Override public void process(ResponseBuilder rb) throws IOException { if (rb.doHighlights) { SolrQueryRequest req = rb.req; SolrParams params = req.getParams(); String[] defaultHighlightFields; //TODO: get from builder by default? if (rb.getQparser() != null) { defaultHighlightFields = rb.getQparser().getDefaultHighlightFields(); } else { defaultHighlightFields = params.getParams(CommonParams.DF); } Query highlightQuery = rb.getHighlightQuery(); if(highlightQuery==null) { if (rb.getQparser() != null) { try { highlightQuery = rb.getQparser().getHighlightQuery(); rb.setHighlightQuery( highlightQuery ); } catch (Exception e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); } } else { highlightQuery = rb.getQuery(); rb.setHighlightQuery( highlightQuery ); } } if(highlightQuery != null) { boolean rewrite = !(Boolean.valueOf(params.get(HighlightParams.USE_PHRASE_HIGHLIGHTER, "true")) && Boolean.valueOf(params.get(HighlightParams.HIGHLIGHT_MULTI_TERM, "true"))); highlightQuery = rewrite ? highlightQuery.rewrite(req.getSearcher().getIndexReader()) : highlightQuery; } // No highlighting if there is no query -- consider q.alt="*:* if( highlightQuery != null ) { NamedList sumData = highlighter.doHighlighting( rb.getResults().docList, highlightQuery, req, defaultHighlightFields ); if(sumData != null) { // TODO ???? add this directly to the response? rb.rsp.add("highlighting", sumData); } } } }
// in core/src/java/org/apache/solr/handler/component/StatsComponent.java
public NamedList<?> getFieldCacheStats(String fieldName, String[] facet ) { SchemaField sf = searcher.getSchema().getField(fieldName); FieldCache.DocTermsIndex si; try { si = FieldCache.DEFAULT.getTermsIndex(searcher.getAtomicReader(), fieldName); } catch (IOException e) { throw new RuntimeException( "failed to open field cache for: "+fieldName, e ); } StatsValues allstats = StatsValuesFactory.createStatsValues(sf); final int nTerms = si.numOrd(); if ( nTerms <= 0 || docs.size() <= 0 ) return allstats.getStatsValues(); // don't worry about faceting if no documents match... List<FieldFacetStats> facetStats = new ArrayList<FieldFacetStats>(); FieldCache.DocTermsIndex facetTermsIndex; for( String facetField : facet ) { SchemaField fsf = searcher.getSchema().getField(facetField); FieldType facetFieldType = fsf.getType(); if (facetFieldType.isTokenized() || facetFieldType.isMultiValued()) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Stats can only facet on single-valued fields, not: " + facetField + "[" + facetFieldType + "]"); } try { facetTermsIndex = FieldCache.DEFAULT.getTermsIndex(searcher.getAtomicReader(), facetField); } catch (IOException e) { throw new RuntimeException( "failed to open field cache for: " + facetField, e ); } facetStats.add(new FieldFacetStats(facetField, facetTermsIndex, sf, fsf, nTerms)); } final BytesRef tempBR = new BytesRef(); DocIterator iter = docs.iterator(); while (iter.hasNext()) { int docID = iter.nextDoc(); BytesRef raw = si.lookup(si.getOrd(docID), tempBR); if( raw.length > 0 ) { allstats.accumulate(raw); } else { allstats.missing(); } // now update the facets for (FieldFacetStats f : facetStats) { f.facet(docID, raw); } } for (FieldFacetStats f : facetStats) { allstats.addFacet(f.name, f.facetStatsValues); } return allstats.getStatsValues(); }
// in core/src/java/org/apache/solr/handler/component/RealTimeGetComponent.java
Override public void process(ResponseBuilder rb) throws IOException { SolrQueryRequest req = rb.req; SolrQueryResponse rsp = rb.rsp; SolrParams params = req.getParams(); if (!params.getBool(COMPONENT_NAME, true)) { return; } String val = params.get("getVersions"); if (val != null) { processGetVersions(rb); return; } val = params.get("getUpdates"); if (val != null) { processGetUpdates(rb); return; } String id[] = params.getParams("id"); String ids[] = params.getParams("ids"); if (id == null && ids == null) { return; } String[] allIds = id==null ? new String[0] : id; if (ids != null) { List<String> lst = new ArrayList<String>(); for (String s : allIds) { lst.add(s); } for (String idList : ids) { lst.addAll( StrUtils.splitSmart(idList, ",", true) ); } allIds = lst.toArray(new String[lst.size()]); } SchemaField idField = req.getSchema().getUniqueKeyField(); FieldType fieldType = idField.getType(); SolrDocumentList docList = new SolrDocumentList(); UpdateLog ulog = req.getCore().getUpdateHandler().getUpdateLog(); RefCounted<SolrIndexSearcher> searcherHolder = null; DocTransformer transformer = rsp.getReturnFields().getTransformer(); if (transformer != null) { TransformContext context = new TransformContext(); context.req = req; transformer.setContext(context); } try { SolrIndexSearcher searcher = null; BytesRef idBytes = new BytesRef(); for (String idStr : allIds) { fieldType.readableToIndexed(idStr, idBytes); if (ulog != null) { Object o = ulog.lookup(idBytes); if (o != null) { // should currently be a List<Oper,Ver,Doc/Id> List entry = (List)o; assert entry.size() >= 3; int oper = (Integer)entry.get(0) & UpdateLog.OPERATION_MASK; switch (oper) { case UpdateLog.ADD: SolrDocument doc = toSolrDoc((SolrInputDocument)entry.get(entry.size()-1), req.getSchema()); if(transformer!=null) { transformer.transform(doc, -1); // unknown docID } docList.add(doc); break; case UpdateLog.DELETE: break; default: throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Unknown Operation! " + oper); } continue; } } // didn't find it in the update log, so it should be in the newest searcher opened if (searcher == null) { searcherHolder = req.getCore().getRealtimeSearcher(); searcher = searcherHolder.get(); } // SolrCore.verbose("RealTimeGet using searcher ", searcher); int docid = searcher.getFirstMatch(new Term(idField.getName(), idBytes)); if (docid < 0) continue; Document luceneDocument = searcher.doc(docid); SolrDocument doc = toSolrDoc(luceneDocument, req.getSchema()); if( transformer != null ) { transformer.transform(doc, docid); } docList.add(doc); } } finally { if (searcherHolder != null) { searcherHolder.decref(); } } // if the client specified a single id=foo, then use "doc":{ // otherwise use a standard doclist if (ids == null && allIds.length <= 1) { // if the doc was not found, then use a value of null. rsp.add("doc", docList.size() > 0 ? docList.get(0) : null); } else { docList.setNumFound(docList.size()); rsp.add("response", docList); } }
// in core/src/java/org/apache/solr/handler/component/RealTimeGetComponent.java
public static SolrInputDocument getInputDocument(SolrCore core, BytesRef idBytes) throws IOException { SolrInputDocument sid = null; RefCounted<SolrIndexSearcher> searcherHolder = null; try { SolrIndexSearcher searcher = null; UpdateLog ulog = core.getUpdateHandler().getUpdateLog(); if (ulog != null) { Object o = ulog.lookup(idBytes); if (o != null) { // should currently be a List<Oper,Ver,Doc/Id> List entry = (List)o; assert entry.size() >= 3; int oper = (Integer)entry.get(0) & UpdateLog.OPERATION_MASK; switch (oper) { case UpdateLog.ADD: sid = (SolrInputDocument)entry.get(entry.size()-1); break; case UpdateLog.DELETE: return null; default: throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Unknown Operation! " + oper); } } } if (sid == null) { // didn't find it in the update log, so it should be in the newest searcher opened if (searcher == null) { searcherHolder = core.getRealtimeSearcher(); searcher = searcherHolder.get(); } // SolrCore.verbose("RealTimeGet using searcher ", searcher); SchemaField idField = core.getSchema().getUniqueKeyField(); int docid = searcher.getFirstMatch(new Term(idField.getName(), idBytes)); if (docid < 0) return null; Document luceneDocument = searcher.doc(docid); sid = toSolrInputDocument(luceneDocument, core.getSchema()); } } finally { if (searcherHolder != null) { searcherHolder.decref(); } } return sid; }
// in core/src/java/org/apache/solr/handler/component/RealTimeGetComponent.java
private String[] sliceToShards(ResponseBuilder rb, String collection, String slice) { String lookup = collection + '_' + slice; // seems either form may be filled in rb.slices? // We use this since the shard handler already filled in the slice to shards mapping. // A better approach would be to avoid filling out every slice each time, or to cache // the mappings. for (int i=0; i<rb.slices.length; i++) { log.info("LOOKUP_SLICE:" + rb.slices[i] + "=" + rb.shards[i]); if (lookup.equals(rb.slices[i]) || slice.equals(rb.slices[i])) { return new String[]{rb.shards[i]}; } } throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Can't find shard '" + lookup + "'"); }
// in core/src/java/org/apache/solr/handler/component/HttpShardHandlerFactory.java
public void init(PluginInfo info) { NamedList args = info.initArgs; this.soTimeout = getParameter(args, HttpClientUtil.PROP_SO_TIMEOUT, soTimeout); this.scheme = getParameter(args, INIT_URL_SCHEME, "http://"); this.scheme = (this.scheme.endsWith("://")) ? this.scheme : this.scheme + "://"; this.connectionTimeout = getParameter(args, HttpClientUtil.PROP_CONNECTION_TIMEOUT, connectionTimeout); this.maxConnectionsPerHost = getParameter(args, HttpClientUtil.PROP_MAX_CONNECTIONS_PER_HOST, maxConnectionsPerHost); this.corePoolSize = getParameter(args, INIT_CORE_POOL_SIZE, corePoolSize); this.maximumPoolSize = getParameter(args, INIT_MAX_POOL_SIZE, maximumPoolSize); this.keepAliveTime = getParameter(args, MAX_THREAD_IDLE_TIME, keepAliveTime); this.queueSize = getParameter(args, INIT_SIZE_OF_QUEUE, queueSize); this.accessPolicy = getParameter(args, INIT_FAIRNESS_POLICY, accessPolicy); BlockingQueue<Runnable> blockingQueue = (this.queueSize == -1) ? new SynchronousQueue<Runnable>(this.accessPolicy) : new ArrayBlockingQueue<Runnable>(this.queueSize, this.accessPolicy); this.commExecutor = new ThreadPoolExecutor( this.corePoolSize, this.maximumPoolSize, this.keepAliveTime, TimeUnit.SECONDS, blockingQueue, new DefaultSolrThreadFactory("httpShardExecutor") ); ModifiableSolrParams clientParams = new ModifiableSolrParams(); clientParams.set(HttpClientUtil.PROP_MAX_CONNECTIONS_PER_HOST, maxConnectionsPerHost); clientParams.set(HttpClientUtil.PROP_MAX_CONNECTIONS, 10000); clientParams.set(HttpClientUtil.PROP_SO_TIMEOUT, soTimeout); clientParams.set(HttpClientUtil.PROP_CONNECTION_TIMEOUT, connectionTimeout); clientParams.set(HttpClientUtil.PROP_USE_RETRY, false); this.defaultClient = HttpClientUtil.createClient(clientParams); try { loadbalancer = new LBHttpSolrServer(defaultClient); } catch (MalformedURLException e) { // should be impossible since we're not passing any URLs here throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); } }
// in core/src/java/org/apache/solr/handler/component/TermVectorComponent.java
Override public void process(ResponseBuilder rb) throws IOException { SolrParams params = rb.req.getParams(); if (!params.getBool(COMPONENT_NAME, false)) { return; } NamedList<Object> termVectors = new NamedList<Object>(); rb.rsp.add(TERM_VECTORS, termVectors); FieldOptions allFields = new FieldOptions(); //figure out what options we have, and try to get the appropriate vector allFields.termFreq = params.getBool(TermVectorParams.TF, false); allFields.positions = params.getBool(TermVectorParams.POSITIONS, false); allFields.offsets = params.getBool(TermVectorParams.OFFSETS, false); allFields.docFreq = params.getBool(TermVectorParams.DF, false); allFields.tfIdf = params.getBool(TermVectorParams.TF_IDF, false); //boolean cacheIdf = params.getBool(TermVectorParams.IDF, false); //short cut to all values. if (params.getBool(TermVectorParams.ALL, false)) { allFields.termFreq = true; allFields.positions = true; allFields.offsets = true; allFields.docFreq = true; allFields.tfIdf = true; } String fldLst = params.get(TermVectorParams.FIELDS); if (fldLst == null) { fldLst = params.get(CommonParams.FL); } //use this to validate our fields IndexSchema schema = rb.req.getSchema(); //Build up our per field mapping Map<String, FieldOptions> fieldOptions = new HashMap<String, FieldOptions>(); NamedList<List<String>> warnings = new NamedList<List<String>>(); List<String> noTV = new ArrayList<String>(); List<String> noPos = new ArrayList<String>(); List<String> noOff = new ArrayList<String>(); //we have specific fields to retrieve if (fldLst != null) { String [] fields = SolrPluginUtils.split(fldLst); for (String field : fields) { SchemaField sf = schema.getFieldOrNull(field); if (sf != null) { if (sf.storeTermVector()) { FieldOptions option = fieldOptions.get(field); if (option == null) { option = new FieldOptions(); option.fieldName = field; fieldOptions.put(field, option); } //get the per field mappings option.termFreq = params.getFieldBool(field, TermVectorParams.TF, allFields.termFreq); option.docFreq = params.getFieldBool(field, TermVectorParams.DF, allFields.docFreq); option.tfIdf = params.getFieldBool(field, TermVectorParams.TF_IDF, allFields.tfIdf); //Validate these are even an option option.positions = params.getFieldBool(field, TermVectorParams.POSITIONS, allFields.positions); if (option.positions && !sf.storeTermPositions()){ noPos.add(field); } option.offsets = params.getFieldBool(field, TermVectorParams.OFFSETS, allFields.offsets); if (option.offsets && !sf.storeTermOffsets()){ noOff.add(field); } } else {//field doesn't have term vectors noTV.add(field); } } else { //field doesn't exist throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "undefined field: " + field); } } } //else, deal with all fields boolean hasWarnings = false; if (!noTV.isEmpty()) { warnings.add("noTermVectors", noTV); hasWarnings = true; } if (!noPos.isEmpty()) { warnings.add("noPositions", noPos); hasWarnings = true; } if (!noOff.isEmpty()) { warnings.add("noOffsets", noOff); hasWarnings = true; } if (hasWarnings) { termVectors.add("warnings", warnings); } DocListAndSet listAndSet = rb.getResults(); List<Integer> docIds = getInts(params.getParams(TermVectorParams.DOC_IDS)); Iterator<Integer> iter; if (docIds != null && !docIds.isEmpty()) { iter = docIds.iterator(); } else { DocList list = listAndSet.docList; iter = list.iterator(); } SolrIndexSearcher searcher = rb.req.getSearcher(); IndexReader reader = searcher.getIndexReader(); //the TVMapper is a TermVectorMapper which can be used to optimize loading of Term Vectors SchemaField keyField = schema.getUniqueKeyField(); String uniqFieldName = null; if (keyField != null) { uniqFieldName = keyField.getName(); } //Only load the id field to get the uniqueKey of that //field final String finalUniqFieldName = uniqFieldName; final List<String> uniqValues = new ArrayList<String>(); // TODO: is this required to be single-valued? if so, we should STOP // once we find it... final StoredFieldVisitor getUniqValue = new StoredFieldVisitor() { @Override public void stringField(FieldInfo fieldInfo, String value) throws IOException { uniqValues.add(value); } @Override public void intField(FieldInfo fieldInfo, int value) throws IOException { uniqValues.add(Integer.toString(value)); } @Override public void longField(FieldInfo fieldInfo, long value) throws IOException { uniqValues.add(Long.toString(value)); } @Override public Status needsField(FieldInfo fieldInfo) throws IOException { return (fieldInfo.name.equals(finalUniqFieldName)) ? Status.YES : Status.NO; } }; TermsEnum termsEnum = null; while (iter.hasNext()) { Integer docId = iter.next(); NamedList<Object> docNL = new NamedList<Object>(); termVectors.add("doc-" + docId, docNL); if (keyField != null) { reader.document(docId, getUniqValue); String uniqVal = null; if (uniqValues.size() != 0) { uniqVal = uniqValues.get(0); uniqValues.clear(); docNL.add("uniqueKey", uniqVal); termVectors.add("uniqueKeyFieldName", uniqFieldName); } } if (!fieldOptions.isEmpty()) { for (Map.Entry<String, FieldOptions> entry : fieldOptions.entrySet()) { final String field = entry.getKey(); final Terms vector = reader.getTermVector(docId, field); if (vector != null) { termsEnum = vector.iterator(termsEnum); mapOneVector(docNL, entry.getValue(), reader, docId, vector.iterator(termsEnum), field); } } } else { // extract all fields final Fields vectors = reader.getTermVectors(docId); final FieldsEnum fieldsEnum = vectors.iterator(); String field; while((field = fieldsEnum.next()) != null) { Terms terms = fieldsEnum.terms(); if (terms != null) { termsEnum = terms.iterator(termsEnum); mapOneVector(docNL, allFields, reader, docId, termsEnum, field); } } } } }
// in core/src/java/org/apache/solr/handler/component/TermVectorComponent.java
private List<Integer> getInts(String[] vals) { List<Integer> result = null; if (vals != null && vals.length > 0) { result = new ArrayList<Integer>(vals.length); for (int i = 0; i < vals.length; i++) { try { result.add(new Integer(vals[i])); } catch (NumberFormatException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e.getMessage(), e); } } } return result; }
// in core/src/java/org/apache/solr/handler/ContentStreamHandlerBase.java
Override public void handleRequestBody(SolrQueryRequest req, SolrQueryResponse rsp) throws Exception { SolrParams params = req.getParams(); UpdateRequestProcessorChain processorChain = req.getCore().getUpdateProcessingChain(params.get(UpdateParams.UPDATE_CHAIN)); UpdateRequestProcessor processor = processorChain.createProcessor(req, rsp); try { ContentStreamLoader documentLoader = newLoader(req, processor); Iterable<ContentStream> streams = req.getContentStreams(); if (streams == null) { if (!RequestHandlerUtils.handleCommit(req, processor, params, false) && !RequestHandlerUtils.handleRollback(req, processor, params, false)) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "missing content stream"); } } else { for (ContentStream stream : streams) { documentLoader.load(req, rsp, stream, processor); } // Perhaps commit from the parameters RequestHandlerUtils.handleCommit(req, processor, params, false); RequestHandlerUtils.handleRollback(req, processor, params, false); } } finally { // finish the request processor.finish(); } }
// in core/src/java/org/apache/solr/handler/admin/ShowFileRequestHandler.java
private void showFromZooKeeper(SolrQueryRequest req, SolrQueryResponse rsp, CoreContainer coreContainer) throws KeeperException, InterruptedException, UnsupportedEncodingException { String adminFile = null; SolrCore core = req.getCore(); SolrZkClient zkClient = coreContainer.getZkController().getZkClient(); final ZkSolrResourceLoader loader = (ZkSolrResourceLoader) core .getResourceLoader(); String confPath = loader.getCollectionZkPath(); String fname = req.getParams().get("file", null); if (fname == null) { adminFile = confPath; } else { fname = fname.replace('\\', '/'); // normalize slashes if (hiddenFiles.contains(fname.toUpperCase(Locale.ENGLISH))) { throw new SolrException(ErrorCode.FORBIDDEN, "Can not access: " + fname); } if (fname.indexOf("..") >= 0) { throw new SolrException(ErrorCode.FORBIDDEN, "Invalid path: " + fname); } adminFile = confPath + "/" + fname; } // Make sure the file exists, is readable and is not a hidden file if (!zkClient.exists(adminFile, true)) { throw new SolrException(ErrorCode.BAD_REQUEST, "Can not find: " + adminFile); } // Show a directory listing List<String> children = zkClient.getChildren(adminFile, null, true); if (children.size() > 0) { NamedList<SimpleOrderedMap<Object>> files = new SimpleOrderedMap<SimpleOrderedMap<Object>>(); for (String f : children) { if (hiddenFiles.contains(f.toUpperCase(Locale.ENGLISH))) { continue; // don't show 'hidden' files } if (f.startsWith(".")) { continue; // skip hidden system files... } SimpleOrderedMap<Object> fileInfo = new SimpleOrderedMap<Object>(); files.add(f, fileInfo); List<String> fchildren = zkClient.getChildren(adminFile, null, true); if (fchildren.size() > 0) { fileInfo.add("directory", true); } else { // TODO? content type fileInfo.add("size", f.length()); } // TODO: ? // fileInfo.add( "modified", new Date( f.lastModified() ) ); } rsp.add("files", files); } else { // Include the file contents // The file logic depends on RawResponseWriter, so force its use. ModifiableSolrParams params = new ModifiableSolrParams(req.getParams()); params.set(CommonParams.WT, "raw"); req.setParams(params); ContentStreamBase content = new ContentStreamBase.StringStream( new String(zkClient.getData(adminFile, null, null, true), "UTF-8")); content.setContentType(req.getParams().get(USE_CONTENT_TYPE)); rsp.add(RawResponseWriter.CONTENT, content); } rsp.setHttpCaching(false); }
// in core/src/java/org/apache/solr/handler/admin/ShowFileRequestHandler.java
private void showFromFileSystem(SolrQueryRequest req, SolrQueryResponse rsp) throws IOException { File adminFile = null; final SolrResourceLoader loader = req.getCore().getResourceLoader(); File configdir = new File( loader.getConfigDir() ); if (!configdir.exists()) { // TODO: maybe we should just open it this way to start with? try { configdir = new File( loader.getClassLoader().getResource(loader.getConfigDir()).toURI() ); } catch (URISyntaxException e) { throw new SolrException( ErrorCode.FORBIDDEN, "Can not access configuration directory!"); } } String fname = req.getParams().get("file", null); if( fname == null ) { adminFile = configdir; } else { fname = fname.replace( '\\', '/' ); // normalize slashes if( hiddenFiles.contains( fname.toUpperCase(Locale.ENGLISH) ) ) { throw new SolrException( ErrorCode.FORBIDDEN, "Can not access: "+fname ); } if( fname.indexOf( ".." ) >= 0 ) { throw new SolrException( ErrorCode.FORBIDDEN, "Invalid path: "+fname ); } adminFile = new File( configdir, fname ); } // Make sure the file exists, is readable and is not a hidden file if( !adminFile.exists() ) { throw new SolrException( ErrorCode.BAD_REQUEST, "Can not find: "+adminFile.getName() + " ["+adminFile.getAbsolutePath()+"]" ); } if( !adminFile.canRead() || adminFile.isHidden() ) { throw new SolrException( ErrorCode.BAD_REQUEST, "Can not show: "+adminFile.getName() + " ["+adminFile.getAbsolutePath()+"]" ); } // Show a directory listing if( adminFile.isDirectory() ) { int basePath = configdir.getAbsolutePath().length() + 1; NamedList<SimpleOrderedMap<Object>> files = new SimpleOrderedMap<SimpleOrderedMap<Object>>(); for( File f : adminFile.listFiles() ) { String path = f.getAbsolutePath().substring( basePath ); path = path.replace( '\\', '/' ); // normalize slashes if( hiddenFiles.contains( path.toUpperCase(Locale.ENGLISH) ) ) { continue; // don't show 'hidden' files } if( f.isHidden() || f.getName().startsWith( "." ) ) { continue; // skip hidden system files... } SimpleOrderedMap<Object> fileInfo = new SimpleOrderedMap<Object>(); files.add( path, fileInfo ); if( f.isDirectory() ) { fileInfo.add( "directory", true ); } else { // TODO? content type fileInfo.add( "size", f.length() ); } fileInfo.add( "modified", new Date( f.lastModified() ) ); } rsp.add( "files", files ); } else { // Include the file contents //The file logic depends on RawResponseWriter, so force its use. ModifiableSolrParams params = new ModifiableSolrParams( req.getParams() ); params.set( CommonParams.WT, "raw" ); req.setParams(params); ContentStreamBase content = new ContentStreamBase.FileStream( adminFile ); content.setContentType( req.getParams().get( USE_CONTENT_TYPE ) ); rsp.add(RawResponseWriter.CONTENT, content); } rsp.setHttpCaching(false); }
// in core/src/java/org/apache/solr/handler/admin/LukeRequestHandler.java
public static ShowStyle get(String v) { if(v==null) return null; if("schema".equalsIgnoreCase(v)) return SCHEMA; if("index".equalsIgnoreCase(v)) return INDEX; if("doc".equalsIgnoreCase(v)) return DOC; if("all".equalsIgnoreCase(v)) return ALL; throw new SolrException(ErrorCode.BAD_REQUEST, "Unknown Show Style: "+v); }
// in core/src/java/org/apache/solr/handler/admin/LukeRequestHandler.java
Override public void handleRequestBody(SolrQueryRequest req, SolrQueryResponse rsp) throws Exception { IndexSchema schema = req.getSchema(); SolrIndexSearcher searcher = req.getSearcher(); DirectoryReader reader = searcher.getIndexReader(); SolrParams params = req.getParams(); ShowStyle style = ShowStyle.get(params.get("show")); // If no doc is given, show all fields and top terms rsp.add("index", getIndexInfo(reader)); if(ShowStyle.INDEX==style) { return; // that's all we need } Integer docId = params.getInt( DOC_ID ); if( docId == null && params.get( ID ) != null ) { // Look for something with a given solr ID SchemaField uniqueKey = schema.getUniqueKeyField(); String v = uniqueKey.getType().toInternal( params.get(ID) ); Term t = new Term( uniqueKey.getName(), v ); docId = searcher.getFirstMatch( t ); if( docId < 0 ) { throw new SolrException( SolrException.ErrorCode.NOT_FOUND, "Can't find document: "+params.get( ID ) ); } } // Read the document from the index if( docId != null ) { if( style != null && style != ShowStyle.DOC ) { throw new SolrException(ErrorCode.BAD_REQUEST, "missing doc param for doc style"); } Document doc = null; try { doc = reader.document( docId ); } catch( Exception ex ) {} if( doc == null ) { throw new SolrException( SolrException.ErrorCode.NOT_FOUND, "Can't find document: "+docId ); } SimpleOrderedMap<Object> info = getDocumentFieldsInfo( doc, docId, reader, schema ); SimpleOrderedMap<Object> docinfo = new SimpleOrderedMap<Object>(); docinfo.add( "docId", docId ); docinfo.add( "lucene", info ); docinfo.add( "solr", doc ); rsp.add( "doc", docinfo ); } else if ( ShowStyle.SCHEMA == style ) { rsp.add( "schema", getSchemaInfo( req.getSchema() ) ); } else { rsp.add( "fields", getIndexedFieldsInfo(req) ) ; } // Add some generally helpful information NamedList<Object> info = new SimpleOrderedMap<Object>(); info.add( "key", getFieldFlagsKey() ); info.add( "NOTE", "Document Frequency (df) is not updated when a document is marked for deletion. df values include deleted documents." ); rsp.add( "info", info ); rsp.setHttpCaching(false); }
// in core/src/java/org/apache/solr/handler/admin/AdminHandlers.java
public void inform(SolrCore core) { String path = null; for( Map.Entry<String, SolrRequestHandler> entry : core.getRequestHandlers().entrySet() ) { if( entry.getValue() == this ) { path = entry.getKey(); break; } } if( path == null ) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "The AdminHandler is not registered with the current core." ); } if( !path.startsWith( "/" ) ) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "The AdminHandler needs to be registered to a path. Typically this is '/admin'" ); } // Remove the parent handler core.registerRequestHandler(path, null); if( !path.endsWith( "/" ) ) { path += "/"; } StandardHandler[] list = new StandardHandler[] { new StandardHandler( "luke", new LukeRequestHandler() ), new StandardHandler( "system", new SystemInfoHandler() ), new StandardHandler( "mbeans", new SolrInfoMBeanHandler() ), new StandardHandler( "plugins", new PluginInfoHandler() ), new StandardHandler( "threads", new ThreadDumpHandler() ), new StandardHandler( "properties", new PropertiesRequestHandler() ), new StandardHandler( "logging", new LoggingHandler() ), new StandardHandler( "file", new ShowFileRequestHandler() ) }; for( StandardHandler handler : list ) { if( core.getRequestHandler( path+handler.name ) == null ) { handler.handler.init( initArgs ); core.registerRequestHandler( path+handler.name, handler.handler ); if( handler.handler instanceof SolrCoreAware ) { ((SolrCoreAware)handler.handler).inform(core); } } } }
// in core/src/java/org/apache/solr/handler/admin/AdminHandlers.java
public void handleRequest(SolrQueryRequest req, SolrQueryResponse rsp) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "The AdminHandler should never be called directly" ); }
// in core/src/java/org/apache/solr/handler/admin/SolrInfoMBeanHandler.java
Override public void handleRequestBody(SolrQueryRequest req, SolrQueryResponse rsp) throws Exception { NamedList<NamedList<NamedList<Object>>> cats = getMBeanInfo(req); if(req.getParams().getBool("diff", false)) { ContentStream body = null; try { body = req.getContentStreams().iterator().next(); } catch(Exception ex) { throw new SolrException(ErrorCode.BAD_REQUEST, "missing content-stream for diff"); } String content = IOUtils.toString(body.getReader()); NamedList<NamedList<NamedList<Object>>> ref = fromXML(content); // Normalize the output SolrQueryResponse wrap = new SolrQueryResponse(); wrap.add("solr-mbeans", cats); cats = (NamedList<NamedList<NamedList<Object>>>) BinaryResponseWriter.getParsedResponse(req, wrap).get("solr-mbeans"); // Get rid of irrelevant things ref = normalize(ref); cats = normalize(cats); // Only the changes boolean showAll = req.getParams().getBool("all", false); rsp.add("solr-mbeans", getDiff(ref,cats, showAll)); } else { rsp.add("solr-mbeans", cats); } rsp.setHttpCaching(false); // never cache, no matter what init config looks like }
// in core/src/java/org/apache/solr/handler/admin/SolrInfoMBeanHandler.java
static NamedList<NamedList<NamedList<Object>>> fromXML(String content) { int idx = content.indexOf("<response>"); if(idx<0) { throw new SolrException(ErrorCode.BAD_REQUEST, "Body does not appear to be an XML response"); } try { XMLResponseParser parser = new XMLResponseParser(); return (NamedList<NamedList<NamedList<Object>>>) parser.processResponse(new StringReader(content.substring(idx))).get("solr-mbeans"); } catch(Exception ex) { throw new SolrException(ErrorCode.BAD_REQUEST, "Unable to read original XML", ex); } }
// in core/src/java/org/apache/solr/handler/admin/CoreAdminHandler.java
Override final public void init(NamedList args) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "CoreAdminHandler should not be configured in solrconf.xml\n" + "it is a special Handler configured directly by the RequestDispatcher"); }
// in core/src/java/org/apache/solr/handler/admin/CoreAdminHandler.java
Override public void handleRequestBody(SolrQueryRequest req, SolrQueryResponse rsp) throws Exception { // Make sure the cores is enabled CoreContainer cores = getCoreContainer(); if (cores == null) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Core container instance missing"); } boolean doPersist = false; // Pick the action SolrParams params = req.getParams(); CoreAdminAction action = CoreAdminAction.STATUS; String a = params.get(CoreAdminParams.ACTION); if (a != null) { action = CoreAdminAction.get(a); if (action == null) { doPersist = this.handleCustomAction(req, rsp); } } if (action != null) { switch (action) { case CREATE: { doPersist = this.handleCreateAction(req, rsp); break; } case RENAME: { doPersist = this.handleRenameAction(req, rsp); break; } case UNLOAD: { doPersist = this.handleUnloadAction(req, rsp); break; } case STATUS: { doPersist = this.handleStatusAction(req, rsp); break; } case PERSIST: { doPersist = this.handlePersistAction(req, rsp); break; } case RELOAD: { doPersist = this.handleReloadAction(req, rsp); break; } case SWAP: { doPersist = this.handleSwapAction(req, rsp); break; } case MERGEINDEXES: { doPersist = this.handleMergeAction(req, rsp); break; } case PREPRECOVERY: { this.handleWaitForStateAction(req, rsp); break; } case REQUESTRECOVERY: { this.handleRequestRecoveryAction(req, rsp); break; } case DISTRIBURL: { this.handleDistribUrlAction(req, rsp); break; } default: { doPersist = this.handleCustomAction(req, rsp); break; } case LOAD: break; } } // Should we persist the changes? if (doPersist) { cores.persist(); rsp.add("saved", cores.getConfigFile().getAbsolutePath()); } rsp.setHttpCaching(false); }
// in core/src/java/org/apache/solr/handler/admin/CoreAdminHandler.java
protected boolean handleMergeAction(SolrQueryRequest req, SolrQueryResponse rsp) throws IOException { SolrParams params = req.getParams(); String cname = params.required().get(CoreAdminParams.CORE); SolrCore core = coreContainer.getCore(cname); SolrQueryRequest wrappedReq = null; SolrCore[] sourceCores = null; RefCounted<SolrIndexSearcher>[] searchers = null; // stores readers created from indexDir param values DirectoryReader[] readersToBeClosed = null; Directory[] dirsToBeReleased = null; if (core != null) { try { String[] dirNames = params.getParams(CoreAdminParams.INDEX_DIR); if (dirNames == null || dirNames.length == 0) { String[] sources = params.getParams("srcCore"); if (sources == null || sources.length == 0) throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "At least one indexDir or srcCore must be specified"); sourceCores = new SolrCore[sources.length]; for (int i = 0; i < sources.length; i++) { String source = sources[i]; SolrCore srcCore = coreContainer.getCore(source); if (srcCore == null) throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Core: " + source + " does not exist"); sourceCores[i] = srcCore; } } else { readersToBeClosed = new DirectoryReader[dirNames.length]; dirsToBeReleased = new Directory[dirNames.length]; DirectoryFactory dirFactory = core.getDirectoryFactory(); for (int i = 0; i < dirNames.length; i++) { Directory dir = dirFactory.get(dirNames[i], core.getSolrConfig().indexConfig.lockType); dirsToBeReleased[i] = dir; // TODO: why doesn't this use the IR factory? what is going on here? readersToBeClosed[i] = DirectoryReader.open(dir); } } DirectoryReader[] readers = null; if (readersToBeClosed != null) { readers = readersToBeClosed; } else { readers = new DirectoryReader[sourceCores.length]; searchers = new RefCounted[sourceCores.length]; for (int i = 0; i < sourceCores.length; i++) { SolrCore solrCore = sourceCores[i]; // record the searchers so that we can decref searchers[i] = solrCore.getSearcher(); readers[i] = searchers[i].get().getIndexReader(); } } UpdateRequestProcessorChain processorChain = core.getUpdateProcessingChain(params.get(UpdateParams.UPDATE_CHAIN)); wrappedReq = new LocalSolrQueryRequest(core, req.getParams()); UpdateRequestProcessor processor = processorChain.createProcessor(wrappedReq, rsp); processor.processMergeIndexes(new MergeIndexesCommand(readers, req)); } finally { if (searchers != null) { for (RefCounted<SolrIndexSearcher> searcher : searchers) { if (searcher != null) searcher.decref(); } } if (sourceCores != null) { for (SolrCore solrCore : sourceCores) { if (solrCore != null) solrCore.close(); } } if (readersToBeClosed != null) IOUtils.closeWhileHandlingException(readersToBeClosed); if (dirsToBeReleased != null) { for (Directory dir : dirsToBeReleased) { DirectoryFactory dirFactory = core.getDirectoryFactory(); dirFactory.release(dir); } } if (wrappedReq != null) wrappedReq.close(); core.close(); } } return coreContainer.isPersistent(); }
// in core/src/java/org/apache/solr/handler/admin/CoreAdminHandler.java
protected boolean handleCustomAction(SolrQueryRequest req, SolrQueryResponse rsp) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Unsupported operation: " + req.getParams().get(CoreAdminParams.ACTION)); }
// in core/src/java/org/apache/solr/handler/admin/CoreAdminHandler.java
protected boolean handleCreateAction(SolrQueryRequest req, SolrQueryResponse rsp) throws SolrException { try { SolrParams params = req.getParams(); String name = params.get(CoreAdminParams.NAME); //for now, do not allow creating new core with same name when in cloud mode //XXX perhaps it should just be unregistered from cloud before readding it?, //XXX perhaps we should also check that cores are of same type before adding new core to collection? if (coreContainer.getZkController() != null) { if (coreContainer.getCore(name) != null) { log.info("Re-creating a core with existing name is not allowed in cloud mode"); throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Core with name '" + name + "' already exists."); } } String instanceDir = params.get(CoreAdminParams.INSTANCE_DIR); if (instanceDir == null) { // instanceDir = coreContainer.getSolrHome() + "/" + name; instanceDir = name; // bare name is already relative to solr home } CoreDescriptor dcore = new CoreDescriptor(coreContainer, name, instanceDir); // fillup optional parameters String opts = params.get(CoreAdminParams.CONFIG); if (opts != null) dcore.setConfigName(opts); opts = params.get(CoreAdminParams.SCHEMA); if (opts != null) dcore.setSchemaName(opts); opts = params.get(CoreAdminParams.DATA_DIR); if (opts != null) dcore.setDataDir(opts); CloudDescriptor cd = dcore.getCloudDescriptor(); if (cd != null) { cd.setParams(req.getParams()); opts = params.get(CoreAdminParams.COLLECTION); if (opts != null) cd.setCollectionName(opts); opts = params.get(CoreAdminParams.SHARD); if (opts != null) cd.setShardId(opts); opts = params.get(CoreAdminParams.ROLES); if (opts != null) cd.setRoles(opts); Integer numShards = params.getInt(ZkStateReader.NUM_SHARDS_PROP); if (numShards != null) cd.setNumShards(numShards); } // Process all property.name=value parameters and set them as name=value core properties Properties coreProperties = new Properties(); Iterator<String> parameterNamesIterator = params.getParameterNamesIterator(); while (parameterNamesIterator.hasNext()) { String parameterName = parameterNamesIterator.next(); if(parameterName.startsWith(CoreAdminParams.PROPERTY_PREFIX)) { String parameterValue = params.get(parameterName); String propertyName = parameterName.substring(CoreAdminParams.PROPERTY_PREFIX.length()); // skip prefix coreProperties.put(propertyName, parameterValue); } } dcore.setCoreProperties(coreProperties); SolrCore core = coreContainer.create(dcore); coreContainer.register(name, core, false); rsp.add("core", core.getName()); return coreContainer.isPersistent(); } catch (Exception ex) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Error executing default implementation of CREATE", ex); } }
// in core/src/java/org/apache/solr/handler/admin/CoreAdminHandler.java
protected boolean handleUnloadAction(SolrQueryRequest req, SolrQueryResponse rsp) throws SolrException { SolrParams params = req.getParams(); String cname = params.get(CoreAdminParams.CORE); SolrCore core = coreContainer.remove(cname); if(core == null){ throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "No such core exists '" + cname + "'"); } else { if (coreContainer.getZkController() != null) { log.info("Unregistering core " + cname + " from cloudstate."); try { coreContainer.getZkController().unregister(cname, core.getCoreDescriptor().getCloudDescriptor()); } catch (InterruptedException e) { Thread.currentThread().interrupt(); throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Could not unregister core " + cname + " from cloudstate: " + e.getMessage(), e); } catch (KeeperException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Could not unregister core " + cname + " from cloudstate: " + e.getMessage(), e); } } } if (params.getBool(CoreAdminParams.DELETE_INDEX, false)) { core.addCloseHook(new CloseHook() { @Override public void preClose(SolrCore core) {} @Override public void postClose(SolrCore core) { File dataDir = new File(core.getIndexDir()); File[] files = dataDir.listFiles(); if (files != null) { for (File file : files) { if (!file.delete()) { log.error(file.getAbsolutePath() + " could not be deleted on core unload"); } } if (!dataDir.delete()) log.error(dataDir.getAbsolutePath() + " could not be deleted on core unload"); } else { log.error(dataDir.getAbsolutePath() + " could not be deleted on core unload"); } } }); }
// in core/src/java/org/apache/solr/handler/admin/CoreAdminHandler.java
protected boolean handleStatusAction(SolrQueryRequest req, SolrQueryResponse rsp) throws SolrException { SolrParams params = req.getParams(); String cname = params.get(CoreAdminParams.CORE); boolean doPersist = false; NamedList<Object> status = new SimpleOrderedMap<Object>(); try { if (cname == null) { rsp.add("defaultCoreName", coreContainer.getDefaultCoreName()); for (String name : coreContainer.getCoreNames()) { status.add(name, getCoreStatus(coreContainer, name)); } } else { status.add(cname, getCoreStatus(coreContainer, cname)); } rsp.add("status", status); doPersist = false; // no state change return doPersist; } catch (Exception ex) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Error handling 'status' action ", ex); } }
// in core/src/java/org/apache/solr/handler/admin/CoreAdminHandler.java
protected boolean handlePersistAction(SolrQueryRequest req, SolrQueryResponse rsp) throws SolrException { SolrParams params = req.getParams(); boolean doPersist = false; String fileName = params.get(CoreAdminParams.FILE); if (fileName != null) { File file = new File(coreContainer.getConfigFile().getParentFile(), fileName); coreContainer.persistFile(file); rsp.add("saved", file.getAbsolutePath()); doPersist = false; } else if (!coreContainer.isPersistent()) { throw new SolrException(SolrException.ErrorCode.FORBIDDEN, "Persistence is not enabled"); } else doPersist = true; return doPersist; }
// in core/src/java/org/apache/solr/handler/admin/CoreAdminHandler.java
protected boolean handleReloadAction(SolrQueryRequest req, SolrQueryResponse rsp) { SolrParams params = req.getParams(); String cname = params.get(CoreAdminParams.CORE); try { coreContainer.reload(cname); return false; // no change on reload } catch (Exception ex) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Error handling 'reload' action", ex); } }
// in core/src/java/org/apache/solr/handler/admin/CoreAdminHandler.java
protected void handleWaitForStateAction(SolrQueryRequest req, SolrQueryResponse rsp) throws IOException, InterruptedException { final SolrParams params = req.getParams(); String cname = params.get(CoreAdminParams.CORE); if (cname == null) { cname = ""; } String nodeName = params.get("nodeName"); String coreNodeName = params.get("coreNodeName"); String waitForState = params.get("state"); Boolean checkLive = params.getBool("checkLive"); int pauseFor = params.getInt("pauseFor", 0); String state = null; boolean live = false; int retry = 0; while (true) { SolrCore core = null; try { core = coreContainer.getCore(cname); if (core == null && retry == 30) { throw new SolrException(ErrorCode.BAD_REQUEST, "core not found:" + cname); } if (core != null) { // wait until we are sure the recovering node is ready // to accept updates CloudDescriptor cloudDescriptor = core.getCoreDescriptor() .getCloudDescriptor(); CloudState cloudState = coreContainer.getZkController() .getCloudState(); String collection = cloudDescriptor.getCollectionName(); Slice slice = cloudState.getSlice(collection, cloudDescriptor.getShardId()); if (slice != null) { ZkNodeProps nodeProps = slice.getShards().get(coreNodeName); if (nodeProps != null) { state = nodeProps.get(ZkStateReader.STATE_PROP); live = cloudState.liveNodesContain(nodeName); if (nodeProps != null && state.equals(waitForState)) { if (checkLive == null) { break; } else if (checkLive && live) { break; } else if (!checkLive && !live) { break; } } } } } if (retry++ == 30) { throw new SolrException(ErrorCode.BAD_REQUEST, "I was asked to wait on state " + waitForState + " for " + nodeName + " but I still do not see the request state. I see state: " + state + " live:" + live); } } finally { if (core != null) { core.close(); } } Thread.sleep(1000); } // small safety net for any updates that started with state that // kept it from sending the update to be buffered - // pause for a while to let any outstanding updates finish // System.out.println("I saw state:" + state + " sleep for " + pauseFor + // " live:" + live); Thread.sleep(pauseFor); // solrcloud_debug // try {; // LocalSolrQueryRequest r = new LocalSolrQueryRequest(core, new // ModifiableSolrParams()); // CommitUpdateCommand commitCmd = new CommitUpdateCommand(r, false); // commitCmd.softCommit = true; // core.getUpdateHandler().commit(commitCmd); // RefCounted<SolrIndexSearcher> searchHolder = // core.getNewestSearcher(false); // SolrIndexSearcher searcher = searchHolder.get(); // try { // System.out.println(core.getCoreDescriptor().getCoreContainer().getZkController().getNodeName() // + " to replicate " // + searcher.search(new MatchAllDocsQuery(), 1).totalHits + " gen:" + // core.getDeletionPolicy().getLatestCommit().getGeneration() + " data:" + // core.getDataDir()); // } finally { // searchHolder.decref(); // } // } catch (Exception e) { // // } }
// in core/src/java/org/apache/solr/handler/admin/LoggingHandler.java
Override public void handleRequestBody(SolrQueryRequest req, SolrQueryResponse rsp) throws Exception { // Don't do anything if the framework is unknown if(watcher==null) { rsp.add("error", "Logging Not Initalized"); return; } rsp.add("watcher", watcher.getName()); SolrParams params = req.getParams(); if(params.get("threshold")!=null) { watcher.setThreshold(params.get("threshold")); } // Write something at each level if(params.get("test")!=null) { log.trace("trace message"); log.debug( "debug message"); log.info("info (with exception)", new RuntimeException("test") ); log.warn("warn (with exception)", new RuntimeException("test") ); log.error("error (with exception)", new RuntimeException("test") ); } String[] set = params.getParams("set"); if (set != null) { for (String pair : set) { String[] split = pair.split(":"); if (split.length != 2) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "Invalid format, expected level:value, got " + pair); } String category = split[0]; String level = split[1]; watcher.setLogLevel(category, level); } } String since = req.getParams().get("since"); if(since != null) { long time = -1; try { time = Long.parseLong(since); } catch(Exception ex) { throw new SolrException(ErrorCode.BAD_REQUEST, "invalid timestamp: "+since); } AtomicBoolean found = new AtomicBoolean(false); SolrDocumentList docs = watcher.getHistory(time, found); if(docs==null) { rsp.add("error", "History not enabled"); return; } else { SimpleOrderedMap<Object> info = new SimpleOrderedMap<Object>(); if(time>0) { info.add("since", time); info.add("found", found); } else { info.add("levels", watcher.getAllLevels()); // show for the first request } info.add("last", watcher.getLastEvent()); info.add("buffer", watcher.getHistorySize()); info.add("threshold", watcher.getThreshold()); rsp.add("info", info); rsp.add("history", docs); } } else { rsp.add("levels", watcher.getAllLevels()); List<LoggerInfo> loggers = new ArrayList<LoggerInfo>(watcher.getAllLoggers()); Collections.sort(loggers); List<SimpleOrderedMap<?>> info = new ArrayList<SimpleOrderedMap<?>>(); for(LoggerInfo wrap:loggers) { info.add(wrap.getInfo()); } rsp.add("loggers", info); } rsp.setHttpCaching(false); }
// in core/src/java/org/apache/solr/handler/PingRequestHandler.java
Override public void handleRequestBody(SolrQueryRequest req, SolrQueryResponse rsp) throws Exception { SolrParams params = req.getParams(); // in this case, we want to default distrib to false so // we only ping the single node Boolean distrib = params.getBool("distrib"); if (distrib == null) { ModifiableSolrParams mparams = new ModifiableSolrParams(params); mparams.set("distrib", false); req.setParams(mparams); } String actionParam = params.get("action"); ACTIONS action = null; if (actionParam == null){ action = ACTIONS.PING; } else { try { action = ACTIONS.valueOf(actionParam.toUpperCase(Locale.ENGLISH)); } catch (IllegalArgumentException iae){ throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Unknown action: " + actionParam); } } switch(action){ case PING: if( isPingDisabled() ) { throw new SolrException(SolrException.ErrorCode.SERVICE_UNAVAILABLE, "Service disabled"); } handlePing(req, rsp); break; case ENABLE: handleEnable(true); break; case DISABLE: handleEnable(false); break; case STATUS: if( healthcheck == null ){ SolrException e = new SolrException (SolrException.ErrorCode.SERVICE_UNAVAILABLE, "healthcheck not configured"); rsp.setException(e); } else { rsp.add( "status", isPingDisabled() ? "disabled" : "enabled" ); } } }
// in core/src/java/org/apache/solr/handler/PingRequestHandler.java
protected void handlePing(SolrQueryRequest req, SolrQueryResponse rsp) throws Exception { SolrParams params = req.getParams(); SolrCore core = req.getCore(); // Get the RequestHandler String qt = params.get( CommonParams.QT );//optional; you get the default otherwise SolrRequestHandler handler = core.getRequestHandler( qt ); if( handler == null ) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Unknown RequestHandler (qt): "+qt ); } if( handler instanceof PingRequestHandler ) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Cannot execute the PingRequestHandler recursively" ); } // Execute the ping query and catch any possible exception Throwable ex = null; try { SolrQueryResponse pingrsp = new SolrQueryResponse(); core.execute(handler, req, pingrsp ); ex = pingrsp.getException(); } catch( Throwable th ) { ex = th; } // Send an error or an 'OK' message (response code will be 200) if( ex != null ) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Ping query caused exception: "+ex.getMessage(), ex ); } rsp.add( "status", "OK" ); }
// in core/src/java/org/apache/solr/handler/PingRequestHandler.java
protected void handleEnable(boolean enable) throws SolrException { if (healthcheck == null) { throw new SolrException(SolrException.ErrorCode.SERVICE_UNAVAILABLE, "No healthcheck file defined."); } if ( enable ) { try { // write out when the file was created FileUtils.write(healthcheck, DateField.formatExternal(new Date()), "UTF-8"); } catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Unable to write healthcheck flag file", e); } } else { if (healthcheck.exists() && !healthcheck.delete()){ throw new SolrException(SolrException.ErrorCode.NOT_FOUND, "Did not successfully delete healthcheck file: " +healthcheck.getAbsolutePath()); } } }
// in core/src/java/org/apache/solr/analysis/TrieTokenizerFactory.java
Override public void reset(Reader input) throws IOException { try { super.reset(input); input = super.input; char[] buf = new char[32]; int len = input.read(buf); this.startOfs = correctOffset(0); this.endOfs = correctOffset(len); String v = new String(buf, 0, len); try { switch (type) { case INTEGER: ts.setIntValue(Integer.parseInt(v)); break; case FLOAT: ts.setFloatValue(Float.parseFloat(v)); break; case LONG: ts.setLongValue(Long.parseLong(v)); break; case DOUBLE: ts.setDoubleValue(Double.parseDouble(v)); break; case DATE: ts.setLongValue(dateField.parseMath(null, v).getTime()); break; default: throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Unknown type for trie field"); } } catch (NumberFormatException nfe) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Invalid Number: " + v); } } catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Unable to create TrieIndexTokenizer", e); } }
// in core/src/java/org/apache/solr/client/solrj/embedded/EmbeddedSolrServer.java
Override public NamedList<Object> request(SolrRequest request) throws SolrServerException, IOException { String path = request.getPath(); if( path == null || !path.startsWith( "/" ) ) { path = "/select"; } // Check for cores action SolrCore core = coreContainer.getCore( coreName ); if( core == null ) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "No such core: " + coreName ); } SolrParams params = request.getParams(); if( params == null ) { params = new ModifiableSolrParams(); } // Extract the handler from the path or params SolrRequestHandler handler = core.getRequestHandler( path ); if( handler == null ) { if( "/select".equals( path ) || "/select/".equalsIgnoreCase( path) ) { String qt = params.get( CommonParams.QT ); handler = core.getRequestHandler( qt ); if( handler == null ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "unknown handler: "+qt); } } // Perhaps the path is to manage the cores if( handler == null && coreContainer != null && path.equals( coreContainer.getAdminPath() ) ) { handler = coreContainer.getMultiCoreHandler(); } } if( handler == null ) { core.close(); throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "unknown handler: "+path ); } SolrQueryRequest req = null; try { req = _parser.buildRequestFrom( core, params, request.getContentStreams() ); req.getContext().put( "path", path ); SolrQueryResponse rsp = new SolrQueryResponse(); SolrRequestInfo.setRequestInfo(new SolrRequestInfo(req, rsp)); core.execute( handler, req, rsp ); if( rsp.getException() != null ) { if(rsp.getException() instanceof SolrException) { throw rsp.getException(); } throw new SolrServerException( rsp.getException() ); } // Check if this should stream results if( request.getStreamingResponseCallback() != null ) { try { final StreamingResponseCallback callback = request.getStreamingResponseCallback(); BinaryResponseWriter.Resolver resolver = new BinaryResponseWriter.Resolver( req, rsp.getReturnFields()) { @Override public void writeResults(ResultContext ctx, JavaBinCodec codec) throws IOException { // write an empty list... SolrDocumentList docs = new SolrDocumentList(); docs.setNumFound( ctx.docs.matches() ); docs.setStart( ctx.docs.offset() ); docs.setMaxScore( ctx.docs.maxScore() ); codec.writeSolrDocumentList( docs ); // This will transform writeResultsBody( ctx, codec ); } }; ByteArrayOutputStream out = new ByteArrayOutputStream(); new JavaBinCodec(resolver) { @Override public void writeSolrDocument(SolrDocument doc) throws IOException { callback.streamSolrDocument( doc ); //super.writeSolrDocument( doc, fields ); } @Override public void writeSolrDocumentList(SolrDocumentList docs) throws IOException { if( docs.size() > 0 ) { SolrDocumentList tmp = new SolrDocumentList(); tmp.setMaxScore( docs.getMaxScore() ); tmp.setNumFound( docs.getNumFound() ); tmp.setStart( docs.getStart() ); docs = tmp; } callback.streamDocListInfo( docs.getNumFound(), docs.getStart(), docs.getMaxScore() ); super.writeSolrDocumentList(docs); } }.marshal(rsp.getValues(), out); InputStream in = new ByteArrayInputStream(out.toByteArray()); return (NamedList<Object>) new JavaBinCodec(resolver).unmarshal(in); } catch (Exception ex) { throw new RuntimeException(ex); } } // Now write it out NamedList<Object> normalized = getParsedResponse(req, rsp); return normalized; } catch( IOException iox ) { throw iox; } catch( SolrException sx ) { throw sx; } catch( Exception ex ) { throw new SolrServerException( ex ); } finally { if (req != null) req.close(); core.close(); SolrRequestInfo.clearRequestInfo(); } }
// in core/src/java/org/apache/solr/response/CSVResponseWriter.java
public void writeResponse() throws IOException { SolrParams params = req.getParams(); strategy = new CSVStrategy(',', '"', CSVStrategy.COMMENTS_DISABLED, CSVStrategy.ESCAPE_DISABLED, false, false, false, true); CSVStrategy strat = strategy; String sep = params.get(CSV_SEPARATOR); if (sep!=null) { if (sep.length()!=1) throw new SolrException( SolrException.ErrorCode.BAD_REQUEST,"Invalid separator:'"+sep+"'"); strat.setDelimiter(sep.charAt(0)); } String nl = params.get(CSV_NEWLINE); if (nl!=null) { if (nl.length()==0) throw new SolrException( SolrException.ErrorCode.BAD_REQUEST,"Invalid newline:'"+nl+"'"); strat.setPrinterNewline(nl); } String encapsulator = params.get(CSV_ENCAPSULATOR); String escape = params.get(CSV_ESCAPE); if (encapsulator!=null) { if (encapsulator.length()!=1) throw new SolrException( SolrException.ErrorCode.BAD_REQUEST,"Invalid encapsulator:'"+encapsulator+"'"); strat.setEncapsulator(encapsulator.charAt(0)); } if (escape!=null) { if (escape.length()!=1) throw new SolrException( SolrException.ErrorCode.BAD_REQUEST,"Invalid escape:'"+escape+"'"); strat.setEscape(escape.charAt(0)); if (encapsulator == null) { strat.setEncapsulator( CSVStrategy.ENCAPSULATOR_DISABLED); } } if (strat.getEscape() == '\\') { // If the escape is the standard backslash, then also enable // unicode escapes (it's harmless since 'u' would not otherwise // be escaped. strat.setUnicodeEscapeInterpretation(true); } printer = new CSVPrinter(writer, strategy); CSVStrategy mvStrategy = new CSVStrategy(strategy.getDelimiter(), CSVStrategy.ENCAPSULATOR_DISABLED, CSVStrategy.COMMENTS_DISABLED, '\\', false, false, false, false); strat = mvStrategy; sep = params.get(MV_SEPARATOR); if (sep!=null) { if (sep.length()!=1) throw new SolrException( SolrException.ErrorCode.BAD_REQUEST,"Invalid mv separator:'"+sep+"'"); strat.setDelimiter(sep.charAt(0)); } encapsulator = params.get(MV_ENCAPSULATOR); escape = params.get(MV_ESCAPE); if (encapsulator!=null) { if (encapsulator.length()!=1) throw new SolrException( SolrException.ErrorCode.BAD_REQUEST,"Invalid mv encapsulator:'"+encapsulator+"'"); strat.setEncapsulator(encapsulator.charAt(0)); if (escape == null) { strat.setEscape(CSVStrategy.ESCAPE_DISABLED); } } escape = params.get(MV_ESCAPE); if (escape!=null) { if (escape.length()!=1) throw new SolrException( SolrException.ErrorCode.BAD_REQUEST,"Invalid mv escape:'"+escape+"'"); strat.setEscape(escape.charAt(0)); // encapsulator will already be disabled if it wasn't specified } Collection<String> fields = returnFields.getLuceneFieldNames(); Object responseObj = rsp.getValues().get("response"); boolean returnOnlyStored = false; if (fields==null) { if (responseObj instanceof SolrDocumentList) { // get the list of fields from the SolrDocumentList fields = new LinkedHashSet<String>(); for (SolrDocument sdoc: (SolrDocumentList)responseObj) { fields.addAll(sdoc.getFieldNames()); } } else { // get the list of fields from the index fields = req.getSearcher().getFieldNames(); } if (returnFields.wantsScore()) { fields.add("score"); } else { fields.remove("score"); } returnOnlyStored = true; } CSVSharedBufPrinter csvPrinterMV = new CSVSharedBufPrinter(mvWriter, mvStrategy); for (String field : fields) { if (!returnFields.wantsField(field)) { continue; } if (field.equals("score")) { CSVField csvField = new CSVField(); csvField.name = "score"; csvFields.put("score", csvField); continue; } SchemaField sf = schema.getFieldOrNull(field); if (sf == null) { FieldType ft = new StrField(); sf = new SchemaField(field, ft); } // Return only stored fields, unless an explicit field list is specified if (returnOnlyStored && sf != null && !sf.stored()) { continue; } // check for per-field overrides sep = params.get("f." + field + '.' + CSV_SEPARATOR); encapsulator = params.get("f." + field + '.' + CSV_ENCAPSULATOR); escape = params.get("f." + field + '.' + CSV_ESCAPE); CSVSharedBufPrinter csvPrinter = csvPrinterMV; if (sep != null || encapsulator != null || escape != null) { // create a new strategy + printer if there were any per-field overrides strat = (CSVStrategy)mvStrategy.clone(); if (sep!=null) { if (sep.length()!=1) throw new SolrException( SolrException.ErrorCode.BAD_REQUEST,"Invalid mv separator:'"+sep+"'"); strat.setDelimiter(sep.charAt(0)); } if (encapsulator!=null) { if (encapsulator.length()!=1) throw new SolrException( SolrException.ErrorCode.BAD_REQUEST,"Invalid mv encapsulator:'"+encapsulator+"'"); strat.setEncapsulator(encapsulator.charAt(0)); if (escape == null) { strat.setEscape(CSVStrategy.ESCAPE_DISABLED); } } if (escape!=null) { if (escape.length()!=1) throw new SolrException( SolrException.ErrorCode.BAD_REQUEST,"Invalid mv escape:'"+escape+"'"); strat.setEscape(escape.charAt(0)); if (encapsulator == null) { strat.setEncapsulator(CSVStrategy.ENCAPSULATOR_DISABLED); } } csvPrinter = new CSVSharedBufPrinter(mvWriter, strat); } CSVField csvField = new CSVField(); csvField.name = field; csvField.sf = sf; csvField.mvPrinter = csvPrinter; csvFields.put(field, csvField); } NullValue = params.get(CSV_NULL, ""); if (params.getBool(CSV_HEADER, true)) { for (CSVField csvField : csvFields.values()) { printer.print(csvField.name); } printer.println(); } if (responseObj instanceof ResultContext ) { writeDocuments(null, (ResultContext)responseObj, returnFields ); } else if (responseObj instanceof DocList) { ResultContext ctx = new ResultContext(); ctx.docs = (DocList)responseObj; writeDocuments(null, ctx, returnFields ); } else if (responseObj instanceof SolrDocumentList) { writeSolrDocumentList(null, (SolrDocumentList)responseObj, returnFields ); } }
// in core/src/java/org/apache/solr/response/transform/ValueAugmenterFactory.java
public static Object getObjectFrom( String val, String type ) { if( type != null ) { try { if( "int".equals( type ) ) return Integer.valueOf( val ); if( "double".equals( type ) ) return Double.valueOf( val ); if( "float".equals( type ) ) return Float.valueOf( val ); if( "date".equals( type ) ) return DateUtil.parseDate(val); } catch( Exception ex ) { throw new SolrException( ErrorCode.BAD_REQUEST, "Unable to parse "+type+"="+val, ex ); } } return val; }
// in core/src/java/org/apache/solr/response/transform/ValueAugmenterFactory.java
Override public DocTransformer create(String field, SolrParams params, SolrQueryRequest req) { Object val = value; if( val == null ) { String v = params.get("v"); if( v == null ) { val = defaultValue; } else { val = getObjectFrom(v, params.get("t")); } if( val == null ) { throw new SolrException( ErrorCode.BAD_REQUEST, "ValueAugmenter is missing a value -- should be defined in solrconfig or inline" ); } } return new ValueAugmenter( field, val ); }
// in core/src/java/org/apache/solr/response/transform/ValueSourceAugmenter.java
Override public void setContext( TransformContext context ) { try { IndexReader reader = qparser.getReq().getSearcher().getIndexReader(); readerContexts = reader.getTopReaderContext().leaves(); docValuesArr = new FunctionValues[readerContexts.length]; searcher = qparser.getReq().getSearcher(); fcontext = ValueSource.newContext(searcher); this.valueSource.createWeight(fcontext, searcher); } catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); } }
// in core/src/java/org/apache/solr/response/transform/ValueSourceAugmenter.java
Override public void transform(SolrDocument doc, int docid) { // This is only good for random-access functions try { // TODO: calculate this stuff just once across diff functions int idx = ReaderUtil.subIndex(docid, readerContexts); AtomicReaderContext rcontext = readerContexts[idx]; FunctionValues values = docValuesArr[idx]; if (values == null) { docValuesArr[idx] = values = valueSource.getValues(fcontext, rcontext); } int localId = docid - rcontext.docBase; Object val = values.objectVal(localId); if (val != null) { doc.setField( name, val ); } } catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "exception at docid " + docid + " for valuesource " + valueSource, e); } }
// in core/src/java/org/apache/solr/response/transform/ExplainAugmenterFactory.java
public static Style getStyle( String str ) { try { return Style.valueOf( str ); } catch( Exception ex ) { throw new SolrException( ErrorCode.BAD_REQUEST, "Unknown Explain Style: "+str ); } }
// in core/src/java/org/apache/solr/request/PerSegmentSingleValuedFaceting.java
NamedList<Integer> getFacetCounts(Executor executor) throws IOException { CompletionService<SegFacet> completionService = new ExecutorCompletionService<SegFacet>(executor); // reuse the translation logic to go from top level set to per-segment set baseSet = docs.getTopFilter(); final AtomicReaderContext[] leaves = searcher.getTopReaderContext().leaves(); // The list of pending tasks that aren't immediately submitted // TODO: Is there a completion service, or a delegating executor that can // limit the number of concurrent tasks submitted to a bigger executor? LinkedList<Callable<SegFacet>> pending = new LinkedList<Callable<SegFacet>>(); int threads = nThreads <= 0 ? Integer.MAX_VALUE : nThreads; for (int i=0; i<leaves.length; i++) { final SegFacet segFacet = new SegFacet(leaves[i]); Callable<SegFacet> task = new Callable<SegFacet>() { public SegFacet call() throws Exception { segFacet.countTerms(); return segFacet; } }; // TODO: if limiting threads, submit by largest segment first? if (--threads >= 0) { completionService.submit(task); } else { pending.add(task); } } // now merge the per-segment results PriorityQueue<SegFacet> queue = new PriorityQueue<SegFacet>(leaves.length) { @Override protected boolean lessThan(SegFacet a, SegFacet b) { return a.tempBR.compareTo(b.tempBR) < 0; } }; boolean hasMissingCount=false; int missingCount=0; for (int i=0; i<leaves.length; i++) { SegFacet seg = null; try { Future<SegFacet> future = completionService.take(); seg = future.get(); if (!pending.isEmpty()) { completionService.submit(pending.removeFirst()); } } catch (InterruptedException e) { Thread.currentThread().interrupt(); throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); } catch (ExecutionException e) { Throwable cause = e.getCause(); if (cause instanceof RuntimeException) { throw (RuntimeException)cause; } else { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Error in per-segment faceting on field: " + fieldName, cause); } } if (seg.startTermIndex < seg.endTermIndex) { if (seg.startTermIndex==0) { hasMissingCount=true; missingCount += seg.counts[0]; seg.pos = 1; } else { seg.pos = seg.startTermIndex; } if (seg.pos < seg.endTermIndex) { seg.tenum = seg.si.getTermsEnum(); seg.tenum.seekExact(seg.pos); seg.tempBR = seg.tenum.term(); queue.add(seg); } } } FacetCollector collector; if (sort.equals(FacetParams.FACET_SORT_COUNT) || sort.equals(FacetParams.FACET_SORT_COUNT_LEGACY)) { collector = new CountSortedFacetCollector(offset, limit, mincount); } else { collector = new IndexSortedFacetCollector(offset, limit, mincount); } BytesRef val = new BytesRef(); while (queue.size() > 0) { SegFacet seg = queue.top(); // make a shallow copy val.bytes = seg.tempBR.bytes; val.offset = seg.tempBR.offset; val.length = seg.tempBR.length; int count = 0; do { count += seg.counts[seg.pos - seg.startTermIndex]; // TODO: OPTIMIZATION... // if mincount>0 then seg.pos++ can skip ahead to the next non-zero entry. seg.pos++; if (seg.pos >= seg.endTermIndex) { queue.pop(); seg = queue.top(); } else { seg.tempBR = seg.tenum.next(); seg = queue.updateTop(); } } while (seg != null && val.compareTo(seg.tempBR) == 0); boolean stop = collector.collect(val, count); if (stop) break; } NamedList<Integer> res = collector.getFacetCounts(); // convert labels to readable form FieldType ft = searcher.getSchema().getFieldType(fieldName); int sz = res.size(); for (int i=0; i<sz; i++) { res.setName(i, ft.indexedToReadable(res.getName(i))); } if (missing) { if (!hasMissingCount) { missingCount = SimpleFacets.getFieldMissingCount(searcher,docs,fieldName); } res.add(null, missingCount); } return res; }
// in core/src/java/org/apache/solr/request/SimpleFacets.java
public NamedList<Object> getFacetCounts() { // if someone called this method, benefit of the doubt: assume true if (!params.getBool(FacetParams.FACET,true)) return null; facetResponse = new SimpleOrderedMap<Object>(); try { facetResponse.add("facet_queries", getFacetQueryCounts()); facetResponse.add("facet_fields", getFacetFieldCounts()); facetResponse.add("facet_dates", getFacetDateCounts()); facetResponse.add("facet_ranges", getFacetRangeCounts()); } catch (IOException e) { throw new SolrException(ErrorCode.SERVER_ERROR, e); } catch (ParseException e) { throw new SolrException(ErrorCode.BAD_REQUEST, e); } return facetResponse; }
// in core/src/java/org/apache/solr/request/SimpleFacets.java
public NamedList<Integer> getGroupedCounts(SolrIndexSearcher searcher, DocSet base, String field, boolean multiToken, int offset, int limit, int mincount, boolean missing, String sort, String prefix) throws IOException { GroupingSpecification groupingSpecification = rb.getGroupingSpec(); String groupField = groupingSpecification != null ? groupingSpecification.getFields()[0] : null; if (groupField == null) { throw new SolrException ( SolrException.ErrorCode.BAD_REQUEST, "Specify the group.field as parameter or local parameter" ); } BytesRef prefixBR = prefix != null ? new BytesRef(prefix) : null; TermGroupFacetCollector collector = TermGroupFacetCollector.createTermGroupFacetCollector(groupField, field, multiToken, prefixBR, 128); searcher.search(new MatchAllDocsQuery(), base.getTopFilter(), collector); boolean orderByCount = sort.equals(FacetParams.FACET_SORT_COUNT) || sort.equals(FacetParams.FACET_SORT_COUNT_LEGACY); TermGroupFacetCollector.GroupedFacetResult result = collector.mergeSegmentResults(offset + limit, mincount, orderByCount); CharsRef charsRef = new CharsRef(); FieldType facetFieldType = searcher.getSchema().getFieldType(field); NamedList<Integer> facetCounts = new NamedList<Integer>(); List<TermGroupFacetCollector.FacetEntry> scopedEntries = result.getFacetEntries(offset, limit); for (TermGroupFacetCollector.FacetEntry facetEntry : scopedEntries) { facetFieldType.indexedToReadable(facetEntry.getValue(), charsRef); facetCounts.add(charsRef.toString(), facetEntry.getCount()); } if (missing) { facetCounts.add(null, result.getTotalMissingCount()); } return facetCounts; }
// in core/src/java/org/apache/solr/request/SimpleFacets.java
Deprecated public void getFacetDateCounts(String dateFacet, NamedList<Object> resOuter) throws IOException, ParseException { final IndexSchema schema = searcher.getSchema(); parseParams(FacetParams.FACET_DATE, dateFacet); String f = facetValue; final NamedList<Object> resInner = new SimpleOrderedMap<Object>(); resOuter.add(key, resInner); final SchemaField sf = schema.getField(f); if (! (sf.getType() instanceof DateField)) { throw new SolrException (SolrException.ErrorCode.BAD_REQUEST, "Can not date facet on a field which is not a DateField: " + f); } final DateField ft = (DateField) sf.getType(); final String startS = required.getFieldParam(f,FacetParams.FACET_DATE_START); final Date start; try { start = ft.parseMath(null, startS); } catch (SolrException e) { throw new SolrException (SolrException.ErrorCode.BAD_REQUEST, "date facet 'start' is not a valid Date string: " + startS, e); } final String endS = required.getFieldParam(f,FacetParams.FACET_DATE_END); Date end; // not final, hardend may change this try { end = ft.parseMath(null, endS); } catch (SolrException e) { throw new SolrException (SolrException.ErrorCode.BAD_REQUEST, "date facet 'end' is not a valid Date string: " + endS, e); } if (end.before(start)) { throw new SolrException (SolrException.ErrorCode.BAD_REQUEST, "date facet 'end' comes before 'start': "+endS+" < "+startS); } final String gap = required.getFieldParam(f,FacetParams.FACET_DATE_GAP); final DateMathParser dmp = new DateMathParser(); final int minCount = params.getFieldInt(f,FacetParams.FACET_MINCOUNT, 0); String[] iStrs = params.getFieldParams(f,FacetParams.FACET_DATE_INCLUDE); // Legacy support for default of [lower,upper,edge] for date faceting // this is not handled by FacetRangeInclude.parseParam because // range faceting has differnet defaults final EnumSet<FacetRangeInclude> include = (null == iStrs || 0 == iStrs.length ) ? EnumSet.of(FacetRangeInclude.LOWER, FacetRangeInclude.UPPER, FacetRangeInclude.EDGE) : FacetRangeInclude.parseParam(iStrs); try { Date low = start; while (low.before(end)) { dmp.setNow(low); String label = ft.toExternal(low); Date high = dmp.parseMath(gap); if (end.before(high)) { if (params.getFieldBool(f,FacetParams.FACET_DATE_HARD_END,false)) { high = end; } else { end = high; } } if (high.before(low)) { throw new SolrException (SolrException.ErrorCode.BAD_REQUEST, "date facet infinite loop (is gap negative?)"); } final boolean includeLower = (include.contains(FacetRangeInclude.LOWER) || (include.contains(FacetRangeInclude.EDGE) && low.equals(start))); final boolean includeUpper = (include.contains(FacetRangeInclude.UPPER) || (include.contains(FacetRangeInclude.EDGE) && high.equals(end))); final int count = rangeCount(sf,low,high,includeLower,includeUpper); if (count >= minCount) { resInner.add(label, count); } low = high; } } catch (java.text.ParseException e) { throw new SolrException (SolrException.ErrorCode.BAD_REQUEST, "date facet 'gap' is not a valid Date Math string: " + gap, e); } // explicitly return the gap and end so all the counts // (including before/after/between) are meaningful - even if mincount // has removed the neighboring ranges resInner.add("gap", gap); resInner.add("start", start); resInner.add("end", end); final String[] othersP = params.getFieldParams(f,FacetParams.FACET_DATE_OTHER); if (null != othersP && 0 < othersP.length ) { final Set<FacetRangeOther> others = EnumSet.noneOf(FacetRangeOther.class); for (final String o : othersP) { others.add(FacetRangeOther.get(o)); } // no matter what other values are listed, we don't do // anything if "none" is specified. if (! others.contains(FacetRangeOther.NONE) ) { boolean all = others.contains(FacetRangeOther.ALL); if (all || others.contains(FacetRangeOther.BEFORE)) { // include upper bound if "outer" or if first gap doesn't already include it resInner.add(FacetRangeOther.BEFORE.toString(), rangeCount(sf,null,start, false, (include.contains(FacetRangeInclude.OUTER) || (! (include.contains(FacetRangeInclude.LOWER) || include.contains(FacetRangeInclude.EDGE)))))); } if (all || others.contains(FacetRangeOther.AFTER)) { // include lower bound if "outer" or if last gap doesn't already include it resInner.add(FacetRangeOther.AFTER.toString(), rangeCount(sf,end,null, (include.contains(FacetRangeInclude.OUTER) || (! (include.contains(FacetRangeInclude.UPPER) || include.contains(FacetRangeInclude.EDGE)))), false)); } if (all || others.contains(FacetRangeOther.BETWEEN)) { resInner.add(FacetRangeOther.BETWEEN.toString(), rangeCount(sf,start,end, (include.contains(FacetRangeInclude.LOWER) || include.contains(FacetRangeInclude.EDGE)), (include.contains(FacetRangeInclude.UPPER) || include.contains(FacetRangeInclude.EDGE)))); } } } }
// in core/src/java/org/apache/solr/request/SimpleFacets.java
void getFacetRangeCounts(String facetRange, NamedList<Object> resOuter) throws IOException, ParseException { final IndexSchema schema = searcher.getSchema(); parseParams(FacetParams.FACET_RANGE, facetRange); String f = facetValue; final SchemaField sf = schema.getField(f); final FieldType ft = sf.getType(); RangeEndpointCalculator<?> calc = null; if (ft instanceof TrieField) { final TrieField trie = (TrieField)ft; switch (trie.getType()) { case FLOAT: calc = new FloatRangeEndpointCalculator(sf); break; case DOUBLE: calc = new DoubleRangeEndpointCalculator(sf); break; case INTEGER: calc = new IntegerRangeEndpointCalculator(sf); break; case LONG: calc = new LongRangeEndpointCalculator(sf); break; default: throw new SolrException (SolrException.ErrorCode.BAD_REQUEST, "Unable to range facet on tried field of unexpected type:" + f); } } else if (ft instanceof DateField) { calc = new DateRangeEndpointCalculator(sf, null); } else if (ft instanceof SortableIntField) { calc = new IntegerRangeEndpointCalculator(sf); } else if (ft instanceof SortableLongField) { calc = new LongRangeEndpointCalculator(sf); } else if (ft instanceof SortableFloatField) { calc = new FloatRangeEndpointCalculator(sf); } else if (ft instanceof SortableDoubleField) { calc = new DoubleRangeEndpointCalculator(sf); } else { throw new SolrException (SolrException.ErrorCode.BAD_REQUEST, "Unable to range facet on field:" + sf); } resOuter.add(key, getFacetRangeCounts(sf, calc)); }
// in core/src/java/org/apache/solr/request/SimpleFacets.java
private <T extends Comparable<T>> NamedList getFacetRangeCounts (final SchemaField sf, final RangeEndpointCalculator<T> calc) throws IOException { final String f = sf.getName(); final NamedList<Object> res = new SimpleOrderedMap<Object>(); final NamedList<Integer> counts = new NamedList<Integer>(); res.add("counts", counts); final T start = calc.getValue(required.getFieldParam(f,FacetParams.FACET_RANGE_START)); // not final, hardend may change this T end = calc.getValue(required.getFieldParam(f,FacetParams.FACET_RANGE_END)); if (end.compareTo(start) < 0) { throw new SolrException (SolrException.ErrorCode.BAD_REQUEST, "range facet 'end' comes before 'start': "+end+" < "+start); } final String gap = required.getFieldParam(f, FacetParams.FACET_RANGE_GAP); // explicitly return the gap. compute this early so we are more // likely to catch parse errors before attempting math res.add("gap", calc.getGap(gap)); final int minCount = params.getFieldInt(f,FacetParams.FACET_MINCOUNT, 0); final EnumSet<FacetRangeInclude> include = FacetRangeInclude.parseParam (params.getFieldParams(f,FacetParams.FACET_RANGE_INCLUDE)); T low = start; while (low.compareTo(end) < 0) { T high = calc.addGap(low, gap); if (end.compareTo(high) < 0) { if (params.getFieldBool(f,FacetParams.FACET_RANGE_HARD_END,false)) { high = end; } else { end = high; } } if (high.compareTo(low) < 0) { throw new SolrException (SolrException.ErrorCode.BAD_REQUEST, "range facet infinite loop (is gap negative? did the math overflow?)"); } final boolean includeLower = (include.contains(FacetRangeInclude.LOWER) || (include.contains(FacetRangeInclude.EDGE) && 0 == low.compareTo(start))); final boolean includeUpper = (include.contains(FacetRangeInclude.UPPER) || (include.contains(FacetRangeInclude.EDGE) && 0 == high.compareTo(end))); final String lowS = calc.formatValue(low); final String highS = calc.formatValue(high); final int count = rangeCount(sf, lowS, highS, includeLower,includeUpper); if (count >= minCount) { counts.add(lowS, count); } low = high; } // explicitly return the start and end so all the counts // (including before/after/between) are meaningful - even if mincount // has removed the neighboring ranges res.add("start", start); res.add("end", end); final String[] othersP = params.getFieldParams(f,FacetParams.FACET_RANGE_OTHER); if (null != othersP && 0 < othersP.length ) { Set<FacetRangeOther> others = EnumSet.noneOf(FacetRangeOther.class); for (final String o : othersP) { others.add(FacetRangeOther.get(o)); } // no matter what other values are listed, we don't do // anything if "none" is specified. if (! others.contains(FacetRangeOther.NONE) ) { boolean all = others.contains(FacetRangeOther.ALL); final String startS = calc.formatValue(start); final String endS = calc.formatValue(end); if (all || others.contains(FacetRangeOther.BEFORE)) { // include upper bound if "outer" or if first gap doesn't already include it res.add(FacetRangeOther.BEFORE.toString(), rangeCount(sf,null,startS, false, (include.contains(FacetRangeInclude.OUTER) || (! (include.contains(FacetRangeInclude.LOWER) || include.contains(FacetRangeInclude.EDGE)))))); } if (all || others.contains(FacetRangeOther.AFTER)) { // include lower bound if "outer" or if last gap doesn't already include it res.add(FacetRangeOther.AFTER.toString(), rangeCount(sf,endS,null, (include.contains(FacetRangeInclude.OUTER) || (! (include.contains(FacetRangeInclude.UPPER) || include.contains(FacetRangeInclude.EDGE)))), false)); } if (all || others.contains(FacetRangeOther.BETWEEN)) { res.add(FacetRangeOther.BETWEEN.toString(), rangeCount(sf,startS,endS, (include.contains(FacetRangeInclude.LOWER) || include.contains(FacetRangeInclude.EDGE)), (include.contains(FacetRangeInclude.UPPER) || include.contains(FacetRangeInclude.EDGE)))); } } } return res; }
// in core/src/java/org/apache/solr/request/SimpleFacets.java
public final T getValue(final String rawval) { try { return parseVal(rawval); } catch (Exception e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Can't parse value "+rawval+" for field: " + field.getName(), e); } }
// in core/src/java/org/apache/solr/request/SimpleFacets.java
public final Object getGap(final String gap) { try { return parseGap(gap); } catch (Exception e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Can't parse gap "+gap+" for field: " + field.getName(), e); } }
// in core/src/java/org/apache/solr/request/SimpleFacets.java
public final T addGap(T value, String gap) { try { return parseAndAddGap(value, gap); } catch (Exception e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Can't add gap "+gap+" to value " + value + " for field: " + field.getName(), e); } }
// in core/src/java/org/apache/solr/request/SolrRequestInfo.java
public TimeZone getClientTimeZone() { if (tz == null) { String tzStr = req.getParams().get(CommonParams.TZ); if (tzStr != null) { tz = TimeZoneUtils.getTimeZone(tzStr); if (null == tz) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Solr JVM does not support TZ: " + tzStr); } } } return tz; }
// in core/src/java/org/apache/solr/servlet/SolrDispatchFilter.java
public void doFilter(ServletRequest request, ServletResponse response, FilterChain chain) throws IOException, ServletException { if( abortErrorMessage != null ) { ((HttpServletResponse)response).sendError( 500, abortErrorMessage ); return; } if (this.cores == null) { ((HttpServletResponse)response).sendError( 403, "Server is shutting down" ); return; } CoreContainer cores = this.cores; SolrCore core = null; SolrQueryRequest solrReq = null; if( request instanceof HttpServletRequest) { HttpServletRequest req = (HttpServletRequest)request; HttpServletResponse resp = (HttpServletResponse)response; SolrRequestHandler handler = null; String corename = ""; try { // put the core container in request attribute req.setAttribute("org.apache.solr.CoreContainer", cores); String path = req.getServletPath(); if( req.getPathInfo() != null ) { // this lets you handle /update/commit when /update is a servlet path += req.getPathInfo(); } if( pathPrefix != null && path.startsWith( pathPrefix ) ) { path = path.substring( pathPrefix.length() ); } // check for management path String alternate = cores.getManagementPath(); if (alternate != null && path.startsWith(alternate)) { path = path.substring(0, alternate.length()); } // unused feature ? int idx = path.indexOf( ':' ); if( idx > 0 ) { // save the portion after the ':' for a 'handler' path parameter path = path.substring( 0, idx ); } // Check for the core admin page if( path.equals( cores.getAdminPath() ) ) { handler = cores.getMultiCoreHandler(); solrReq = adminRequestParser.parse(null,path, req); handleAdminRequest(req, response, handler, solrReq); return; } else { //otherwise, we should find a core from the path idx = path.indexOf( "/", 1 ); if( idx > 1 ) { // try to get the corename as a request parameter first corename = path.substring( 1, idx ); core = cores.getCore(corename); if (core != null) { path = path.substring( idx ); } } if (core == null) { if (!cores.isZooKeeperAware() ) { core = cores.getCore(""); } } } if (core == null && cores.isZooKeeperAware()) { // we couldn't find the core - lets make sure a collection was not specified instead core = getCoreByCollection(cores, corename, path); if (core != null) { // we found a core, update the path path = path.substring( idx ); } else { // try the default core core = cores.getCore(""); } // TODO: if we couldn't find it locally, look on other nodes } // With a valid core... if( core != null ) { final SolrConfig config = core.getSolrConfig(); // get or create/cache the parser for the core SolrRequestParsers parser = null; parser = parsers.get(config); if( parser == null ) { parser = new SolrRequestParsers(config); parsers.put(config, parser ); } // Determine the handler from the url path if not set // (we might already have selected the cores handler) if( handler == null && path.length() > 1 ) { // don't match "" or "/" as valid path handler = core.getRequestHandler( path ); // no handler yet but allowed to handle select; let's check if( handler == null && parser.isHandleSelect() ) { if( "/select".equals( path ) || "/select/".equals( path ) ) { solrReq = parser.parse( core, path, req ); String qt = solrReq.getParams().get( CommonParams.QT ); handler = core.getRequestHandler( qt ); if( handler == null ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "unknown handler: "+qt); } if( qt != null && qt.startsWith("/") && (handler instanceof ContentStreamHandlerBase)) { //For security reasons it's a bad idea to allow a leading '/', ex: /select?qt=/update see SOLR-3161 //There was no restriction from Solr 1.4 thru 3.5 and it's not supported for update handlers. throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "Invalid query type. Do not use /select to access: "+qt); } } } } // With a valid handler and a valid core... if( handler != null ) { // if not a /select, create the request if( solrReq == null ) { solrReq = parser.parse( core, path, req ); } final Method reqMethod = Method.getMethod(req.getMethod()); HttpCacheHeaderUtil.setCacheControlHeader(config, resp, reqMethod); // unless we have been explicitly told not to, do cache validation // if we fail cache validation, execute the query if (config.getHttpCachingConfig().isNever304() || !HttpCacheHeaderUtil.doCacheHeaderValidation(solrReq, req, reqMethod, resp)) { SolrQueryResponse solrRsp = new SolrQueryResponse(); /* even for HEAD requests, we need to execute the handler to * ensure we don't get an error (and to make sure the correct * QueryResponseWriter is selected and we get the correct * Content-Type) */ SolrRequestInfo.setRequestInfo(new SolrRequestInfo(solrReq, solrRsp)); this.execute( req, handler, solrReq, solrRsp ); HttpCacheHeaderUtil.checkHttpCachingVeto(solrRsp, resp, reqMethod); // add info to http headers //TODO: See SOLR-232 and SOLR-267. /*try { NamedList solrRspHeader = solrRsp.getResponseHeader(); for (int i=0; i<solrRspHeader.size(); i++) { ((javax.servlet.http.HttpServletResponse) response).addHeader(("Solr-" + solrRspHeader.getName(i)), String.valueOf(solrRspHeader.getVal(i))); } } catch (ClassCastException cce) { log.log(Level.WARNING, "exception adding response header log information", cce); }*/ QueryResponseWriter responseWriter = core.getQueryResponseWriter(solrReq); writeResponse(solrRsp, response, responseWriter, solrReq, reqMethod); } return; // we are done with a valid handler } } log.debug("no handler or core retrieved for " + path + ", follow through..."); } catch (Throwable ex) { sendError( core, solrReq, request, (HttpServletResponse)response, ex ); return; } finally { if( solrReq != null ) { solrReq.close(); } if (core != null) { core.close(); } SolrRequestInfo.clearRequestInfo(); } } // Otherwise let the webapp handle the request chain.doFilter(request, response); }
// in core/src/java/org/apache/solr/servlet/SolrRequestParsers.java
public SolrQueryRequest buildRequestFrom( SolrCore core, SolrParams params, Collection<ContentStream> streams ) throws Exception { // The content type will be applied to all streaming content String contentType = params.get( CommonParams.STREAM_CONTENTTYPE ); // Handle anything with a remoteURL String[] strs = params.getParams( CommonParams.STREAM_URL ); if( strs != null ) { if( !enableRemoteStreams ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "Remote Streaming is disabled." ); } for( final String url : strs ) { ContentStreamBase stream = new ContentStreamBase.URLStream( new URL(url) ); if( contentType != null ) { stream.setContentType( contentType ); } streams.add( stream ); } } // Handle streaming files strs = params.getParams( CommonParams.STREAM_FILE ); if( strs != null ) { if( !enableRemoteStreams ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "Remote Streaming is disabled." ); } for( final String file : strs ) { ContentStreamBase stream = new ContentStreamBase.FileStream( new File(file) ); if( contentType != null ) { stream.setContentType( contentType ); } streams.add( stream ); } } // Check for streams in the request parameters strs = params.getParams( CommonParams.STREAM_BODY ); if( strs != null ) { for( final String body : strs ) { ContentStreamBase stream = new ContentStreamBase.StringStream( body ); if( contentType != null ) { stream.setContentType( contentType ); } streams.add( stream ); } } SolrQueryRequestBase q = new SolrQueryRequestBase( core, params ) { }; if( streams != null && streams.size() > 0 ) { q.setContentStreams( streams ); } return q; }
// in core/src/java/org/apache/solr/servlet/SolrRequestParsers.java
public static MultiMapSolrParams parseQueryString(String queryString) { Map<String,String[]> map = new HashMap<String, String[]>(); if( queryString != null && queryString.length() > 0 ) { try { for( String kv : queryString.split( "&" ) ) { int idx = kv.indexOf( '=' ); if( idx > 0 ) { String name = URLDecoder.decode( kv.substring( 0, idx ), "UTF-8"); String value = URLDecoder.decode( kv.substring( idx+1 ), "UTF-8"); MultiMapSolrParams.addParam( name, value, map ); } else { String name = URLDecoder.decode( kv, "UTF-8" ); MultiMapSolrParams.addParam( name, "", map ); } } } catch( UnsupportedEncodingException uex ) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, uex ); } } return new MultiMapSolrParams( map ); }
// in core/src/java/org/apache/solr/servlet/SolrRequestParsers.java
public SolrParams parseParamsAndFillStreams( final HttpServletRequest req, ArrayList<ContentStream> streams ) throws Exception { if( !ServletFileUpload.isMultipartContent(req) ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "Not multipart content! "+req.getContentType() ); } MultiMapSolrParams params = SolrRequestParsers.parseQueryString( req.getQueryString() ); // Create a factory for disk-based file items DiskFileItemFactory factory = new DiskFileItemFactory(); // Set factory constraints // TODO - configure factory.setSizeThreshold(yourMaxMemorySize); // TODO - configure factory.setRepository(yourTempDirectory); // Create a new file upload handler ServletFileUpload upload = new ServletFileUpload(factory); upload.setSizeMax( uploadLimitKB*1024 ); // Parse the request List items = upload.parseRequest(req); Iterator iter = items.iterator(); while (iter.hasNext()) { FileItem item = (FileItem) iter.next(); // If its a form field, put it in our parameter map if (item.isFormField()) { MultiMapSolrParams.addParam( item.getFieldName(), item.getString(), params.getMap() ); } // Add the stream else { streams.add( new FileItemContentStream( item ) ); } } return params; }
// in core/src/java/org/apache/solr/servlet/SolrRequestParsers.java
public SolrParams parseParamsAndFillStreams( final HttpServletRequest req, ArrayList<ContentStream> streams ) throws Exception { String method = req.getMethod().toUpperCase(Locale.ENGLISH); if( "GET".equals( method ) || "HEAD".equals( method )) { return new ServletSolrParams(req); } if( "POST".equals( method ) ) { String contentType = req.getContentType(); if( contentType != null ) { int idx = contentType.indexOf( ';' ); if( idx > 0 ) { // remove the charset definition "; charset=utf-8" contentType = contentType.substring( 0, idx ); } if( "application/x-www-form-urlencoded".equals( contentType.toLowerCase(Locale.ENGLISH) ) ) { return new ServletSolrParams(req); // just get the params from parameterMap } if( ServletFileUpload.isMultipartContent(req) ) { return multipart.parseParamsAndFillStreams(req, streams); } } return raw.parseParamsAndFillStreams(req, streams); } throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "Unsupported method: "+method ); }
// in core/src/java/org/apache/solr/servlet/cache/HttpCacheHeaderUtil.java
public static long calcLastModified(final SolrQueryRequest solrReq) { final SolrCore core = solrReq.getCore(); final SolrIndexSearcher searcher = solrReq.getSearcher(); final LastModFrom lastModFrom = core.getSolrConfig().getHttpCachingConfig().getLastModFrom(); long lastMod; try { // assume default, change if needed (getOpenTime() should be fast) lastMod = LastModFrom.DIRLASTMOD == lastModFrom ? IndexDeletionPolicyWrapper.getCommitTimestamp(searcher.getIndexReader().getIndexCommit()) : searcher.getOpenTime(); } catch (IOException e) { // we're pretty freaking screwed if this happens throw new SolrException(ErrorCode.SERVER_ERROR, e); } // Get the time where the searcher has been opened // We get rid of the milliseconds because the HTTP header has only // second granularity return lastMod - (lastMod % 1000L); }
// in core/src/java/org/apache/solr/servlet/DirectSolrConnection.java
public String request(String path, SolrParams params, String body) throws Exception { // Extract the handler from the path or params SolrRequestHandler handler = core.getRequestHandler( path ); if( handler == null ) { if( "/select".equals( path ) || "/select/".equalsIgnoreCase( path) ) { if (params == null) params = new MapSolrParams( new HashMap<String, String>() ); String qt = params.get( CommonParams.QT ); handler = core.getRequestHandler( qt ); if( handler == null ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "unknown handler: "+qt); } } } if( handler == null ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "unknown handler: "+path ); } return request(handler, params, body); }
// in core/src/java/org/apache/solr/schema/OpenExchangeRatesOrgProvider.java
public double getExchangeRate(String sourceCurrencyCode, String targetCurrencyCode) { if (rates == null) { throw new SolrException(SolrException.ErrorCode.SERVICE_UNAVAILABLE, "Rates not initialized."); } if (sourceCurrencyCode == null || targetCurrencyCode == null) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Cannot get exchange rate; currency was null."); } if (rates.getTimestamp() + refreshInterval*60*1000 > System.currentTimeMillis()) { log.debug("Refresh interval has expired. Refreshing exchange rates."); reload(); } Double source = (Double) rates.getRates().get(sourceCurrencyCode); Double target = (Double) rates.getRates().get(targetCurrencyCode); if (source == null || target == null) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "No available conversion rate from " + sourceCurrencyCode + " to " + targetCurrencyCode + ". " + "Available rates are "+listAvailableCurrencies()); } return target / source; }
// in core/src/java/org/apache/solr/schema/OpenExchangeRatesOrgProvider.java
Override public Set<String> listAvailableCurrencies() { if (rates == null) throw new SolrException(ErrorCode.SERVER_ERROR, "Rates not initialized"); return rates.getRates().keySet(); }
// in core/src/java/org/apache/solr/schema/OpenExchangeRatesOrgProvider.java
Override public boolean reload() throws SolrException { InputStream ratesJsonStream = null; try { log.info("Reloading exchange rates from "+ratesFileLocation); try { ratesJsonStream = (new URL(ratesFileLocation)).openStream(); } catch (Exception e) { ratesJsonStream = resourceLoader.openResource(ratesFileLocation); } rates = new OpenExchangeRates(ratesJsonStream); return true; } catch (Exception e) { throw new SolrException(ErrorCode.SERVER_ERROR, "Error reloading exchange rates", e); } finally { if (ratesJsonStream != null) try { ratesJsonStream.close(); } catch (IOException e) { throw new SolrException(ErrorCode.SERVER_ERROR, "Error closing stream", e); } } }
// in core/src/java/org/apache/solr/schema/OpenExchangeRatesOrgProvider.java
Override public void init(Map<String,String> params) throws SolrException { try { ratesFileLocation = getParam(params.get(PARAM_RATES_FILE_LOCATION), DEFAULT_RATES_FILE_LOCATION); refreshInterval = Integer.parseInt(getParam(params.get(PARAM_REFRESH_INTERVAL), DEFAULT_REFRESH_INTERVAL)); // Force a refresh interval of minimum one hour, since the API does not offer better resolution if (refreshInterval < 60) { refreshInterval = 60; log.warn("Specified refreshInterval was too small. Setting to 60 minutes which is the update rate of openexchangerates.org"); } log.info("Initialized with rates="+ratesFileLocation+", refreshInterval="+refreshInterval+"."); } catch (Exception e) { throw new SolrException(ErrorCode.BAD_REQUEST, "Error initializing", e); } finally { // Removing config params custom to us params.remove(PARAM_RATES_FILE_LOCATION); params.remove(PARAM_REFRESH_INTERVAL); } }
// in core/src/java/org/apache/solr/schema/IndexSchema.java
private void readSchema(InputSource is) { log.info("Reading Solr Schema"); try { // pass the config resource loader to avoid building an empty one for no reason: // in the current case though, the stream is valid so we wont load the resource by name Config schemaConf = new Config(loader, "schema", is, "/schema/"); Document document = schemaConf.getDocument(); final XPath xpath = schemaConf.getXPath(); final List<SchemaAware> schemaAware = new ArrayList<SchemaAware>(); Node nd = (Node) xpath.evaluate("/schema/@name", document, XPathConstants.NODE); if (nd==null) { log.warn("schema has no name!"); } else { name = nd.getNodeValue(); log.info("Schema name=" + name); } version = schemaConf.getFloat("/schema/@version", 1.0f); // load the Field Types final FieldTypePluginLoader typeLoader = new FieldTypePluginLoader(this, fieldTypes, schemaAware); String expression = "/schema/types/fieldtype | /schema/types/fieldType"; NodeList nodes = (NodeList) xpath.evaluate(expression, document, XPathConstants.NODESET); typeLoader.load( loader, nodes ); // load the Fields // Hang on to the fields that say if they are required -- this lets us set a reasonable default for the unique key Map<String,Boolean> explicitRequiredProp = new HashMap<String, Boolean>(); ArrayList<DynamicField> dFields = new ArrayList<DynamicField>(); expression = "/schema/fields/field | /schema/fields/dynamicField"; nodes = (NodeList) xpath.evaluate(expression, document, XPathConstants.NODESET); for (int i=0; i<nodes.getLength(); i++) { Node node = nodes.item(i); NamedNodeMap attrs = node.getAttributes(); String name = DOMUtil.getAttr(attrs,"name","field definition"); log.trace("reading field def "+name); String type = DOMUtil.getAttr(attrs,"type","field " + name); FieldType ft = fieldTypes.get(type); if (ft==null) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST,"Unknown fieldtype '" + type + "' specified on field " + name); } Map<String,String> args = DOMUtil.toMapExcept(attrs, "name", "type"); if( args.get( "required" ) != null ) { explicitRequiredProp.put( name, Boolean.valueOf( args.get( "required" ) ) ); } SchemaField f = SchemaField.create(name,ft,args); if (node.getNodeName().equals("field")) { SchemaField old = fields.put(f.getName(),f); if( old != null ) { String msg = "[schema.xml] Duplicate field definition for '" + f.getName() + "' [[["+old.toString()+"]]] and [[["+f.toString()+"]]]"; throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, msg ); } log.debug("field defined: " + f); if( f.getDefaultValue() != null ) { log.debug(name+" contains default value: " + f.getDefaultValue()); fieldsWithDefaultValue.add( f ); } if (f.isRequired()) { log.debug(name+" is required in this schema"); requiredFields.add(f); } } else if (node.getNodeName().equals("dynamicField")) { // make sure nothing else has the same path addDynamicField(dFields, f); } else { // we should never get here throw new RuntimeException("Unknown field type"); } } //fields with default values are by definition required //add them to required fields, and we only have to loop once // in DocumentBuilder.getDoc() requiredFields.addAll(getFieldsWithDefaultValue()); // OK, now sort the dynamic fields largest to smallest size so we don't get // any false matches. We want to act like a compiler tool and try and match // the largest string possible. Collections.sort(dFields); log.trace("Dynamic Field Ordering:" + dFields); // stuff it in a normal array for faster access dynamicFields = dFields.toArray(new DynamicField[dFields.size()]); Node node = (Node) xpath.evaluate("/schema/similarity", document, XPathConstants.NODE); SimilarityFactory simFactory = readSimilarity(loader, node); if (simFactory == null) { simFactory = new DefaultSimilarityFactory(); } if (simFactory instanceof SchemaAware) { ((SchemaAware)simFactory).inform(this); } similarity = simFactory.getSimilarity(); node = (Node) xpath.evaluate("/schema/defaultSearchField/text()", document, XPathConstants.NODE); if (node==null) { log.warn("no default search field specified in schema."); } else { defaultSearchFieldName=node.getNodeValue().trim(); // throw exception if specified, but not found or not indexed if (defaultSearchFieldName!=null) { SchemaField defaultSearchField = getFields().get(defaultSearchFieldName); if ((defaultSearchField == null) || !defaultSearchField.indexed()) { String msg = "default search field '" + defaultSearchFieldName + "' not defined or not indexed" ; throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, msg ); } } log.info("default search field is "+defaultSearchFieldName); } node = (Node) xpath.evaluate("/schema/solrQueryParser/@defaultOperator", document, XPathConstants.NODE); if (node==null) { log.debug("using default query parser operator (OR)"); } else { queryParserDefaultOperator=node.getNodeValue().trim(); log.info("query parser default operator is "+queryParserDefaultOperator); } node = (Node) xpath.evaluate("/schema/uniqueKey/text()", document, XPathConstants.NODE); if (node==null) { log.warn("no uniqueKey specified in schema."); } else { uniqueKeyField=getIndexedField(node.getNodeValue().trim()); if (!uniqueKeyField.stored()) { log.error("uniqueKey is not stored - distributed search will not work"); } if (uniqueKeyField.multiValued()) { log.error("uniqueKey should not be multivalued"); } uniqueKeyFieldName=uniqueKeyField.getName(); uniqueKeyFieldType=uniqueKeyField.getType(); log.info("unique key field: "+uniqueKeyFieldName); // Unless the uniqueKeyField is marked 'required=false' then make sure it exists if( Boolean.FALSE != explicitRequiredProp.get( uniqueKeyFieldName ) ) { uniqueKeyField.required = true; requiredFields.add(uniqueKeyField); } } /////////////// parse out copyField commands /////////////// // Map<String,ArrayList<SchemaField>> cfields = new HashMap<String,ArrayList<SchemaField>>(); // expression = "/schema/copyField"; dynamicCopyFields = new DynamicCopy[] {}; expression = "//copyField"; nodes = (NodeList) xpath.evaluate(expression, document, XPathConstants.NODESET); for (int i=0; i<nodes.getLength(); i++) { node = nodes.item(i); NamedNodeMap attrs = node.getAttributes(); String source = DOMUtil.getAttr(attrs,"source","copyField definition"); String dest = DOMUtil.getAttr(attrs,"dest", "copyField definition"); String maxChars = DOMUtil.getAttr(attrs, "maxChars"); int maxCharsInt = CopyField.UNLIMITED; if (maxChars != null) { try { maxCharsInt = Integer.parseInt(maxChars); } catch (NumberFormatException e) { log.warn("Couldn't parse maxChars attribute for copyField from " + source + " to " + dest + " as integer. The whole field will be copied."); } } registerCopyField(source, dest, maxCharsInt); } for (Map.Entry<SchemaField, Integer> entry : copyFieldTargetCounts.entrySet()) { if (entry.getValue() > 1 && !entry.getKey().multiValued()) { log.warn("Field " + entry.getKey().name + " is not multivalued "+ "and destination for multiple copyFields ("+ entry.getValue()+")"); } } //Run the callbacks on SchemaAware now that everything else is done for (SchemaAware aware : schemaAware) { aware.inform(this); } } catch (SolrException e) { throw e; } catch(Exception e) { // unexpected exception... throw new SolrException( SolrException.ErrorCode.SERVER_ERROR,"Schema Parsing Failed: " + e.getMessage(), e); } // create the field analyzers refreshAnalyzers(); }
// in core/src/java/org/apache/solr/schema/IndexSchema.java
private void addDynamicField(List<DynamicField> dFields, SchemaField f) { boolean dup = isDuplicateDynField(dFields, f); if( !dup ) { addDynamicFieldNoDupCheck(dFields, f); } else { String msg = "[schema.xml] Duplicate DynamicField definition for '" + f.getName() + "'"; throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, msg); } }
// in core/src/java/org/apache/solr/schema/IndexSchema.java
public void registerCopyField( String source, String dest, int maxChars ) { boolean sourceIsPattern = isWildCard(source); boolean destIsPattern = isWildCard(dest); log.debug("copyField source='"+source+"' dest='"+dest+"' maxChars='"+maxChars); SchemaField d = getFieldOrNull(dest); if(d == null){ throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "copyField destination :'"+dest+"' does not exist" ); } if(sourceIsPattern) { if( destIsPattern ) { DynamicField df = null; for( DynamicField dd : dynamicFields ) { if( dd.regex.equals( dest ) ) { df = dd; break; } } if( df == null ) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "copyField dynamic destination must match a dynamicField." ); } registerDynamicCopyField(new DynamicDestCopy(source, df, maxChars )); } else { registerDynamicCopyField(new DynamicCopy(source, d, maxChars)); } } else if( destIsPattern ) { String msg = "copyField only supports a dynamic destination if the source is also dynamic" ; throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, msg ); } else { // retrieve the field to force an exception if it doesn't exist SchemaField f = getField(source); List<CopyField> copyFieldList = copyFieldsMap.get(source); if (copyFieldList == null) { copyFieldList = new ArrayList<CopyField>(); copyFieldsMap.put(source, copyFieldList); } copyFieldList.add(new CopyField(f, d, maxChars)); copyFieldTargetCounts.put(d, (copyFieldTargetCounts.containsKey(d) ? copyFieldTargetCounts.get(d) + 1 : 1)); } }
// in core/src/java/org/apache/solr/schema/IndexSchema.java
public SchemaField getField(String fieldName) { SchemaField f = getFieldOrNull(fieldName); if (f != null) return f; // Hmmm, default field could also be implemented with a dynamic field of "*". // It would have to be special-cased and only used if nothing else matched. /*** REMOVED -YCS if (defaultFieldType != null) return new SchemaField(fieldName,defaultFieldType); ***/ throw new SolrException( SolrException.ErrorCode.BAD_REQUEST,"undefined field: \""+fieldName+"\""); }
// in core/src/java/org/apache/solr/schema/IndexSchema.java
public FieldType getDynamicFieldType(String fieldName) { for (DynamicField df : dynamicFields) { if (df.matches(fieldName)) return df.prototype.getType(); } throw new SolrException( SolrException.ErrorCode.BAD_REQUEST,"undefined field "+fieldName); }
// in core/src/java/org/apache/solr/schema/DateField.java
public Date parseMath(Date now, String val) { String math = null; final DateMathParser p = new DateMathParser(); if (null != now) p.setNow(now); if (val.startsWith(NOW)) { math = val.substring(NOW.length()); } else { final int zz = val.indexOf(Z); if (0 < zz) { math = val.substring(zz+1); try { // p.setNow(toObject(val.substring(0,zz))); p.setNow(parseDate(val.substring(0,zz+1))); } catch (ParseException e) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "Invalid Date in Date Math String:'" +val+'\'',e); } } else { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "Invalid Date String:'" +val+'\''); } } if (null == math || math.equals("")) { return p.getNow(); } try { return p.parseMath(math); } catch (ParseException e) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "Invalid Date Math String:'" +val+'\'',e); } }
// in core/src/java/org/apache/solr/schema/DateField.java
public Date parseMathLenient(Date now, String val, SolrQueryRequest req) { String math = null; final DateMathParser p = new DateMathParser(); if (null != now) p.setNow(now); if (val.startsWith(NOW)) { math = val.substring(NOW.length()); } else { final int zz = val.indexOf(Z); if (0 < zz) { math = val.substring(zz+1); try { // p.setNow(toObject(val.substring(0,zz))); p.setNow(parseDateLenient(val.substring(0,zz+1), req)); } catch (ParseException e) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "Invalid Date in Date Math String:'" +val+'\'',e); } } else { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "Invalid Date String:'" +val+'\''); } } if (null == math || math.equals("")) { return p.getNow(); } try { return p.parseMath(math); } catch (ParseException e) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "Invalid Date Math String:'" +val+'\'',e); } }
// in core/src/java/org/apache/solr/schema/CurrencyField.java
Override protected void init(IndexSchema schema, Map<String, String> args) { super.init(schema, args); this.schema = schema; this.exchangeRateProviderClass = args.get(PARAM_RATE_PROVIDER_CLASS); this.defaultCurrency = args.get(PARAM_DEFAULT_CURRENCY); if (this.defaultCurrency == null) { this.defaultCurrency = DEFAULT_DEFAULT_CURRENCY; } if (this.exchangeRateProviderClass == null) { this.exchangeRateProviderClass = DEFAULT_RATE_PROVIDER_CLASS; } if (java.util.Currency.getInstance(this.defaultCurrency) == null) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Invalid currency code " + this.defaultCurrency); } String precisionStepString = args.get(PARAM_PRECISION_STEP); if (precisionStepString == null) { precisionStepString = DEFAULT_PRECISION_STEP; } // Initialize field type for amount fieldTypeAmountRaw = new TrieLongField(); fieldTypeAmountRaw.setTypeName("amount_raw_type_tlong"); Map<String,String> map = new HashMap<String,String>(1); map.put("precisionStep", precisionStepString); fieldTypeAmountRaw.init(schema, map); // Initialize field type for currency string fieldTypeCurrency = new StrField(); fieldTypeCurrency.setTypeName("currency_type_string"); fieldTypeCurrency.init(schema, new HashMap<String,String>()); args.remove(PARAM_RATE_PROVIDER_CLASS); args.remove(PARAM_DEFAULT_CURRENCY); args.remove(PARAM_PRECISION_STEP); try { Class<? extends ExchangeRateProvider> c = schema.getResourceLoader().findClass(exchangeRateProviderClass, ExchangeRateProvider.class); provider = c.newInstance(); provider.init(args); } catch (Exception e) { throw new SolrException(ErrorCode.BAD_REQUEST, "Error instansiating exhange rate provider "+exchangeRateProviderClass+". Please check your FieldType configuration", e); } }
// in core/src/java/org/apache/solr/schema/CurrencyField.java
Override public Query getRangeQuery(QParser parser, SchemaField field, String part1, String part2, final boolean minInclusive, final boolean maxInclusive) { final CurrencyValue p1 = CurrencyValue.parse(part1, defaultCurrency); final CurrencyValue p2 = CurrencyValue.parse(part2, defaultCurrency); if (!p1.getCurrencyCode().equals(p2.getCurrencyCode())) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Cannot parse range query " + part1 + " to " + part2 + ": range queries only supported when upper and lower bound have same currency."); } return getRangeQuery(parser, field, p1, p2, minInclusive, maxInclusive); }
// in core/src/java/org/apache/solr/schema/CurrencyField.java
Override public SortField getSortField(SchemaField field, boolean reverse) { try { // Convert all values to default currency for sorting. return (new CurrencyValueSource(field, defaultCurrency, null)).getSortField(reverse); } catch (IOException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); } }
// in core/src/java/org/apache/solr/schema/CurrencyField.java
public double getExchangeRate(String sourceCurrencyCode, String targetCurrencyCode) { if (sourceCurrencyCode == null || targetCurrencyCode == null) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Cannot get exchange rate; currency was null."); } if (sourceCurrencyCode.equals(targetCurrencyCode)) { return 1.0; } Double directRate = lookupRate(sourceCurrencyCode, targetCurrencyCode); if (directRate != null) { return directRate; } Double symmetricRate = lookupRate(targetCurrencyCode, sourceCurrencyCode); if (symmetricRate != null) { return 1.0 / symmetricRate; } throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "No available conversion rate between " + sourceCurrencyCode + " to " + targetCurrencyCode); }
// in core/src/java/org/apache/solr/schema/CurrencyField.java
Override public boolean reload() throws SolrException { InputStream is = null; Map<String, Map<String, Double>> tmpRates = new HashMap<String, Map<String, Double>>(); try { log.info("Reloading exchange rates from file "+this.currencyConfigFile); is = loader.openResource(currencyConfigFile); javax.xml.parsers.DocumentBuilderFactory dbf = DocumentBuilderFactory.newInstance(); try { dbf.setXIncludeAware(true); dbf.setNamespaceAware(true); } catch (UnsupportedOperationException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "XML parser doesn't support XInclude option", e); } try { Document doc = dbf.newDocumentBuilder().parse(is); XPathFactory xpathFactory = XPathFactory.newInstance(); XPath xpath = xpathFactory.newXPath(); // Parse exchange rates. NodeList nodes = (NodeList) xpath.evaluate("/currencyConfig/rates/rate", doc, XPathConstants.NODESET); for (int i = 0; i < nodes.getLength(); i++) { Node rateNode = nodes.item(i); NamedNodeMap attributes = rateNode.getAttributes(); Node from = attributes.getNamedItem("from"); Node to = attributes.getNamedItem("to"); Node rate = attributes.getNamedItem("rate"); if (from == null || to == null || rate == null) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Exchange rate missing attributes (required: from, to, rate) " + rateNode); } String fromCurrency = from.getNodeValue(); String toCurrency = to.getNodeValue(); Double exchangeRate; if (java.util.Currency.getInstance(fromCurrency) == null || java.util.Currency.getInstance(toCurrency) == null) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Could not find from currency specified in exchange rate: " + rateNode); } try { exchangeRate = Double.parseDouble(rate.getNodeValue()); } catch (NumberFormatException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Could not parse exchange rate: " + rateNode, e); } addRate(tmpRates, fromCurrency, toCurrency, exchangeRate); } } catch (SAXException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Error parsing currency config.", e); } catch (IOException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Error parsing currency config.", e); } catch (ParserConfigurationException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Error parsing currency config.", e); } catch (XPathExpressionException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Error parsing currency config.", e); } } catch (IOException e) { throw new SolrException(ErrorCode.BAD_REQUEST, "Error while opening Currency configuration file "+currencyConfigFile, e); } finally { try { if (is != null) { is.close(); } } catch (IOException e) { e.printStackTrace(); } } // Atomically swap in the new rates map, if it loaded successfully this.rates = tmpRates; return true; }
// in core/src/java/org/apache/solr/schema/CurrencyField.java
Override public void init(Map<String,String> params) throws SolrException { this.currencyConfigFile = params.get(PARAM_CURRENCY_CONFIG); if(currencyConfigFile == null) { throw new SolrException(ErrorCode.NOT_FOUND, "Missing required configuration "+PARAM_CURRENCY_CONFIG); } // Removing config params custom to us params.remove(PARAM_CURRENCY_CONFIG); }
// in core/src/java/org/apache/solr/schema/CurrencyField.java
Override public void inform(ResourceLoader loader) throws SolrException { if(loader == null) { throw new SolrException(ErrorCode.BAD_REQUEST, "Needs ResourceLoader in order to load config file"); } this.loader = loader; reload(); }
// in core/src/java/org/apache/solr/schema/CurrencyField.java
public static CurrencyValue parse(String externalVal, String defaultCurrency) { String amount = externalVal; String code = defaultCurrency; if (externalVal.contains(",")) { String[] amountAndCode = externalVal.split(","); amount = amountAndCode[0]; code = amountAndCode[1]; } Currency currency = java.util.Currency.getInstance(code); if (currency == null) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Invalid currency code " + code); } try { double value = Double.parseDouble(amount); long currencyValue = Math.round(value * Math.pow(10.0, currency.getDefaultFractionDigits())); return new CurrencyValue(currencyValue, code); } catch (NumberFormatException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); } }
// in core/src/java/org/apache/solr/schema/TrieField.java
Override protected void init(IndexSchema schema, Map<String, String> args) { super.init(schema, args); String p = args.remove("precisionStep"); if (p != null) { precisionStepArg = Integer.parseInt(p); } // normalize the precisionStep precisionStep = precisionStepArg; if (precisionStep<=0 || precisionStep>=64) precisionStep=Integer.MAX_VALUE; String t = args.remove("type"); if (t != null) { try { type = TrieTypes.valueOf(t.toUpperCase(Locale.ENGLISH)); } catch (IllegalArgumentException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Invalid type specified in schema.xml for field: " + args.get("name"), e); } } CharFilterFactory[] filterFactories = new CharFilterFactory[0]; TokenFilterFactory[] tokenFilterFactories = new TokenFilterFactory[0]; analyzer = new TokenizerChain(filterFactories, new TrieTokenizerFactory(type, precisionStep), tokenFilterFactories); // for query time we only need one token, so we use the biggest possible precisionStep: queryAnalyzer = new TokenizerChain(filterFactories, new TrieTokenizerFactory(type, Integer.MAX_VALUE), tokenFilterFactories); }
// in core/src/java/org/apache/solr/schema/TrieField.java
Override public Object toObject(IndexableField f) { final Number val = f.numericValue(); if (val != null) { return (type == TrieTypes.DATE) ? new Date(val.longValue()) : val; } else { // the following code is "deprecated" and only to support pre-3.2 indexes using the old BinaryField encoding: final BytesRef bytes = f.binaryValue(); if (bytes==null) return badFieldString(f); switch (type) { case INTEGER: return toInt(bytes.bytes, bytes.offset); case FLOAT: return Float.intBitsToFloat(toInt(bytes.bytes, bytes.offset)); case LONG: return toLong(bytes.bytes, bytes.offset); case DOUBLE: return Double.longBitsToDouble(toLong(bytes.bytes, bytes.offset)); case DATE: return new Date(toLong(bytes.bytes, bytes.offset)); default: throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Unknown type for trie field: " + f.name()); } } }
// in core/src/java/org/apache/solr/schema/TrieField.java
Override public SortField getSortField(SchemaField field, boolean top) { field.checkSortability(); Object missingValue = null; boolean sortMissingLast = field.sortMissingLast(); boolean sortMissingFirst = field.sortMissingFirst(); switch (type) { case INTEGER: if( sortMissingLast ) { missingValue = top ? Integer.MIN_VALUE : Integer.MAX_VALUE; } else if( sortMissingFirst ) { missingValue = top ? Integer.MAX_VALUE : Integer.MIN_VALUE; } return new SortField( field.getName(), FieldCache.NUMERIC_UTILS_INT_PARSER, top).setMissingValue(missingValue); case FLOAT: if( sortMissingLast ) { missingValue = top ? Float.NEGATIVE_INFINITY : Float.POSITIVE_INFINITY; } else if( sortMissingFirst ) { missingValue = top ? Float.POSITIVE_INFINITY : Float.NEGATIVE_INFINITY; } return new SortField( field.getName(), FieldCache.NUMERIC_UTILS_FLOAT_PARSER, top).setMissingValue(missingValue); case DATE: // fallthrough case LONG: if( sortMissingLast ) { missingValue = top ? Long.MIN_VALUE : Long.MAX_VALUE; } else if( sortMissingFirst ) { missingValue = top ? Long.MAX_VALUE : Long.MIN_VALUE; } return new SortField( field.getName(), FieldCache.NUMERIC_UTILS_LONG_PARSER, top).setMissingValue(missingValue); case DOUBLE: if( sortMissingLast ) { missingValue = top ? Double.NEGATIVE_INFINITY : Double.POSITIVE_INFINITY; } else if( sortMissingFirst ) { missingValue = top ? Double.POSITIVE_INFINITY : Double.NEGATIVE_INFINITY; } return new SortField( field.getName(), FieldCache.NUMERIC_UTILS_DOUBLE_PARSER, top).setMissingValue(missingValue); default: throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Unknown type for trie field: " + field.name); } }
// in core/src/java/org/apache/solr/schema/TrieField.java
Override public ValueSource getValueSource(SchemaField field, QParser qparser) { field.checkFieldCacheSource(qparser); switch (type) { case INTEGER: return new IntFieldSource( field.getName(), FieldCache.NUMERIC_UTILS_INT_PARSER ); case FLOAT: return new FloatFieldSource( field.getName(), FieldCache.NUMERIC_UTILS_FLOAT_PARSER ); case DATE: return new TrieDateFieldSource( field.getName(), FieldCache.NUMERIC_UTILS_LONG_PARSER ); case LONG: return new LongFieldSource( field.getName(), FieldCache.NUMERIC_UTILS_LONG_PARSER ); case DOUBLE: return new DoubleFieldSource( field.getName(), FieldCache.NUMERIC_UTILS_DOUBLE_PARSER ); default: throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Unknown type for trie field: " + field.name); } }
// in core/src/java/org/apache/solr/schema/TrieField.java
Override public Query getRangeQuery(QParser parser, SchemaField field, String min, String max, boolean minInclusive, boolean maxInclusive) { int ps = precisionStep; Query query = null; switch (type) { case INTEGER: query = NumericRangeQuery.newIntRange(field.getName(), ps, min == null ? null : Integer.parseInt(min), max == null ? null : Integer.parseInt(max), minInclusive, maxInclusive); break; case FLOAT: query = NumericRangeQuery.newFloatRange(field.getName(), ps, min == null ? null : Float.parseFloat(min), max == null ? null : Float.parseFloat(max), minInclusive, maxInclusive); break; case LONG: query = NumericRangeQuery.newLongRange(field.getName(), ps, min == null ? null : Long.parseLong(min), max == null ? null : Long.parseLong(max), minInclusive, maxInclusive); break; case DOUBLE: query = NumericRangeQuery.newDoubleRange(field.getName(), ps, min == null ? null : Double.parseDouble(min), max == null ? null : Double.parseDouble(max), minInclusive, maxInclusive); break; case DATE: query = NumericRangeQuery.newLongRange(field.getName(), ps, min == null ? null : dateField.parseMath(null, min).getTime(), max == null ? null : dateField.parseMath(null, max).getTime(), minInclusive, maxInclusive); break; default: throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Unknown type for trie field"); } return query; }
// in core/src/java/org/apache/solr/schema/TrieField.java
Override public void readableToIndexed(CharSequence val, BytesRef result) { String s = val.toString(); switch (type) { case INTEGER: NumericUtils.intToPrefixCoded(Integer.parseInt(s), 0, result); break; case FLOAT: NumericUtils.intToPrefixCoded(NumericUtils.floatToSortableInt(Float.parseFloat(s)), 0, result); break; case LONG: NumericUtils.longToPrefixCoded(Long.parseLong(s), 0, result); break; case DOUBLE: NumericUtils.longToPrefixCoded(NumericUtils.doubleToSortableLong(Double.parseDouble(s)), 0, result); break; case DATE: NumericUtils.longToPrefixCoded(dateField.parseMath(null, s).getTime(), 0, result); break; default: throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Unknown type for trie field: " + type); } }
// in core/src/java/org/apache/solr/schema/TrieField.java
Override public String indexedToReadable(String _indexedForm) { final BytesRef indexedForm = new BytesRef(_indexedForm); switch (type) { case INTEGER: return Integer.toString( NumericUtils.prefixCodedToInt(indexedForm) ); case FLOAT: return Float.toString( NumericUtils.sortableIntToFloat(NumericUtils.prefixCodedToInt(indexedForm)) ); case LONG: return Long.toString( NumericUtils.prefixCodedToLong(indexedForm) ); case DOUBLE: return Double.toString( NumericUtils.sortableLongToDouble(NumericUtils.prefixCodedToLong(indexedForm)) ); case DATE: return dateField.toExternal( new Date(NumericUtils.prefixCodedToLong(indexedForm)) ); default: throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Unknown type for trie field: " + type); } }
// in core/src/java/org/apache/solr/schema/TrieField.java
Override public CharsRef indexedToReadable(BytesRef indexedForm, CharsRef charsRef) { final String value; switch (type) { case INTEGER: value = Integer.toString( NumericUtils.prefixCodedToInt(indexedForm) ); break; case FLOAT: value = Float.toString( NumericUtils.sortableIntToFloat(NumericUtils.prefixCodedToInt(indexedForm)) ); break; case LONG: value = Long.toString( NumericUtils.prefixCodedToLong(indexedForm) ); break; case DOUBLE: value = Double.toString( NumericUtils.sortableLongToDouble(NumericUtils.prefixCodedToLong(indexedForm)) ); break; case DATE: value = dateField.toExternal( new Date(NumericUtils.prefixCodedToLong(indexedForm)) ); break; default: throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Unknown type for trie field: " + type); } charsRef.grow(value.length()); charsRef.length = value.length(); value.getChars(0, charsRef.length, charsRef.chars, 0); return charsRef; }
// in core/src/java/org/apache/solr/schema/TrieField.java
Override public Object toObject(SchemaField sf, BytesRef term) { switch (type) { case INTEGER: return NumericUtils.prefixCodedToInt(term); case FLOAT: return NumericUtils.sortableIntToFloat(NumericUtils.prefixCodedToInt(term)); case LONG: return NumericUtils.prefixCodedToLong(term); case DOUBLE: return NumericUtils.sortableLongToDouble(NumericUtils.prefixCodedToLong(term)); case DATE: return new Date(NumericUtils.prefixCodedToLong(term)); default: throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Unknown type for trie field: " + type); } }
// in core/src/java/org/apache/solr/schema/TrieField.java
Override public String storedToIndexed(IndexableField f) { final BytesRef bytes = new BytesRef(NumericUtils.BUF_SIZE_LONG); final Number val = f.numericValue(); if (val != null) { switch (type) { case INTEGER: NumericUtils.intToPrefixCoded(val.intValue(), 0, bytes); break; case FLOAT: NumericUtils.intToPrefixCoded(NumericUtils.floatToSortableInt(val.floatValue()), 0, bytes); break; case LONG: //fallthrough! case DATE: NumericUtils.longToPrefixCoded(val.longValue(), 0, bytes); break; case DOUBLE: NumericUtils.longToPrefixCoded(NumericUtils.doubleToSortableLong(val.doubleValue()), 0, bytes); break; default: throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Unknown type for trie field: " + f.name()); } } else { // the following code is "deprecated" and only to support pre-3.2 indexes using the old BinaryField encoding: final BytesRef bytesRef = f.binaryValue(); if (bytesRef==null) throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Invalid field contents: "+f.name()); switch (type) { case INTEGER: NumericUtils.intToPrefixCoded(toInt(bytesRef.bytes, bytesRef.offset), 0, bytes); break; case FLOAT: { // WARNING: Code Duplication! Keep in sync with o.a.l.util.NumericUtils! // copied from NumericUtils to not convert to/from float two times // code in next 2 lines is identical to: int v = NumericUtils.floatToSortableInt(Float.intBitsToFloat(toInt(arr))); int v = toInt(bytesRef.bytes, bytesRef.offset); if (v<0) v ^= 0x7fffffff; NumericUtils.intToPrefixCoded(v, 0, bytes); break; } case LONG: //fallthrough! case DATE: NumericUtils.longToPrefixCoded(toLong(bytesRef.bytes, bytesRef.offset), 0, bytes); break; case DOUBLE: { // WARNING: Code Duplication! Keep in sync with o.a.l.util.NumericUtils! // copied from NumericUtils to not convert to/from double two times // code in next 2 lines is identical to: long v = NumericUtils.doubleToSortableLong(Double.longBitsToDouble(toLong(arr))); long v = toLong(bytesRef.bytes, bytesRef.offset); if (v<0) v ^= 0x7fffffffffffffffL; NumericUtils.longToPrefixCoded(v, 0, bytes); break; } default: throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Unknown type for trie field: " + f.name()); } } return bytes.utf8ToString(); }
// in core/src/java/org/apache/solr/schema/TrieField.java
Override public IndexableField createField(SchemaField field, Object value, float boost) { boolean indexed = field.indexed(); boolean stored = field.stored(); if (!indexed && !stored) { if (log.isTraceEnabled()) log.trace("Ignoring unindexed/unstored field: " + field); return null; } FieldType ft = new FieldType(); ft.setStored(stored); ft.setTokenized(true); ft.setIndexed(indexed); ft.setOmitNorms(field.omitNorms()); ft.setIndexOptions(getIndexOptions(field, value.toString())); switch (type) { case INTEGER: ft.setNumericType(NumericType.INT); break; case FLOAT: ft.setNumericType(NumericType.FLOAT); break; case LONG: ft.setNumericType(NumericType.LONG); break; case DOUBLE: ft.setNumericType(NumericType.DOUBLE); break; case DATE: ft.setNumericType(NumericType.LONG); break; default: throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Unknown type for trie field: " + type); } ft.setNumericPrecisionStep(precisionStep); final org.apache.lucene.document.Field f; switch (type) { case INTEGER: int i = (value instanceof Number) ? ((Number)value).intValue() : Integer.parseInt(value.toString()); f = new org.apache.lucene.document.IntField(field.getName(), i, ft); break; case FLOAT: float fl = (value instanceof Number) ? ((Number)value).floatValue() : Float.parseFloat(value.toString()); f = new org.apache.lucene.document.FloatField(field.getName(), fl, ft); break; case LONG: long l = (value instanceof Number) ? ((Number)value).longValue() : Long.parseLong(value.toString()); f = new org.apache.lucene.document.LongField(field.getName(), l, ft); break; case DOUBLE: double d = (value instanceof Number) ? ((Number)value).doubleValue() : Double.parseDouble(value.toString()); f = new org.apache.lucene.document.DoubleField(field.getName(), d, ft); break; case DATE: Date date = (value instanceof Date) ? ((Date)value) : dateField.parseMath(null, value.toString()); f = new org.apache.lucene.document.LongField(field.getName(), date.getTime(), ft); break; default: throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Unknown type for trie field: " + type); } f.setBoost(boost); return f; }
// in core/src/java/org/apache/solr/schema/TrieField.java
public static String getMainValuePrefix(org.apache.solr.schema.FieldType ft) { if (ft instanceof TrieDateField) ft = ((TrieDateField) ft).wrappedField; if (ft instanceof TrieField) { final TrieField trie = (TrieField)ft; if (trie.precisionStep == Integer.MAX_VALUE) return null; switch (trie.type) { case INTEGER: case FLOAT: return INT_PREFIX; case LONG: case DOUBLE: case DATE: return LONG_PREFIX; default: throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Unknown type for trie field: " + trie.type); } } return null; }
// in core/src/java/org/apache/solr/schema/LatLonType.java
Override public IndexableField[] createFields(SchemaField field, Object value, float boost) { String externalVal = value.toString(); //we could have tileDiff + 3 fields (two for the lat/lon, one for storage) IndexableField[] f = new IndexableField[(field.indexed() ? 2 : 0) + (field.stored() ? 1 : 0)]; if (field.indexed()) { int i = 0; double[] latLon; try { latLon = ParseUtils.parseLatitudeLongitude(null, externalVal); } catch (InvalidShapeException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); } //latitude SchemaField lat = subField(field, i); f[i] = lat.createField(String.valueOf(latLon[LAT]), lat.omitNorms() ? 1F : boost); i++; //longitude SchemaField lon = subField(field, i); f[i] = lon.createField(String.valueOf(latLon[LON]), lon.omitNorms() ? 1F : boost); } if (field.stored()) { FieldType customType = new FieldType(); customType.setStored(true); f[f.length - 1] = createField(field.getName(), externalVal, customType, boost); } return f; }
// in core/src/java/org/apache/solr/schema/LatLonType.java
Override public Query getRangeQuery(QParser parser, SchemaField field, String part1, String part2, boolean minInclusive, boolean maxInclusive) { int dimension = 2; String[] p1; String[] p2; try { p1 = ParseUtils.parsePoint(null, part1, dimension); p2 = ParseUtils.parsePoint(null, part2, dimension); } catch (InvalidShapeException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); } BooleanQuery result = new BooleanQuery(true); for (int i = 0; i < dimension; i++) { SchemaField subSF = subField(field, i); // points must currently be ordered... should we support specifying any two opposite corner points? result.add(subSF.getType().getRangeQuery(parser, subSF, p1[i], p2[i], minInclusive, maxInclusive), BooleanClause.Occur.MUST); } return result; }
// in core/src/java/org/apache/solr/schema/LatLonType.java
Override public Query getFieldQuery(QParser parser, SchemaField field, String externalVal) { int dimension = 2; String[] p1 = new String[0]; try { p1 = ParseUtils.parsePoint(null, externalVal, dimension); } catch (InvalidShapeException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); } BooleanQuery bq = new BooleanQuery(true); for (int i = 0; i < dimension; i++) { SchemaField sf = subField(field, i); Query tq = sf.getType().getFieldQuery(parser, sf, p1[i]); bq.add(tq, BooleanClause.Occur.MUST); } return bq; }
// in core/src/java/org/apache/solr/schema/LatLonType.java
Override public Query createSpatialQuery(QParser parser, SpatialOptions options) { double[] point = null; try { point = ParseUtils.parseLatitudeLongitude(options.pointStr); } catch (InvalidShapeException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); } // lat & lon in degrees double latCenter = point[LAT]; double lonCenter = point[LON]; DistanceCalculator distCalc = new GeodesicSphereDistCalc.Haversine(options.units.earthRadius()); SpatialContext ctx = new SimpleSpatialContext(options.units,distCalc,null); Rectangle bbox = DistanceUtils.calcBoxByDistFromPtDEG(latCenter, lonCenter, options.distance, ctx); double latMin = bbox.getMinY(); double latMax = bbox.getMaxY(); double lonMin, lonMax, lon2Min, lon2Max; if (bbox.getCrossesDateLine()) { lonMin = -180; lonMax = bbox.getMaxX(); lon2Min = bbox.getMinX(); lon2Max = 180; } else { lonMin = bbox.getMinX(); lonMax = bbox.getMaxX(); lon2Min = -180; lon2Max = 180; } // Now that we've figured out the ranges, build them! SchemaField latField = subField(options.field, LAT); SchemaField lonField = subField(options.field, LON); SpatialDistanceQuery spatial = new SpatialDistanceQuery(); if (options.bbox) { BooleanQuery result = new BooleanQuery(); Query latRange = latField.getType().getRangeQuery(parser, latField, String.valueOf(latMin), String.valueOf(latMax), true, true); result.add(latRange, BooleanClause.Occur.MUST); if (lonMin != -180 || lonMax != 180) { Query lonRange = lonField.getType().getRangeQuery(parser, lonField, String.valueOf(lonMin), String.valueOf(lonMax), true, true); if (lon2Min != -180 || lon2Max != 180) { // another valid longitude range BooleanQuery bothLons = new BooleanQuery(); bothLons.add(lonRange, BooleanClause.Occur.SHOULD); lonRange = lonField.getType().getRangeQuery(parser, lonField, String.valueOf(lon2Min), String.valueOf(lon2Max), true, true); bothLons.add(lonRange, BooleanClause.Occur.SHOULD); lonRange = bothLons; } result.add(lonRange, BooleanClause.Occur.MUST); } spatial.bboxQuery = result; } spatial.origField = options.field.getName(); spatial.latSource = latField.getType().getValueSource(latField, parser); spatial.lonSource = lonField.getType().getValueSource(lonField, parser); spatial.latMin = latMin; spatial.latMax = latMax; spatial.lonMin = lonMin; spatial.lonMax = lonMax; spatial.lon2Min = lon2Min; spatial.lon2Max = lon2Max; spatial.lon2 = lon2Min != -180 || lon2Max != 180; spatial.latCenter = latCenter; spatial.lonCenter = lonCenter; spatial.dist = options.distance; spatial.planetRadius = options.radius; spatial.calcDist = !options.bbox; return spatial; }
// in core/src/java/org/apache/solr/schema/LatLonType.java
Override public SortField getSortField(SchemaField field, boolean top) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Sorting not supported on LatLonType " + field.getName()); }
// in core/src/java/org/apache/solr/schema/LatLonType.java
Override public DelegatingCollector getFilterCollector(IndexSearcher searcher) { try { return new SpatialCollector(new SpatialWeight(searcher)); } catch (IOException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); } }
// in core/src/java/org/apache/solr/schema/FieldTypePluginLoader.java
public void add(Object current) { if (!(current instanceof MultiTermAwareComponent)) return; AbstractAnalysisFactory newComponent = ((MultiTermAwareComponent)current).getMultiTermComponent(); if (newComponent instanceof TokenFilterFactory) { if (filters == null) { filters = new ArrayList<TokenFilterFactory>(2); } filters.add((TokenFilterFactory)newComponent); } else if (newComponent instanceof TokenizerFactory) { tokenizer = (TokenizerFactory)newComponent; } else if (newComponent instanceof CharFilterFactory) { if (charFilters == null) { charFilters = new ArrayList<CharFilterFactory>(1); } charFilters.add( (CharFilterFactory)newComponent); } else { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Unknown analysis component from MultiTermAwareComponent: " + newComponent); } }
// in core/src/java/org/apache/solr/schema/FieldTypePluginLoader.java
private Analyzer readAnalyzer(Node node) throws XPathExpressionException { final SolrResourceLoader loader = schema.getResourceLoader(); // parent node used to be passed in as "fieldtype" // if (!fieldtype.hasChildNodes()) return null; // Node node = DOMUtil.getChild(fieldtype,"analyzer"); if (node == null) return null; NamedNodeMap attrs = node.getAttributes(); String analyzerName = DOMUtil.getAttr(attrs,"class"); if (analyzerName != null) { try { // No need to be core-aware as Analyzers are not in the core-aware list final Class<? extends Analyzer> clazz = loader.findClass(analyzerName, Analyzer.class); try { // first try to use a ctor with version parameter // (needed for many new Analyzers that have no default one anymore) Constructor<? extends Analyzer> cnstr = clazz.getConstructor(Version.class); final String matchVersionStr = DOMUtil.getAttr(attrs, LUCENE_MATCH_VERSION_PARAM); final Version luceneMatchVersion = (matchVersionStr == null) ? schema.getDefaultLuceneMatchVersion() : Config.parseLuceneVersionString(matchVersionStr); if (luceneMatchVersion == null) { throw new SolrException ( SolrException.ErrorCode.SERVER_ERROR, "Configuration Error: Analyzer '" + clazz.getName() + "' needs a 'luceneMatchVersion' parameter"); } return cnstr.newInstance(luceneMatchVersion); } catch (NoSuchMethodException nsme) { // otherwise use default ctor return clazz.newInstance(); } } catch (Exception e) { log.error("Cannot load analyzer: "+analyzerName, e); throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "Cannot load analyzer: "+analyzerName, e ); } } // Load the CharFilters final ArrayList<CharFilterFactory> charFilters = new ArrayList<CharFilterFactory>(); AbstractPluginLoader<CharFilterFactory> charFilterLoader = new AbstractPluginLoader<CharFilterFactory> ("[schema.xml] analyzer/charFilter", CharFilterFactory.class, false, false) { @Override protected void init(CharFilterFactory plugin, Node node) throws Exception { if( plugin != null ) { final Map<String,String> params = DOMUtil.toMapExcept(node.getAttributes(),"class"); String configuredVersion = params.remove(LUCENE_MATCH_VERSION_PARAM); plugin.setLuceneMatchVersion(parseConfiguredVersion(configuredVersion, plugin.getClass().getSimpleName())); plugin.init( params ); charFilters.add( plugin ); } } @Override protected CharFilterFactory register(String name, CharFilterFactory plugin) { return null; // used for map registration } }; charFilterLoader.load( loader, (NodeList)xpath.evaluate("./charFilter", node, XPathConstants.NODESET) ); // Load the Tokenizer // Although an analyzer only allows a single Tokenizer, we load a list to make sure // the configuration is ok final ArrayList<TokenizerFactory> tokenizers = new ArrayList<TokenizerFactory>(1); AbstractPluginLoader<TokenizerFactory> tokenizerLoader = new AbstractPluginLoader<TokenizerFactory> ("[schema.xml] analyzer/tokenizer", TokenizerFactory.class, false, false) { @Override protected void init(TokenizerFactory plugin, Node node) throws Exception { if( !tokenizers.isEmpty() ) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "The schema defines multiple tokenizers for: "+node ); } final Map<String,String> params = DOMUtil.toMapExcept(node.getAttributes(),"class"); String configuredVersion = params.remove(LUCENE_MATCH_VERSION_PARAM); plugin.setLuceneMatchVersion(parseConfiguredVersion(configuredVersion, plugin.getClass().getSimpleName())); plugin.init( params ); tokenizers.add( plugin ); } @Override protected TokenizerFactory register(String name, TokenizerFactory plugin) { return null; // used for map registration } }; tokenizerLoader.load( loader, (NodeList)xpath.evaluate("./tokenizer", node, XPathConstants.NODESET) ); // Make sure something was loaded if( tokenizers.isEmpty() ) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR,"analyzer without class or tokenizer & filter list"); } // Load the Filters final ArrayList<TokenFilterFactory> filters = new ArrayList<TokenFilterFactory>(); AbstractPluginLoader<TokenFilterFactory> filterLoader = new AbstractPluginLoader<TokenFilterFactory>("[schema.xml] analyzer/filter", TokenFilterFactory.class, false, false) { @Override protected void init(TokenFilterFactory plugin, Node node) throws Exception { if( plugin != null ) { final Map<String,String> params = DOMUtil.toMapExcept(node.getAttributes(),"class"); String configuredVersion = params.remove(LUCENE_MATCH_VERSION_PARAM); plugin.setLuceneMatchVersion(parseConfiguredVersion(configuredVersion, plugin.getClass().getSimpleName())); plugin.init( params ); filters.add( plugin ); } } @Override protected TokenFilterFactory register(String name, TokenFilterFactory plugin) throws Exception { return null; // used for map registration } }; filterLoader.load( loader, (NodeList)xpath.evaluate("./filter", node, XPathConstants.NODESET) ); return new TokenizerChain(charFilters.toArray(new CharFilterFactory[charFilters.size()]), tokenizers.get(0), filters.toArray(new TokenFilterFactory[filters.size()])); }
// in core/src/java/org/apache/solr/schema/FieldTypePluginLoader.java
Override protected void init(TokenizerFactory plugin, Node node) throws Exception { if( !tokenizers.isEmpty() ) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "The schema defines multiple tokenizers for: "+node ); } final Map<String,String> params = DOMUtil.toMapExcept(node.getAttributes(),"class"); String configuredVersion = params.remove(LUCENE_MATCH_VERSION_PARAM); plugin.setLuceneMatchVersion(parseConfiguredVersion(configuredVersion, plugin.getClass().getSimpleName())); plugin.init( params ); tokenizers.add( plugin ); }
// in core/src/java/org/apache/solr/schema/PointType.java
Override protected void init(IndexSchema schema, Map<String, String> args) { SolrParams p = new MapSolrParams(args); dimension = p.getInt(DIMENSION, DEFAULT_DIMENSION); if (dimension < 1) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "The dimension must be > 0: " + dimension); } args.remove(DIMENSION); this.schema = schema; super.init(schema, args); // cache suffixes createSuffixCache(dimension); }
// in core/src/java/org/apache/solr/schema/PointType.java
Override public IndexableField[] createFields(SchemaField field, Object value, float boost) { String externalVal = value.toString(); String[] point = new String[0]; try { point = ParseUtils.parsePoint(null, externalVal, dimension); } catch (InvalidShapeException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); } // TODO: this doesn't currently support polyFields as sub-field types IndexableField[] f = new IndexableField[ (field.indexed() ? dimension : 0) + (field.stored() ? 1 : 0) ]; if (field.indexed()) { for (int i=0; i<dimension; i++) { f[i] = subField(field, i).createField(point[i], boost); } } if (field.stored()) { String storedVal = externalVal; // normalize or not? FieldType customType = new FieldType(); customType.setStored(true); f[f.length - 1] = createField(field.getName(), storedVal, customType, boost); } return f; }
// in core/src/java/org/apache/solr/schema/PointType.java
Override public SortField getSortField(SchemaField field, boolean top) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Sorting not supported on PointType " + field.getName()); }
// in core/src/java/org/apache/solr/schema/PointType.java
Override public Query getFieldQuery(QParser parser, SchemaField field, String externalVal) { String[] p1 = new String[0]; try { p1 = ParseUtils.parsePoint(null, externalVal, dimension); } catch (InvalidShapeException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); } //TODO: should we assert that p1.length == dimension? BooleanQuery bq = new BooleanQuery(true); for (int i = 0; i < dimension; i++) { SchemaField sf = subField(field, i); Query tq = sf.getType().getFieldQuery(parser, sf, p1[i]); bq.add(tq, BooleanClause.Occur.MUST); } return bq; }
// in core/src/java/org/apache/solr/schema/PointType.java
public Query createSpatialQuery(QParser parser, SpatialOptions options) { Query result = null; double [] point = new double[0]; try { point = ParseUtils.parsePointDouble(null, options.pointStr, dimension); } catch (InvalidShapeException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); } if (dimension == 1){ //TODO: Handle distance measures String lower = String.valueOf(point[0] - options.distance); String upper = String.valueOf(point[0] + options.distance); SchemaField subSF = subField(options.field, 0); // points must currently be ordered... should we support specifying any two opposite corner points? result = subSF.getType().getRangeQuery(parser, subSF, lower, upper, true, true); } else { BooleanQuery tmp = new BooleanQuery(); //TODO: Handle distance measures, as this assumes Euclidean double [] ur = DistanceUtils.vectorBoxCorner(point, null, options.distance, true); double [] ll = DistanceUtils.vectorBoxCorner(point, null, options.distance, false); for (int i = 0; i < ur.length; i++) { SchemaField subSF = subField(options.field, i); Query range = subSF.getType().getRangeQuery(parser, subSF, String.valueOf(ll[i]), String.valueOf(ur[i]), true, true); tmp.add(range, BooleanClause.Occur.MUST); } result = tmp; } return result; }
// in core/src/java/org/apache/solr/schema/UUIDField.java
Override public String toInternal(String val) { if (val == null || 0==val.length() || NEW.equals(val)) { return UUID.randomUUID().toString().toLowerCase(Locale.ENGLISH); } else { // we do some basic validation if 'val' looks like an UUID if (val.length() != 36 || val.charAt(8) != DASH || val.charAt(13) != DASH || val.charAt(18) != DASH || val.charAt(23) != DASH) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Invalid UUID String: '" + val + "'"); } return val.toLowerCase(Locale.ENGLISH); } }
// in core/src/java/org/apache/solr/schema/GeoHashField.java
public Query createSpatialQuery(QParser parser, SpatialOptions options) { double [] point = new double[0]; try { point = ParseUtils.parsePointDouble(null, options.pointStr, 2); } catch (InvalidShapeException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); } String geohash = GeohashUtils.encodeLatLon(point[0], point[1]); //TODO: optimize this return new SolrConstantScoreQuery(new ValueSourceRangeFilter(new GeohashHaversineFunction(getValueSource(options.field, parser), new LiteralValueSource(geohash), options.radius), "0", String.valueOf(options.distance), true, true)); }
// in core/src/java/org/apache/solr/schema/GeoHashField.java
Override public String toInternal(String val) { // validate that the string is of the form // latitude, longitude double[] latLon = new double[0]; try { latLon = ParseUtils.parseLatitudeLongitude(null, val); } catch (InvalidShapeException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); } return GeohashUtils.encodeLatLon(latLon[0], latLon[1]); }
// in core/src/java/org/apache/solr/schema/ExternalFileField.java
Override protected void init(IndexSchema schema, Map<String, String> args) { restrictProps(SORT_MISSING_FIRST | SORT_MISSING_LAST); // valType has never been used for anything except to throw an error, so make it optional since the // code (see getValueSource) gives you a FileFloatSource. String ftypeS = args.remove("valType"); if (ftypeS != null) { ftype = schema.getFieldTypes().get(ftypeS); if (ftype != null && !(ftype instanceof FloatField) && !(ftype instanceof TrieFloatField)) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Only float and pfloat (Trie|Float)Field are currently supported as external field type. Got " + ftypeS); } } keyFieldName = args.remove("keyField"); String defValS = args.remove("defVal"); defVal = defValS == null ? 0 : Float.parseFloat(defValS); this.schema = schema; }
// in core/src/java/org/apache/solr/schema/SchemaField.java
public void checkSortability() throws SolrException { if (! indexed() ) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "can not sort on unindexed field: " + getName()); } if ( multiValued() ) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "can not sort on multivalued field: " + getName()); } }
// in core/src/java/org/apache/solr/schema/SchemaField.java
public void checkFieldCacheSource(QParser parser) throws SolrException { if (! indexed() ) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "can not use FieldCache on unindexed field: " + getName()); } if ( multiValued() ) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "can not use FieldCache on multivalued field: " + getName()); } }
// in core/src/java/org/apache/solr/schema/TextField.java
public static BytesRef analyzeMultiTerm(String field, String part, Analyzer analyzerIn) { if (part == null) return null; TokenStream source; try { source = analyzerIn.tokenStream(field, new StringReader(part)); source.reset(); } catch (IOException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Unable to initialize TokenStream to analyze multiTerm term: " + part, e); } TermToBytesRefAttribute termAtt = source.getAttribute(TermToBytesRefAttribute.class); BytesRef bytes = termAtt.getBytesRef(); try { if (!source.incrementToken()) throw new SolrException(SolrException.ErrorCode.BAD_REQUEST,"analyzer returned no terms for multiTerm term: " + part); termAtt.fillBytesRef(); if (source.incrementToken()) throw new SolrException(SolrException.ErrorCode.BAD_REQUEST,"analyzer returned too many terms for multiTerm term: " + part); } catch (IOException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST,"error analyzing range part: " + part, e); } try { source.end(); source.close(); } catch (IOException e) { throw new RuntimeException("Unable to end & close TokenStream after analyzing multiTerm term: " + part, e); } return BytesRef.deepCopyOf(bytes); }
// in core/src/java/org/apache/solr/schema/CollationField.java
private void setup(ResourceLoader loader, Map<String,String> args) { String custom = args.remove("custom"); String language = args.remove("language"); String country = args.remove("country"); String variant = args.remove("variant"); String strength = args.remove("strength"); String decomposition = args.remove("decomposition"); final Collator collator; if (custom == null && language == null) throw new SolrException(ErrorCode.SERVER_ERROR, "Either custom or language is required."); if (custom != null && (language != null || country != null || variant != null)) throw new SolrException(ErrorCode.SERVER_ERROR, "Cannot specify both language and custom. " + "To tailor rules for a built-in language, see the javadocs for RuleBasedCollator. " + "Then save the entire customized ruleset to a file, and use with the custom parameter"); if (language != null) { // create from a system collator, based on Locale. collator = createFromLocale(language, country, variant); } else { // create from a custom ruleset collator = createFromRules(custom, loader); } // set the strength flag, otherwise it will be the default. if (strength != null) { if (strength.equalsIgnoreCase("primary")) collator.setStrength(Collator.PRIMARY); else if (strength.equalsIgnoreCase("secondary")) collator.setStrength(Collator.SECONDARY); else if (strength.equalsIgnoreCase("tertiary")) collator.setStrength(Collator.TERTIARY); else if (strength.equalsIgnoreCase("identical")) collator.setStrength(Collator.IDENTICAL); else throw new SolrException(ErrorCode.SERVER_ERROR, "Invalid strength: " + strength); } // set the decomposition flag, otherwise it will be the default. if (decomposition != null) { if (decomposition.equalsIgnoreCase("no")) collator.setDecomposition(Collator.NO_DECOMPOSITION); else if (decomposition.equalsIgnoreCase("canonical")) collator.setDecomposition(Collator.CANONICAL_DECOMPOSITION); else if (decomposition.equalsIgnoreCase("full")) collator.setDecomposition(Collator.FULL_DECOMPOSITION); else throw new SolrException(ErrorCode.SERVER_ERROR, "Invalid decomposition: " + decomposition); } // we use 4.0 because it ensures we just encode the pure byte[] keys. analyzer = new CollationKeyAnalyzer(Version.LUCENE_40, collator); }
// in core/src/java/org/apache/solr/schema/CollationField.java
private Collator createFromLocale(String language, String country, String variant) { Locale locale; if (language != null && country == null && variant != null) throw new SolrException(ErrorCode.SERVER_ERROR, "To specify variant, country is required"); else if (language != null && country != null && variant != null) locale = new Locale(language, country, variant); else if (language != null && country != null) locale = new Locale(language, country); else locale = new Locale(language); return Collator.getInstance(locale); }
// in core/src/java/org/apache/solr/schema/AbstractSubTypeFieldType.java
Override protected void init(IndexSchema schema, Map<String, String> args) { this.schema = schema; //it's not a first class citizen for the IndexSchema SolrParams p = new MapSolrParams(args); String subFT = p.get(SUB_FIELD_TYPE); String subSuffix = p.get(SUB_FIELD_SUFFIX); if (subFT != null) { args.remove(SUB_FIELD_TYPE); subType = schema.getFieldTypeByName(subFT.trim()); suffix = POLY_FIELD_SEPARATOR + subType.typeName; } else if (subSuffix != null) { args.remove(SUB_FIELD_SUFFIX); suffix = subSuffix; } else { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "The field type: " + typeName + " must specify the " + SUB_FIELD_TYPE + " attribute or the " + SUB_FIELD_SUFFIX + " attribute."); } }
// in core/src/java/org/apache/solr/schema/FieldType.java
protected String getArg(String n, Map<String,String> args) { String s = args.remove(n); if (s == null) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Missing parameter '"+n+"' for FieldType=" + typeName +args); } return s; }
// in core/src/java/org/apache/solr/schema/FieldType.java
public IndexableField createField(SchemaField field, Object value, float boost) { if (!field.indexed() && !field.stored()) { if (log.isTraceEnabled()) log.trace("Ignoring unindexed/unstored field: " + field); return null; } String val; try { val = toInternal(value.toString()); } catch (RuntimeException e) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "Error while creating field '" + field + "' from value '" + value + "'", e); } if (val==null) return null; org.apache.lucene.document.FieldType newType = new org.apache.lucene.document.FieldType(); newType.setIndexed(field.indexed()); newType.setTokenized(field.isTokenized()); newType.setStored(field.stored()); newType.setOmitNorms(field.omitNorms()); newType.setIndexOptions(getIndexOptions(field, val)); newType.setStoreTermVectors(field.storeTermVector()); newType.setStoreTermVectorOffsets(field.storeTermOffsets()); newType.setStoreTermVectorPositions(field.storeTermPositions()); return createField(field.getName(), val, newType, boost); }
// in core/src/java/org/apache/solr/schema/FieldType.java
public void setAnalyzer(Analyzer analyzer) { throw new SolrException (ErrorCode.SERVER_ERROR, "FieldType: " + this.getClass().getSimpleName() + " (" + typeName + ") does not support specifying an analyzer"); }
// in core/src/java/org/apache/solr/schema/FieldType.java
public void setQueryAnalyzer(Analyzer analyzer) { throw new SolrException (ErrorCode.SERVER_ERROR, "FieldType: " + this.getClass().getSimpleName() + " (" + typeName + ") does not support specifying an analyzer"); }
// in core/src/java/org/apache/solr/search/SpatialFilterQParser.java
Override public Query parse() throws ParseException { //if more than one, we need to treat them as a point... //TODO: Should we accept multiple fields String[] fields = localParams.getParams("f"); if (fields == null || fields.length == 0) { String field = getParam(SpatialParams.FIELD); if (field == null) throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, " missing sfield for spatial request"); fields = new String[] {field}; } String pointStr = getParam(SpatialParams.POINT); if (pointStr == null) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, SpatialParams.POINT + " missing."); } double dist = -1; String distS = getParam(SpatialParams.DISTANCE); if (distS != null) dist = Double.parseDouble(distS); if (dist < 0) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, SpatialParams.DISTANCE + " must be >= 0"); } String measStr = localParams.get(SpatialParams.MEASURE); //TODO: Need to do something with Measures Query result = null; //fields is valid at this point if (fields.length == 1) { SchemaField sf = req.getSchema().getField(fields[0]); FieldType type = sf.getType(); if (type instanceof SpatialQueryable) { double radius = localParams.getDouble(SpatialParams.SPHERE_RADIUS, DistanceUtils.EARTH_MEAN_RADIUS_KM); SpatialOptions opts = new SpatialOptions(pointStr, dist, sf, measStr, radius, DistanceUnits.KILOMETERS); opts.bbox = bbox; result = ((SpatialQueryable)type).createSpatialQuery(this, opts); } else { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "The field " + fields[0] + " does not support spatial filtering"); } } else {// fields.length > 1 //TODO: Not sure about this just yet, is there a way to delegate, or do we just have a helper class? //Seems like we could just use FunctionQuery, but then what about scoring /*List<ValueSource> sources = new ArrayList<ValueSource>(fields.length); for (String field : fields) { SchemaField sf = schema.getField(field); sources.add(sf.getType().getValueSource(sf, this)); } MultiValueSource vs = new VectorValueSource(sources); ValueSourceRangeFilter rf = new ValueSourceRangeFilter(vs, "0", String.valueOf(dist), true, true); result = new SolrConstantScoreQuery(rf);*/ } return result; }
// in core/src/java/org/apache/solr/search/Grouping.java
public void execute() throws IOException { if (commands.isEmpty()) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Specify at least one field, function or query to group by."); } DocListAndSet out = new DocListAndSet(); qr.setDocListAndSet(out); SolrIndexSearcher.ProcessedFilter pf = searcher.getProcessedFilter(cmd.getFilter(), cmd.getFilterList()); final Filter luceneFilter = pf.filter; maxDoc = searcher.maxDoc(); needScores = (cmd.getFlags() & SolrIndexSearcher.GET_SCORES) != 0; boolean cacheScores = false; // NOTE: Change this when groupSort can be specified per group if (!needScores && !commands.isEmpty()) { if (commands.get(0).groupSort == null) { cacheScores = true; } else { for (SortField field : commands.get(0).groupSort.getSort()) { if (field.getType() == SortField.Type.SCORE) { cacheScores = true; break; } } } } else if (needScores) { cacheScores = needScores; } getDocSet = (cmd.getFlags() & SolrIndexSearcher.GET_DOCSET) != 0; getDocList = (cmd.getFlags() & SolrIndexSearcher.GET_DOCLIST) != 0; query = QueryUtils.makeQueryable(cmd.getQuery()); for (Command cmd : commands) { cmd.prepare(); } AbstractAllGroupHeadsCollector<?> allGroupHeadsCollector = null; List<Collector> collectors = new ArrayList<Collector>(commands.size()); for (Command cmd : commands) { Collector collector = cmd.createFirstPassCollector(); if (collector != null) { collectors.add(collector); } if (getGroupedDocSet && allGroupHeadsCollector == null) { collectors.add(allGroupHeadsCollector = cmd.createAllGroupCollector()); } } Collector allCollectors = MultiCollector.wrap(collectors.toArray(new Collector[collectors.size()])); DocSetCollector setCollector = null; if (getDocSet && allGroupHeadsCollector == null) { setCollector = new DocSetDelegateCollector(maxDoc >> 6, maxDoc, allCollectors); allCollectors = setCollector; } CachingCollector cachedCollector = null; if (cacheSecondPassSearch && allCollectors != null) { int maxDocsToCache = (int) Math.round(maxDoc * (maxDocsPercentageToCache / 100.0d)); // Only makes sense to cache if we cache more than zero. // Maybe we should have a minimum and a maximum, that defines the window we would like caching for. if (maxDocsToCache > 0) { allCollectors = cachedCollector = CachingCollector.create(allCollectors, cacheScores, maxDocsToCache); } } if (pf.postFilter != null) { pf.postFilter.setLastDelegate(allCollectors); allCollectors = pf.postFilter; } if (allCollectors != null) { searchWithTimeLimiter(luceneFilter, allCollectors); } if (getGroupedDocSet && allGroupHeadsCollector != null) { FixedBitSet fixedBitSet = allGroupHeadsCollector.retrieveGroupHeads(maxDoc); long[] bits = fixedBitSet.getBits(); OpenBitSet openBitSet = new OpenBitSet(bits, bits.length); qr.setDocSet(new BitDocSet(openBitSet)); } else if (getDocSet) { qr.setDocSet(setCollector.getDocSet()); } collectors.clear(); for (Command cmd : commands) { Collector collector = cmd.createSecondPassCollector(); if (collector != null) collectors.add(collector); } if (!collectors.isEmpty()) { Collector secondPhaseCollectors = MultiCollector.wrap(collectors.toArray(new Collector[collectors.size()])); if (collectors.size() > 0) { if (cachedCollector != null) { if (cachedCollector.isCached()) { cachedCollector.replay(secondPhaseCollectors); } else { signalCacheWarning = true; logger.warn(String.format("The grouping cache is active, but not used because it exceeded the max cache limit of %d percent", maxDocsPercentageToCache)); logger.warn("Please increase cache size or disable group caching."); searchWithTimeLimiter(luceneFilter, secondPhaseCollectors); } } else { if (pf.postFilter != null) { pf.postFilter.setLastDelegate(secondPhaseCollectors); secondPhaseCollectors = pf.postFilter; } searchWithTimeLimiter(luceneFilter, secondPhaseCollectors); } } } for (Command cmd : commands) { cmd.finish(); } qr.groupedResults = grouped; if (getDocList) { int sz = idSet.size(); int[] ids = new int[sz]; int idx = 0; for (int val : idSet) { ids[idx++] = val; } qr.setDocList(new DocSlice(0, sz, ids, null, maxMatches, maxScore)); } }
// in core/src/java/org/apache/solr/search/ReturnFields.java
private void add(String fl, NamedList<String> rename, DocTransformers augmenters, SolrQueryRequest req) { if( fl == null ) { return; } try { QueryParsing.StrParser sp = new QueryParsing.StrParser(fl); for(;;) { sp.opt(','); sp.eatws(); if (sp.pos >= sp.end) break; int start = sp.pos; // short circuit test for a really simple field name String key = null; String field = getFieldName(sp); char ch = sp.ch(); if (field != null) { if (sp.opt(':')) { // this was a key, not a field name key = field; field = null; sp.eatws(); start = sp.pos; } else { if (ch==' ' || ch == ',' || ch==0) { addField( field, key, augmenters, req ); continue; } // an invalid field name... reset the position pointer to retry sp.pos = start; field = null; } } if (key != null) { // we read "key : " field = sp.getId(null); ch = sp.ch(); if (field != null && (ch==' ' || ch == ',' || ch==0)) { rename.add(field, key); addField( field, key, augmenters, req ); continue; } // an invalid field name... reset the position pointer to retry sp.pos = start; field = null; } if (field == null) { // We didn't find a simple name, so let's see if it's a globbed field name. // Globbing only works with field names of the recommended form (roughly like java identifiers) field = sp.getGlobbedId(null); ch = sp.ch(); if (field != null && (ch==' ' || ch == ',' || ch==0)) { // "*" looks and acts like a glob, but we give it special treatment if ("*".equals(field)) { _wantsAllFields = true; } else { globs.add(field); } continue; } // an invalid glob sp.pos = start; } String funcStr = sp.val.substring(start); // Is it an augmenter of the form [augmenter_name foo=1 bar=myfield]? // This is identical to localParams syntax except it uses [] instead of {!} if (funcStr.startsWith("[")) { Map<String,String> augmenterArgs = new HashMap<String,String>(); int end = QueryParsing.parseLocalParams(funcStr, 0, augmenterArgs, req.getParams(), "[", ']'); sp.pos += end; // [foo] is short for [type=foo] in localParams syntax String augmenterName = augmenterArgs.remove("type"); String disp = key; if( disp == null ) { disp = '['+augmenterName+']'; } TransformerFactory factory = req.getCore().getTransformerFactory( augmenterName ); if( factory != null ) { MapSolrParams augmenterParams = new MapSolrParams( augmenterArgs ); augmenters.addTransformer( factory.create(disp, augmenterParams, req) ); } else { // unknown transformer? } addField(field, disp, augmenters, req); continue; } // let's try it as a function instead QParser parser = QParser.getParser(funcStr, FunctionQParserPlugin.NAME, req); Query q = null; ValueSource vs = null; try { if (parser instanceof FunctionQParser) { FunctionQParser fparser = (FunctionQParser)parser; fparser.setParseMultipleSources(false); fparser.setParseToEnd(false); q = fparser.getQuery(); if (fparser.localParams != null) { if (fparser.valFollowedParams) { // need to find the end of the function query via the string parser int leftOver = fparser.sp.end - fparser.sp.pos; sp.pos = sp.end - leftOver; // reset our parser to the same amount of leftover } else { // the value was via the "v" param in localParams, so we need to find // the end of the local params themselves to pick up where we left off sp.pos = start + fparser.localParamsEnd; } } else { // need to find the end of the function query via the string parser int leftOver = fparser.sp.end - fparser.sp.pos; sp.pos = sp.end - leftOver; // reset our parser to the same amount of leftover } } else { // A QParser that's not for function queries. // It must have been specified via local params. q = parser.getQuery(); assert parser.getLocalParams() != null; sp.pos = start + parser.localParamsEnd; } if (q instanceof FunctionQuery) { vs = ((FunctionQuery)q).getValueSource(); } else { vs = new QueryValueSource(q, 0.0f); } if (key==null) { SolrParams localParams = parser.getLocalParams(); if (localParams != null) { key = localParams.get("key"); } if (key == null) { // use the function name itself as the field name key = sp.val.substring(start, sp.pos); } } if (key==null) { key = funcStr; } okFieldNames.add( key ); okFieldNames.add( funcStr ); augmenters.addTransformer( new ValueSourceAugmenter( key, parser, vs ) ); } catch (ParseException e) { // try again, simple rules for a field name with no whitespace sp.pos = start; field = sp.getSimpleString(); if (req.getSchema().getFieldOrNull(field) != null) { // OK, it was an oddly named field fields.add(field); if( key != null ) { rename.add(field, key); } } else { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Error parsing fieldname: " + e.getMessage(), e); } } // end try as function } // end for(;;) } catch (ParseException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Error parsing fieldname", e); } }
// in core/src/java/org/apache/solr/search/SolrConstantScoreQuery.java
Override public Weight createWeight(IndexSearcher searcher) { try { return new SolrConstantScoreQuery.ConstantWeight(searcher); } catch (IOException e) { // TODO: remove this if ConstantScoreQuery.createWeight adds IOException throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); } }
// in core/src/java/org/apache/solr/search/QueryParsing.java
public static Sort parseSort(String sortSpec, SolrQueryRequest req) { if (sortSpec == null || sortSpec.length() == 0) return null; List<SortField> lst = new ArrayList<SortField>(4); try { StrParser sp = new StrParser(sortSpec); while (sp.pos < sp.end) { sp.eatws(); final int start = sp.pos; // short circuit test for a really simple field name String field = sp.getId(null); Exception qParserException = null; if (field == null || !Character.isWhitespace(sp.peekChar())) { // let's try it as a function instead field = null; String funcStr = sp.val.substring(start); QParser parser = QParser.getParser(funcStr, FunctionQParserPlugin.NAME, req); Query q = null; try { if (parser instanceof FunctionQParser) { FunctionQParser fparser = (FunctionQParser)parser; fparser.setParseMultipleSources(false); fparser.setParseToEnd(false); q = fparser.getQuery(); if (fparser.localParams != null) { if (fparser.valFollowedParams) { // need to find the end of the function query via the string parser int leftOver = fparser.sp.end - fparser.sp.pos; sp.pos = sp.end - leftOver; // reset our parser to the same amount of leftover } else { // the value was via the "v" param in localParams, so we need to find // the end of the local params themselves to pick up where we left off sp.pos = start + fparser.localParamsEnd; } } else { // need to find the end of the function query via the string parser int leftOver = fparser.sp.end - fparser.sp.pos; sp.pos = sp.end - leftOver; // reset our parser to the same amount of leftover } } else { // A QParser that's not for function queries. // It must have been specified via local params. q = parser.getQuery(); assert parser.getLocalParams() != null; sp.pos = start + parser.localParamsEnd; } Boolean top = sp.getSortDirection(); if (null != top) { // we have a Query and a valid direction if (q instanceof FunctionQuery) { lst.add(((FunctionQuery)q).getValueSource().getSortField(top)); } else { lst.add((new QueryValueSource(q, 0.0f)).getSortField(top)); } continue; } } catch (IOException ioe) { throw ioe; } catch (Exception e) { // hang onto this in case the string isn't a full field name either qParserException = e; } } // if we made it here, we either have a "simple" field name, // or there was a problem parsing the string as a complex func/quer if (field == null) { // try again, simple rules for a field name with no whitespace sp.pos = start; field = sp.getSimpleString(); } Boolean top = sp.getSortDirection(); if (null == top) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Can't determine a Sort Order (asc or desc) in sort spec " + sp); } if (SCORE.equals(field)) { if (top) { lst.add(SortField.FIELD_SCORE); } else { lst.add(new SortField(null, SortField.Type.SCORE, true)); } } else if (DOCID.equals(field)) { lst.add(new SortField(null, SortField.Type.DOC, top)); } else { // try to find the field SchemaField sf = req.getSchema().getFieldOrNull(field); if (null == sf) { if (null != qParserException) { throw new SolrException (SolrException.ErrorCode.BAD_REQUEST, "sort param could not be parsed as a query, and is not a "+ "field that exists in the index: " + field, qParserException); } throw new SolrException (SolrException.ErrorCode.BAD_REQUEST, "sort param field can't be found: " + field); } lst.add(sf.getSortField(top)); } } } catch (ParseException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "error in sort: " + sortSpec, e); } catch (IOException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "error in sort: " + sortSpec, e); } // normalize a sort on score desc to null if (lst.size()==1 && lst.get(0) == SortField.FIELD_SCORE) { return null; } return new Sort(lst.toArray(new SortField[lst.size()])); }
// in core/src/java/org/apache/solr/search/grouping/distributed/responseprocessor/SearchGroupShardResponseProcessor.java
public void process(ResponseBuilder rb, ShardRequest shardRequest) { SortSpec ss = rb.getSortSpec(); Sort groupSort = rb.getGroupingSpec().getGroupSort(); String[] fields = rb.getGroupingSpec().getFields(); Map<String, List<Collection<SearchGroup<BytesRef>>>> commandSearchGroups = new HashMap<String, List<Collection<SearchGroup<BytesRef>>>>(); Map<String, Map<SearchGroup<BytesRef>, Set<String>>> tempSearchGroupToShards = new HashMap<String, Map<SearchGroup<BytesRef>, Set<String>>>(); for (String field : fields) { commandSearchGroups.put(field, new ArrayList<Collection<SearchGroup<BytesRef>>>(shardRequest.responses.size())); tempSearchGroupToShards.put(field, new HashMap<SearchGroup<BytesRef>, Set<String>>()); if (!rb.searchGroupToShards.containsKey(field)) { rb.searchGroupToShards.put(field, new HashMap<SearchGroup<BytesRef>, Set<String>>()); } } SearchGroupsResultTransformer serializer = new SearchGroupsResultTransformer(rb.req.getSearcher()); try { int maxElapsedTime = 0; int hitCountDuringFirstPhase = 0; for (ShardResponse srsp : shardRequest.responses) { maxElapsedTime = (int) Math.max(maxElapsedTime, srsp.getSolrResponse().getElapsedTime()); @SuppressWarnings("unchecked") NamedList<NamedList> firstPhaseResult = (NamedList<NamedList>) srsp.getSolrResponse().getResponse().get("firstPhase"); Map<String, Pair<Integer, Collection<SearchGroup<BytesRef>>>> result = serializer.transformToNative(firstPhaseResult, groupSort, null, srsp.getShard()); for (String field : commandSearchGroups.keySet()) { Pair<Integer, Collection<SearchGroup<BytesRef>>> firstPhaseCommandResult = result.get(field); Integer groupCount = firstPhaseCommandResult.getA(); if (groupCount != null) { Integer existingGroupCount = rb.mergedGroupCounts.get(field); // Assuming groups don't cross shard boundary... rb.mergedGroupCounts.put(field, existingGroupCount != null ? existingGroupCount + groupCount : groupCount); } Collection<SearchGroup<BytesRef>> searchGroups = firstPhaseCommandResult.getB(); if (searchGroups == null) { continue; } commandSearchGroups.get(field).add(searchGroups); for (SearchGroup<BytesRef> searchGroup : searchGroups) { Map<SearchGroup<BytesRef>, java.util.Set<String>> map = tempSearchGroupToShards.get(field); Set<String> shards = map.get(searchGroup); if (shards == null) { shards = new HashSet<String>(); map.put(searchGroup, shards); } shards.add(srsp.getShard()); } } hitCountDuringFirstPhase += (Integer) srsp.getSolrResponse().getResponse().get("totalHitCount"); } rb.totalHitCount = hitCountDuringFirstPhase; rb.firstPhaseElapsedTime = maxElapsedTime; for (String groupField : commandSearchGroups.keySet()) { List<Collection<SearchGroup<BytesRef>>> topGroups = commandSearchGroups.get(groupField); Collection<SearchGroup<BytesRef>> mergedTopGroups = SearchGroup.merge(topGroups, ss.getOffset(), ss.getCount(), groupSort); if (mergedTopGroups == null) { continue; } rb.mergedSearchGroups.put(groupField, mergedTopGroups); for (SearchGroup<BytesRef> mergedTopGroup : mergedTopGroups) { rb.searchGroupToShards.get(groupField).put(mergedTopGroup, tempSearchGroupToShards.get(groupField).get(mergedTopGroup)); } } } catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); } }
// in core/src/java/org/apache/solr/search/JoinQParserPlugin.java
public Query parse() throws ParseException { String fromField = getParam("from"); String fromIndex = getParam("fromIndex"); String toField = getParam("to"); String v = localParams.get("v"); Query fromQuery; long fromCoreOpenTime = 0; if (fromIndex != null && !fromIndex.equals(req.getCore().getCoreDescriptor().getName()) ) { CoreContainer container = req.getCore().getCoreDescriptor().getCoreContainer(); final SolrCore fromCore = container.getCore(fromIndex); RefCounted<SolrIndexSearcher> fromHolder = null; if (fromCore == null) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Cross-core join: no such core " + fromIndex); } LocalSolrQueryRequest otherReq = new LocalSolrQueryRequest(fromCore, params); try { QParser parser = QParser.getParser(v, "lucene", otherReq); fromQuery = parser.getQuery(); fromHolder = fromCore.getRegisteredSearcher(); if (fromHolder != null) fromCoreOpenTime = fromHolder.get().getOpenTime(); } finally { otherReq.close(); fromCore.close(); if (fromHolder != null) fromHolder.decref(); } } else { QParser fromQueryParser = subQuery(v, null); fromQuery = fromQueryParser.getQuery(); } JoinQuery jq = new JoinQuery(fromField, toField, fromIndex, fromQuery); jq.fromCoreOpenTime = fromCoreOpenTime; return jq; }
// in core/src/java/org/apache/solr/search/SolrQueryParser.java
private void checkNullField(String field) throws SolrException { if (field == null && defaultField == null) { throw new SolrException (SolrException.ErrorCode.BAD_REQUEST, "no field name specified in query and no defaultSearchField defined in schema.xml"); } }
// in core/src/java/org/apache/solr/search/DocSetBase.java
public void add(int doc) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR,"Unsupported Operation"); }
// in core/src/java/org/apache/solr/search/DocSetBase.java
public void addUnique(int doc) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR,"Unsupported Operation"); }
// in core/src/java/org/apache/solr/search/ValueSourceParser.java
Override public ValueSource parse(FunctionQParser fp) throws ParseException { double radius = fp.parseDouble(); //SOLR-2114, make the convert flag required, since the parser doesn't support much in the way of lookahead or the ability to convert a String into a ValueSource boolean convert = Boolean.parseBoolean(fp.parseArg()); MultiValueSource pv1; MultiValueSource pv2; ValueSource one = fp.parseValueSource(); ValueSource two = fp.parseValueSource(); if (fp.hasMoreArguments()) { List<ValueSource> s1 = new ArrayList<ValueSource>(); s1.add(one); s1.add(two); pv1 = new VectorValueSource(s1); ValueSource x2 = fp.parseValueSource(); ValueSource y2 = fp.parseValueSource(); List<ValueSource> s2 = new ArrayList<ValueSource>(); s2.add(x2); s2.add(y2); pv2 = new VectorValueSource(s2); } else { //check to see if we have multiValue source if (one instanceof MultiValueSource && two instanceof MultiValueSource){ pv1 = (MultiValueSource) one; pv2 = (MultiValueSource) two; } else { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Input must either be 2 MultiValueSources, or there must be 4 ValueSources"); } } return new HaversineFunction(pv1, pv2, radius, convert); }
// in core/src/java/org/apache/solr/search/ValueSourceParser.java
private static MVResult getMultiValueSources(List<ValueSource> sources) { MVResult mvr = new MVResult(); if (sources.size() % 2 != 0) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Illegal number of sources. There must be an even number of sources"); } if (sources.size() == 2) { //check to see if these are MultiValueSource boolean s1MV = sources.get(0) instanceof MultiValueSource; boolean s2MV = sources.get(1) instanceof MultiValueSource; if (s1MV && s2MV) { mvr.mv1 = (MultiValueSource) sources.get(0); mvr.mv2 = (MultiValueSource) sources.get(1); } else if (s1MV || s2MV) { //if one is a MultiValueSource, than the other one needs to be too. throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Illegal number of sources. There must be an even number of sources"); } else { mvr.mv1 = new VectorValueSource(Collections.singletonList(sources.get(0))); mvr.mv2 = new VectorValueSource(Collections.singletonList(sources.get(1))); } } else { int dim = sources.size() / 2; List<ValueSource> sources1 = new ArrayList<ValueSource>(dim); List<ValueSource> sources2 = new ArrayList<ValueSource>(dim); //Get dim value sources for the first vector splitSources(dim, sources, sources1, sources2); mvr.mv1 = new VectorValueSource(sources1); mvr.mv2 = new VectorValueSource(sources2); } return mvr; }
// in core/src/java/org/apache/solr/search/ValueSourceParser.java
public ValueSource getValueSource(FunctionQParser fp, String arg) { if (arg == null) return null; SchemaField f = fp.req.getSchema().getField(arg); if (f.getType().getClass() == DateField.class) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Can't use ms() function on non-numeric legacy date field " + arg); } return f.getType().getValueSource(f, fp); }
// in core/src/java/org/apache/solr/search/ValueSourceParser.java
Override public FunctionValues getValues(Map context, AtomicReaderContext readerContext) throws IOException { if (context.get(this) == null) { SolrRequestInfo requestInfo = SolrRequestInfo.getRequestInfo(); throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "testfunc: unweighted value source detected. delegate="+source + " request=" + (requestInfo==null ? "null" : requestInfo.getReq())); } return source.getValues(context, readerContext); }
// in core/src/java/org/apache/solr/highlight/DefaultSolrHighlighter.java
protected Formatter getFormatter(String fieldName, SolrParams params ) { String str = params.getFieldParam( fieldName, HighlightParams.FORMATTER ); SolrFormatter formatter = formatters.get( str ); if( formatter == null ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "Unknown formatter: "+str ); } return formatter.getFormatter( fieldName, params ); }
// in core/src/java/org/apache/solr/highlight/DefaultSolrHighlighter.java
protected Encoder getEncoder(String fieldName, SolrParams params){ String str = params.getFieldParam( fieldName, HighlightParams.ENCODER ); SolrEncoder encoder = encoders.get( str ); if( encoder == null ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "Unknown encoder: "+str ); } return encoder.getEncoder( fieldName, params ); }
// in core/src/java/org/apache/solr/highlight/DefaultSolrHighlighter.java
protected Fragmenter getFragmenter(String fieldName, SolrParams params) { String fmt = params.getFieldParam( fieldName, HighlightParams.FRAGMENTER ); SolrFragmenter frag = fragmenters.get( fmt ); if( frag == null ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "Unknown fragmenter: "+fmt ); } return frag.getFragmenter( fieldName, params ); }
// in core/src/java/org/apache/solr/highlight/DefaultSolrHighlighter.java
protected FragListBuilder getFragListBuilder( String fieldName, SolrParams params ){ String flb = params.getFieldParam( fieldName, HighlightParams.FRAG_LIST_BUILDER ); SolrFragListBuilder solrFlb = fragListBuilders.get( flb ); if( solrFlb == null ){ throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "Unknown fragListBuilder: " + flb ); } return solrFlb.getFragListBuilder( params ); }
// in core/src/java/org/apache/solr/highlight/DefaultSolrHighlighter.java
private SolrFragmentsBuilder getSolrFragmentsBuilder( String fieldName, SolrParams params ){ String fb = params.getFieldParam( fieldName, HighlightParams.FRAGMENTS_BUILDER ); SolrFragmentsBuilder solrFb = fragmentsBuilders.get( fb ); if( solrFb == null ){ throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "Unknown fragmentsBuilder: " + fb ); } return solrFb; }
// in core/src/java/org/apache/solr/highlight/DefaultSolrHighlighter.java
private BoundaryScanner getBoundaryScanner(String fieldName, SolrParams params){ String bs = params.getFieldParam(fieldName, HighlightParams.BOUNDARY_SCANNER); SolrBoundaryScanner solrBs = boundaryScanners.get(bs); if(solrBs == null){ throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Unknown boundaryScanner: " + bs); } return solrBs.getBoundaryScanner(fieldName, params); }
// in core/src/java/org/apache/solr/highlight/DefaultSolrHighlighter.java
private void doHighlightingByHighlighter( Query query, SolrQueryRequest req, NamedList docSummaries, int docId, Document doc, String fieldName ) throws IOException { final SolrIndexSearcher searcher = req.getSearcher(); final IndexSchema schema = searcher.getSchema(); // TODO: Currently in trunk highlighting numeric fields is broken (Lucene) - // so we disable them until fixed (see LUCENE-3080)! // BEGIN: Hack final SchemaField schemaField = schema.getFieldOrNull(fieldName); if (schemaField != null && ( (schemaField.getType() instanceof org.apache.solr.schema.TrieField) || (schemaField.getType() instanceof org.apache.solr.schema.TrieDateField) )) return; // END: Hack SolrParams params = req.getParams(); IndexableField[] docFields = doc.getFields(fieldName); List<String> listFields = new ArrayList<String>(); for (IndexableField field : docFields) { listFields.add(field.stringValue()); } String[] docTexts = (String[]) listFields.toArray(new String[listFields.size()]); // according to Document javadoc, doc.getValues() never returns null. check empty instead of null if (docTexts.length == 0) return; TokenStream tstream = null; int numFragments = getMaxSnippets(fieldName, params); boolean mergeContiguousFragments = isMergeContiguousFragments(fieldName, params); String[] summaries = null; List<TextFragment> frags = new ArrayList<TextFragment>(); TermOffsetsTokenStream tots = null; // to be non-null iff we're using TermOffsets optimization try { TokenStream tvStream = TokenSources.getTokenStream(searcher.getIndexReader(), docId, fieldName); if (tvStream != null) { tots = new TermOffsetsTokenStream(tvStream); } } catch (IllegalArgumentException e) { // No problem. But we can't use TermOffsets optimization. } for (int j = 0; j < docTexts.length; j++) { if( tots != null ) { // if we're using TermOffsets optimization, then get the next // field value's TokenStream (i.e. get field j's TokenStream) from tots: tstream = tots.getMultiValuedTokenStream( docTexts[j].length() ); } else { // fall back to analyzer tstream = createAnalyzerTStream(schema, fieldName, docTexts[j]); } int maxCharsToAnalyze = params.getFieldInt(fieldName, HighlightParams.MAX_CHARS, Highlighter.DEFAULT_MAX_CHARS_TO_ANALYZE); Highlighter highlighter; if (Boolean.valueOf(req.getParams().get(HighlightParams.USE_PHRASE_HIGHLIGHTER, "true"))) { if (maxCharsToAnalyze < 0) { tstream = new CachingTokenFilter(tstream); } else { tstream = new CachingTokenFilter(new OffsetLimitTokenFilter(tstream, maxCharsToAnalyze)); } // get highlighter highlighter = getPhraseHighlighter(query, fieldName, req, (CachingTokenFilter) tstream); // after highlighter initialization, reset tstream since construction of highlighter already used it tstream.reset(); } else { // use "the old way" highlighter = getHighlighter(query, fieldName, req); } if (maxCharsToAnalyze < 0) { highlighter.setMaxDocCharsToAnalyze(docTexts[j].length()); } else { highlighter.setMaxDocCharsToAnalyze(maxCharsToAnalyze); } try { TextFragment[] bestTextFragments = highlighter.getBestTextFragments(tstream, docTexts[j], mergeContiguousFragments, numFragments); for (int k = 0; k < bestTextFragments.length; k++) { if ((bestTextFragments[k] != null) && (bestTextFragments[k].getScore() > 0)) { frags.add(bestTextFragments[k]); } } } catch (InvalidTokenOffsetsException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); } } // sort such that the fragments with the highest score come first Collections.sort(frags, new Comparator<TextFragment>() { public int compare(TextFragment arg0, TextFragment arg1) { return Math.round(arg1.getScore() - arg0.getScore()); } }); // convert fragments back into text // TODO: we can include score and position information in output as snippet attributes if (frags.size() > 0) { ArrayList<String> fragTexts = new ArrayList<String>(); for (TextFragment fragment: frags) { if ((fragment != null) && (fragment.getScore() > 0)) { fragTexts.add(fragment.toString()); } if (fragTexts.size() >= numFragments) break; } summaries = fragTexts.toArray(new String[0]); if (summaries.length > 0) docSummaries.add(fieldName, summaries); } // no summeries made, copy text from alternate field if (summaries == null || summaries.length == 0) { alternateField( docSummaries, params, doc, fieldName ); } }
// in core/src/java/org/apache/solr/highlight/BreakIteratorBoundaryScanner.java
Override protected BoundaryScanner get(String fieldName, SolrParams params) { // construct Locale String language = params.getFieldParam(fieldName, HighlightParams.BS_LANGUAGE); String country = params.getFieldParam(fieldName, HighlightParams.BS_COUNTRY); if(country != null && language == null){ throw new SolrException(ErrorCode.BAD_REQUEST, HighlightParams.BS_LANGUAGE + " parameter cannot be null when you specify " + HighlightParams.BS_COUNTRY); } Locale locale = null; if(language != null){ locale = country == null ? new Locale(language) : new Locale(language, country); } // construct BreakIterator String type = params.getFieldParam(fieldName, HighlightParams.BS_TYPE, "WORD").toLowerCase(); BreakIterator bi = null; if(type.equals("character")){ bi = locale == null ? BreakIterator.getCharacterInstance() : BreakIterator.getCharacterInstance(locale); } else if(type.equals("word")){ bi = locale == null ? BreakIterator.getWordInstance() : BreakIterator.getWordInstance(locale); } else if(type.equals("line")){ bi = locale == null ? BreakIterator.getLineInstance() : BreakIterator.getLineInstance(locale); } else if(type.equals("sentence")){ bi = locale == null ? BreakIterator.getSentenceInstance() : BreakIterator.getSentenceInstance(locale); } else throw new SolrException(ErrorCode.BAD_REQUEST, type + " is invalid for parameter " + HighlightParams.BS_TYPE); return new org.apache.lucene.search.vectorhighlight.BreakIteratorBoundaryScanner(bi); }
// in core/src/java/org/apache/solr/highlight/SolrFragmentsBuilder.java
protected char getMultiValuedSeparatorChar( SolrParams params ){ String separator = params.get( HighlightParams.MULTI_VALUED_SEPARATOR, " " ); if( separator.length() > 1 ){ throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, HighlightParams.MULTI_VALUED_SEPARATOR + " parameter must be a char, but is \"" + separator + "\"" ); } return separator.charAt( 0 ); }
// in core/src/java/org/apache/solr/cloud/ElectionContext.java
Override void runLeaderProcess(boolean weAreReplacement) throws KeeperException, InterruptedException, IOException { if (cc != null) { String coreName = leaderProps.get(ZkStateReader.CORE_NAME_PROP); SolrCore core = null; try { // the first time we are run, we will get a startupCore - after // we will get null and must use cc.getCore core = cc.getCore(coreName); if (core == null) { cancelElection(); throw new SolrException(ErrorCode.SERVER_ERROR, "Fatal Error, SolrCore not found:" + coreName + " in " + cc.getCoreNames()); } // should I be leader? if (weAreReplacement && !shouldIBeLeader(leaderProps)) { // System.out.println("there is a better leader candidate it appears"); rejoinLeaderElection(leaderSeqPath, core); return; } if (weAreReplacement) { if (zkClient.exists(leaderPath, true)) { zkClient.delete(leaderPath, -1, true); } // System.out.println("I may be the new Leader:" + leaderPath // + " - I need to try and sync"); boolean success = syncStrategy.sync(zkController, core, leaderProps); if (!success && anyoneElseActive()) { rejoinLeaderElection(leaderSeqPath, core); return; } } // If I am going to be the leader I have to be active // System.out.println("I am leader go active"); core.getUpdateHandler().getSolrCoreState().cancelRecovery(); zkController.publish(core.getCoreDescriptor(), ZkStateReader.ACTIVE); } finally { if (core != null ) { core.close(); } } } super.runLeaderProcess(weAreReplacement); }
// in core/src/java/org/apache/solr/cloud/SolrZkServer.java
public void parseConfig() { if (zkProps == null) { zkProps = new SolrZkServerProps(); // set default data dir // TODO: use something based on IP+port??? support ensemble all from same solr home? zkProps.setDataDir(dataHome); zkProps.zkRun = zkRun; zkProps.solrPort = solrPort; } try { props = SolrZkServerProps.getProperties(confHome + '/' + "zoo.cfg"); SolrZkServerProps.injectServers(props, zkRun, zkHost); zkProps.parseProperties(props); if (zkProps.getClientPortAddress() == null) { zkProps.setClientPort(Integer.parseInt(solrPort)+1000); } } catch (QuorumPeerConfig.ConfigException e) { if (zkRun != null) throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); } catch (IOException e) { if (zkRun != null) throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); } }
// in core/src/java/org/apache/solr/cloud/SolrZkServer.java
Override public void run() { try { if (zkProps.getServers().size() > 1) { QuorumPeerMain zkServer = new QuorumPeerMain(); zkServer.runFromConfig(zkProps); } else { ServerConfig sc = new ServerConfig(); sc.readFrom(zkProps); ZooKeeperServerMain zkServer = new ZooKeeperServerMain(); zkServer.runFromConfig(sc); } log.info("ZooKeeper Server exited."); } catch (Throwable e) { log.error("ZooKeeper Server ERROR", e); throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); } }
// in core/src/java/org/apache/solr/cloud/RecoveryStrategy.java
private void replicate(String nodeName, SolrCore core, ZkNodeProps leaderprops, String baseUrl) throws SolrServerException, IOException { String leaderBaseUrl = leaderprops.get(ZkStateReader.BASE_URL_PROP); ZkCoreNodeProps leaderCNodeProps = new ZkCoreNodeProps(leaderprops); String leaderUrl = leaderCNodeProps.getCoreUrl(); log.info("Attempting to replicate from " + leaderUrl); // if we are the leader, either we are trying to recover faster // then our ephemeral timed out or we are the only node if (!leaderBaseUrl.equals(baseUrl)) { // send commit commitOnLeader(leaderUrl); // use rep handler directly, so we can do this sync rather than async SolrRequestHandler handler = core.getRequestHandler(REPLICATION_HANDLER); if (handler instanceof LazyRequestHandlerWrapper) { handler = ((LazyRequestHandlerWrapper)handler).getWrappedHandler(); } ReplicationHandler replicationHandler = (ReplicationHandler) handler; if (replicationHandler == null) { throw new SolrException(ErrorCode.SERVICE_UNAVAILABLE, "Skipping recovery, no " + REPLICATION_HANDLER + " handler found"); } ModifiableSolrParams solrParams = new ModifiableSolrParams(); solrParams.set(ReplicationHandler.MASTER_URL, leaderUrl + "replication"); if (isClosed()) retries = INTERRUPTED; boolean success = replicationHandler.doFetch(solrParams, true); // TODO: look into making sure force=true does not download files we already have if (!success) { throw new SolrException(ErrorCode.SERVER_ERROR, "Replication for recovery failed."); } // solrcloud_debug // try { // RefCounted<SolrIndexSearcher> searchHolder = core.getNewestSearcher(false); // SolrIndexSearcher searcher = searchHolder.get(); // try { // System.out.println(core.getCoreDescriptor().getCoreContainer().getZkController().getNodeName() + " replicated " // + searcher.search(new MatchAllDocsQuery(), 1).totalHits + " from " + leaderUrl + " gen:" + core.getDeletionPolicy().getLatestCommit().getGeneration() + " data:" + core.getDataDir()); // } finally { // searchHolder.decref(); // } // } catch (Exception e) { // // } } }
// in core/src/java/org/apache/solr/cloud/ZkController.java
private void init() { try { // makes nodes zkNode cmdExecutor.ensureExists(ZkStateReader.LIVE_NODES_ZKNODE, zkClient); Overseer.createClientNodes(zkClient, getNodeName()); createEphemeralLiveNode(); cmdExecutor.ensureExists(ZkStateReader.COLLECTIONS_ZKNODE, zkClient); syncNodeState(); overseerElector = new LeaderElector(zkClient); ElectionContext context = new OverseerElectionContext(getNodeName(), zkClient, zkStateReader); overseerElector.setup(context); overseerElector.joinElection(context); zkStateReader.createClusterStateWatchersAndUpdate(); } catch (IOException e) { log.error("", e); throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Can't create ZooKeeperController", e); } catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } catch (KeeperException e) { log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } }
// in core/src/java/org/apache/solr/cloud/ZkController.java
public String register(String coreName, final CoreDescriptor desc, boolean recoverReloadedCores) throws Exception { final String baseUrl = getBaseUrl(); final CloudDescriptor cloudDesc = desc.getCloudDescriptor(); final String collection = cloudDesc.getCollectionName(); final String coreZkNodeName = getNodeName() + "_" + coreName; String shardId = cloudDesc.getShardId(); Map<String,String> props = new HashMap<String,String>(); // we only put a subset of props into the leader node props.put(ZkStateReader.BASE_URL_PROP, baseUrl); props.put(ZkStateReader.CORE_NAME_PROP, coreName); props.put(ZkStateReader.NODE_NAME_PROP, getNodeName()); if (log.isInfoEnabled()) { log.info("Register shard - core:" + coreName + " address:" + baseUrl + " shardId:" + shardId); } ZkNodeProps leaderProps = new ZkNodeProps(props); try { joinElection(desc); } catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } catch (KeeperException e) { throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } catch (IOException e) { throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } // rather than look in the cluster state file, we go straight to the zknodes // here, because on cluster restart there could be stale leader info in the // cluster state node that won't be updated for a moment String leaderUrl = getLeaderProps(collection, cloudDesc.getShardId()).getCoreUrl(); // now wait until our currently cloud state contains the latest leader String cloudStateLeader = zkStateReader.getLeaderUrl(collection, cloudDesc.getShardId(), 30000); int tries = 0; while (!leaderUrl.equals(cloudStateLeader)) { if (tries == 60) { throw new SolrException(ErrorCode.SERVER_ERROR, "There is conflicting information about the leader of shard: " + cloudDesc.getShardId()); } Thread.sleep(1000); tries++; cloudStateLeader = zkStateReader.getLeaderUrl(collection, cloudDesc.getShardId(), 30000); } String ourUrl = ZkCoreNodeProps.getCoreUrl(baseUrl, coreName); log.info("We are " + ourUrl + " and leader is " + leaderUrl); boolean isLeader = leaderUrl.equals(ourUrl); SolrCore core = null; if (cc != null) { // CoreContainer only null in tests try { core = cc.getCore(desc.getName()); // recover from local transaction log and wait for it to complete before // going active // TODO: should this be moved to another thread? To recoveryStrat? // TODO: should this actually be done earlier, before (or as part of) // leader election perhaps? // TODO: if I'm the leader, ensure that a replica that is trying to recover waits until I'm // active (or don't make me the // leader until my local replay is done. UpdateLog ulog = core.getUpdateHandler().getUpdateLog(); if (!core.isReloaded() && ulog != null) { Future<UpdateLog.RecoveryInfo> recoveryFuture = core.getUpdateHandler() .getUpdateLog().recoverFromLog(); if (recoveryFuture != null) { recoveryFuture.get(); // NOTE: this could potentially block for // minutes or more! // TODO: public as recovering in the mean time? // TODO: in the future we could do peerync in parallel with recoverFromLog } else { log.info("No LogReplay needed for core="+core.getName() + " baseURL=" + baseUrl); } } boolean didRecovery = checkRecovery(coreName, desc, recoverReloadedCores, isLeader, cloudDesc, collection, coreZkNodeName, shardId, leaderProps, core, cc); if (!didRecovery) { publishAsActive(baseUrl, desc, coreZkNodeName, coreName); } } finally { if (core != null) { core.close(); } } } else { publishAsActive(baseUrl, desc, coreZkNodeName, coreName); } // make sure we have an update cluster state right away zkStateReader.updateCloudState(true); return shardId; }
// in core/src/java/org/apache/solr/cloud/ZkController.java
private void publishState(CoreDescriptor cd, String shardZkNodeName, String coreName, Map<String,String> props) { CloudDescriptor cloudDesc = cd.getCloudDescriptor(); if (cloudDesc.getRoles() != null) { props.put(ZkStateReader.ROLES_PROP, cloudDesc.getRoles()); } if (cloudDesc.getShardId() == null && needsToBeAssignedShardId(cd, zkStateReader.getCloudState(), shardZkNodeName)) { // publish with no shard id so we are assigned one, and then look for it doPublish(shardZkNodeName, coreName, props, cloudDesc); String shardId; try { shardId = doGetShardIdProcess(coreName, cloudDesc); } catch (InterruptedException e) { throw new SolrException(ErrorCode.SERVER_ERROR, "Interrupted"); } cloudDesc.setShardId(shardId); } if (!props.containsKey(ZkStateReader.SHARD_ID_PROP) && cloudDesc.getShardId() != null) { props.put(ZkStateReader.SHARD_ID_PROP, cloudDesc.getShardId()); } doPublish(shardZkNodeName, coreName, props, cloudDesc); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
private String doGetShardIdProcess(String coreName, CloudDescriptor descriptor) throws InterruptedException { final String shardZkNodeName = getNodeName() + "_" + coreName; int retryCount = 120; while (retryCount-- > 0) { final String shardId = zkStateReader.getCloudState().getShardId( shardZkNodeName); if (shardId != null) { return shardId; } try { Thread.sleep(500); } catch (InterruptedException e) { Thread.currentThread().interrupt(); } } throw new SolrException(ErrorCode.SERVER_ERROR, "Could not get shard_id for core: " + coreName); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
private ZkCoreNodeProps waitForLeaderToSeeDownState( CoreDescriptor descriptor, final String coreZkNodeName) { CloudDescriptor cloudDesc = descriptor.getCloudDescriptor(); String collection = cloudDesc.getCollectionName(); String shard = cloudDesc.getShardId(); ZkCoreNodeProps leaderProps = null; int retries = 6; for (int i = 0; i < retries; i++) { try { // go straight to zk, not the cloud state - we must have current info leaderProps = getLeaderProps(collection, shard); break; } catch (Exception e) { SolrException.log(log, "There was a problem finding the leader in zk", e); try { Thread.sleep(2000); } catch (InterruptedException e1) { Thread.currentThread().interrupt(); } if (i == retries - 1) { throw new SolrException(ErrorCode.SERVER_ERROR, "There was a problem finding the leader in zk"); } } } String leaderBaseUrl = leaderProps.getBaseUrl(); String leaderCoreName = leaderProps.getCoreName(); String ourUrl = ZkCoreNodeProps.getCoreUrl(getBaseUrl(), descriptor.getName()); boolean isLeader = leaderProps.getCoreUrl().equals(ourUrl); if (!isLeader && !SKIP_AUTO_RECOVERY) { HttpSolrServer server = null; server = new HttpSolrServer(leaderBaseUrl); server.setConnectionTimeout(45000); server.setSoTimeout(45000); WaitForState prepCmd = new WaitForState(); prepCmd.setCoreName(leaderCoreName); prepCmd.setNodeName(getNodeName()); prepCmd.setCoreNodeName(coreZkNodeName); prepCmd.setState(ZkStateReader.DOWN); prepCmd.setPauseFor(0); // let's retry a couple times - perhaps the leader just went down, // or perhaps he is just not quite ready for us yet retries = 6; for (int i = 0; i < retries; i++) { try { server.request(prepCmd); break; } catch (Exception e) { SolrException.log(log, "There was a problem making a request to the leader", e); try { Thread.sleep(2000); } catch (InterruptedException e1) { Thread.currentThread().interrupt(); } if (i == retries - 1) { throw new SolrException(ErrorCode.SERVER_ERROR, "There was a problem making a request to the leader"); } } } server.shutdown(); } return leaderProps; }
// in core/src/java/org/apache/solr/update/UpdateLog.java
private void ensureLog() { if (tlog == null) { String newLogName = String.format(Locale.ENGLISH, LOG_FILENAME_PATTERN, TLOG_NAME, id); try { tlog = new TransactionLog(new File(tlogDir, newLogName), globalStrings); } catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Can't open new tlog!", e); } } }
// in core/src/java/org/apache/solr/update/UpdateLog.java
private void update() { int numUpdates = 0; updateList = new ArrayList<List<Update>>(logList.size()); deleteByQueryList = new ArrayList<Update>(); deleteList = new ArrayList<DeleteUpdate>(); updates = new HashMap<Long,Update>(numRecordsToKeep); for (TransactionLog oldLog : logList) { List<Update> updatesForLog = new ArrayList<Update>(); TransactionLog.ReverseReader reader = null; try { reader = oldLog.getReverseReader(); while (numUpdates < numRecordsToKeep) { Object o = reader.next(); if (o==null) break; try { // should currently be a List<Oper,Ver,Doc/Id> List entry = (List)o; // TODO: refactor this out so we get common error handling int opAndFlags = (Integer)entry.get(0); if (latestOperation == 0) { latestOperation = opAndFlags; } int oper = opAndFlags & UpdateLog.OPERATION_MASK; long version = (Long) entry.get(1); switch (oper) { case UpdateLog.ADD: case UpdateLog.DELETE: case UpdateLog.DELETE_BY_QUERY: Update update = new Update(); update.log = oldLog; update.pointer = reader.position(); update.version = version; updatesForLog.add(update); updates.put(version, update); if (oper == UpdateLog.DELETE_BY_QUERY) { deleteByQueryList.add(update); } else if (oper == UpdateLog.DELETE) { deleteList.add(new DeleteUpdate(version, (byte[])entry.get(2))); } break; case UpdateLog.COMMIT: break; default: throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Unknown Operation! " + oper); } } catch (ClassCastException cl) { log.warn("Unexpected log entry or corrupt log. Entry=" + o, cl); // would be caused by a corrupt transaction log } catch (Exception ex) { log.warn("Exception reverse reading log", ex); break; } } } catch (IOException e) { // failure to read a log record isn't fatal log.error("Exception reading versions from log",e); } finally { if (reader != null) reader.close(); } updateList.add(updatesForLog); } }
// in core/src/java/org/apache/solr/update/UpdateLog.java
public void doReplay(TransactionLog translog) { try { loglog.warn("Starting log replay " + translog + " active="+activeLog + " starting pos=" + recoveryInfo.positionOfStart); tlogReader = translog.getReader(recoveryInfo.positionOfStart); // NOTE: we don't currently handle a core reload during recovery. This would cause the core // to change underneath us. // TODO: use the standard request factory? We won't get any custom configuration instantiating this way. RunUpdateProcessorFactory runFac = new RunUpdateProcessorFactory(); DistributedUpdateProcessorFactory magicFac = new DistributedUpdateProcessorFactory(); runFac.init(new NamedList()); magicFac.init(new NamedList()); UpdateRequestProcessor proc = magicFac.getInstance(req, rsp, runFac.getInstance(req, rsp, null)); long commitVersion = 0; int operationAndFlags = 0; for(;;) { Object o = null; if (cancelApplyBufferUpdate) break; try { if (testing_logReplayHook != null) testing_logReplayHook.run(); o = null; o = tlogReader.next(); if (o == null && activeLog) { if (!finishing) { // block to prevent new adds, but don't immediately unlock since // we could be starved from ever completing recovery. Only unlock // after we've finished this recovery. // NOTE: our own updates won't be blocked since the thread holding a write lock can // lock a read lock. versionInfo.blockUpdates(); finishing = true; o = tlogReader.next(); } else { // we had previously blocked updates, so this "null" from the log is final. // Wait until our final commit to change the state and unlock. // This is only so no new updates are written to the current log file, and is // only an issue if we crash before the commit (and we are paying attention // to incomplete log files). // // versionInfo.unblockUpdates(); } } } catch (InterruptedException e) { SolrException.log(log,e); } catch (IOException e) { SolrException.log(log,e); } catch (Throwable e) { SolrException.log(log,e); } if (o == null) break; try { // should currently be a List<Oper,Ver,Doc/Id> List entry = (List)o; operationAndFlags = (Integer)entry.get(0); int oper = operationAndFlags & OPERATION_MASK; long version = (Long) entry.get(1); switch (oper) { case UpdateLog.ADD: { recoveryInfo.adds++; // byte[] idBytes = (byte[]) entry.get(2); SolrInputDocument sdoc = (SolrInputDocument)entry.get(entry.size()-1); AddUpdateCommand cmd = new AddUpdateCommand(req); // cmd.setIndexedId(new BytesRef(idBytes)); cmd.solrDoc = sdoc; cmd.setVersion(version); cmd.setFlags(UpdateCommand.REPLAY | UpdateCommand.IGNORE_AUTOCOMMIT); if (debug) log.debug("add " + cmd); proc.processAdd(cmd); break; } case UpdateLog.DELETE: { recoveryInfo.deletes++; byte[] idBytes = (byte[]) entry.get(2); DeleteUpdateCommand cmd = new DeleteUpdateCommand(req); cmd.setIndexedId(new BytesRef(idBytes)); cmd.setVersion(version); cmd.setFlags(UpdateCommand.REPLAY | UpdateCommand.IGNORE_AUTOCOMMIT); if (debug) log.debug("delete " + cmd); proc.processDelete(cmd); break; } case UpdateLog.DELETE_BY_QUERY: { recoveryInfo.deleteByQuery++; String query = (String)entry.get(2); DeleteUpdateCommand cmd = new DeleteUpdateCommand(req); cmd.query = query; cmd.setVersion(version); cmd.setFlags(UpdateCommand.REPLAY | UpdateCommand.IGNORE_AUTOCOMMIT); if (debug) log.debug("deleteByQuery " + cmd); proc.processDelete(cmd); break; } case UpdateLog.COMMIT: { commitVersion = version; break; } default: throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Unknown Operation! " + oper); } if (rsp.getException() != null) { loglog.error("REPLAY_ERR: Exception replaying log", rsp.getException()); throw rsp.getException(); } } catch (IOException ex) { recoveryInfo.errors++; loglog.warn("REYPLAY_ERR: IOException reading log", ex); // could be caused by an incomplete flush if recovering from log } catch (ClassCastException cl) { recoveryInfo.errors++; loglog.warn("REPLAY_ERR: Unexpected log entry or corrupt log. Entry=" + o, cl); // would be caused by a corrupt transaction log } catch (Throwable ex) { recoveryInfo.errors++; loglog.warn("REPLAY_ERR: Exception replaying log", ex); // something wrong with the request? } } CommitUpdateCommand cmd = new CommitUpdateCommand(req, false); cmd.setVersion(commitVersion); cmd.softCommit = false; cmd.waitSearcher = true; cmd.setFlags(UpdateCommand.REPLAY); try { if (debug) log.debug("commit " + cmd); uhandler.commit(cmd); // this should cause a commit to be added to the incomplete log and avoid it being replayed again after a restart. } catch (IOException ex) { recoveryInfo.errors++; loglog.error("Replay exception: final commit.", ex); } if (!activeLog) { // if we are replaying an old tlog file, we need to add a commit to the end // so we don't replay it again if we restart right after. // if the last operation we replayed had FLAG_GAP set, we want to use that again so we don't lose it // as the flag on the last operation. translog.writeCommit(cmd, operationFlags | (operationAndFlags & ~OPERATION_MASK)); } try { proc.finish(); } catch (IOException ex) { recoveryInfo.errors++; loglog.error("Replay exception: finish()", ex); } } finally { if (tlogReader != null) tlogReader.close(); translog.decref(); } }
// in core/src/java/org/apache/solr/update/AddUpdateCommand.java
public BytesRef getIndexedId() { if (indexedId == null) { IndexSchema schema = req.getSchema(); SchemaField sf = schema.getUniqueKeyField(); if (sf != null) { if (solrDoc != null) { SolrInputField field = solrDoc.getField(sf.getName()); int count = field==null ? 0 : field.getValueCount(); if (count == 0) { if (overwrite) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST,"Document is missing mandatory uniqueKey field: " + sf.getName()); } } else if (count > 1) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST,"Document contains multiple values for uniqueKey field: " + field); } else { indexedId = new BytesRef(); sf.getType().readableToIndexed(field.getFirstValue().toString(), indexedId); } } } } return indexedId; }
// in core/src/java/org/apache/solr/update/DirectUpdateHandler2.java
Override public void deleteByQuery(DeleteUpdateCommand cmd) throws IOException { deleteByQueryCommands.incrementAndGet(); deleteByQueryCommandsCumulative.incrementAndGet(); boolean madeIt=false; try { Query q; try { // TODO: move this higher in the stack? QParser parser = QParser.getParser(cmd.query, "lucene", cmd.req); q = parser.getQuery(); q = QueryUtils.makeQueryable(q); // peer-sync can cause older deleteByQueries to be executed and could // delete newer documents. We prevent this by adding a clause restricting // version. if ((cmd.getFlags() & UpdateCommand.PEER_SYNC) != 0) { BooleanQuery bq = new BooleanQuery(); bq.add(q, Occur.MUST); SchemaField sf = core.getSchema().getField(VersionInfo.VERSION_FIELD); ValueSource vs = sf.getType().getValueSource(sf, null); ValueSourceRangeFilter filt = new ValueSourceRangeFilter(vs, null, Long.toString(Math.abs(cmd.version)), true, true); FunctionRangeQuery range = new FunctionRangeQuery(filt); bq.add(range, Occur.MUST); q = bq; } } catch (ParseException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); } boolean delAll = MatchAllDocsQuery.class == q.getClass(); // // synchronized to prevent deleteByQuery from running during the "open new searcher" // part of a commit. DBQ needs to signal that a fresh reader will be needed for // a realtime view of the index. When a new searcher is opened after a DBQ, that // flag can be cleared. If those thing happen concurrently, it's not thread safe. // synchronized (this) { if (delAll) { deleteAll(); } else { solrCoreState.getIndexWriter(core).deleteDocuments(q); } if (ulog != null) ulog.deleteByQuery(cmd); } madeIt = true; updateDeleteTrackers(cmd); } finally { if (!madeIt) { numErrors.incrementAndGet(); numErrorsCumulative.incrementAndGet(); } } }
// in core/src/java/org/apache/solr/update/DirectUpdateHandler2.java
Override public void decref() { try { solrCoreState.decref(this); } catch (IOException e) { throw new SolrException(ErrorCode.SERVER_ERROR, "", e); } }
// in core/src/java/org/apache/solr/update/processor/DistributedUpdateProcessor.java
public static DistribPhase parseParam(final String param) { if (param == null || param.trim().isEmpty()) { return NONE; } try { return valueOf(param); } catch (IllegalArgumentException e) { throw new SolrException (SolrException.ErrorCode.BAD_REQUEST, "Illegal value for " + DISTRIB_UPDATE_PARAM + ": " + param, e); } }
// in core/src/java/org/apache/solr/update/processor/DistributedUpdateProcessor.java
private boolean versionAdd(AddUpdateCommand cmd) throws IOException { BytesRef idBytes = cmd.getIndexedId(); if (vinfo == null || idBytes == null) { super.processAdd(cmd); return false; } // This is only the hash for the bucket, and must be based only on the uniqueKey (i.e. do not use a pluggable hash here) int bucketHash = Hash.murmurhash3_x86_32(idBytes.bytes, idBytes.offset, idBytes.length, 0); // at this point, there is an update we need to try and apply. // we may or may not be the leader. // Find any existing version in the document // TODO: don't reuse update commands any more! long versionOnUpdate = cmd.getVersion(); if (versionOnUpdate == 0) { SolrInputField versionField = cmd.getSolrInputDocument().getField(VersionInfo.VERSION_FIELD); if (versionField != null) { Object o = versionField.getValue(); versionOnUpdate = o instanceof Number ? ((Number) o).longValue() : Long.parseLong(o.toString()); } else { // Find the version String versionOnUpdateS = req.getParams().get(VERSION_FIELD); versionOnUpdate = versionOnUpdateS == null ? 0 : Long.parseLong(versionOnUpdateS); } } boolean isReplay = (cmd.getFlags() & UpdateCommand.REPLAY) != 0; boolean leaderLogic = isLeader && !isReplay; VersionBucket bucket = vinfo.bucket(bucketHash); vinfo.lockForUpdate(); try { synchronized (bucket) { // we obtain the version when synchronized and then do the add so we can ensure that // if version1 < version2 then version1 is actually added before version2. // even if we don't store the version field, synchronizing on the bucket // will enable us to know what version happened first, and thus enable // realtime-get to work reliably. // TODO: if versions aren't stored, do we need to set on the cmd anyway for some reason? // there may be other reasons in the future for a version on the commands if (versionsStored) { long bucketVersion = bucket.highest; if (leaderLogic) { boolean updated = getUpdatedDocument(cmd); if (updated && versionOnUpdate == -1) { versionOnUpdate = 1; // implied "doc must exist" for now... } if (versionOnUpdate != 0) { Long lastVersion = vinfo.lookupVersion(cmd.getIndexedId()); long foundVersion = lastVersion == null ? -1 : lastVersion; if ( versionOnUpdate == foundVersion || (versionOnUpdate < 0 && foundVersion < 0) || (versionOnUpdate==1 && foundVersion > 0) ) { // we're ok if versions match, or if both are negative (all missing docs are equal), or if cmd // specified it must exist (versionOnUpdate==1) and it does. } else { throw new SolrException(ErrorCode.CONFLICT, "version conflict for " + cmd.getPrintableId() + " expected=" + versionOnUpdate + " actual=" + foundVersion); } } long version = vinfo.getNewClock(); cmd.setVersion(version); cmd.getSolrInputDocument().setField(VersionInfo.VERSION_FIELD, version); bucket.updateHighest(version); } else { // The leader forwarded us this update. cmd.setVersion(versionOnUpdate); if (ulog.getState() != UpdateLog.State.ACTIVE && (cmd.getFlags() & UpdateCommand.REPLAY) == 0) { // we're not in an active state, and this update isn't from a replay, so buffer it. cmd.setFlags(cmd.getFlags() | UpdateCommand.BUFFERING); ulog.add(cmd); return true; } // if we aren't the leader, then we need to check that updates were not re-ordered if (bucketVersion != 0 && bucketVersion < versionOnUpdate) { // we're OK... this update has a version higher than anything we've seen // in this bucket so far, so we know that no reordering has yet occured. bucket.updateHighest(versionOnUpdate); } else { // there have been updates higher than the current update. we need to check // the specific version for this id. Long lastVersion = vinfo.lookupVersion(cmd.getIndexedId()); if (lastVersion != null && Math.abs(lastVersion) >= versionOnUpdate) { // This update is a repeat, or was reordered. We need to drop this update. return true; } } } } doLocalAdd(cmd); } // end synchronized (bucket) } finally { vinfo.unlockForUpdate(); } return false; }
// in core/src/java/org/apache/solr/update/processor/DistributedUpdateProcessor.java
boolean getUpdatedDocument(AddUpdateCommand cmd) throws IOException { SolrInputDocument sdoc = cmd.getSolrInputDocument(); boolean update = false; for (SolrInputField sif : sdoc.values()) { if (sif.getValue() instanceof Map) { update = true; break; } } if (!update) return false; BytesRef id = cmd.getIndexedId(); SolrInputDocument oldDoc = RealTimeGetComponent.getInputDocument(cmd.getReq().getCore(), id); if (oldDoc == null) { // not found... allow this in the future (depending on the details of the update, or if the user explicitly sets it). // could also just not change anything here and let the optimistic locking throw the error throw new SolrException(ErrorCode.CONFLICT, "Document not found for update. id=" + cmd.getPrintableId()); } oldDoc.remove(VERSION_FIELD); for (SolrInputField sif : sdoc.values()) { Object val = sif.getValue(); if (val instanceof Map) { for (Entry<String,Object> entry : ((Map<String,Object>) val).entrySet()) { String key = entry.getKey(); Object fieldVal = entry.getValue(); if ("add".equals(key)) { oldDoc.addField( sif.getName(), fieldVal, sif.getBoost()); } else if ("set".equals(key)) { oldDoc.setField(sif.getName(), fieldVal, sif.getBoost()); } else if ("inc".equals(key)) { SolrInputField numericField = oldDoc.get(sif.getName()); if (numericField == null) { oldDoc.setField(sif.getName(), fieldVal, sif.getBoost()); } else { // TODO: fieldtype needs externalToObject? String oldValS = numericField.getFirstValue().toString(); SchemaField sf = cmd.getReq().getSchema().getField(sif.getName()); BytesRef term = new BytesRef(); sf.getType().readableToIndexed(oldValS, term); Object oldVal = sf.getType().toObject(sf, term); String fieldValS = fieldVal.toString(); Number result; if (oldVal instanceof Long) { result = ((Long) oldVal).longValue() + Long.parseLong(fieldValS); } else if (oldVal instanceof Float) { result = ((Float) oldVal).floatValue() + Float.parseFloat(fieldValS); } else if (oldVal instanceof Double) { result = ((Double) oldVal).doubleValue() + Double.parseDouble(fieldValS); } else { // int, short, byte result = ((Integer) oldVal).intValue() + Integer.parseInt(fieldValS); } oldDoc.setField(sif.getName(), result, sif.getBoost()); } } } } else { // normal fields are treated as a "set" oldDoc.put(sif.getName(), sif); } } cmd.solrDoc = oldDoc; return true; }
// in core/src/java/org/apache/solr/update/processor/DistributedUpdateProcessor.java
public void doDeleteByQuery(DeleteUpdateCommand cmd) throws IOException { // even in non zk mode, tests simulate updates from a leader if(!zkEnabled) { isLeader = getNonZkLeaderAssumption(req); } else { zkCheck(); } // NONE: we are the first to receive this deleteByQuery // - it must be forwarded to the leader of every shard // TO: we are a leader receiving a forwarded deleteByQuery... we must: // - block all updates (use VersionInfo) // - flush *all* updates going to our replicas // - forward the DBQ to our replicas and wait for the response // - log + execute the local DBQ // FROM: we are a replica receiving a DBQ from our leader // - log + execute the local DBQ DistribPhase phase = DistribPhase.parseParam(req.getParams().get(DISTRIB_UPDATE_PARAM)); if (zkEnabled && DistribPhase.NONE == phase) { boolean leaderForAnyShard = false; // start off by assuming we are not a leader for any shard Map<String,Slice> slices = zkController.getCloudState().getSlices(collection); if (slices == null) { throw new SolrException(ErrorCode.BAD_REQUEST, "Cannot find collection:" + collection + " in " + zkController.getCloudState().getCollections()); } ModifiableSolrParams params = new ModifiableSolrParams(req.getParams()); params.set(DISTRIB_UPDATE_PARAM, DistribPhase.TOLEADER.toString()); List<Node> leaders = new ArrayList<Node>(slices.size()); for (Map.Entry<String,Slice> sliceEntry : slices.entrySet()) { String sliceName = sliceEntry.getKey(); ZkNodeProps leaderProps; try { leaderProps = zkController.getZkStateReader().getLeaderProps(collection, sliceName); } catch (InterruptedException e) { throw new SolrException(ErrorCode.SERVER_ERROR, "Exception finding leader for shard " + sliceName, e); } // TODO: What if leaders changed in the meantime? // should we send out slice-at-a-time and if a node returns "hey, I'm not a leader" (or we get an error because it went down) then look up the new leader? // Am I the leader for this slice? ZkCoreNodeProps coreLeaderProps = new ZkCoreNodeProps(leaderProps); String leaderNodeName = coreLeaderProps.getCoreNodeName(); String coreName = req.getCore().getName(); String coreNodeName = zkController.getNodeName() + "_" + coreName; isLeader = coreNodeName.equals(leaderNodeName); if (isLeader) { // don't forward to ourself leaderForAnyShard = true; } else { leaders.add(new StdNode(coreLeaderProps)); } } params.remove("commit"); // this will be distributed from the local commit cmdDistrib.distribDelete(cmd, leaders, params); if (!leaderForAnyShard) { return; } // change the phase to TOLEADER so we look up and forward to our own replicas (if any) phase = DistribPhase.TOLEADER; } List<Node> replicas = null; if (zkEnabled && DistribPhase.TOLEADER == phase) { // This core should be a leader replicas = setupRequest(); } if (vinfo == null) { super.processDelete(cmd); return; } // at this point, there is an update we need to try and apply. // we may or may not be the leader. // Find the version long versionOnUpdate = cmd.getVersion(); if (versionOnUpdate == 0) { String versionOnUpdateS = req.getParams().get(VERSION_FIELD); versionOnUpdate = versionOnUpdateS == null ? 0 : Long.parseLong(versionOnUpdateS); } versionOnUpdate = Math.abs(versionOnUpdate); // normalize to positive version boolean isReplay = (cmd.getFlags() & UpdateCommand.REPLAY) != 0; boolean leaderLogic = isLeader && !isReplay; if (!leaderLogic && versionOnUpdate==0) { throw new SolrException(ErrorCode.BAD_REQUEST, "missing _version_ on update from leader"); } vinfo.blockUpdates(); try { if (versionsStored) { if (leaderLogic) { long version = vinfo.getNewClock(); cmd.setVersion(-version); // TODO update versions in all buckets doLocalDelete(cmd); } else { cmd.setVersion(-versionOnUpdate); if (ulog.getState() != UpdateLog.State.ACTIVE && (cmd.getFlags() & UpdateCommand.REPLAY) == 0) { // we're not in an active state, and this update isn't from a replay, so buffer it. cmd.setFlags(cmd.getFlags() | UpdateCommand.BUFFERING); ulog.deleteByQuery(cmd); return; } doLocalDelete(cmd); } } // since we don't know which documents were deleted, the easiest thing to do is to invalidate // all real-time caches (i.e. UpdateLog) which involves also getting a new version of the IndexReader // (so cache misses will see up-to-date data) } finally { vinfo.unblockUpdates(); } // TODO: need to handle reorders to replicas somehow // forward to all replicas if (leaderLogic && replicas != null) { ModifiableSolrParams params = new ModifiableSolrParams(req.getParams()); params.set(VERSION_FIELD, Long.toString(cmd.getVersion())); params.set(DISTRIB_UPDATE_PARAM, DistribPhase.FROMLEADER.toString()); cmdDistrib.distribDelete(cmd, replicas, params); cmdDistrib.finish(); } if (returnVersions && rsp != null) { if (deleteByQueryResponse == null) { deleteByQueryResponse = new NamedList<String>(); rsp.add("deleteByQuery",deleteByQueryResponse); } deleteByQueryResponse.add(cmd.getQuery(), cmd.getVersion()); } }
// in core/src/java/org/apache/solr/update/processor/DistributedUpdateProcessor.java
private void zkCheck() { int retries = 10; while (!zkController.isConnected()) { if (retries-- == 0) { throw new SolrException(ErrorCode.SERVICE_UNAVAILABLE, "Cannot talk to ZooKeeper - Updates are disabled."); } try { Thread.sleep(100); } catch (InterruptedException e) { Thread.currentThread().interrupt(); break; } } }
// in core/src/java/org/apache/solr/update/processor/DistributedUpdateProcessor.java
private boolean versionDelete(DeleteUpdateCommand cmd) throws IOException { BytesRef idBytes = cmd.getIndexedId(); if (vinfo == null || idBytes == null) { super.processDelete(cmd); return false; } // This is only the hash for the bucket, and must be based only on the uniqueKey (i.e. do not use a pluggable hash here) int bucketHash = Hash.murmurhash3_x86_32(idBytes.bytes, idBytes.offset, idBytes.length, 0); // at this point, there is an update we need to try and apply. // we may or may not be the leader. // Find the version long versionOnUpdate = cmd.getVersion(); if (versionOnUpdate == 0) { String versionOnUpdateS = req.getParams().get(VERSION_FIELD); versionOnUpdate = versionOnUpdateS == null ? 0 : Long.parseLong(versionOnUpdateS); } long signedVersionOnUpdate = versionOnUpdate; versionOnUpdate = Math.abs(versionOnUpdate); // normalize to positive version boolean isReplay = (cmd.getFlags() & UpdateCommand.REPLAY) != 0; boolean leaderLogic = isLeader && !isReplay; if (!leaderLogic && versionOnUpdate==0) { throw new SolrException(ErrorCode.BAD_REQUEST, "missing _version_ on update from leader"); } VersionBucket bucket = vinfo.bucket(bucketHash); vinfo.lockForUpdate(); try { synchronized (bucket) { if (versionsStored) { long bucketVersion = bucket.highest; if (leaderLogic) { if (signedVersionOnUpdate != 0) { Long lastVersion = vinfo.lookupVersion(cmd.getIndexedId()); long foundVersion = lastVersion == null ? -1 : lastVersion; if ( (signedVersionOnUpdate == foundVersion) || (signedVersionOnUpdate < 0 && foundVersion < 0) || (signedVersionOnUpdate == 1 && foundVersion > 0) ) { // we're ok if versions match, or if both are negative (all missing docs are equal), or if cmd // specified it must exist (versionOnUpdate==1) and it does. } else { throw new SolrException(ErrorCode.CONFLICT, "version conflict for " + cmd.getId() + " expected=" + signedVersionOnUpdate + " actual=" + foundVersion); } } long version = vinfo.getNewClock(); cmd.setVersion(-version); bucket.updateHighest(version); } else { cmd.setVersion(-versionOnUpdate); if (ulog.getState() != UpdateLog.State.ACTIVE && (cmd.getFlags() & UpdateCommand.REPLAY) == 0) { // we're not in an active state, and this update isn't from a replay, so buffer it. cmd.setFlags(cmd.getFlags() | UpdateCommand.BUFFERING); ulog.delete(cmd); return true; } // if we aren't the leader, then we need to check that updates were not re-ordered if (bucketVersion != 0 && bucketVersion < versionOnUpdate) { // we're OK... this update has a version higher than anything we've seen // in this bucket so far, so we know that no reordering has yet occured. bucket.updateHighest(versionOnUpdate); } else { // there have been updates higher than the current update. we need to check // the specific version for this id. Long lastVersion = vinfo.lookupVersion(cmd.getIndexedId()); if (lastVersion != null && Math.abs(lastVersion) >= versionOnUpdate) { // This update is a repeat, or was reordered. We need to drop this update. return true; } } } } doLocalDelete(cmd); return false; } // end synchronized (bucket) } finally { vinfo.unlockForUpdate(); } }
// in core/src/java/org/apache/solr/update/processor/UpdateRequestProcessorChain.java
public void init(PluginInfo info) { final String infomsg = "updateRequestProcessorChain \"" + (null != info.name ? info.name : "") + "\"" + (info.isDefault() ? " (default)" : ""); // wrap in an ArrayList so we know we know we can do fast index lookups // and that add(int,Object) is supported List<UpdateRequestProcessorFactory> list = new ArrayList (solrCore.initPlugins(info.getChildren("processor"),UpdateRequestProcessorFactory.class,null)); if(list.isEmpty()){ throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, infomsg + " require at least one processor"); } int numDistrib = 0; int runIndex = -1; // hi->lo incase multiple run instances, add before first one // (no idea why someone might use multiple run instances, but just in case) for (int i = list.size()-1; 0 <= i; i--) { UpdateRequestProcessorFactory factory = list.get(i); if (factory instanceof DistributingUpdateProcessorFactory) { numDistrib++; } if (factory instanceof RunUpdateProcessorFactory) { runIndex = i; } } if (1 < numDistrib) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, infomsg + " may not contain more then one " + "instance of DistributingUpdateProcessorFactory"); } if (0 <= runIndex && 0 == numDistrib) { // by default, add distrib processor immediately before run DistributedUpdateProcessorFactory distrib = new DistributedUpdateProcessorFactory(); distrib.init(new NamedList()); list.add(runIndex, distrib); log.info("inserting DistributedUpdateProcessorFactory into " + infomsg); } chain = list.toArray(new UpdateRequestProcessorFactory[list.size()]); }
// in core/src/java/org/apache/solr/update/processor/FieldMutatingUpdateProcessor.java
Override public void processAdd(AddUpdateCommand cmd) throws IOException { final SolrInputDocument doc = cmd.getSolrInputDocument(); // make a copy we can iterate over while mutating the doc final Collection<String> fieldNames = new ArrayList<String>(doc.getFieldNames()); for (final String fname : fieldNames) { if (! selector.shouldMutate(fname)) continue; final SolrInputField src = doc.get(fname); SolrInputField dest = null; try { dest = mutate(src); } catch (SolrException e) { String msg = "Unable to mutate field '"+fname+"': "+e.getMessage(); SolrException.log(log, msg, e); throw new SolrException(BAD_REQUEST, msg, e); } if (null == dest) { doc.remove(fname); } else { // semantics of what happens if dest has diff name are hard // we could treat it as a copy, or a rename // for now, don't allow it. if (! fname.equals(dest.getName()) ) { throw new SolrException(SERVER_ERROR, "mutute returned field with different name: " + fname + " => " + dest.getName()); } doc.put(dest.getName(), dest); } } super.processAdd(cmd); }
// in core/src/java/org/apache/solr/update/processor/FieldMutatingUpdateProcessor.java
public static FieldNameSelector createFieldNameSelector (final SolrResourceLoader loader, final IndexSchema schema, final Set<String> fields, final Set<String> typeNames, final Collection<String> typeClasses, final Collection<Pattern> regexes, final FieldNameSelector defSelector) { final Collection<Class> classes = new ArrayList<Class>(typeClasses.size()); for (String t : typeClasses) { try { classes.add(loader.findClass(t, Object.class)); } catch (Exception e) { throw new SolrException(SERVER_ERROR, "Can't resolve typeClass: " + t, e); } } if (classes.isEmpty() && typeNames.isEmpty() && regexes.isEmpty() && fields.isEmpty()) { return defSelector; } return new ConfigurableFieldNameSelector (schema, fields, typeNames, classes, regexes); }
// in core/src/java/org/apache/solr/update/processor/SignatureUpdateProcessorFactory.java
public void inform(SolrCore core) { final SchemaField field = core.getSchema().getFieldOrNull(getSignatureField()); if (null == field) { throw new SolrException (ErrorCode.SERVER_ERROR, "Can't use signatureField which does not exist in schema: " + getSignatureField()); } if (getOverwriteDupes() && ( ! field.indexed() ) ) { throw new SolrException (ErrorCode.SERVER_ERROR, "Can't set overwriteDupes when signatureField is not indexed: " + getSignatureField()); } }
// in core/src/java/org/apache/solr/update/processor/FieldMutatingUpdateProcessorFactory.java
protected final FieldMutatingUpdateProcessor.FieldNameSelector getSelector() { if (null != selector) return selector; throw new SolrException(SERVER_ERROR, "selector was never initialized, "+ " inform(SolrCore) never called???"); }
// in core/src/java/org/apache/solr/update/processor/FieldMutatingUpdateProcessorFactory.java
private static Collection<String> oneOrMany(final NamedList args, final String key) { List<String> result = new ArrayList<String>(args.size() / 2); final String err = "init arg '" + key + "' must be a string " + "(ie: 'str'), or an array (ie: 'arr') containing strings; found: "; for (Object o = args.remove(key); null != o; o = args.remove(key)) { if (o instanceof String) { result.add((String)o); continue; } if (o instanceof Object[]) { o = Arrays.asList((Object[]) o); } if (o instanceof Collection) { for (Object item : (Collection)o) { if (! (item instanceof String)) { throw new SolrException(SERVER_ERROR, err + item.getClass()); } result.add((String)item); } continue; } // who knows what the hell we have throw new SolrException(SERVER_ERROR, err + o.getClass()); } return result; }
// in core/src/java/org/apache/solr/update/VersionInfo.java
public Long getVersionFromIndex(BytesRef idBytes) { // TODO: we could cache much of this and invalidate during a commit. // TODO: most DocValues classes are threadsafe - expose which. RefCounted<SolrIndexSearcher> newestSearcher = core.getRealtimeSearcher(); try { SolrIndexSearcher searcher = newestSearcher.get(); long lookup = searcher.lookupId(idBytes); if (lookup < 0) return null; ValueSource vs = versionField.getType().getValueSource(versionField, null); Map context = ValueSource.newContext(searcher); vs.createWeight(context, searcher); FunctionValues fv = vs.getValues(context, searcher.getTopReaderContext().leaves()[(int)(lookup>>32)]); long ver = fv.longVal((int)lookup); return ver; } catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Error reading version from index", e); } finally { if (newestSearcher != null) { newestSearcher.decref(); } } }
// in core/src/java/org/apache/solr/update/SolrIndexConfig.java
private void assertWarnOrFail(String reason, boolean assertCondition, boolean failCondition) { if(assertCondition) { return; } else if(failCondition) { throw new SolrException(ErrorCode.FORBIDDEN, reason); } else { log.warn(reason); } }
// in core/src/java/org/apache/solr/update/DocumentBuilder.java
protected void addSingleField(SchemaField sfield, String val, float boost) { //System.out.println("###################ADDING FIELD "+sfield+"="+val); // we don't check for a null val ourselves because a solr.FieldType // might actually want to map it to something. If createField() // returns null, then we don't store the field. if (sfield.isPolyField()) { IndexableField[] fields = sfield.createFields(val, boost); if (fields.length > 0) { if (!sfield.multiValued()) { String oldValue = map.put(sfield.getName(), val); if (oldValue != null) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "ERROR: multiple values encountered for non multiValued field " + sfield.getName() + ": first='" + oldValue + "' second='" + val + "'"); } } // Add each field for (IndexableField field : fields) { doc.add(field); } } } else { IndexableField field = sfield.createField(val, boost); if (field != null) { if (!sfield.multiValued()) { String oldValue = map.put(sfield.getName(), val); if (oldValue != null) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST,"ERROR: multiple values encountered for non multiValued field " + sfield.getName() + ": first='" + oldValue + "' second='" + val + "'"); } } } doc.add(field); } }
// in core/src/java/org/apache/solr/update/DocumentBuilder.java
public void addField(String name, String val, float boost) { SchemaField sfield = schema.getFieldOrNull(name); if (sfield != null) { addField(sfield,val,boost); } // Check if we should copy this field to any other fields. // This could happen whether it is explicit or not. final List<CopyField> copyFields = schema.getCopyFieldsList(name); if (copyFields != null) { for(CopyField cf : copyFields) { addSingleField(cf.getDestination(), cf.getLimitedValue( val ), boost); } } // error if this field name doesn't match anything if (sfield==null && (copyFields==null || copyFields.size()==0)) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST,"ERROR:unknown field '" + name + "'"); } }
// in core/src/java/org/apache/solr/update/DocumentBuilder.java
public Document getDoc() throws IllegalArgumentException { // Check for all required fields -- Note, all fields with a // default value are defacto 'required' fields. List<String> missingFields = null; for (SchemaField field : schema.getRequiredFields()) { if (doc.getField(field.getName() ) == null) { if (field.getDefaultValue() != null) { addField(doc, field, field.getDefaultValue(), 1.0f); } else { if (missingFields==null) { missingFields = new ArrayList<String>(1); } missingFields.add(field.getName()); } } } if (missingFields != null) { StringBuilder builder = new StringBuilder(); // add the uniqueKey if possible if( schema.getUniqueKeyField() != null ) { String n = schema.getUniqueKeyField().getName(); String v = doc.getField( n ).stringValue(); builder.append( "Document ["+n+"="+v+"] " ); } builder.append("missing required fields: " ); for (String field : missingFields) { builder.append(field); builder.append(" "); } throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, builder.toString()); } Document ret = doc; doc=null; return ret; }
// in core/src/java/org/apache/solr/update/DocumentBuilder.java
public static Document toDocument( SolrInputDocument doc, IndexSchema schema ) { Document out = new Document(); final float docBoost = doc.getDocumentBoost(); // Load fields from SolrDocument to Document for( SolrInputField field : doc ) { String name = field.getName(); SchemaField sfield = schema.getFieldOrNull(name); boolean used = false; float boost = field.getBoost(); boolean omitNorms = sfield != null && sfield.omitNorms(); // Make sure it has the correct number if( sfield!=null && !sfield.multiValued() && field.getValueCount() > 1 ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "ERROR: "+getID(doc, schema)+"multiple values encountered for non multiValued field " + sfield.getName() + ": " +field.getValue() ); } if (omitNorms && boost != 1.0F) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "ERROR: "+getID(doc, schema)+"cannot set an index-time boost, norms are omitted for field " + sfield.getName() + ": " +field.getValue() ); } // load each field value boolean hasField = false; try { for( Object v : field ) { if( v == null ) { continue; } hasField = true; if (sfield != null) { used = true; addField(out, sfield, v, omitNorms ? 1F : docBoost*boost); } // Check if we should copy this field to any other fields. // This could happen whether it is explicit or not. List<CopyField> copyFields = schema.getCopyFieldsList(name); for (CopyField cf : copyFields) { SchemaField destinationField = cf.getDestination(); // check if the copy field is a multivalued or not if (!destinationField.multiValued() && out.getField(destinationField.getName()) != null) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "ERROR: "+getID(doc, schema)+"multiple values encountered for non multiValued copy field " + destinationField.getName() + ": " + v); } used = true; // Perhaps trim the length of a copy field Object val = v; if( val instanceof String && cf.getMaxChars() > 0 ) { val = cf.getLimitedValue((String)val); } addField(out, destinationField, val, destinationField.omitNorms() ? 1F : docBoost*boost); } // In lucene, the boost for a given field is the product of the // document boost and *all* boosts on values of that field. // For multi-valued fields, we only want to set the boost on the // first field. boost = docBoost; } } catch( SolrException ex ) { throw ex; } catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "ERROR: "+getID(doc, schema)+"Error adding field '" + field.getName() + "'='" +field.getValue()+"' msg=" + ex.getMessage(), ex ); } // make sure the field was used somehow... if( !used && hasField ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "ERROR: "+getID(doc, schema)+"unknown field '" +name + "'"); } } // Now validate required fields or add default values // fields with default values are defacto 'required' for (SchemaField field : schema.getRequiredFields()) { if (out.getField(field.getName() ) == null) { if (field.getDefaultValue() != null) { addField(out, field, field.getDefaultValue(), 1.0f); } else { String msg = getID(doc, schema) + "missing required field: " + field.getName(); throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, msg ); } } } return out; }
// in core/src/java/org/apache/solr/update/SolrCmdDistributor.java
void checkResponses(boolean block) { while (pending != null && pending.size() > 0) { try { Future<Request> future = block ? completionService.take() : completionService.poll(); if (future == null) return; pending.remove(future); try { Request sreq = future.get(); if (sreq.rspCode != 0) { // error during request // if there is a retry url, we want to retry... // TODO: but we really should only retry on connection errors... if (sreq.retries < 5 && sreq.node.checkRetry()) { sreq.retries++; sreq.rspCode = 0; sreq.exception = null; Thread.sleep(500); submit(sreq); checkResponses(block); } else { Exception e = sreq.exception; Error error = new Error(); error.e = e; error.node = sreq.node; response.errors.add(error); response.sreq = sreq; SolrException.log(SolrCore.log, "shard update error " + sreq.node, sreq.exception); } } } catch (ExecutionException e) { // shouldn't happen since we catch exceptions ourselves SolrException.log(SolrCore.log, "error sending update request to shard", e); } } catch (InterruptedException e) { throw new SolrException(SolrException.ErrorCode.SERVICE_UNAVAILABLE, "interrupted waiting for shard update response", e); } } }
// in core/src/java/org/apache/solr/update/PeerSync.java
private boolean handleUpdates(ShardResponse srsp) { // we retrieved the last N updates from the replica List<Object> updates = (List<Object>)srsp.getSolrResponse().getResponse().get("updates"); SyncShardRequest sreq = (SyncShardRequest) srsp.getShardRequest(); if (updates.size() < sreq.requestedUpdates.size()) { log.error(msg() + " Requested " + sreq.requestedUpdates.size() + " updates from " + sreq.shards[0] + " but retrieved " + updates.size()); return false; } ModifiableSolrParams params = new ModifiableSolrParams(); params.set(DISTRIB_UPDATE_PARAM, FROMLEADER.toString()); // params.set("peersync",true); // debugging SolrQueryRequest req = new LocalSolrQueryRequest(uhandler.core, params); SolrQueryResponse rsp = new SolrQueryResponse(); RunUpdateProcessorFactory runFac = new RunUpdateProcessorFactory(); DistributedUpdateProcessorFactory magicFac = new DistributedUpdateProcessorFactory(); runFac.init(new NamedList()); magicFac.init(new NamedList()); UpdateRequestProcessor proc = magicFac.getInstance(req, rsp, runFac.getInstance(req, rsp, null)); Collections.sort(updates, updateRecordComparator); Object o = null; long lastVersion = 0; try { // Apply oldest updates first for (Object obj : updates) { // should currently be a List<Oper,Ver,Doc/Id> o = obj; List<Object> entry = (List<Object>)o; if (debug) { log.debug(msg() + "raw update record " + o); } int oper = (Integer)entry.get(0) & UpdateLog.OPERATION_MASK; long version = (Long) entry.get(1); if (version == lastVersion && version != 0) continue; lastVersion = version; switch (oper) { case UpdateLog.ADD: { // byte[] idBytes = (byte[]) entry.get(2); SolrInputDocument sdoc = (SolrInputDocument)entry.get(entry.size()-1); AddUpdateCommand cmd = new AddUpdateCommand(req); // cmd.setIndexedId(new BytesRef(idBytes)); cmd.solrDoc = sdoc; cmd.setVersion(version); cmd.setFlags(UpdateCommand.PEER_SYNC | UpdateCommand.IGNORE_AUTOCOMMIT); if (debug) { log.debug(msg() + "add " + cmd); } proc.processAdd(cmd); break; } case UpdateLog.DELETE: { byte[] idBytes = (byte[]) entry.get(2); DeleteUpdateCommand cmd = new DeleteUpdateCommand(req); cmd.setIndexedId(new BytesRef(idBytes)); cmd.setVersion(version); cmd.setFlags(UpdateCommand.PEER_SYNC | UpdateCommand.IGNORE_AUTOCOMMIT); if (debug) { log.debug(msg() + "delete " + cmd); } proc.processDelete(cmd); break; } case UpdateLog.DELETE_BY_QUERY: { String query = (String)entry.get(2); DeleteUpdateCommand cmd = new DeleteUpdateCommand(req); cmd.query = query; cmd.setVersion(version); cmd.setFlags(UpdateCommand.PEER_SYNC | UpdateCommand.IGNORE_AUTOCOMMIT); if (debug) { log.debug(msg() + "deleteByQuery " + cmd); } proc.processDelete(cmd); break; } default: throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Unknown Operation! " + oper); } } } catch (IOException e) { // TODO: should this be handled separately as a problem with us? // I guess it probably already will by causing replication to be kicked off. sreq.updateException = e; log.error(msg() + "Error applying updates from " + sreq.shards + " ,update=" + o, e); return false; } catch (Exception e) { sreq.updateException = e; log.error(msg() + "Error applying updates from " + sreq.shards + " ,update=" + o, e); return false; } finally { try { proc.finish(); } catch (Exception e) { sreq.updateException = e; log.error(msg() + "Error applying updates from " + sreq.shards + " ,finish()", e); return false; } } return true; }
// in core/src/java/org/apache/solr/update/TransactionLog.java
Override public String readExternString(FastInputStream fis) throws IOException { int idx = readSize(fis); if (idx != 0) {// idx != 0 is the index of the extern string // no need to synchronize globalStringList - it's only updated before the first record is written to the log return globalStringList.get(idx - 1); } else {// idx == 0 means it has a string value // this shouldn't happen with this codec subclass. throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Corrupt transaction log"); } }
// in core/src/java/org/apache/solr/update/TransactionLog.java
public long writeData(Object o) { LogCodec codec = new LogCodec(); try { long pos = fos.size(); // if we had flushed, this should be equal to channel.position() codec.init(fos); codec.writeVal(o); return pos; } catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); } }
// in core/src/java/org/apache/solr/update/TransactionLog.java
public long write(AddUpdateCommand cmd, int flags) { LogCodec codec = new LogCodec(); long pos = 0; synchronized (this) { try { pos = fos.size(); // if we had flushed, this should be equal to channel.position() SolrInputDocument sdoc = cmd.getSolrInputDocument(); if (pos == 0) { // TODO: needs to be changed if we start writing a header first addGlobalStrings(sdoc.getFieldNames()); writeLogHeader(codec); pos = fos.size(); } /*** System.out.println("###writing at " + pos + " fos.size()=" + fos.size() + " raf.length()=" + raf.length()); if (pos != fos.size()) { throw new RuntimeException("ERROR" + "###writing at " + pos + " fos.size()=" + fos.size() + " raf.length()=" + raf.length()); } ***/ codec.init(fos); codec.writeTag(JavaBinCodec.ARR, 3); codec.writeInt(UpdateLog.ADD | flags); // should just take one byte codec.writeLong(cmd.getVersion()); codec.writeSolrInputDocument(cmd.getSolrInputDocument()); endRecord(pos); // fos.flushBuffer(); // flush later return pos; } catch (IOException e) { // TODO: reset our file pointer back to "pos", the start of this record. throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Error logging add", e); } } }
// in core/src/java/org/apache/solr/update/TransactionLog.java
public long writeDelete(DeleteUpdateCommand cmd, int flags) { LogCodec codec = new LogCodec(); synchronized (this) { try { long pos = fos.size(); // if we had flushed, this should be equal to channel.position() if (pos == 0) { writeLogHeader(codec); pos = fos.size(); } codec.init(fos); codec.writeTag(JavaBinCodec.ARR, 3); codec.writeInt(UpdateLog.DELETE | flags); // should just take one byte codec.writeLong(cmd.getVersion()); BytesRef br = cmd.getIndexedId(); codec.writeByteArray(br.bytes, br.offset, br.length); endRecord(pos); // fos.flushBuffer(); // flush later return pos; } catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); } } }
// in core/src/java/org/apache/solr/update/TransactionLog.java
public long writeDeleteByQuery(DeleteUpdateCommand cmd, int flags) { LogCodec codec = new LogCodec(); synchronized (this) { try { long pos = fos.size(); // if we had flushed, this should be equal to channel.position() if (pos == 0) { writeLogHeader(codec); pos = fos.size(); } codec.init(fos); codec.writeTag(JavaBinCodec.ARR, 3); codec.writeInt(UpdateLog.DELETE_BY_QUERY | flags); // should just take one byte codec.writeLong(cmd.getVersion()); codec.writeStr(cmd.query); endRecord(pos); // fos.flushBuffer(); // flush later return pos; } catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); } } }
// in core/src/java/org/apache/solr/update/TransactionLog.java
public long writeCommit(CommitUpdateCommand cmd, int flags) { LogCodec codec = new LogCodec(); synchronized (this) { try { long pos = fos.size(); // if we had flushed, this should be equal to channel.position() if (pos == 0) { writeLogHeader(codec); pos = fos.size(); } codec.init(fos); codec.writeTag(JavaBinCodec.ARR, 3); codec.writeInt(UpdateLog.COMMIT | flags); // should just take one byte codec.writeLong(cmd.getVersion()); codec.writeStr(END_MESSAGE); // ensure these bytes are (almost) last in the file endRecord(pos); fos.flush(); // flush since this will be the last record in a log fill assert fos.size() == channel.size(); return pos; } catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); } } }
// in core/src/java/org/apache/solr/update/TransactionLog.java
public Object lookup(long pos) { // A negative position can result from a log replay (which does not re-log, but does // update the version map. This is OK since the node won't be ACTIVE when this happens. if (pos < 0) return null; try { // make sure any unflushed buffer has been flushed synchronized (this) { // TODO: optimize this by keeping track of what we have flushed up to fos.flushBuffer(); /*** System.out.println("###flushBuffer to " + fos.size() + " raf.length()=" + raf.length() + " pos="+pos); if (fos.size() != raf.length() || pos >= fos.size() ) { throw new RuntimeException("ERROR" + "###flushBuffer to " + fos.size() + " raf.length()=" + raf.length() + " pos="+pos); } ***/ } ChannelFastInputStream fis = new ChannelFastInputStream(channel, pos); LogCodec codec = new LogCodec(); return codec.readVal(fis); } catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); } }
// in core/src/java/org/apache/solr/update/TransactionLog.java
public void incref() { int result = refcount.incrementAndGet(); if (result <= 1) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "incref on a closed log: " + this); } }
// in core/src/java/org/apache/solr/update/TransactionLog.java
public void finish(UpdateLog.SyncLevel syncLevel) { if (syncLevel == UpdateLog.SyncLevel.NONE) return; try { synchronized (this) { fos.flushBuffer(); } if (syncLevel == UpdateLog.SyncLevel.FSYNC) { // Since fsync is outside of synchronized block, we can end up with a partial // last record on power failure (which is OK, and does not represent an error... // we just need to be aware of it when reading). raf.getFD().sync(); } } catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); } }
// in core/src/java/org/apache/solr/update/TransactionLog.java
private void close() { try { if (debug) { log.debug("Closing tlog" + this); } synchronized (this) { fos.flush(); fos.close(); } if (deleteOnClose) { tlogFile.delete(); } } catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); } }
// in core/src/java/org/apache/solr/core/Config.java
public Object evaluate(String path, QName type) { XPath xpath = xpathFactory.newXPath(); try { String xstr=normalize(path); // TODO: instead of prepending /prefix/, we could do the search rooted at /prefix... Object o = xpath.evaluate(xstr, doc, type); return o; } catch (XPathExpressionException e) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR,"Error in xpath:" + path +" for " + name,e); } }
// in core/src/java/org/apache/solr/core/Config.java
public Node getNode(String path, boolean errIfMissing) { XPath xpath = xpathFactory.newXPath(); Node nd = null; String xstr = normalize(path); try { nd = (Node)xpath.evaluate(xstr, doc, XPathConstants.NODE); if (nd==null) { if (errIfMissing) { throw new RuntimeException(name + " missing "+path); } else { log.debug(name + " missing optional " + path); return null; } } log.trace(name + ":" + path + "=" + nd); return nd; } catch (XPathExpressionException e) { SolrException.log(log,"Error in xpath",e); throw new SolrException( SolrException.ErrorCode.SERVER_ERROR,"Error in xpath:" + xstr + " for " + name,e); } catch (SolrException e) { throw(e); } catch (Throwable e) { SolrException.log(log,"Error in xpath",e); throw new SolrException( SolrException.ErrorCode.SERVER_ERROR,"Error in xpath:" + xstr+ " for " + name,e); } }
// in core/src/java/org/apache/solr/core/Config.java
public static final Version parseLuceneVersionString(final String matchVersion) { final Version version; try { version = Version.parseLeniently(matchVersion); } catch (IllegalArgumentException iae) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Invalid luceneMatchVersion '" + matchVersion + "', valid values are: " + Arrays.toString(Version.values()) + " or a string in format 'V.V'", iae); } if (version == Version.LUCENE_CURRENT && !versionWarningAlreadyLogged.getAndSet(true)) { log.warn( "You should not use LUCENE_CURRENT as luceneMatchVersion property: "+ "if you use this setting, and then Solr upgrades to a newer release of Lucene, "+ "sizable changes may happen. If precise back compatibility is important "+ "then you should instead explicitly specify an actual Lucene version." ); } return version; }
// in core/src/java/org/apache/solr/core/RequestHandlers.java
void initHandlersFromConfig(SolrConfig config ){ // use link map so we iterate in the same order Map<PluginInfo,SolrRequestHandler> handlers = new LinkedHashMap<PluginInfo,SolrRequestHandler>(); for (PluginInfo info : config.getPluginInfos(SolrRequestHandler.class.getName())) { try { SolrRequestHandler requestHandler; String startup = info.attributes.get("startup") ; if( startup != null ) { if( "lazy".equals(startup) ) { log.info("adding lazy requestHandler: " + info.className); requestHandler = new LazyRequestHandlerWrapper( core, info.className, info.initArgs ); } else { throw new Exception( "Unknown startup value: '"+startup+"' for: "+info.className ); } } else { requestHandler = core.createRequestHandler(info.className); } handlers.put(info,requestHandler); SolrRequestHandler old = register(info.name, requestHandler); if(old != null) { log.warn("Multiple requestHandler registered to the same name: " + info.name + " ignoring: " + old.getClass().getName()); } if(info.isDefault()){ old = register("",requestHandler); if(old != null) log.warn("Multiple default requestHandler registered" + " ignoring: " + old.getClass().getName()); } log.info("created "+info.name+": " + info.className); } catch (Exception ex) { throw new SolrException (ErrorCode.SERVER_ERROR, "RequestHandler init failure", ex); } } // we've now registered all handlers, time to init them in the same order for (Map.Entry<PluginInfo,SolrRequestHandler> entry : handlers.entrySet()) { PluginInfo info = entry.getKey(); SolrRequestHandler requestHandler = entry.getValue(); if (requestHandler instanceof PluginInfoInitialized) { ((PluginInfoInitialized) requestHandler).init(info); } else{ requestHandler.init(info.initArgs); } } if(get("") == null) register("", get("/select"));//defacto default handler if(get("") == null) register("", get("standard"));//old default handler name; TODO remove? if(get("") == null) log.warn("no default request handler is registered (either '/select' or 'standard')"); }
// in core/src/java/org/apache/solr/core/RequestHandlers.java
public synchronized SolrRequestHandler getWrappedHandler() { if( _handler == null ) { try { SolrRequestHandler handler = core.createRequestHandler(_className); handler.init( _args ); if( handler instanceof SolrCoreAware ) { ((SolrCoreAware)handler).inform( core ); } _handler = handler; } catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "lazy loading error", ex ); } } return _handler; }
// in core/src/java/org/apache/solr/core/CachingDirectoryFactory.java
private static Directory injectLockFactory(Directory dir, String lockPath, String rawLockType) throws IOException { if (null == rawLockType) { // we default to "simple" for backwards compatibility log.warn("No lockType configured for " + dir + " assuming 'simple'"); rawLockType = "simple"; } final String lockType = rawLockType.toLowerCase(Locale.ENGLISH).trim(); if ("simple".equals(lockType)) { // multiple SimpleFSLockFactory instances should be OK dir.setLockFactory(new SimpleFSLockFactory(lockPath)); } else if ("native".equals(lockType)) { dir.setLockFactory(new NativeFSLockFactory(lockPath)); } else if ("single".equals(lockType)) { if (!(dir.getLockFactory() instanceof SingleInstanceLockFactory)) dir .setLockFactory(new SingleInstanceLockFactory()); } else if ("none".equals(lockType)) { // Recipe for disaster log.error("CONFIGURATION WARNING: locks are disabled on " + dir); dir.setLockFactory(NoLockFactory.getNoLockFactory()); } else { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Unrecognized lockType: " + rawLockType); } return dir; }
// in core/src/java/org/apache/solr/core/SolrCore.java
private <T> T createInstance(String className, Class<T> cast, String msg) { Class<? extends T> clazz = null; if (msg == null) msg = "SolrCore Object"; try { clazz = getResourceLoader().findClass(className, cast); //most of the classes do not have constructors which takes SolrCore argument. It is recommended to obtain SolrCore by implementing SolrCoreAware. // So invariably always it will cause a NoSuchMethodException. So iterate though the list of available constructors Constructor[] cons = clazz.getConstructors(); for (Constructor con : cons) { Class[] types = con.getParameterTypes(); if(types.length == 1 && types[0] == SolrCore.class){ return (T)con.newInstance(this); } } return getResourceLoader().newInstance(className, cast);//use the empty constructor } catch (SolrException e) { throw e; } catch (Exception e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR,"Error Instantiating "+msg+", "+className+ " failed to instantiate " +cast.getName(), e); } }
// in core/src/java/org/apache/solr/core/SolrCore.java
private UpdateHandler createReloadedUpdateHandler(String className, String msg, UpdateHandler updateHandler) { Class<? extends UpdateHandler> clazz = null; if (msg == null) msg = "SolrCore Object"; try { clazz = getResourceLoader().findClass(className, UpdateHandler.class); //most of the classes do not have constructors which takes SolrCore argument. It is recommended to obtain SolrCore by implementing SolrCoreAware. // So invariably always it will cause a NoSuchMethodException. So iterate though the list of available constructors Constructor justSolrCoreCon = null; Constructor[] cons = clazz.getConstructors(); for (Constructor con : cons) { Class[] types = con.getParameterTypes(); if(types.length == 2 && types[0] == SolrCore.class && types[1] == UpdateHandler.class){ return (UpdateHandler) con.newInstance(this, updateHandler); } } throw new SolrException(SolrException.ErrorCode.SERVER_ERROR,"Error Instantiating "+msg+", "+className+ " could not find proper constructor for " + UpdateHandler.class.getName()); } catch (SolrException e) { throw e; } catch (Exception e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR,"Error Instantiating "+msg+", "+className+ " failed to instantiate " + UpdateHandler.class.getName(), e); } }
// in core/src/java/org/apache/solr/core/SolrCore.java
public UpdateRequestProcessorChain getUpdateProcessingChain( final String name ) { UpdateRequestProcessorChain chain = updateProcessorChains.get( name ); if( chain == null ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "unknown UpdateRequestProcessorChain: "+name ); } return chain; }
// in core/src/java/org/apache/solr/core/SolrCore.java
public SearchComponent getSearchComponent( String name ) { SearchComponent component = searchComponents.get( name ); if( component == null ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "Unknown Search Component: "+name ); } return component; }
// in core/src/java/org/apache/solr/core/SolrCore.java
public RefCounted<SolrIndexSearcher> openNewSearcher(boolean updateHandlerReopens, boolean realtime) { SolrIndexSearcher tmp; RefCounted<SolrIndexSearcher> newestSearcher = null; boolean nrt = solrConfig.reopenReaders && updateHandlerReopens; openSearcherLock.lock(); try { String newIndexDir = getNewIndexDir(); File indexDirFile = null; File newIndexDirFile = null; // if it's not a normal near-realtime update, check that paths haven't changed. if (!nrt) { indexDirFile = new File(getIndexDir()).getCanonicalFile(); newIndexDirFile = new File(newIndexDir).getCanonicalFile(); } synchronized (searcherLock) { newestSearcher = realtimeSearcher; if (newestSearcher != null) { newestSearcher.incref(); // the matching decref is in the finally block } } if (newestSearcher != null && solrConfig.reopenReaders && (nrt || indexDirFile.equals(newIndexDirFile))) { DirectoryReader newReader; DirectoryReader currentReader = newestSearcher.get().getIndexReader(); if (updateHandlerReopens) { // SolrCore.verbose("start reopen from",previousSearcher,"writer=",writer); IndexWriter writer = getUpdateHandler().getSolrCoreState().getIndexWriter(this); newReader = DirectoryReader.openIfChanged(currentReader, writer, true); } else { // verbose("start reopen without writer, reader=", currentReader); newReader = DirectoryReader.openIfChanged(currentReader); // verbose("reopen result", newReader); } if (newReader == null) { // if this is a request for a realtime searcher, just return the same searcher if there haven't been any changes. if (realtime) { newestSearcher.incref(); return newestSearcher; } currentReader.incRef(); newReader = currentReader; } // for now, turn off caches if this is for a realtime reader (caches take a little while to instantiate) tmp = new SolrIndexSearcher(this, schema, (realtime ? "realtime":"main"), newReader, true, !realtime, true, directoryFactory); } else { // verbose("non-reopen START:"); tmp = new SolrIndexSearcher(this, newIndexDir, schema, getSolrConfig().indexConfig, "main", true, directoryFactory); // verbose("non-reopen DONE: searcher=",tmp); } List<RefCounted<SolrIndexSearcher>> searcherList = realtime ? _realtimeSearchers : _searchers; RefCounted<SolrIndexSearcher> newSearcher = newHolder(tmp, searcherList); // refcount now at 1 // Increment reference again for "realtimeSearcher" variable. It should be at 2 after. // When it's decremented by both the caller of this method, and by realtimeSearcher being replaced, // it will be closed. newSearcher.incref(); synchronized (searcherLock) { if (realtimeSearcher != null) { realtimeSearcher.decref(); } realtimeSearcher = newSearcher; searcherList.add(realtimeSearcher); } return newSearcher; } catch (Exception e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Error opening new searcher", e); } finally { openSearcherLock.unlock(); if (newestSearcher != null) { newestSearcher.decref(); } } }
// in core/src/java/org/apache/solr/core/SolrCore.java
public RefCounted<SolrIndexSearcher> getSearcher(boolean forceNew, boolean returnSearcher, final Future[] waitSearcher, boolean updateHandlerReopens) throws IOException { // it may take some time to open an index.... we may need to make // sure that two threads aren't trying to open one at the same time // if it isn't necessary. synchronized (searcherLock) { // see if we can return the current searcher if (_searcher!=null && !forceNew) { if (returnSearcher) { _searcher.incref(); return _searcher; } else { return null; } } // check to see if we can wait for someone else's searcher to be set if (onDeckSearchers>0 && !forceNew && _searcher==null) { try { searcherLock.wait(); } catch (InterruptedException e) { log.info(SolrException.toStr(e)); } } // check again: see if we can return right now if (_searcher!=null && !forceNew) { if (returnSearcher) { _searcher.incref(); return _searcher; } else { return null; } } // At this point, we know we need to open a new searcher... // first: increment count to signal other threads that we are // opening a new searcher. onDeckSearchers++; if (onDeckSearchers < 1) { // should never happen... just a sanity check log.error(logid+"ERROR!!! onDeckSearchers is " + onDeckSearchers); onDeckSearchers=1; // reset } else if (onDeckSearchers > maxWarmingSearchers) { onDeckSearchers--; String msg="Error opening new searcher. exceeded limit of maxWarmingSearchers="+maxWarmingSearchers + ", try again later."; log.warn(logid+""+ msg); // HTTP 503==service unavailable, or 409==Conflict throw new SolrException(SolrException.ErrorCode.SERVICE_UNAVAILABLE,msg); } else if (onDeckSearchers > 1) { log.warn(logid+"PERFORMANCE WARNING: Overlapping onDeckSearchers=" + onDeckSearchers); } } // a signal to decrement onDeckSearchers if something goes wrong. final boolean[] decrementOnDeckCount=new boolean[]{true}; RefCounted<SolrIndexSearcher> currSearcherHolder = null; // searcher we are autowarming from RefCounted<SolrIndexSearcher> searchHolder = null; boolean success = false; openSearcherLock.lock(); try { searchHolder = openNewSearcher(updateHandlerReopens, false); // the searchHolder will be incremented once already (and it will eventually be assigned to _searcher when registered) // increment it again if we are going to return it to the caller. if (returnSearcher) { searchHolder.incref(); } final RefCounted<SolrIndexSearcher> newSearchHolder = searchHolder; final SolrIndexSearcher newSearcher = newSearchHolder.get(); boolean alreadyRegistered = false; synchronized (searcherLock) { if (_searcher == null) { // if there isn't a current searcher then we may // want to register this one before warming is complete instead of waiting. if (solrConfig.useColdSearcher) { registerSearcher(newSearchHolder); decrementOnDeckCount[0]=false; alreadyRegistered=true; } } else { // get a reference to the current searcher for purposes of autowarming. currSearcherHolder=_searcher; currSearcherHolder.incref(); } } final SolrIndexSearcher currSearcher = currSearcherHolder==null ? null : currSearcherHolder.get(); Future future=null; // warm the new searcher based on the current searcher. // should this go before the other event handlers or after? if (currSearcher != null) { future = searcherExecutor.submit( new Callable() { public Object call() throws Exception { try { newSearcher.warm(currSearcher); } catch (Throwable e) { SolrException.log(log,e); } return null; } } ); } if (currSearcher==null && firstSearcherListeners.size() > 0) { future = searcherExecutor.submit( new Callable() { public Object call() throws Exception { try { for (SolrEventListener listener : firstSearcherListeners) { listener.newSearcher(newSearcher,null); } } catch (Throwable e) { SolrException.log(log,null,e); } return null; } } ); }
// in core/src/java/org/apache/solr/core/SolrCore.java
public void execute(SolrRequestHandler handler, SolrQueryRequest req, SolrQueryResponse rsp) { if (handler==null) { String msg = "Null Request Handler '" + req.getParams().get(CommonParams.QT) + "'"; if (log.isWarnEnabled()) log.warn(logid + msg + ":" + req); throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, msg); } // setup response header and handle request final NamedList<Object> responseHeader = new SimpleOrderedMap<Object>(); rsp.add("responseHeader", responseHeader); // toLog is a local ref to the same NamedList used by the request NamedList<Object> toLog = rsp.getToLog(); // for back compat, we set these now just in case other code // are expecting them during handleRequest toLog.add("webapp", req.getContext().get("webapp")); toLog.add("path", req.getContext().get("path")); toLog.add("params", "{" + req.getParamString() + "}"); // TODO: this doesn't seem to be working correctly and causes problems with the example server and distrib (for example /spell) // if (req.getParams().getBool(ShardParams.IS_SHARD,false) && !(handler instanceof SearchHandler)) // throw new SolrException(SolrException.ErrorCode.BAD_REQUEST,"isShard is only acceptable with search handlers"); handler.handleRequest(req,rsp); setResponseHeaderValues(handler,req,rsp); if (log.isInfoEnabled() && toLog.size() > 0) { StringBuilder sb = new StringBuilder(logid); for (int i=0; i<toLog.size(); i++) { String name = toLog.getName(i); Object val = toLog.getVal(i); if (name != null) { sb.append(name).append('='); } sb.append(val).append(' '); } log.info(sb.toString()); } }
// in core/src/java/org/apache/solr/core/SolrCore.java
public static void setResponseHeaderValues(SolrRequestHandler handler, SolrQueryRequest req, SolrQueryResponse rsp) { // TODO should check that responseHeader has not been replaced by handler NamedList<Object> responseHeader = rsp.getResponseHeader(); final int qtime=(int)(rsp.getEndTime() - req.getStartTime()); int status = 0; Exception exception = rsp.getException(); if( exception != null ){ if( exception instanceof SolrException ) status = ((SolrException)exception).code(); else status = 500; } responseHeader.add("status",status); responseHeader.add("QTime",qtime); if (rsp.getToLog().size() > 0) { rsp.getToLog().add("status",status); rsp.getToLog().add("QTime",qtime); } SolrParams params = req.getParams(); if( params.getBool(CommonParams.HEADER_ECHO_HANDLER, false) ) { responseHeader.add("handler", handler.getName() ); } // Values for echoParams... false/true/all or false/explicit/all ??? String ep = params.get( CommonParams.HEADER_ECHO_PARAMS, null ); if( ep != null ) { EchoParamStyle echoParams = EchoParamStyle.get( ep ); if( echoParams == null ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST,"Invalid value '" + ep + "' for " + CommonParams.HEADER_ECHO_PARAMS + " parameter, use '" + EchoParamStyle.EXPLICIT + "' or '" + EchoParamStyle.ALL + "'" ); } if( echoParams == EchoParamStyle.EXPLICIT ) { responseHeader.add("params", req.getOriginalParams().toNamedList()); } else if( echoParams == EchoParamStyle.ALL ) { responseHeader.add("params", req.getParams().toNamedList()); } } }
// in core/src/java/org/apache/solr/core/SolrCore.java
private void initQParsers() { initPlugins(qParserPlugins,QParserPlugin.class); // default parsers for (int i=0; i<QParserPlugin.standardPlugins.length; i+=2) { try { String name = (String)QParserPlugin.standardPlugins[i]; if (null == qParserPlugins.get(name)) { Class<QParserPlugin> clazz = (Class<QParserPlugin>)QParserPlugin.standardPlugins[i+1]; QParserPlugin plugin = clazz.newInstance(); qParserPlugins.put(name, plugin); plugin.init(null); } } catch (Exception e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); } } }
// in core/src/java/org/apache/solr/core/SolrCore.java
public QParserPlugin getQueryPlugin(String parserName) { QParserPlugin plugin = qParserPlugins.get(parserName); if (plugin != null) return plugin; throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Unknown query type '"+parserName+"'"); }
// in core/src/java/org/apache/solr/core/SolrCore.java
private void initValueSourceParsers() { initPlugins(valueSourceParsers,ValueSourceParser.class); // default value source parsers for (Map.Entry<String, ValueSourceParser> entry : ValueSourceParser.standardValueSourceParsers.entrySet()) { try { String name = entry.getKey(); if (null == valueSourceParsers.get(name)) { ValueSourceParser valueSourceParser = entry.getValue(); valueSourceParsers.put(name, valueSourceParser); valueSourceParser.init(null); } } catch (Exception e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); } } }
// in core/src/java/org/apache/solr/core/SolrCore.java
private void initTransformerFactories() { // Load any transformer factories initPlugins(transformerFactories,TransformerFactory.class); // Tell each transformer what its name is for( Map.Entry<String, TransformerFactory> entry : TransformerFactory.defaultFactories.entrySet() ) { try { String name = entry.getKey(); if (null == valueSourceParsers.get(name)) { TransformerFactory f = entry.getValue(); transformerFactories.put(name, f); // f.init(null); default ones don't need init } } catch (Exception e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); } } }
// in core/src/java/org/apache/solr/core/SolrCore.java
public synchronized QueryResponseWriter getWrappedWriter() { if( _writer == null ) { try { QueryResponseWriter writer = createQueryResponseWriter(_className); writer.init( _args ); _writer = writer; } catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "lazy loading error", ex ); } } return _writer; }
// in core/src/java/org/apache/solr/core/SolrResourceLoader.java
public <T> Class<? extends T> findClass(String cname, Class<T> expectedType, String... subpackages) { if (subpackages == null || subpackages.length == 0 || subpackages == packages) { subpackages = packages; String c = classNameCache.get(cname); if(c != null) { try { return Class.forName(c, true, classLoader).asSubclass(expectedType); } catch (ClassNotFoundException e) { //this is unlikely log.error("Unable to load cached class-name : "+ c +" for shortname : "+cname + e); } } } Class<? extends T> clazz = null; // first try cname == full name try { return Class.forName(cname, true, classLoader).asSubclass(expectedType); } catch (ClassNotFoundException e) { String newName=cname; if (newName.startsWith(project)) { newName = cname.substring(project.length()+1); } for (String subpackage : subpackages) { try { String name = base + '.' + subpackage + newName; log.trace("Trying class name " + name); return clazz = Class.forName(name,true,classLoader).asSubclass(expectedType); } catch (ClassNotFoundException e1) { // ignore... assume first exception is best. } } throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "Error loading class '" + cname + "'", e); }finally{ //cache the shortname vs FQN if it is loaded by the webapp classloader and it is loaded // using a shortname if ( clazz != null && clazz.getClassLoader() == SolrResourceLoader.class.getClassLoader() && !cname.equals(clazz.getName()) && (subpackages.length == 0 || subpackages == packages)) { //store in the cache classNameCache.put(cname, clazz.getName()); } } }
// in core/src/java/org/apache/solr/core/SolrResourceLoader.java
public <T> T newInstance(String cname, Class<T> expectedType, String ... subpackages) { Class<? extends T> clazz = findClass(cname, expectedType, subpackages); if( clazz == null ) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "Can not find class: "+cname + " in " + classLoader); } T obj = null; try { obj = clazz.newInstance(); } catch (Exception e) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "Error instantiating class: '" + clazz.getName()+"'", e); } if (!live) { if( obj instanceof SolrCoreAware ) { assertAwareCompatibility( SolrCoreAware.class, obj ); waitingForCore.add( (SolrCoreAware)obj ); } if (org.apache.solr.util.plugin.ResourceLoaderAware.class.isInstance(obj)) { log.warn("Class [{}] uses org.apache.solr.util.plugin.ResourceLoaderAware " + "which is deprecated. Change to org.apache.lucene.analysis.util.ResourceLoaderAware.", cname); } if( obj instanceof ResourceLoaderAware ) { assertAwareCompatibility( ResourceLoaderAware.class, obj ); waitingForResources.add( (ResourceLoaderAware)obj ); } if (obj instanceof SolrInfoMBean){ //TODO: Assert here? infoMBeans.add((SolrInfoMBean) obj); } } return obj; }
// in core/src/java/org/apache/solr/core/SolrResourceLoader.java
public CoreAdminHandler newAdminHandlerInstance(final CoreContainer coreContainer, String cname, String ... subpackages) { Class<? extends CoreAdminHandler> clazz = findClass(cname, CoreAdminHandler.class, subpackages); if( clazz == null ) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "Can not find class: "+cname + " in " + classLoader); } CoreAdminHandler obj = null; try { Constructor<? extends CoreAdminHandler> ctor = clazz.getConstructor(CoreContainer.class); obj = ctor.newInstance(coreContainer); } catch (Exception e) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "Error instantiating class: '" + clazz.getName()+"'", e); } if (!live) { //TODO: Does SolrCoreAware make sense here since in a multi-core context // which core are we talking about ? if (org.apache.solr.util.plugin.ResourceLoaderAware.class.isInstance(obj)) { log.warn("Class [{}] uses org.apache.solr.util.plugin.ResourceLoaderAware " + "which is deprecated. Change to org.apache.lucene.analysis.util.ResourceLoaderAware.", cname); } if( obj instanceof ResourceLoaderAware ) { assertAwareCompatibility( ResourceLoaderAware.class, obj ); waitingForResources.add( (ResourceLoaderAware)obj ); } } return obj; }
// in core/src/java/org/apache/solr/core/SolrResourceLoader.java
public <T> T newInstance(String cName, Class<T> expectedType, String [] subPackages, Class[] params, Object[] args){ Class<? extends T> clazz = findClass(cName, expectedType, subPackages); if( clazz == null ) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "Can not find class: "+cName + " in " + classLoader); } T obj = null; try { Constructor<? extends T> constructor = clazz.getConstructor(params); obj = constructor.newInstance(args); } catch (Exception e) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "Error instantiating class: '" + clazz.getName()+"'", e); } if (!live) { if( obj instanceof SolrCoreAware ) { assertAwareCompatibility( SolrCoreAware.class, obj ); waitingForCore.add( (SolrCoreAware)obj ); } if (org.apache.solr.util.plugin.ResourceLoaderAware.class.isInstance(obj)) { log.warn("Class [{}] uses org.apache.solr.util.plugin.ResourceLoaderAware " + "which is deprecated. Change to org.apache.lucene.analysis.util.ResourceLoaderAware.", cName); } if( obj instanceof ResourceLoaderAware ) { assertAwareCompatibility( ResourceLoaderAware.class, obj ); waitingForResources.add( (ResourceLoaderAware)obj ); } if (obj instanceof SolrInfoMBean){ //TODO: Assert here? infoMBeans.add((SolrInfoMBean) obj); } } return obj; }
// in core/src/java/org/apache/solr/core/SolrResourceLoader.java
void assertAwareCompatibility( Class aware, Object obj ) { Class[] valid = awareCompatibility.get( aware ); if( valid == null ) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "Unknown Aware interface: "+aware ); } for( Class v : valid ) { if( v.isInstance( obj ) ) { return; } } StringBuilder builder = new StringBuilder(); builder.append( "Invalid 'Aware' object: " ).append( obj ); builder.append( " -- ").append( aware.getName() ); builder.append( " must be an instance of: " ); for( Class v : valid ) { builder.append( "[" ).append( v.getName() ).append( "] ") ; } throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, builder.toString() ); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
public void load(String dir, InputSource cfgis) throws ParserConfigurationException, IOException, SAXException { if (null == dir) { // don't rely on SolrResourceLoader(), determine explicitly first dir = SolrResourceLoader.locateSolrHome(); } log.info("Loading CoreContainer using Solr Home: '{}'", dir); this.loader = new SolrResourceLoader(dir); solrHome = loader.getInstanceDir(); Config cfg = new Config(loader, null, cfgis, null, false); // keep orig config for persist to consult try { this.cfg = new Config(loader, null, copyDoc(cfg.getDocument())); } catch (TransformerException e) { throw new SolrException(ErrorCode.SERVER_ERROR, "", e); } cfg.substituteProperties(); // Initialize Logging if(cfg.getBool("solr/logging/@enabled",true)) { String slf4jImpl = null; String fname = cfg.get("solr/logging/watcher/@class", null); try { slf4jImpl = StaticLoggerBinder.getSingleton().getLoggerFactoryClassStr(); if(fname==null) { if( slf4jImpl.indexOf("Log4j") > 0) { log.warn("Log watching is not yet implemented for log4j" ); } else if( slf4jImpl.indexOf("JDK") > 0) { fname = "JUL"; } } } catch(Throwable ex) { log.warn("Unable to read SLF4J version. LogWatcher will be disabled: "+ex); } // Now load the framework if(fname!=null) { if("JUL".equalsIgnoreCase(fname)) { logging = new JulWatcher(slf4jImpl); } // else if( "Log4j".equals(fname) ) { // logging = new Log4jWatcher(slf4jImpl); // } else { try { logging = loader.newInstance(fname, LogWatcher.class); } catch (Throwable e) { log.warn("Unable to load LogWatcher", e); } } if( logging != null ) { ListenerConfig v = new ListenerConfig(); v.size = cfg.getInt("solr/logging/watcher/@size",50); v.threshold = cfg.get("solr/logging/watcher/@threshold",null); if(v.size>0) { log.info("Registering Log Listener"); logging.registerListener(v, this); } } } } String dcoreName = cfg.get("solr/cores/@defaultCoreName", null); if(dcoreName != null && !dcoreName.isEmpty()) { defaultCoreName = dcoreName; } persistent = cfg.getBool("solr/@persistent", false); libDir = cfg.get("solr/@sharedLib", null); zkHost = cfg.get("solr/@zkHost" , null); adminPath = cfg.get("solr/cores/@adminPath", null); shareSchema = cfg.getBool("solr/cores/@shareSchema", DEFAULT_SHARE_SCHEMA); zkClientTimeout = cfg.getInt("solr/cores/@zkClientTimeout", DEFAULT_ZK_CLIENT_TIMEOUT); hostPort = cfg.get("solr/cores/@hostPort", DEFAULT_HOST_PORT); hostContext = cfg.get("solr/cores/@hostContext", DEFAULT_HOST_CONTEXT); host = cfg.get("solr/cores/@host", null); if(shareSchema){ indexSchemaCache = new ConcurrentHashMap<String ,IndexSchema>(); } adminHandler = cfg.get("solr/cores/@adminHandler", null ); managementPath = cfg.get("solr/cores/@managementPath", null ); zkClientTimeout = Integer.parseInt(System.getProperty("zkClientTimeout", Integer.toString(zkClientTimeout))); initZooKeeper(zkHost, zkClientTimeout); if (libDir != null) { File f = FileUtils.resolvePath(new File(dir), libDir); log.info( "loading shared library: "+f.getAbsolutePath() ); libLoader = SolrResourceLoader.createClassLoader(f, null); } if (adminPath != null) { if (adminHandler == null) { coreAdminHandler = new CoreAdminHandler(this); } else { coreAdminHandler = this.createMultiCoreHandler(adminHandler); } } try { containerProperties = readProperties(cfg, ((NodeList) cfg.evaluate(DEFAULT_HOST_CONTEXT, XPathConstants.NODESET)).item(0)); } catch (Throwable e) { SolrException.log(log,null,e); } NodeList nodes = (NodeList)cfg.evaluate("solr/cores/core", XPathConstants.NODESET); for (int i=0; i<nodes.getLength(); i++) { Node node = nodes.item(i); try { String rawName = DOMUtil.getAttr(node, "name", null); if (null == rawName) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Each core in solr.xml must have a 'name'"); } String name = rawName; CoreDescriptor p = new CoreDescriptor(this, name, DOMUtil.getAttr(node, "instanceDir", null)); // deal with optional settings String opt = DOMUtil.getAttr(node, "config", null); if (opt != null) { p.setConfigName(opt); } opt = DOMUtil.getAttr(node, "schema", null); if (opt != null) { p.setSchemaName(opt); } if (zkController != null) { opt = DOMUtil.getAttr(node, "shard", null); if (opt != null && opt.length() > 0) { p.getCloudDescriptor().setShardId(opt); } opt = DOMUtil.getAttr(node, "collection", null); if (opt != null) { p.getCloudDescriptor().setCollectionName(opt); } opt = DOMUtil.getAttr(node, "roles", null); if(opt != null){ p.getCloudDescriptor().setRoles(opt); } } opt = DOMUtil.getAttr(node, "properties", null); if (opt != null) { p.setPropertiesName(opt); } opt = DOMUtil.getAttr(node, CoreAdminParams.DATA_DIR, null); if (opt != null) { p.setDataDir(opt); } p.setCoreProperties(readProperties(cfg, node)); SolrCore core = create(p); register(name, core, false); // track original names coreToOrigName.put(core, rawName); } catch (Throwable ex) { SolrException.log(log,null,ex); } } }
// in core/src/java/org/apache/solr/core/CoreContainer.java
public void reload(String name) throws ParserConfigurationException, IOException, SAXException { name= checkDefault(name); SolrCore core; synchronized(cores) { core = cores.get(name); } if (core == null) throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "No such core: " + name ); CoreDescriptor cd = core.getCoreDescriptor(); File instanceDir = new File(cd.getInstanceDir()); if (!instanceDir.isAbsolute()) { instanceDir = new File(getSolrHome(), cd.getInstanceDir()); } log.info("Reloading SolrCore '{}' using instanceDir: {}", cd.getName(), instanceDir.getAbsolutePath()); SolrResourceLoader solrLoader; if(zkController == null) { solrLoader = new SolrResourceLoader(instanceDir.getAbsolutePath(), libLoader, getCoreProps(instanceDir.getAbsolutePath(), cd.getPropertiesName(),cd.getCoreProperties())); } else { try { String collection = cd.getCloudDescriptor().getCollectionName(); zkController.createCollectionZkNode(cd.getCloudDescriptor()); String zkConfigName = zkController.readConfigName(collection); if (zkConfigName == null) { log.error("Could not find config name for collection:" + collection); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "Could not find config name for collection:" + collection); } solrLoader = new ZkSolrResourceLoader(instanceDir.getAbsolutePath(), zkConfigName, libLoader, getCoreProps(instanceDir.getAbsolutePath(), cd.getPropertiesName(),cd.getCoreProperties()), zkController); } catch (KeeperException e) { log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } } SolrCore newCore = core.reload(solrLoader); // keep core to orig name link String origName = coreToOrigName.remove(core); if (origName != null) { coreToOrigName.put(newCore, origName); } register(name, newCore, false); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
public void swap(String n0, String n1) { if( n0 == null || n1 == null ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "Can not swap unnamed cores." ); } n0 = checkDefault(n0); n1 = checkDefault(n1); synchronized( cores ) { SolrCore c0 = cores.get(n0); SolrCore c1 = cores.get(n1); if (c0 == null) throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "No such core: " + n0 ); if (c1 == null) throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "No such core: " + n1 ); cores.put(n0, c1); cores.put(n1, c0); c0.setName(n1); c0.getCoreDescriptor().name = n1; c1.setName(n0); c1.getCoreDescriptor().name = n0; } log.info("swapped: "+n0 + " with " + n1); }
// in core/src/java/org/apache/solr/core/SolrXMLSerializer.java
void persistFile(File file, SolrXMLDef solrXMLDef) { log.info("Persisting cores config to " + file); File tmpFile = null; try { // write in temp first tmpFile = File.createTempFile("solr", ".xml", file.getParentFile()); java.io.FileOutputStream out = new java.io.FileOutputStream(tmpFile); Writer writer = new BufferedWriter(new OutputStreamWriter(out, "UTF-8")); try { persist(writer, solrXMLDef); } finally { writer.close(); out.close(); } // rename over origin or copy if this fails if (tmpFile != null) { if (tmpFile.renameTo(file)) tmpFile = null; else fileCopy(tmpFile, file); } } catch (java.io.FileNotFoundException xnf) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, xnf); } catch (java.io.IOException xio) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, xio); } finally { if (tmpFile != null) { if (!tmpFile.delete()) tmpFile.deleteOnExit(); } } }
// in core/src/java/org/apache/solr/core/JmxMonitoredMap.java
private void unregister(String key, SolrInfoMBean infoBean) { if (server == null) return; try { ObjectName name = getObjectName(key, infoBean); if (server.isRegistered(name) && coreHashCode.equals(server.getAttribute(name, "coreHashCode"))) { server.unregisterMBean(name); } } catch (Exception e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Failed to unregister info bean: " + key, e); } }
// in core/src/java/org/apache/solr/util/SolrPluginUtils.java
public static DocList doSimpleQuery(String sreq, SolrQueryRequest req, int start, int limit) throws IOException { List<String> commands = StrUtils.splitSmart(sreq,';'); String qs = commands.size() >= 1 ? commands.get(0) : ""; try { Query query = QParser.getParser(qs, null, req).getQuery(); // If the first non-query, non-filter command is a simple sort on an indexed field, then // we can use the Lucene sort ability. Sort sort = null; if (commands.size() >= 2) { sort = QueryParsing.parseSort(commands.get(1), req); } DocList results = req.getSearcher().getDocList(query,(DocSet)null, sort, start, limit); return results; } catch (ParseException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Error parsing query: " + qs); } }
// in core/src/java/org/apache/solr/util/DOMUtil.java
public static String substituteProperty(String value, Properties coreProperties) { if (value == null || value.indexOf('$') == -1) { return value; } List<String> fragments = new ArrayList<String>(); List<String> propertyRefs = new ArrayList<String>(); parsePropertyString(value, fragments, propertyRefs); StringBuilder sb = new StringBuilder(); Iterator<String> i = fragments.iterator(); Iterator<String> j = propertyRefs.iterator(); while (i.hasNext()) { String fragment = i.next(); if (fragment == null) { String propertyName = j.next(); String defaultValue = null; int colon_index = propertyName.indexOf(':'); if (colon_index > -1) { defaultValue = propertyName.substring(colon_index + 1); propertyName = propertyName.substring(0,colon_index); } if (coreProperties != null) { fragment = coreProperties.getProperty(propertyName); } if (fragment == null) { fragment = System.getProperty(propertyName, defaultValue); } if (fragment == null) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "No system property or default value specified for " + propertyName + " value:" + value); } } sb.append(fragment); } return sb.toString(); }
// in core/src/java/org/apache/solr/util/plugin/AbstractPluginLoader.java
public T load( ResourceLoader loader, NodeList nodes ) { List<PluginInitInfo> info = new ArrayList<PluginInitInfo>(); T defaultPlugin = null; if (nodes !=null ) { for (int i=0; i<nodes.getLength(); i++) { Node node = nodes.item(i); String name = null; try { name = DOMUtil.getAttr(node,"name", requireName?type:null); String className = DOMUtil.getAttr(node,"class", type); String defaultStr = DOMUtil.getAttr(node,"default", null ); T plugin = create(loader, name, className, node ); log.debug("created " + ((name != null) ? name : "") + ": " + plugin.getClass().getName()); // Either initialize now or wait till everything has been registered if( preRegister ) { info.add( new PluginInitInfo( plugin, node ) ); } else { init( plugin, node ); } T old = register( name, plugin ); if( old != null && !( name == null && !requireName ) ) { throw new SolrException( ErrorCode.SERVER_ERROR, "Multiple "+type+" registered to the same name: "+name+" ignoring: "+old ); } if( defaultStr != null && Boolean.parseBoolean( defaultStr ) ) { if( defaultPlugin != null ) { throw new SolrException( ErrorCode.SERVER_ERROR, "Multiple default "+type+" plugins: "+defaultPlugin + " AND " + name ); } defaultPlugin = plugin; } } catch (Exception ex) { SolrException e = new SolrException (ErrorCode.SERVER_ERROR, "Plugin init failure for " + type + (null != name ? (" \"" + name + "\"") : "") + ": " + ex.getMessage(), ex); throw e; } } } // If everything needs to be registered *first*, this will initialize later for( PluginInitInfo pinfo : info ) { try { init( pinfo.plugin, pinfo.node ); } catch( Exception ex ) { SolrException e = new SolrException (ErrorCode.SERVER_ERROR, "Plugin Initializing failure for " + type, ex); throw e; } } return defaultPlugin; }
// in core/src/java/org/apache/solr/util/plugin/AbstractPluginLoader.java
public T loadSingle(ResourceLoader loader, Node node) { List<PluginInitInfo> info = new ArrayList<PluginInitInfo>(); T plugin = null; try { String name = DOMUtil.getAttr(node, "name", requireName ? type : null); String className = DOMUtil.getAttr(node, "class", type); plugin = create(loader, name, className, node); log.debug("created " + name + ": " + plugin.getClass().getName()); // Either initialize now or wait till everything has been registered if (preRegister) { info.add(new PluginInitInfo(plugin, node)); } else { init(plugin, node); } T old = register(name, plugin); if (old != null && !(name == null && !requireName)) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Multiple " + type + " registered to the same name: " + name + " ignoring: " + old); } } catch (Exception ex) { SolrException e = new SolrException (ErrorCode.SERVER_ERROR, "Plugin init failure for " + type, ex); throw e; } // If everything needs to be registered *first*, this will initialize later for (PluginInitInfo pinfo : info) { try { init(pinfo.plugin, pinfo.node); } catch (Exception ex) { SolrException e = new SolrException (ErrorCode.SERVER_ERROR, "Plugin init failure for " + type, ex); throw e; } } return plugin; }
187
            
// in solrj/src/java/org/apache/solr/common/params/FacetParams.java
catch (IllegalArgumentException e) { throw new SolrException (SolrException.ErrorCode.BAD_REQUEST, label+" is not a valid type of 'other' range facet information",e); }
// in solrj/src/java/org/apache/solr/common/params/FacetParams.java
catch (IllegalArgumentException e) { throw new SolrException (SolrException.ErrorCode.BAD_REQUEST, label+" is not a valid type of 'other' range facet information",e); }
// in solrj/src/java/org/apache/solr/common/params/FacetParams.java
catch (IllegalArgumentException e) { throw new SolrException (SolrException.ErrorCode.BAD_REQUEST, label+" is not a valid type of for range 'include' information",e); }
// in solrj/src/java/org/apache/solr/common/params/SolrParams.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, ex.getMessage(), ex ); }
// in solrj/src/java/org/apache/solr/common/params/SolrParams.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, ex.getMessage(), ex ); }
// in solrj/src/java/org/apache/solr/common/params/SolrParams.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, ex.getMessage(), ex ); }
// in solrj/src/java/org/apache/solr/common/params/SolrParams.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, ex.getMessage(), ex ); }
// in solrj/src/java/org/apache/solr/common/params/SolrParams.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, ex.getMessage(), ex ); }
// in solrj/src/java/org/apache/solr/common/params/SolrParams.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, ex.getMessage(), ex ); }
// in solrj/src/java/org/apache/solr/common/params/SolrParams.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, ex.getMessage(), ex ); }
// in solrj/src/java/org/apache/solr/common/params/SolrParams.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, ex.getMessage(), ex ); }
// in solrj/src/java/org/apache/solr/common/params/SolrParams.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, ex.getMessage(), ex ); }
// in solrj/src/java/org/apache/solr/common/params/SolrParams.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, ex.getMessage(), ex ); }
// in solrj/src/java/org/apache/solr/common/params/SolrParams.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, ex.getMessage(), ex ); }
// in solrj/src/java/org/apache/solr/common/params/SolrParams.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, ex.getMessage(), ex ); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/BinaryResponseParser.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "parsing error", e); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/StreamingBinaryResponseParser.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "parsing error", e); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/XMLResponseParser.java
catch (XMLStreamException e) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "parsing error", e); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/XMLResponseParser.java
catch (XMLStreamException e) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "parsing error", e); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/XMLResponseParser.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "parsing error", ex ); }
// in contrib/extraction/src/java/org/apache/solr/handler/extraction/ExtractingDocumentLoader.java
catch (TikaException e) { if(ignoreTikaException) log.warn(new StringBuilder("skip extracting text due to ").append(e.getLocalizedMessage()) .append(". metadata=").append(metadata.toString()).toString()); else throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in contrib/extraction/src/java/org/apache/solr/handler/extraction/ExtractingDocumentLoader.java
catch (SAXException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in contrib/extraction/src/java/org/apache/solr/handler/extraction/ExtractingRequestHandler.java
catch (Exception e) { throw new SolrException(ErrorCode.SERVER_ERROR, e); }
// in contrib/extraction/src/java/org/apache/solr/handler/extraction/ExtractingRequestHandler.java
catch (MimeTypeException e) { throw new SolrException(ErrorCode.SERVER_ERROR, e); }
// in contrib/extraction/src/java/org/apache/solr/handler/extraction/ExtractingRequestHandler.java
catch (IOException e) { throw new SolrException(ErrorCode.SERVER_ERROR, e); }
// in contrib/uima/src/java/org/apache/solr/uima/processor/UIMAUpdateRequestProcessor.java
catch (Exception e) { String logField = solrUIMAConfiguration.getLogField(); if(logField == null){ SchemaField uniqueKeyField = solrCore.getSchema().getUniqueKeyField(); if(uniqueKeyField != null){ logField = uniqueKeyField.getName(); } } String optionalFieldInfo = logField == null ? "." : new StringBuilder(". ").append(logField).append("=") .append((String)cmd.getSolrInputDocument().getField(logField).getValue()) .append(", ").toString(); int len = Math.min(text.length(), 100); if (solrUIMAConfiguration.isIgnoreErrors()) { log.warn(new StringBuilder("skip the text processing due to ") .append(e.getLocalizedMessage()).append(optionalFieldInfo) .append(" text=\"").append(text.substring(0, len)).append("...\"").toString()); } else { throw new SolrException(ErrorCode.SERVER_ERROR, new StringBuilder("processing error: ") .append(e.getLocalizedMessage()).append(optionalFieldInfo) .append(" text=\"").append(text.substring(0, len)).append("...\"").toString(), e); } }
// in contrib/clustering/src/java/org/apache/solr/handler/clustering/carrot2/CarrotClusteringEngine.java
catch (IOException e) { throw new SolrException(ErrorCode.SERVER_ERROR, e); }
// in contrib/clustering/src/java/org/apache/solr/handler/clustering/carrot2/CarrotClusteringEngine.java
catch (Exception e) { log.error("Carrot2 clustering failed", e); throw new SolrException(ErrorCode.SERVER_ERROR, "Carrot2 clustering failed", e); }
// in contrib/analysis-extras/src/java/org/apache/solr/analysis/StempelPolishStemFilterFactory.java
catch (IOException e) { throw new SolrException(ErrorCode.SERVER_ERROR, "Could not load stem table: " + STEMTABLE); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DataImportHandler.java
catch (Throwable e) { LOG.error( DataImporter.MSG.LOAD_EXP, e); throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, DataImporter.MSG.INVALID_CONFIG, e); }
// in core/src/java/org/apache/solr/handler/AnalysisRequestHandlerBase.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/handler/AnalysisRequestHandlerBase.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
catch (Exception e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Index fetch failed : ", e); }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
catch (Exception e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Unable to write index.properties", e); }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
catch (Exception e) { LOG.warn("Error in fetching packets ", e); //for any failure , increment the error count errorCount++; //if it fails for the same pacaket for MAX_RETRIES fail and come out if (errorCount > MAX_RETRIES) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Fetch failed for file:" + fileName, e); } return ERR; }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
catch (NumberFormatException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, INTERVAL_ERR_MSG); }
// in core/src/java/org/apache/solr/handler/loader/XMLLoader.java
catch(TransformerException te) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, te.getMessage(), te); }
// in core/src/java/org/apache/solr/handler/loader/XMLLoader.java
catch (XMLStreamException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e.getMessage(), e); }
// in core/src/java/org/apache/solr/handler/loader/XMLLoader.java
catch (XMLStreamException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e.getMessage(), e); }
// in core/src/java/org/apache/solr/handler/loader/JavabinLoader.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "ERROR handling commit/rollback"); }
// in core/src/java/org/apache/solr/handler/loader/JavabinLoader.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "ERROR adding document " + document); }
// in core/src/java/org/apache/solr/handler/loader/CSVLoaderBase.java
catch (IOException e) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST,e); }
// in core/src/java/org/apache/solr/handler/MoreLikeThisHandler.java
catch (ParseException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/handler/component/TermsComponent.java
catch (IllegalArgumentException iae) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Unknown terms regex flag '" + flagParam + "'"); }
// in core/src/java/org/apache/solr/handler/component/HttpShardHandler.java
catch (InterruptedException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/handler/component/HttpShardHandler.java
catch (ExecutionException e) { // should be impossible... the problem with catching the exception // at this level is we don't know what ShardRequest it applied to throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Impossible Exception",e); }
// in core/src/java/org/apache/solr/handler/component/QueryElevationComponent.java
catch (Exception ex) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Error initializing QueryElevationComponent.", ex); }
// in core/src/java/org/apache/solr/handler/component/QueryElevationComponent.java
catch (XPathExpressionException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "query requires '<doc .../>' child"); }
// in core/src/java/org/apache/solr/handler/component/QueryElevationComponent.java
catch (Exception ex) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Error loading elevation", ex); }
// in core/src/java/org/apache/solr/handler/component/QueryComponent.java
catch (ParseException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/handler/component/QueryComponent.java
catch (IllegalArgumentException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, String.format("Illegal %s parameter", GroupParams.GROUP_FORMAT)); }
// in core/src/java/org/apache/solr/handler/component/QueryComponent.java
catch (ParseException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/handler/component/FacetComponent.java
catch (ParseException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/handler/component/HighlightComponent.java
catch (ParseException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/handler/component/HighlightComponent.java
catch (Exception e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/handler/component/HttpShardHandlerFactory.java
catch (MalformedURLException e) { // should be impossible since we're not passing any URLs here throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/handler/component/TermVectorComponent.java
catch (NumberFormatException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e.getMessage(), e); }
// in core/src/java/org/apache/solr/handler/admin/ShowFileRequestHandler.java
catch (URISyntaxException e) { throw new SolrException( ErrorCode.FORBIDDEN, "Can not access configuration directory!"); }
// in core/src/java/org/apache/solr/handler/admin/SolrInfoMBeanHandler.java
catch(Exception ex) { throw new SolrException(ErrorCode.BAD_REQUEST, "missing content-stream for diff"); }
// in core/src/java/org/apache/solr/handler/admin/SolrInfoMBeanHandler.java
catch(Exception ex) { throw new SolrException(ErrorCode.BAD_REQUEST, "Unable to read original XML", ex); }
// in core/src/java/org/apache/solr/handler/admin/CoreAdminHandler.java
catch (Exception ex) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Error executing default implementation of CREATE", ex); }
// in core/src/java/org/apache/solr/handler/admin/CoreAdminHandler.java
catch (InterruptedException e) { Thread.currentThread().interrupt(); throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Could not unregister core " + cname + " from cloudstate: " + e.getMessage(), e); }
// in core/src/java/org/apache/solr/handler/admin/CoreAdminHandler.java
catch (KeeperException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Could not unregister core " + cname + " from cloudstate: " + e.getMessage(), e); }
// in core/src/java/org/apache/solr/handler/admin/CoreAdminHandler.java
catch (Exception ex) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Error handling 'status' action ", ex); }
// in core/src/java/org/apache/solr/handler/admin/CoreAdminHandler.java
catch (Exception ex) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Error handling 'reload' action", ex); }
// in core/src/java/org/apache/solr/handler/admin/LoggingHandler.java
catch(Exception ex) { throw new SolrException(ErrorCode.BAD_REQUEST, "invalid timestamp: "+since); }
// in core/src/java/org/apache/solr/handler/PingRequestHandler.java
catch (IllegalArgumentException iae){ throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Unknown action: " + actionParam); }
// in core/src/java/org/apache/solr/handler/PingRequestHandler.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Unable to write healthcheck flag file", e); }
// in core/src/java/org/apache/solr/analysis/TrieTokenizerFactory.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Unable to create TrieIndexTokenizer", e); }
// in core/src/java/org/apache/solr/analysis/TrieTokenizerFactory.java
catch (NumberFormatException nfe) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Invalid Number: " + v); }
// in core/src/java/org/apache/solr/analysis/TrieTokenizerFactory.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Unable to create TrieIndexTokenizer", e); }
// in core/src/java/org/apache/solr/response/transform/ValueAugmenterFactory.java
catch( Exception ex ) { throw new SolrException( ErrorCode.BAD_REQUEST, "Unable to parse "+type+"="+val, ex ); }
// in core/src/java/org/apache/solr/response/transform/ValueSourceAugmenter.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/response/transform/ValueSourceAugmenter.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "exception at docid " + docid + " for valuesource " + valueSource, e); }
// in core/src/java/org/apache/solr/response/transform/ExplainAugmenterFactory.java
catch( Exception ex ) { throw new SolrException( ErrorCode.BAD_REQUEST, "Unknown Explain Style: "+str ); }
// in core/src/java/org/apache/solr/request/PerSegmentSingleValuedFaceting.java
catch (InterruptedException e) { Thread.currentThread().interrupt(); throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/request/PerSegmentSingleValuedFaceting.java
catch (ExecutionException e) { Throwable cause = e.getCause(); if (cause instanceof RuntimeException) { throw (RuntimeException)cause; } else { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Error in per-segment faceting on field: " + fieldName, cause); } }
// in core/src/java/org/apache/solr/request/SimpleFacets.java
catch (IOException e) { throw new SolrException(ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/request/SimpleFacets.java
catch (ParseException e) { throw new SolrException(ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/request/SimpleFacets.java
catch (SolrException e) { throw new SolrException (SolrException.ErrorCode.BAD_REQUEST, "date facet 'start' is not a valid Date string: " + startS, e); }
// in core/src/java/org/apache/solr/request/SimpleFacets.java
catch (SolrException e) { throw new SolrException (SolrException.ErrorCode.BAD_REQUEST, "date facet 'end' is not a valid Date string: " + endS, e); }
// in core/src/java/org/apache/solr/request/SimpleFacets.java
catch (java.text.ParseException e) { throw new SolrException (SolrException.ErrorCode.BAD_REQUEST, "date facet 'gap' is not a valid Date Math string: " + gap, e); }
// in core/src/java/org/apache/solr/request/SimpleFacets.java
catch (Exception e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Can't parse value "+rawval+" for field: " + field.getName(), e); }
// in core/src/java/org/apache/solr/request/SimpleFacets.java
catch (Exception e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Can't parse gap "+gap+" for field: " + field.getName(), e); }
// in core/src/java/org/apache/solr/request/SimpleFacets.java
catch (Exception e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Can't add gap "+gap+" to value " + value + " for field: " + field.getName(), e); }
// in core/src/java/org/apache/solr/request/UnInvertedField.java
catch (IllegalStateException ise) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, ise.getMessage()); }
// in core/src/java/org/apache/solr/servlet/SolrDispatchFilter.java
catch (Exception e) { //unlikely throw new SolrException(SolrException.ErrorCode.SERVER_ERROR,e); }
// in core/src/java/org/apache/solr/servlet/SolrRequestParsers.java
catch( UnsupportedEncodingException uex ) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, uex ); }
// in core/src/java/org/apache/solr/servlet/cache/HttpCacheHeaderUtil.java
catch (IOException e) { // we're pretty freaking screwed if this happens throw new SolrException(ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/schema/OpenExchangeRatesOrgProvider.java
catch (Exception e) { throw new SolrException(ErrorCode.SERVER_ERROR, "Error reloading exchange rates", e); }
// in core/src/java/org/apache/solr/schema/OpenExchangeRatesOrgProvider.java
catch (IOException e) { throw new SolrException(ErrorCode.SERVER_ERROR, "Error closing stream", e); }
// in core/src/java/org/apache/solr/schema/OpenExchangeRatesOrgProvider.java
catch (Exception e) { throw new SolrException(ErrorCode.BAD_REQUEST, "Error initializing", e); }
// in core/src/java/org/apache/solr/schema/IndexSchema.java
catch(Exception e) { // unexpected exception... throw new SolrException( SolrException.ErrorCode.SERVER_ERROR,"Schema Parsing Failed: " + e.getMessage(), e); }
// in core/src/java/org/apache/solr/schema/DateField.java
catch (ParseException e) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "Invalid Date in Date Math String:'" +val+'\'',e); }
// in core/src/java/org/apache/solr/schema/DateField.java
catch (ParseException e) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "Invalid Date Math String:'" +val+'\'',e); }
// in core/src/java/org/apache/solr/schema/DateField.java
catch (ParseException e) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "Invalid Date in Date Math String:'" +val+'\'',e); }
// in core/src/java/org/apache/solr/schema/DateField.java
catch (ParseException e) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "Invalid Date Math String:'" +val+'\'',e); }
// in core/src/java/org/apache/solr/schema/CurrencyField.java
catch (Exception e) { throw new SolrException(ErrorCode.BAD_REQUEST, "Error instansiating exhange rate provider "+exchangeRateProviderClass+". Please check your FieldType configuration", e); }
// in core/src/java/org/apache/solr/schema/CurrencyField.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/schema/CurrencyField.java
catch (UnsupportedOperationException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "XML parser doesn't support XInclude option", e); }
// in core/src/java/org/apache/solr/schema/CurrencyField.java
catch (NumberFormatException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Could not parse exchange rate: " + rateNode, e); }
// in core/src/java/org/apache/solr/schema/CurrencyField.java
catch (SAXException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Error parsing currency config.", e); }
// in core/src/java/org/apache/solr/schema/CurrencyField.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Error parsing currency config.", e); }
// in core/src/java/org/apache/solr/schema/CurrencyField.java
catch (ParserConfigurationException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Error parsing currency config.", e); }
// in core/src/java/org/apache/solr/schema/CurrencyField.java
catch (XPathExpressionException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Error parsing currency config.", e); }
// in core/src/java/org/apache/solr/schema/CurrencyField.java
catch (IOException e) { throw new SolrException(ErrorCode.BAD_REQUEST, "Error while opening Currency configuration file "+currencyConfigFile, e); }
// in core/src/java/org/apache/solr/schema/CurrencyField.java
catch (NumberFormatException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/schema/TrieField.java
catch (IllegalArgumentException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Invalid type specified in schema.xml for field: " + args.get("name"), e); }
// in core/src/java/org/apache/solr/schema/LatLonType.java
catch (InvalidShapeException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/schema/LatLonType.java
catch (InvalidShapeException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/schema/LatLonType.java
catch (InvalidShapeException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/schema/LatLonType.java
catch (InvalidShapeException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/schema/LatLonType.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/schema/FieldTypePluginLoader.java
catch (Exception e) { log.error("Cannot load analyzer: "+analyzerName, e); throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "Cannot load analyzer: "+analyzerName, e ); }
// in core/src/java/org/apache/solr/schema/PointType.java
catch (InvalidShapeException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/schema/PointType.java
catch (InvalidShapeException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/schema/PointType.java
catch (InvalidShapeException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/schema/PointType.java
catch (InvalidShapeException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/schema/GeoHashField.java
catch (InvalidShapeException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/schema/GeoHashField.java
catch (InvalidShapeException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/schema/TextField.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Unable to initialize TokenStream to analyze multiTerm term: " + part, e); }
// in core/src/java/org/apache/solr/schema/TextField.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST,"error analyzing range part: " + part, e); }
// in core/src/java/org/apache/solr/schema/FieldType.java
catch (RuntimeException e) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "Error while creating field '" + field + "' from value '" + value + "'", e); }
// in core/src/java/org/apache/solr/search/ReturnFields.java
catch (ParseException e) { // try again, simple rules for a field name with no whitespace sp.pos = start; field = sp.getSimpleString(); if (req.getSchema().getFieldOrNull(field) != null) { // OK, it was an oddly named field fields.add(field); if( key != null ) { rename.add(field, key); } } else { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Error parsing fieldname: " + e.getMessage(), e); } }
// in core/src/java/org/apache/solr/search/ReturnFields.java
catch (ParseException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Error parsing fieldname", e); }
// in core/src/java/org/apache/solr/search/SolrConstantScoreQuery.java
catch (IOException e) { // TODO: remove this if ConstantScoreQuery.createWeight adds IOException throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/search/QueryParsing.java
catch (ParseException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "error in sort: " + sortSpec, e); }
// in core/src/java/org/apache/solr/search/QueryParsing.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "error in sort: " + sortSpec, e); }
// in core/src/java/org/apache/solr/search/grouping/distributed/responseprocessor/TopGroupsShardResponseProcessor.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/search/grouping/distributed/responseprocessor/SearchGroupShardResponseProcessor.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/highlight/DefaultSolrHighlighter.java
catch (InvalidTokenOffsetsException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/cloud/SolrZkServer.java
catch (QuorumPeerConfig.ConfigException e) { if (zkRun != null) throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/cloud/SolrZkServer.java
catch (IOException e) { if (zkRun != null) throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/cloud/SolrZkServer.java
catch (Throwable e) { log.error("ZooKeeper Server ERROR", e); throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (IOException e) { log.error("", e); throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Can't create ZooKeeperController", e); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (InterruptedException e) { throw new SolrException(ErrorCode.SERVER_ERROR, "Interrupted"); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (Exception e) { SolrException.log(log, "There was a problem finding the leader in zk", e); try { Thread.sleep(2000); } catch (InterruptedException e1) { Thread.currentThread().interrupt(); } if (i == retries - 1) { throw new SolrException(ErrorCode.SERVER_ERROR, "There was a problem finding the leader in zk"); } }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (Exception e) { SolrException.log(log, "There was a problem making a request to the leader", e); try { Thread.sleep(2000); } catch (InterruptedException e1) { Thread.currentThread().interrupt(); } if (i == retries - 1) { throw new SolrException(ErrorCode.SERVER_ERROR, "There was a problem making a request to the leader"); } }
// in core/src/java/org/apache/solr/update/UpdateLog.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Can't open new tlog!", e); }
// in core/src/java/org/apache/solr/update/DirectUpdateHandler2.java
catch (ParseException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e); }
// in core/src/java/org/apache/solr/update/DirectUpdateHandler2.java
catch (IOException e) { throw new SolrException(ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/update/processor/DistributedUpdateProcessor.java
catch (IllegalArgumentException e) { throw new SolrException (SolrException.ErrorCode.BAD_REQUEST, "Illegal value for " + DISTRIB_UPDATE_PARAM + ": " + param, e); }
// in core/src/java/org/apache/solr/update/processor/DistributedUpdateProcessor.java
catch (InterruptedException e) { throw new SolrException(ErrorCode.SERVER_ERROR, "Exception finding leader for shard " + sliceName, e); }
// in core/src/java/org/apache/solr/update/processor/MinFieldValueUpdateProcessorFactory.java
catch (ClassCastException e) { throw new SolrException (BAD_REQUEST, "Field values are not mutually comparable: " + e.getMessage(), e); }
// in core/src/java/org/apache/solr/update/processor/RegexReplaceProcessorFactory.java
catch (PatternSyntaxException e) { throw new SolrException(ErrorCode.SERVER_ERROR, "Invalid regex: " + patternParam, e); }
// in core/src/java/org/apache/solr/update/processor/FieldMutatingUpdateProcessor.java
catch (SolrException e) { String msg = "Unable to mutate field '"+fname+"': "+e.getMessage(); SolrException.log(log, msg, e); throw new SolrException(BAD_REQUEST, msg, e); }
// in core/src/java/org/apache/solr/update/processor/FieldMutatingUpdateProcessor.java
catch (Exception e) { throw new SolrException(SERVER_ERROR, "Can't resolve typeClass: " + t, e); }
// in core/src/java/org/apache/solr/update/processor/MaxFieldValueUpdateProcessorFactory.java
catch (ClassCastException e) { throw new SolrException (BAD_REQUEST, "Field values are not mutually comparable: " + e.getMessage(), e); }
// in core/src/java/org/apache/solr/update/processor/FieldMutatingUpdateProcessorFactory.java
catch (PatternSyntaxException e) { throw new SolrException (SERVER_ERROR, "Invalid 'fieldRegex' pattern: " + s, e); }
// in core/src/java/org/apache/solr/update/VersionInfo.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Error reading version from index", e); }
// in core/src/java/org/apache/solr/update/DocumentBuilder.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "ERROR: "+getID(doc, schema)+"Error adding field '" + field.getName() + "'='" +field.getValue()+"' msg=" + ex.getMessage(), ex ); }
// in core/src/java/org/apache/solr/update/SolrCmdDistributor.java
catch (InterruptedException e) { throw new SolrException(SolrException.ErrorCode.SERVICE_UNAVAILABLE, "interrupted waiting for shard update response", e); }
// in core/src/java/org/apache/solr/update/TransactionLog.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/update/TransactionLog.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/update/TransactionLog.java
catch (IOException e) { // TODO: reset our file pointer back to "pos", the start of this record. throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Error logging add", e); }
// in core/src/java/org/apache/solr/update/TransactionLog.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/update/TransactionLog.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/update/TransactionLog.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/update/TransactionLog.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/update/TransactionLog.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/update/TransactionLog.java
catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/core/Config.java
catch (XPathExpressionException e) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR,"Error in xpath:" + path +" for " + name,e); }
// in core/src/java/org/apache/solr/core/Config.java
catch (XPathExpressionException e) { SolrException.log(log,"Error in xpath",e); throw new SolrException( SolrException.ErrorCode.SERVER_ERROR,"Error in xpath:" + xstr + " for " + name,e); }
// in core/src/java/org/apache/solr/core/Config.java
catch (Throwable e) { SolrException.log(log,"Error in xpath",e); throw new SolrException( SolrException.ErrorCode.SERVER_ERROR,"Error in xpath:" + xstr+ " for " + name,e); }
// in core/src/java/org/apache/solr/core/Config.java
catch (IllegalArgumentException iae) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Invalid luceneMatchVersion '" + matchVersion + "', valid values are: " + Arrays.toString(Version.values()) + " or a string in format 'V.V'", iae); }
// in core/src/java/org/apache/solr/core/RequestHandlers.java
catch (Exception ex) { throw new SolrException (ErrorCode.SERVER_ERROR, "RequestHandler init failure", ex); }
// in core/src/java/org/apache/solr/core/RequestHandlers.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "lazy loading error", ex ); }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (Exception e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR,"Error Instantiating "+msg+", "+className+ " failed to instantiate " +cast.getName(), e); }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (Exception e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR,"Error Instantiating "+msg+", "+className+ " failed to instantiate " + UpdateHandler.class.getName(), e); }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (Throwable e) { latch.countDown();//release the latch, otherwise we block trying to do the close. This should be fine, since counting down on a latch of 0 is still fine //close down the searcher and any other resources, if it exists, as this is not recoverable close(); throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, null, e); }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (Exception e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Error opening new searcher", e); }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (Exception e) { if (e instanceof SolrException) throw (SolrException)e; throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (Exception e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (Exception e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (Exception e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch( Exception ex ) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "lazy loading error", ex ); }
// in core/src/java/org/apache/solr/core/SolrResourceLoader.java
catch (CharacterCodingException ex) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Error loading resource (wrong encoding?): " + resource, ex); }
// in core/src/java/org/apache/solr/core/SolrResourceLoader.java
catch (ClassNotFoundException e) { String newName=cname; if (newName.startsWith(project)) { newName = cname.substring(project.length()+1); } for (String subpackage : subpackages) { try { String name = base + '.' + subpackage + newName; log.trace("Trying class name " + name); return clazz = Class.forName(name,true,classLoader).asSubclass(expectedType); } catch (ClassNotFoundException e1) { // ignore... assume first exception is best. } } throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "Error loading class '" + cname + "'", e); }
// in core/src/java/org/apache/solr/core/SolrResourceLoader.java
catch (Exception e) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "Error instantiating class: '" + clazz.getName()+"'", e); }
// in core/src/java/org/apache/solr/core/SolrResourceLoader.java
catch (Exception e) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "Error instantiating class: '" + clazz.getName()+"'", e); }
// in core/src/java/org/apache/solr/core/SolrResourceLoader.java
catch (Exception e) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "Error instantiating class: '" + clazz.getName()+"'", e); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (TransformerException e) { throw new SolrException(ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/core/SolrXMLSerializer.java
catch (java.io.FileNotFoundException xnf) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, xnf); }
// in core/src/java/org/apache/solr/core/SolrXMLSerializer.java
catch (java.io.IOException xio) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, xio); }
// in core/src/java/org/apache/solr/core/JmxMonitoredMap.java
catch (Exception e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Failed to unregister info bean: " + key, e); }
// in core/src/java/org/apache/solr/util/SolrPluginUtils.java
catch (ParseException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Error parsing query: " + qs); }
// in core/src/java/org/apache/solr/util/DOMUtil.java
catch (NumberFormatException nfe) { throw new SolrException (SolrException.ErrorCode.SERVER_ERROR, "Value " + (null != name ? ("of '" +name+ "' ") : "") + "can not be parsed as '" +type+ "': \"" + textValue + "\"", nfe); }
15
            
// in core/src/java/org/apache/solr/handler/admin/CoreAdminHandler.java
protected boolean handleCreateAction(SolrQueryRequest req, SolrQueryResponse rsp) throws SolrException { try { SolrParams params = req.getParams(); String name = params.get(CoreAdminParams.NAME); //for now, do not allow creating new core with same name when in cloud mode //XXX perhaps it should just be unregistered from cloud before readding it?, //XXX perhaps we should also check that cores are of same type before adding new core to collection? if (coreContainer.getZkController() != null) { if (coreContainer.getCore(name) != null) { log.info("Re-creating a core with existing name is not allowed in cloud mode"); throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Core with name '" + name + "' already exists."); } } String instanceDir = params.get(CoreAdminParams.INSTANCE_DIR); if (instanceDir == null) { // instanceDir = coreContainer.getSolrHome() + "/" + name; instanceDir = name; // bare name is already relative to solr home } CoreDescriptor dcore = new CoreDescriptor(coreContainer, name, instanceDir); // fillup optional parameters String opts = params.get(CoreAdminParams.CONFIG); if (opts != null) dcore.setConfigName(opts); opts = params.get(CoreAdminParams.SCHEMA); if (opts != null) dcore.setSchemaName(opts); opts = params.get(CoreAdminParams.DATA_DIR); if (opts != null) dcore.setDataDir(opts); CloudDescriptor cd = dcore.getCloudDescriptor(); if (cd != null) { cd.setParams(req.getParams()); opts = params.get(CoreAdminParams.COLLECTION); if (opts != null) cd.setCollectionName(opts); opts = params.get(CoreAdminParams.SHARD); if (opts != null) cd.setShardId(opts); opts = params.get(CoreAdminParams.ROLES); if (opts != null) cd.setRoles(opts); Integer numShards = params.getInt(ZkStateReader.NUM_SHARDS_PROP); if (numShards != null) cd.setNumShards(numShards); } // Process all property.name=value parameters and set them as name=value core properties Properties coreProperties = new Properties(); Iterator<String> parameterNamesIterator = params.getParameterNamesIterator(); while (parameterNamesIterator.hasNext()) { String parameterName = parameterNamesIterator.next(); if(parameterName.startsWith(CoreAdminParams.PROPERTY_PREFIX)) { String parameterValue = params.get(parameterName); String propertyName = parameterName.substring(CoreAdminParams.PROPERTY_PREFIX.length()); // skip prefix coreProperties.put(propertyName, parameterValue); } } dcore.setCoreProperties(coreProperties); SolrCore core = coreContainer.create(dcore); coreContainer.register(name, core, false); rsp.add("core", core.getName()); return coreContainer.isPersistent(); } catch (Exception ex) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Error executing default implementation of CREATE", ex); } }
// in core/src/java/org/apache/solr/handler/admin/CoreAdminHandler.java
protected boolean handleRenameAction(SolrQueryRequest req, SolrQueryResponse rsp) throws SolrException { SolrParams params = req.getParams(); String name = params.get(CoreAdminParams.OTHER); String cname = params.get(CoreAdminParams.CORE); boolean doPersist = false; if (cname.equals(name)) return doPersist; doPersist = coreContainer.isPersistent(); coreContainer.rename(cname, name); return doPersist; }
// in core/src/java/org/apache/solr/handler/admin/CoreAdminHandler.java
protected boolean handleUnloadAction(SolrQueryRequest req, SolrQueryResponse rsp) throws SolrException { SolrParams params = req.getParams(); String cname = params.get(CoreAdminParams.CORE); SolrCore core = coreContainer.remove(cname); if(core == null){ throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "No such core exists '" + cname + "'"); } else { if (coreContainer.getZkController() != null) { log.info("Unregistering core " + cname + " from cloudstate."); try { coreContainer.getZkController().unregister(cname, core.getCoreDescriptor().getCloudDescriptor()); } catch (InterruptedException e) { Thread.currentThread().interrupt(); throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Could not unregister core " + cname + " from cloudstate: " + e.getMessage(), e); } catch (KeeperException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Could not unregister core " + cname + " from cloudstate: " + e.getMessage(), e); } } } if (params.getBool(CoreAdminParams.DELETE_INDEX, false)) { core.addCloseHook(new CloseHook() { @Override public void preClose(SolrCore core) {} @Override public void postClose(SolrCore core) { File dataDir = new File(core.getIndexDir()); File[] files = dataDir.listFiles(); if (files != null) { for (File file : files) { if (!file.delete()) { log.error(file.getAbsolutePath() + " could not be deleted on core unload"); } } if (!dataDir.delete()) log.error(dataDir.getAbsolutePath() + " could not be deleted on core unload"); } else { log.error(dataDir.getAbsolutePath() + " could not be deleted on core unload"); } } }); }
// in core/src/java/org/apache/solr/handler/admin/CoreAdminHandler.java
protected boolean handleStatusAction(SolrQueryRequest req, SolrQueryResponse rsp) throws SolrException { SolrParams params = req.getParams(); String cname = params.get(CoreAdminParams.CORE); boolean doPersist = false; NamedList<Object> status = new SimpleOrderedMap<Object>(); try { if (cname == null) { rsp.add("defaultCoreName", coreContainer.getDefaultCoreName()); for (String name : coreContainer.getCoreNames()) { status.add(name, getCoreStatus(coreContainer, name)); } } else { status.add(cname, getCoreStatus(coreContainer, cname)); } rsp.add("status", status); doPersist = false; // no state change return doPersist; } catch (Exception ex) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Error handling 'status' action ", ex); } }
// in core/src/java/org/apache/solr/handler/admin/CoreAdminHandler.java
protected boolean handlePersistAction(SolrQueryRequest req, SolrQueryResponse rsp) throws SolrException { SolrParams params = req.getParams(); boolean doPersist = false; String fileName = params.get(CoreAdminParams.FILE); if (fileName != null) { File file = new File(coreContainer.getConfigFile().getParentFile(), fileName); coreContainer.persistFile(file); rsp.add("saved", file.getAbsolutePath()); doPersist = false; } else if (!coreContainer.isPersistent()) { throw new SolrException(SolrException.ErrorCode.FORBIDDEN, "Persistence is not enabled"); } else doPersist = true; return doPersist; }
// in core/src/java/org/apache/solr/handler/PingRequestHandler.java
protected void handleEnable(boolean enable) throws SolrException { if (healthcheck == null) { throw new SolrException(SolrException.ErrorCode.SERVICE_UNAVAILABLE, "No healthcheck file defined."); } if ( enable ) { try { // write out when the file was created FileUtils.write(healthcheck, DateField.formatExternal(new Date()), "UTF-8"); } catch (IOException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Unable to write healthcheck flag file", e); } } else { if (healthcheck.exists() && !healthcheck.delete()){ throw new SolrException(SolrException.ErrorCode.NOT_FOUND, "Did not successfully delete healthcheck file: " +healthcheck.getAbsolutePath()); } } }
// in core/src/java/org/apache/solr/schema/OpenExchangeRatesOrgProvider.java
Override public boolean reload() throws SolrException { InputStream ratesJsonStream = null; try { log.info("Reloading exchange rates from "+ratesFileLocation); try { ratesJsonStream = (new URL(ratesFileLocation)).openStream(); } catch (Exception e) { ratesJsonStream = resourceLoader.openResource(ratesFileLocation); } rates = new OpenExchangeRates(ratesJsonStream); return true; } catch (Exception e) { throw new SolrException(ErrorCode.SERVER_ERROR, "Error reloading exchange rates", e); } finally { if (ratesJsonStream != null) try { ratesJsonStream.close(); } catch (IOException e) { throw new SolrException(ErrorCode.SERVER_ERROR, "Error closing stream", e); } } }
// in core/src/java/org/apache/solr/schema/OpenExchangeRatesOrgProvider.java
Override public void init(Map<String,String> params) throws SolrException { try { ratesFileLocation = getParam(params.get(PARAM_RATES_FILE_LOCATION), DEFAULT_RATES_FILE_LOCATION); refreshInterval = Integer.parseInt(getParam(params.get(PARAM_REFRESH_INTERVAL), DEFAULT_REFRESH_INTERVAL)); // Force a refresh interval of minimum one hour, since the API does not offer better resolution if (refreshInterval < 60) { refreshInterval = 60; log.warn("Specified refreshInterval was too small. Setting to 60 minutes which is the update rate of openexchangerates.org"); } log.info("Initialized with rates="+ratesFileLocation+", refreshInterval="+refreshInterval+"."); } catch (Exception e) { throw new SolrException(ErrorCode.BAD_REQUEST, "Error initializing", e); } finally { // Removing config params custom to us params.remove(PARAM_RATES_FILE_LOCATION); params.remove(PARAM_REFRESH_INTERVAL); } }
// in core/src/java/org/apache/solr/schema/OpenExchangeRatesOrgProvider.java
Override public void inform(ResourceLoader loader) throws SolrException { resourceLoader = loader; reload(); }
// in core/src/java/org/apache/solr/schema/CurrencyField.java
Override public boolean reload() throws SolrException { InputStream is = null; Map<String, Map<String, Double>> tmpRates = new HashMap<String, Map<String, Double>>(); try { log.info("Reloading exchange rates from file "+this.currencyConfigFile); is = loader.openResource(currencyConfigFile); javax.xml.parsers.DocumentBuilderFactory dbf = DocumentBuilderFactory.newInstance(); try { dbf.setXIncludeAware(true); dbf.setNamespaceAware(true); } catch (UnsupportedOperationException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "XML parser doesn't support XInclude option", e); } try { Document doc = dbf.newDocumentBuilder().parse(is); XPathFactory xpathFactory = XPathFactory.newInstance(); XPath xpath = xpathFactory.newXPath(); // Parse exchange rates. NodeList nodes = (NodeList) xpath.evaluate("/currencyConfig/rates/rate", doc, XPathConstants.NODESET); for (int i = 0; i < nodes.getLength(); i++) { Node rateNode = nodes.item(i); NamedNodeMap attributes = rateNode.getAttributes(); Node from = attributes.getNamedItem("from"); Node to = attributes.getNamedItem("to"); Node rate = attributes.getNamedItem("rate"); if (from == null || to == null || rate == null) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Exchange rate missing attributes (required: from, to, rate) " + rateNode); } String fromCurrency = from.getNodeValue(); String toCurrency = to.getNodeValue(); Double exchangeRate; if (java.util.Currency.getInstance(fromCurrency) == null || java.util.Currency.getInstance(toCurrency) == null) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Could not find from currency specified in exchange rate: " + rateNode); } try { exchangeRate = Double.parseDouble(rate.getNodeValue()); } catch (NumberFormatException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Could not parse exchange rate: " + rateNode, e); } addRate(tmpRates, fromCurrency, toCurrency, exchangeRate); } } catch (SAXException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Error parsing currency config.", e); } catch (IOException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Error parsing currency config.", e); } catch (ParserConfigurationException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Error parsing currency config.", e); } catch (XPathExpressionException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Error parsing currency config.", e); } } catch (IOException e) { throw new SolrException(ErrorCode.BAD_REQUEST, "Error while opening Currency configuration file "+currencyConfigFile, e); } finally { try { if (is != null) { is.close(); } } catch (IOException e) { e.printStackTrace(); } } // Atomically swap in the new rates map, if it loaded successfully this.rates = tmpRates; return true; }
// in core/src/java/org/apache/solr/schema/CurrencyField.java
Override public void init(Map<String,String> params) throws SolrException { this.currencyConfigFile = params.get(PARAM_CURRENCY_CONFIG); if(currencyConfigFile == null) { throw new SolrException(ErrorCode.NOT_FOUND, "Missing required configuration "+PARAM_CURRENCY_CONFIG); } // Removing config params custom to us params.remove(PARAM_CURRENCY_CONFIG); }
// in core/src/java/org/apache/solr/schema/CurrencyField.java
Override public void inform(ResourceLoader loader) throws SolrException { if(loader == null) { throw new SolrException(ErrorCode.BAD_REQUEST, "Needs ResourceLoader in order to load config file"); } this.loader = loader; reload(); }
// in core/src/java/org/apache/solr/schema/SchemaField.java
public void checkSortability() throws SolrException { if (! indexed() ) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "can not sort on unindexed field: " + getName()); } if ( multiValued() ) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "can not sort on multivalued field: " + getName()); } }
// in core/src/java/org/apache/solr/schema/SchemaField.java
public void checkFieldCacheSource(QParser parser) throws SolrException { if (! indexed() ) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "can not use FieldCache on unindexed field: " + getName()); } if ( multiValued() ) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "can not use FieldCache on multivalued field: " + getName()); } }
// in core/src/java/org/apache/solr/search/SolrQueryParser.java
private void checkNullField(String field) throws SolrException { if (field == null && defaultField == null) { throw new SolrException (SolrException.ErrorCode.BAD_REQUEST, "no field name specified in query and no defaultSearchField defined in schema.xml"); } }
18
            
// in solrj/src/java/org/apache/solr/client/solrj/request/QueryRequest.java
catch (SolrException s){ throw s; }
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
catch (SolrException e) { // we retry on 404 or 403 or 503 - you can see this on solr shutdown if (e.code() == 404 || e.code() == 403 || e.code() == 503 || e.code() == 500) { ex = addZombie(server, e); } else { // Server is alive but the request was likely malformed or invalid throw e; } // TODO: consider using below above - currently does cause a problem with distrib updates: // seems to match up against a failed forward to leader exception as well... // || e.getMessage().contains("java.net.SocketException") // || e.getMessage().contains("java.net.ConnectException") }
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
catch (SolrException e) { // we retry on 404 or 403 or 503 - you can see this on solr shutdown if (e.code() == 404 || e.code() == 403 || e.code() == 503 || e.code() == 500) { ex = e; // already a zombie, no need to re-add } else { // Server is alive but the request was malformed or invalid zombieServers.remove(wrapper.getKey()); throw e; } }
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
catch (SolrException e) { // Server is alive but the request was malformed or invalid throw e; }
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
catch (SolrException e) { // Server is alive but the request was malformed or invalid throw e; }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
catch (SolrException e) { throw e; }
// in core/src/java/org/apache/solr/handler/component/RealTimeGetComponent.java
catch (SolrException e) { log.warn("Exception reading log for updates", e); }
// in core/src/java/org/apache/solr/client/solrj/embedded/EmbeddedSolrServer.java
catch( SolrException sx ) { throw sx; }
// in core/src/java/org/apache/solr/request/SimpleFacets.java
catch (SolrException e) { throw new SolrException (SolrException.ErrorCode.BAD_REQUEST, "date facet 'start' is not a valid Date string: " + startS, e); }
// in core/src/java/org/apache/solr/request/SimpleFacets.java
catch (SolrException e) { throw new SolrException (SolrException.ErrorCode.BAD_REQUEST, "date facet 'end' is not a valid Date string: " + endS, e); }
// in core/src/java/org/apache/solr/schema/IndexSchema.java
catch (SolrException e) { throw e; }
// in core/src/java/org/apache/solr/update/processor/FieldMutatingUpdateProcessor.java
catch (SolrException e) { String msg = "Unable to mutate field '"+fname+"': "+e.getMessage(); SolrException.log(log, msg, e); throw new SolrException(BAD_REQUEST, msg, e); }
// in core/src/java/org/apache/solr/update/DocumentBuilder.java
catch( SolrException ex ) { throw ex; }
// in core/src/java/org/apache/solr/core/Config.java
catch( SolrException e ){ SolrException.log(log,"Error in "+name,e); throw e; }
// in core/src/java/org/apache/solr/core/Config.java
catch (SolrException e) { throw(e); }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (SolrException e) { throw e; }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (SolrException e) { throw e; }
// in core/src/java/org/apache/solr/util/SolrPluginUtils.java
catch (SolrException e) { sortE = e; }
16
            
// in solrj/src/java/org/apache/solr/client/solrj/request/QueryRequest.java
catch (SolrException s){ throw s; }
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
catch (SolrException e) { // we retry on 404 or 403 or 503 - you can see this on solr shutdown if (e.code() == 404 || e.code() == 403 || e.code() == 503 || e.code() == 500) { ex = addZombie(server, e); } else { // Server is alive but the request was likely malformed or invalid throw e; } // TODO: consider using below above - currently does cause a problem with distrib updates: // seems to match up against a failed forward to leader exception as well... // || e.getMessage().contains("java.net.SocketException") // || e.getMessage().contains("java.net.ConnectException") }
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
catch (SolrException e) { // we retry on 404 or 403 or 503 - you can see this on solr shutdown if (e.code() == 404 || e.code() == 403 || e.code() == 503 || e.code() == 500) { ex = e; // already a zombie, no need to re-add } else { // Server is alive but the request was malformed or invalid zombieServers.remove(wrapper.getKey()); throw e; } }
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
catch (SolrException e) { // Server is alive but the request was malformed or invalid throw e; }
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
catch (SolrException e) { // Server is alive but the request was malformed or invalid throw e; }
// in core/src/java/org/apache/solr/handler/SnapPuller.java
catch (SolrException e) { throw e; }
// in core/src/java/org/apache/solr/client/solrj/embedded/EmbeddedSolrServer.java
catch( SolrException sx ) { throw sx; }
// in core/src/java/org/apache/solr/request/SimpleFacets.java
catch (SolrException e) { throw new SolrException (SolrException.ErrorCode.BAD_REQUEST, "date facet 'start' is not a valid Date string: " + startS, e); }
// in core/src/java/org/apache/solr/request/SimpleFacets.java
catch (SolrException e) { throw new SolrException (SolrException.ErrorCode.BAD_REQUEST, "date facet 'end' is not a valid Date string: " + endS, e); }
// in core/src/java/org/apache/solr/schema/IndexSchema.java
catch (SolrException e) { throw e; }
// in core/src/java/org/apache/solr/update/processor/FieldMutatingUpdateProcessor.java
catch (SolrException e) { String msg = "Unable to mutate field '"+fname+"': "+e.getMessage(); SolrException.log(log, msg, e); throw new SolrException(BAD_REQUEST, msg, e); }
// in core/src/java/org/apache/solr/update/DocumentBuilder.java
catch( SolrException ex ) { throw ex; }
// in core/src/java/org/apache/solr/core/Config.java
catch( SolrException e ){ SolrException.log(log,"Error in "+name,e); throw e; }
// in core/src/java/org/apache/solr/core/Config.java
catch (SolrException e) { throw(e); }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (SolrException e) { throw e; }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (SolrException e) { throw e; }
3
checked (Domain) SolrServerException
public class SolrServerException extends Exception {

  private static final long serialVersionUID = -3371703521752000294L;
  
  public SolrServerException(String message, Throwable cause) {
    super(message, cause);
  }

  public SolrServerException(String message) {
    super(message);
  }

  public SolrServerException(Throwable cause) {
    super(cause);
  }
  
  public Throwable getRootCause() {
    Throwable t = this;
    while (true) {
      Throwable cause = t.getCause();
      if (cause!=null) {
        t = cause;
      } else {
        break;
      }
    }
    return t;
  }

}
19
            
// in solrj/src/java/org/apache/solr/client/solrj/request/QueryRequest.java
Override public QueryResponse process( SolrServer server ) throws SolrServerException { try { long startTime = System.currentTimeMillis(); QueryResponse res = new QueryResponse( server.request( this ), server ); res.setElapsedTime( System.currentTimeMillis()-startTime ); return res; } catch (SolrServerException e){ throw e; } catch (SolrException s){ throw s; } catch (Exception e) { throw new SolrServerException("Error executing query", e); } }
// in solrj/src/java/org/apache/solr/client/solrj/impl/CloudSolrServer.java
Override public NamedList<Object> request(SolrRequest request) throws SolrServerException, IOException { connect(); // TODO: if you can hash here, you could favor the shard leader CloudState cloudState = zkStateReader.getCloudState(); SolrParams reqParams = request.getParams(); if (reqParams == null) { reqParams = new ModifiableSolrParams(); } String collection = reqParams.get("collection", defaultCollection); if (collection == null) { throw new SolrServerException("No collection param specified on request and no default collection has been set."); } // Extract each comma separated collection name and store in a List. List<String> collectionList = StrUtils.splitSmart(collection, ",", true); // Retrieve slices from the cloud state and, for each collection specified, // add it to the Map of slices. Map<String,Slice> slices = new HashMap<String,Slice>(); for (int i = 0; i < collectionList.size(); i++) { String coll= collectionList.get(i); ClientUtils.appendMap(coll, slices, cloudState.getSlices(coll)); } Set<String> liveNodes = cloudState.getLiveNodes(); // IDEA: have versions on various things... like a global cloudState version // or shardAddressVersion (which only changes when the shards change) // to allow caching. // build a map of unique nodes // TODO: allow filtering by group, role, etc Map<String,ZkNodeProps> nodes = new HashMap<String,ZkNodeProps>(); List<String> urlList = new ArrayList<String>(); for (Slice slice : slices.values()) { for (ZkNodeProps nodeProps : slice.getShards().values()) { ZkCoreNodeProps coreNodeProps = new ZkCoreNodeProps(nodeProps); String node = coreNodeProps.getNodeName(); if (!liveNodes.contains(coreNodeProps.getNodeName()) || !coreNodeProps.getState().equals( ZkStateReader.ACTIVE)) continue; if (nodes.put(node, nodeProps) == null) { String url = coreNodeProps.getCoreUrl(); urlList.add(url); } } } Collections.shuffle(urlList, rand); //System.out.println("########################## MAKING REQUEST TO " + urlList); LBHttpSolrServer.Req req = new LBHttpSolrServer.Req(request, urlList); LBHttpSolrServer.Rsp rsp = lbServer.request(req); return rsp.getResponse(); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/HttpSolrServer.java
public NamedList<Object> request(final SolrRequest request, final ResponseParser processor) throws SolrServerException, IOException { HttpRequestBase method = null; InputStream is = null; SolrParams params = request.getParams(); Collection<ContentStream> streams = requestWriter.getContentStreams(request); String path = requestWriter.getPath(request); if (path == null || !path.startsWith("/")) { path = DEFAULT_PATH; } ResponseParser parser = request.getResponseParser(); if (parser == null) { parser = this.parser; } // The parser 'wt=' and 'version=' params are used instead of the original // params ModifiableSolrParams wparams = new ModifiableSolrParams(params); wparams.set(CommonParams.WT, parser.getWriterType()); wparams.set(CommonParams.VERSION, parser.getVersion()); if (invariantParams != null) { wparams.add(invariantParams); } params = wparams; int tries = maxRetries + 1; try { while( tries-- > 0 ) { // Note: since we aren't do intermittent time keeping // ourselves, the potential non-timeout latency could be as // much as tries-times (plus scheduling effects) the given // timeAllowed. try { if( SolrRequest.METHOD.GET == request.getMethod() ) { if( streams != null ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "GET can't send streams!" ); } method = new HttpGet( baseUrl + path + ClientUtils.toQueryString( params, false ) ); } else if( SolrRequest.METHOD.POST == request.getMethod() ) { String url = baseUrl + path; boolean isMultipart = ( streams != null && streams.size() > 1 ); LinkedList<NameValuePair> postParams = new LinkedList<NameValuePair>(); if (streams == null || isMultipart) { HttpPost post = new HttpPost(url); post.setHeader("Content-Charset", "UTF-8"); if (!this.useMultiPartPost && !isMultipart) { post.addHeader("Content-Type", "application/x-www-form-urlencoded; charset=UTF-8"); } List<FormBodyPart> parts = new LinkedList<FormBodyPart>(); Iterator<String> iter = params.getParameterNamesIterator(); while (iter.hasNext()) { String p = iter.next(); String[] vals = params.getParams(p); if (vals != null) { for (String v : vals) { if (this.useMultiPartPost || isMultipart) { parts.add(new FormBodyPart(p, new StringBody(v, Charset.forName("UTF-8")))); } else { postParams.add(new BasicNameValuePair(p, v)); } } } } if (isMultipart) { for (ContentStream content : streams) { String contentType = content.getContentType(); if(contentType==null) { contentType = "application/octet-stream"; // default } parts.add(new FormBodyPart(content.getName(), new InputStreamBody( content.getStream(), contentType, content.getName()))); } } if (parts.size() > 0) { MultipartEntity entity = new MultipartEntity(HttpMultipartMode.STRICT); for(FormBodyPart p: parts) { entity.addPart(p); } post.setEntity(entity); } else { //not using multipart post.setEntity(new UrlEncodedFormEntity(postParams, "UTF-8")); } method = post; } // It is has one stream, it is the post body, put the params in the URL else { String pstr = ClientUtils.toQueryString(params, false); HttpPost post = new HttpPost(url + pstr); // Single stream as body // Using a loop just to get the first one final ContentStream[] contentStream = new ContentStream[1]; for (ContentStream content : streams) { contentStream[0] = content; break; } if (contentStream[0] instanceof RequestWriter.LazyContentStream) { post.setEntity(new InputStreamEntity(contentStream[0].getStream(), -1) { @Override public Header getContentType() { return new BasicHeader("Content-Type", contentStream[0].getContentType()); } @Override public boolean isRepeatable() { return false; } }); } else { post.setEntity(new InputStreamEntity(contentStream[0].getStream(), -1) { @Override public Header getContentType() { return new BasicHeader("Content-Type", contentStream[0].getContentType()); } @Override public boolean isRepeatable() { return false; } }); } method = post; } } else { throw new SolrServerException("Unsupported method: "+request.getMethod() ); } } catch( NoHttpResponseException r ) { method = null; if(is != null) { is.close(); } // If out of tries then just rethrow (as normal error). if (tries < 1) { throw r; } } }
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
public Rsp request(Req req) throws SolrServerException, IOException { Rsp rsp = new Rsp(); Exception ex = null; List<ServerWrapper> skipped = new ArrayList<ServerWrapper>(req.getNumDeadServersToTry()); for (String serverStr : req.getServers()) { serverStr = normalize(serverStr); // if the server is currently a zombie, just skip to the next one ServerWrapper wrapper = zombieServers.get(serverStr); if (wrapper != null) { // System.out.println("ZOMBIE SERVER QUERIED: " + serverStr); if (skipped.size() < req.getNumDeadServersToTry()) skipped.add(wrapper); continue; } rsp.server = serverStr; HttpSolrServer server = makeServer(serverStr); try { rsp.rsp = server.request(req.getRequest()); return rsp; // SUCCESS } catch (SolrException e) { // we retry on 404 or 403 or 503 - you can see this on solr shutdown if (e.code() == 404 || e.code() == 403 || e.code() == 503 || e.code() == 500) { ex = addZombie(server, e); } else { // Server is alive but the request was likely malformed or invalid throw e; } // TODO: consider using below above - currently does cause a problem with distrib updates: // seems to match up against a failed forward to leader exception as well... // || e.getMessage().contains("java.net.SocketException") // || e.getMessage().contains("java.net.ConnectException") } catch (SocketException e) { ex = addZombie(server, e); } catch (SocketTimeoutException e) { ex = addZombie(server, e); } catch (SolrServerException e) { Throwable rootCause = e.getRootCause(); if (rootCause instanceof IOException) { ex = addZombie(server, e); } else { throw e; } } catch (Exception e) { throw new SolrServerException(e); } } // try the servers we previously skipped for (ServerWrapper wrapper : skipped) { try { rsp.rsp = wrapper.solrServer.request(req.getRequest()); zombieServers.remove(wrapper.getKey()); return rsp; // SUCCESS } catch (SolrException e) { // we retry on 404 or 403 or 503 - you can see this on solr shutdown if (e.code() == 404 || e.code() == 403 || e.code() == 503 || e.code() == 500) { ex = e; // already a zombie, no need to re-add } else { // Server is alive but the request was malformed or invalid zombieServers.remove(wrapper.getKey()); throw e; } } catch (SocketException e) { ex = e; } catch (SocketTimeoutException e) { ex = e; } catch (SolrServerException e) { Throwable rootCause = e.getRootCause(); if (rootCause instanceof IOException) { ex = e; // already a zombie, no need to re-add } else { throw e; } } catch (Exception e) { throw new SolrServerException(e); } } if (ex == null) { throw new SolrServerException("No live SolrServers available to handle this request"); } else { throw new SolrServerException("No live SolrServers available to handle this request:" + zombieServers.keySet(), ex); } }
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
Override public NamedList<Object> request(final SolrRequest request) throws SolrServerException, IOException { Exception ex = null; ServerWrapper[] serverList = aliveServerList; int maxTries = serverList.length; Map<String,ServerWrapper> justFailed = null; for (int attempts=0; attempts<maxTries; attempts++) { int count = counter.incrementAndGet(); ServerWrapper wrapper = serverList[count % serverList.length]; wrapper.lastUsed = System.currentTimeMillis(); try { return wrapper.solrServer.request(request); } catch (SolrException e) { // Server is alive but the request was malformed or invalid throw e; } catch (SolrServerException e) { if (e.getRootCause() instanceof IOException) { ex = e; moveAliveToDead(wrapper); if (justFailed == null) justFailed = new HashMap<String,ServerWrapper>(); justFailed.put(wrapper.getKey(), wrapper); } else { throw e; } } catch (Exception e) { throw new SolrServerException(e); } } // try other standard servers that we didn't try just now for (ServerWrapper wrapper : zombieServers.values()) { if (wrapper.standard==false || justFailed!=null && justFailed.containsKey(wrapper.getKey())) continue; try { NamedList<Object> rsp = wrapper.solrServer.request(request); // remove from zombie list *before* adding to alive to avoid a race that could lose a server zombieServers.remove(wrapper.getKey()); addToAlive(wrapper); return rsp; } catch (SolrException e) { // Server is alive but the request was malformed or invalid throw e; } catch (SolrServerException e) { if (e.getRootCause() instanceof IOException) { ex = e; // still dead } else { throw e; } } catch (Exception e) { throw new SolrServerException(e); } } if (ex == null) { throw new SolrServerException("No live SolrServers available to handle this request"); } else { throw new SolrServerException("No live SolrServers available to handle this request", ex); } }
// in core/src/java/org/apache/solr/client/solrj/embedded/EmbeddedSolrServer.java
Override public NamedList<Object> request(SolrRequest request) throws SolrServerException, IOException { String path = request.getPath(); if( path == null || !path.startsWith( "/" ) ) { path = "/select"; } // Check for cores action SolrCore core = coreContainer.getCore( coreName ); if( core == null ) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "No such core: " + coreName ); } SolrParams params = request.getParams(); if( params == null ) { params = new ModifiableSolrParams(); } // Extract the handler from the path or params SolrRequestHandler handler = core.getRequestHandler( path ); if( handler == null ) { if( "/select".equals( path ) || "/select/".equalsIgnoreCase( path) ) { String qt = params.get( CommonParams.QT ); handler = core.getRequestHandler( qt ); if( handler == null ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "unknown handler: "+qt); } } // Perhaps the path is to manage the cores if( handler == null && coreContainer != null && path.equals( coreContainer.getAdminPath() ) ) { handler = coreContainer.getMultiCoreHandler(); } } if( handler == null ) { core.close(); throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "unknown handler: "+path ); } SolrQueryRequest req = null; try { req = _parser.buildRequestFrom( core, params, request.getContentStreams() ); req.getContext().put( "path", path ); SolrQueryResponse rsp = new SolrQueryResponse(); SolrRequestInfo.setRequestInfo(new SolrRequestInfo(req, rsp)); core.execute( handler, req, rsp ); if( rsp.getException() != null ) { if(rsp.getException() instanceof SolrException) { throw rsp.getException(); } throw new SolrServerException( rsp.getException() ); } // Check if this should stream results if( request.getStreamingResponseCallback() != null ) { try { final StreamingResponseCallback callback = request.getStreamingResponseCallback(); BinaryResponseWriter.Resolver resolver = new BinaryResponseWriter.Resolver( req, rsp.getReturnFields()) { @Override public void writeResults(ResultContext ctx, JavaBinCodec codec) throws IOException { // write an empty list... SolrDocumentList docs = new SolrDocumentList(); docs.setNumFound( ctx.docs.matches() ); docs.setStart( ctx.docs.offset() ); docs.setMaxScore( ctx.docs.maxScore() ); codec.writeSolrDocumentList( docs ); // This will transform writeResultsBody( ctx, codec ); } }; ByteArrayOutputStream out = new ByteArrayOutputStream(); new JavaBinCodec(resolver) { @Override public void writeSolrDocument(SolrDocument doc) throws IOException { callback.streamSolrDocument( doc ); //super.writeSolrDocument( doc, fields ); } @Override public void writeSolrDocumentList(SolrDocumentList docs) throws IOException { if( docs.size() > 0 ) { SolrDocumentList tmp = new SolrDocumentList(); tmp.setMaxScore( docs.getMaxScore() ); tmp.setNumFound( docs.getNumFound() ); tmp.setStart( docs.getStart() ); docs = tmp; } callback.streamDocListInfo( docs.getNumFound(), docs.getStart(), docs.getMaxScore() ); super.writeSolrDocumentList(docs); } }.marshal(rsp.getValues(), out); InputStream in = new ByteArrayInputStream(out.toByteArray()); return (NamedList<Object>) new JavaBinCodec(resolver).unmarshal(in); } catch (Exception ex) { throw new RuntimeException(ex); } } // Now write it out NamedList<Object> normalized = getParsedResponse(req, rsp); return normalized; } catch( IOException iox ) { throw iox; } catch( SolrException sx ) { throw sx; } catch( Exception ex ) { throw new SolrServerException( ex ); } finally { if (req != null) req.close(); core.close(); SolrRequestInfo.clearRequestInfo(); } }
10
            
// in solrj/src/java/org/apache/solr/client/solrj/request/QueryRequest.java
catch (Exception e) { throw new SolrServerException("Error executing query", e); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/HttpSolrServer.java
catch (IOException ex) { throw new SolrServerException("error reading streams", ex); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/HttpSolrServer.java
catch (ConnectException e) { throw new SolrServerException("Server refused connection at: " + getBaseURL(), e); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/HttpSolrServer.java
catch (SocketTimeoutException e) { throw new SolrServerException( "Timeout occured while waiting response from server at: " + getBaseURL(), e); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/HttpSolrServer.java
catch (IOException e) { throw new SolrServerException( "IOException occured when talking to server at: " + getBaseURL(), e); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
catch (Exception e) { throw new SolrServerException(e); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
catch (Exception e) { throw new SolrServerException(e); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
catch (Exception e) { throw new SolrServerException(e); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
catch (Exception e) { throw new SolrServerException(e); }
// in core/src/java/org/apache/solr/client/solrj/embedded/EmbeddedSolrServer.java
catch( Exception ex ) { throw new SolrServerException( ex ); }
57
            
// in solrj/src/java/org/apache/solr/client/solrj/request/QueryRequest.java
Override public QueryResponse process( SolrServer server ) throws SolrServerException { try { long startTime = System.currentTimeMillis(); QueryResponse res = new QueryResponse( server.request( this ), server ); res.setElapsedTime( System.currentTimeMillis()-startTime ); return res; } catch (SolrServerException e){ throw e; } catch (SolrException s){ throw s; } catch (Exception e) { throw new SolrServerException("Error executing query", e); } }
// in solrj/src/java/org/apache/solr/client/solrj/request/DirectXmlRequest.java
Override public UpdateResponse process( SolrServer server ) throws SolrServerException, IOException { long startTime = System.currentTimeMillis(); UpdateResponse res = new UpdateResponse(); res.setResponse( server.request( this ) ); res.setElapsedTime( System.currentTimeMillis()-startTime ); return res; }
// in solrj/src/java/org/apache/solr/client/solrj/request/SolrPing.java
Override public SolrPingResponse process( SolrServer server ) throws SolrServerException, IOException { long startTime = System.currentTimeMillis(); SolrPingResponse res = new SolrPingResponse(); res.setResponse( server.request( this ) ); res.setElapsedTime( System.currentTimeMillis()-startTime ); return res; }
// in solrj/src/java/org/apache/solr/client/solrj/request/AbstractUpdateRequest.java
Override public UpdateResponse process( SolrServer server ) throws SolrServerException, IOException { long startTime = System.currentTimeMillis(); UpdateResponse res = new UpdateResponse(); res.setResponse( server.request( this ) ); res.setElapsedTime( System.currentTimeMillis()-startTime ); return res; }
// in solrj/src/java/org/apache/solr/client/solrj/request/FieldAnalysisRequest.java
Override public FieldAnalysisResponse process(SolrServer server) throws SolrServerException, IOException { if (fieldTypes == null && fieldNames == null) { throw new IllegalStateException("At least one field type or field name need to be specified"); } if (fieldValue == null) { throw new IllegalStateException("The field value must be set"); } long startTime = System.currentTimeMillis(); FieldAnalysisResponse res = new FieldAnalysisResponse(); res.setResponse(server.request(this)); res.setElapsedTime(System.currentTimeMillis() - startTime); return res; }
// in solrj/src/java/org/apache/solr/client/solrj/request/LukeRequest.java
Override public LukeResponse process( SolrServer server ) throws SolrServerException, IOException { long startTime = System.currentTimeMillis(); LukeResponse res = new LukeResponse(); res.setResponse( server.request( this ) ); res.setElapsedTime( System.currentTimeMillis()-startTime ); return res; }
// in solrj/src/java/org/apache/solr/client/solrj/request/DocumentAnalysisRequest.java
Override public DocumentAnalysisResponse process(SolrServer server) throws SolrServerException, IOException { long startTime = System.currentTimeMillis(); DocumentAnalysisResponse res = new DocumentAnalysisResponse(); res.setResponse(server.request(this)); res.setElapsedTime(System.currentTimeMillis() - startTime); return res; }
// in solrj/src/java/org/apache/solr/client/solrj/request/CoreAdminRequest.java
Override public CoreAdminResponse process(SolrServer server) throws SolrServerException, IOException { long startTime = System.currentTimeMillis(); CoreAdminResponse res = new CoreAdminResponse(); res.setResponse( server.request( this ) ); res.setElapsedTime( System.currentTimeMillis()-startTime ); return res; }
// in solrj/src/java/org/apache/solr/client/solrj/request/CoreAdminRequest.java
public static CoreAdminResponse reloadCore( String name, SolrServer server ) throws SolrServerException, IOException { CoreAdminRequest req = new CoreAdminRequest(); req.setCoreName( name ); req.setAction( CoreAdminAction.RELOAD ); return req.process( server ); }
// in solrj/src/java/org/apache/solr/client/solrj/request/CoreAdminRequest.java
public static CoreAdminResponse unloadCore( String name, SolrServer server ) throws SolrServerException, IOException { return unloadCore(name, false, server); }
// in solrj/src/java/org/apache/solr/client/solrj/request/CoreAdminRequest.java
public static CoreAdminResponse unloadCore( String name, boolean deleteIndex, SolrServer server ) throws SolrServerException, IOException { Unload req = new Unload(deleteIndex); req.setCoreName( name ); return req.process( server ); }
// in solrj/src/java/org/apache/solr/client/solrj/request/CoreAdminRequest.java
public static CoreAdminResponse renameCore(String coreName, String newName, SolrServer server ) throws SolrServerException, IOException { CoreAdminRequest req = new CoreAdminRequest(); req.setCoreName(coreName); req.setOtherCoreName(newName); req.setAction( CoreAdminAction.RENAME ); return req.process( server ); }
// in solrj/src/java/org/apache/solr/client/solrj/request/CoreAdminRequest.java
public static CoreAdminResponse getStatus( String name, SolrServer server ) throws SolrServerException, IOException { CoreAdminRequest req = new CoreAdminRequest(); req.setCoreName( name ); req.setAction( CoreAdminAction.STATUS ); return req.process( server ); }
// in solrj/src/java/org/apache/solr/client/solrj/request/CoreAdminRequest.java
public static CoreAdminResponse createCore( String name, String instanceDir, SolrServer server ) throws SolrServerException, IOException { return CoreAdminRequest.createCore(name, instanceDir, server, null, null); }
// in solrj/src/java/org/apache/solr/client/solrj/request/CoreAdminRequest.java
public static CoreAdminResponse createCore( String name, String instanceDir, SolrServer server, String configFile, String schemaFile ) throws SolrServerException, IOException { CoreAdminRequest.Create req = new CoreAdminRequest.Create(); req.setCoreName( name ); req.setInstanceDir(instanceDir); if(configFile != null){ req.setConfigName(configFile); } if(schemaFile != null){ req.setSchemaName(schemaFile); } return req.process( server ); }
// in solrj/src/java/org/apache/solr/client/solrj/request/CoreAdminRequest.java
public static CoreAdminResponse persist(String fileName, SolrServer server) throws SolrServerException, IOException { CoreAdminRequest.Persist req = new CoreAdminRequest.Persist(); req.setFileName(fileName); return req.process(server); }
// in solrj/src/java/org/apache/solr/client/solrj/request/CoreAdminRequest.java
public static CoreAdminResponse mergeIndexes(String name, String[] indexDirs, String[] srcCores, SolrServer server) throws SolrServerException, IOException { CoreAdminRequest.MergeIndexes req = new CoreAdminRequest.MergeIndexes(); req.setCoreName(name); req.setIndexDirs(Arrays.asList(indexDirs)); req.setSrcCores(Arrays.asList(srcCores)); return req.process(server); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/ConcurrentUpdateSolrServer.java
public NamedList<Object> request(final SolrRequest request) throws SolrServerException, IOException { if (!(request instanceof UpdateRequest)) { return server.request(request); } UpdateRequest req = (UpdateRequest) request; // this happens for commit... if (req.getDocuments() == null || req.getDocuments().isEmpty()) { blockUntilFinished(); return server.request(request); } SolrParams params = req.getParams(); if (params != null) { // check if it is waiting for the searcher if (params.getBool(UpdateParams.WAIT_SEARCHER, false)) { log.info("blocking for commit/optimize"); blockUntilFinished(); // empty the queue return server.request(request); } } try { CountDownLatch tmpLock = lock; if (tmpLock != null) { tmpLock.await(); } boolean success = queue.offer(req); for (;;) { synchronized (runners) { if (runners.isEmpty() || (queue.remainingCapacity() < queue.size() // queue // is // half // full // and // we // can // add // more // runners && runners.size() < threadCount)) { // We need more runners, so start a new one. Runner r = new Runner(); runners.add(r); scheduler.execute(r); } else { // break out of the retry loop if we added the element to the queue // successfully, *and* // while we are still holding the runners lock to prevent race // conditions. // race conditions. if (success) break; } } // Retry to add to the queue w/o the runners lock held (else we risk // temporary deadlock) // This retry could also fail because // 1) existing runners were not able to take off any new elements in the // queue // 2) the queue was filled back up since our last try // If we succeed, the queue may have been completely emptied, and all // runners stopped. // In all cases, we should loop back to the top to see if we need to // start more runners. // if (!success) { success = queue.offer(req, 100, TimeUnit.MILLISECONDS); } } } catch (InterruptedException e) { log.error("interrupted", e); throw new IOException(e.getLocalizedMessage()); } // RETURN A DUMMY result NamedList<Object> dummy = new NamedList<Object>(); dummy.add("NOTE", "the request is processed in a background stream"); return dummy; }
// in solrj/src/java/org/apache/solr/client/solrj/impl/CloudSolrServer.java
Override public NamedList<Object> request(SolrRequest request) throws SolrServerException, IOException { connect(); // TODO: if you can hash here, you could favor the shard leader CloudState cloudState = zkStateReader.getCloudState(); SolrParams reqParams = request.getParams(); if (reqParams == null) { reqParams = new ModifiableSolrParams(); } String collection = reqParams.get("collection", defaultCollection); if (collection == null) { throw new SolrServerException("No collection param specified on request and no default collection has been set."); } // Extract each comma separated collection name and store in a List. List<String> collectionList = StrUtils.splitSmart(collection, ",", true); // Retrieve slices from the cloud state and, for each collection specified, // add it to the Map of slices. Map<String,Slice> slices = new HashMap<String,Slice>(); for (int i = 0; i < collectionList.size(); i++) { String coll= collectionList.get(i); ClientUtils.appendMap(coll, slices, cloudState.getSlices(coll)); } Set<String> liveNodes = cloudState.getLiveNodes(); // IDEA: have versions on various things... like a global cloudState version // or shardAddressVersion (which only changes when the shards change) // to allow caching. // build a map of unique nodes // TODO: allow filtering by group, role, etc Map<String,ZkNodeProps> nodes = new HashMap<String,ZkNodeProps>(); List<String> urlList = new ArrayList<String>(); for (Slice slice : slices.values()) { for (ZkNodeProps nodeProps : slice.getShards().values()) { ZkCoreNodeProps coreNodeProps = new ZkCoreNodeProps(nodeProps); String node = coreNodeProps.getNodeName(); if (!liveNodes.contains(coreNodeProps.getNodeName()) || !coreNodeProps.getState().equals( ZkStateReader.ACTIVE)) continue; if (nodes.put(node, nodeProps) == null) { String url = coreNodeProps.getCoreUrl(); urlList.add(url); } } } Collections.shuffle(urlList, rand); //System.out.println("########################## MAKING REQUEST TO " + urlList); LBHttpSolrServer.Req req = new LBHttpSolrServer.Req(request, urlList); LBHttpSolrServer.Rsp rsp = lbServer.request(req); return rsp.getResponse(); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/HttpSolrServer.java
Override public NamedList<Object> request(final SolrRequest request) throws SolrServerException, IOException { ResponseParser responseParser = request.getResponseParser(); if (responseParser == null) { responseParser = parser; } return request(request, responseParser); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/HttpSolrServer.java
public NamedList<Object> request(final SolrRequest request, final ResponseParser processor) throws SolrServerException, IOException { HttpRequestBase method = null; InputStream is = null; SolrParams params = request.getParams(); Collection<ContentStream> streams = requestWriter.getContentStreams(request); String path = requestWriter.getPath(request); if (path == null || !path.startsWith("/")) { path = DEFAULT_PATH; } ResponseParser parser = request.getResponseParser(); if (parser == null) { parser = this.parser; } // The parser 'wt=' and 'version=' params are used instead of the original // params ModifiableSolrParams wparams = new ModifiableSolrParams(params); wparams.set(CommonParams.WT, parser.getWriterType()); wparams.set(CommonParams.VERSION, parser.getVersion()); if (invariantParams != null) { wparams.add(invariantParams); } params = wparams; int tries = maxRetries + 1; try { while( tries-- > 0 ) { // Note: since we aren't do intermittent time keeping // ourselves, the potential non-timeout latency could be as // much as tries-times (plus scheduling effects) the given // timeAllowed. try { if( SolrRequest.METHOD.GET == request.getMethod() ) { if( streams != null ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "GET can't send streams!" ); } method = new HttpGet( baseUrl + path + ClientUtils.toQueryString( params, false ) ); } else if( SolrRequest.METHOD.POST == request.getMethod() ) { String url = baseUrl + path; boolean isMultipart = ( streams != null && streams.size() > 1 ); LinkedList<NameValuePair> postParams = new LinkedList<NameValuePair>(); if (streams == null || isMultipart) { HttpPost post = new HttpPost(url); post.setHeader("Content-Charset", "UTF-8"); if (!this.useMultiPartPost && !isMultipart) { post.addHeader("Content-Type", "application/x-www-form-urlencoded; charset=UTF-8"); } List<FormBodyPart> parts = new LinkedList<FormBodyPart>(); Iterator<String> iter = params.getParameterNamesIterator(); while (iter.hasNext()) { String p = iter.next(); String[] vals = params.getParams(p); if (vals != null) { for (String v : vals) { if (this.useMultiPartPost || isMultipart) { parts.add(new FormBodyPart(p, new StringBody(v, Charset.forName("UTF-8")))); } else { postParams.add(new BasicNameValuePair(p, v)); } } } } if (isMultipart) { for (ContentStream content : streams) { String contentType = content.getContentType(); if(contentType==null) { contentType = "application/octet-stream"; // default } parts.add(new FormBodyPart(content.getName(), new InputStreamBody( content.getStream(), contentType, content.getName()))); } } if (parts.size() > 0) { MultipartEntity entity = new MultipartEntity(HttpMultipartMode.STRICT); for(FormBodyPart p: parts) { entity.addPart(p); } post.setEntity(entity); } else { //not using multipart post.setEntity(new UrlEncodedFormEntity(postParams, "UTF-8")); } method = post; } // It is has one stream, it is the post body, put the params in the URL else { String pstr = ClientUtils.toQueryString(params, false); HttpPost post = new HttpPost(url + pstr); // Single stream as body // Using a loop just to get the first one final ContentStream[] contentStream = new ContentStream[1]; for (ContentStream content : streams) { contentStream[0] = content; break; } if (contentStream[0] instanceof RequestWriter.LazyContentStream) { post.setEntity(new InputStreamEntity(contentStream[0].getStream(), -1) { @Override public Header getContentType() { return new BasicHeader("Content-Type", contentStream[0].getContentType()); } @Override public boolean isRepeatable() { return false; } }); } else { post.setEntity(new InputStreamEntity(contentStream[0].getStream(), -1) { @Override public Header getContentType() { return new BasicHeader("Content-Type", contentStream[0].getContentType()); } @Override public boolean isRepeatable() { return false; } }); } method = post; } } else { throw new SolrServerException("Unsupported method: "+request.getMethod() ); } } catch( NoHttpResponseException r ) { method = null; if(is != null) { is.close(); } // If out of tries then just rethrow (as normal error). if (tries < 1) { throw r; } } }
// in solrj/src/java/org/apache/solr/client/solrj/impl/HttpSolrServer.java
public UpdateResponse add(Iterator<SolrInputDocument> docIterator) throws SolrServerException, IOException { UpdateRequest req = new UpdateRequest(); req.setDocIterator(docIterator); return req.process(this); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/HttpSolrServer.java
public UpdateResponse addBeans(final Iterator<?> beanIterator) throws SolrServerException, IOException { UpdateRequest req = new UpdateRequest(); req.setDocIterator(new Iterator<SolrInputDocument>() { public boolean hasNext() { return beanIterator.hasNext(); } public SolrInputDocument next() { Object o = beanIterator.next(); if (o == null) return null; return getBinder().toSolrInputDocument(o); } public void remove() { beanIterator.remove(); } }); return req.process(this); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
public Rsp request(Req req) throws SolrServerException, IOException { Rsp rsp = new Rsp(); Exception ex = null; List<ServerWrapper> skipped = new ArrayList<ServerWrapper>(req.getNumDeadServersToTry()); for (String serverStr : req.getServers()) { serverStr = normalize(serverStr); // if the server is currently a zombie, just skip to the next one ServerWrapper wrapper = zombieServers.get(serverStr); if (wrapper != null) { // System.out.println("ZOMBIE SERVER QUERIED: " + serverStr); if (skipped.size() < req.getNumDeadServersToTry()) skipped.add(wrapper); continue; } rsp.server = serverStr; HttpSolrServer server = makeServer(serverStr); try { rsp.rsp = server.request(req.getRequest()); return rsp; // SUCCESS } catch (SolrException e) { // we retry on 404 or 403 or 503 - you can see this on solr shutdown if (e.code() == 404 || e.code() == 403 || e.code() == 503 || e.code() == 500) { ex = addZombie(server, e); } else { // Server is alive but the request was likely malformed or invalid throw e; } // TODO: consider using below above - currently does cause a problem with distrib updates: // seems to match up against a failed forward to leader exception as well... // || e.getMessage().contains("java.net.SocketException") // || e.getMessage().contains("java.net.ConnectException") } catch (SocketException e) { ex = addZombie(server, e); } catch (SocketTimeoutException e) { ex = addZombie(server, e); } catch (SolrServerException e) { Throwable rootCause = e.getRootCause(); if (rootCause instanceof IOException) { ex = addZombie(server, e); } else { throw e; } } catch (Exception e) { throw new SolrServerException(e); } } // try the servers we previously skipped for (ServerWrapper wrapper : skipped) { try { rsp.rsp = wrapper.solrServer.request(req.getRequest()); zombieServers.remove(wrapper.getKey()); return rsp; // SUCCESS } catch (SolrException e) { // we retry on 404 or 403 or 503 - you can see this on solr shutdown if (e.code() == 404 || e.code() == 403 || e.code() == 503 || e.code() == 500) { ex = e; // already a zombie, no need to re-add } else { // Server is alive but the request was malformed or invalid zombieServers.remove(wrapper.getKey()); throw e; } } catch (SocketException e) { ex = e; } catch (SocketTimeoutException e) { ex = e; } catch (SolrServerException e) { Throwable rootCause = e.getRootCause(); if (rootCause instanceof IOException) { ex = e; // already a zombie, no need to re-add } else { throw e; } } catch (Exception e) { throw new SolrServerException(e); } } if (ex == null) { throw new SolrServerException("No live SolrServers available to handle this request"); } else { throw new SolrServerException("No live SolrServers available to handle this request:" + zombieServers.keySet(), ex); } }
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
Override public NamedList<Object> request(final SolrRequest request) throws SolrServerException, IOException { Exception ex = null; ServerWrapper[] serverList = aliveServerList; int maxTries = serverList.length; Map<String,ServerWrapper> justFailed = null; for (int attempts=0; attempts<maxTries; attempts++) { int count = counter.incrementAndGet(); ServerWrapper wrapper = serverList[count % serverList.length]; wrapper.lastUsed = System.currentTimeMillis(); try { return wrapper.solrServer.request(request); } catch (SolrException e) { // Server is alive but the request was malformed or invalid throw e; } catch (SolrServerException e) { if (e.getRootCause() instanceof IOException) { ex = e; moveAliveToDead(wrapper); if (justFailed == null) justFailed = new HashMap<String,ServerWrapper>(); justFailed.put(wrapper.getKey(), wrapper); } else { throw e; } } catch (Exception e) { throw new SolrServerException(e); } } // try other standard servers that we didn't try just now for (ServerWrapper wrapper : zombieServers.values()) { if (wrapper.standard==false || justFailed!=null && justFailed.containsKey(wrapper.getKey())) continue; try { NamedList<Object> rsp = wrapper.solrServer.request(request); // remove from zombie list *before* adding to alive to avoid a race that could lose a server zombieServers.remove(wrapper.getKey()); addToAlive(wrapper); return rsp; } catch (SolrException e) { // Server is alive but the request was malformed or invalid throw e; } catch (SolrServerException e) { if (e.getRootCause() instanceof IOException) { ex = e; // still dead } else { throw e; } } catch (Exception e) { throw new SolrServerException(e); } } if (ex == null) { throw new SolrServerException("No live SolrServers available to handle this request"); } else { throw new SolrServerException("No live SolrServers available to handle this request", ex); } }
// in solrj/src/java/org/apache/solr/client/solrj/SolrServer.java
public UpdateResponse add(Collection<SolrInputDocument> docs) throws SolrServerException, IOException { return add(docs, -1); }
// in solrj/src/java/org/apache/solr/client/solrj/SolrServer.java
public UpdateResponse add(Collection<SolrInputDocument> docs, int commitWithinMs) throws SolrServerException, IOException { UpdateRequest req = new UpdateRequest(); req.add(docs); req.setCommitWithin(commitWithinMs); return req.process(this); }
// in solrj/src/java/org/apache/solr/client/solrj/SolrServer.java
public UpdateResponse addBeans(Collection<?> beans ) throws SolrServerException, IOException { return addBeans(beans, -1); }
// in solrj/src/java/org/apache/solr/client/solrj/SolrServer.java
public UpdateResponse addBeans(Collection<?> beans, int commitWithinMs) throws SolrServerException, IOException { DocumentObjectBinder binder = this.getBinder(); ArrayList<SolrInputDocument> docs = new ArrayList<SolrInputDocument>(beans.size()); for (Object bean : beans) { docs.add(binder.toSolrInputDocument(bean)); } return add(docs, commitWithinMs); }
// in solrj/src/java/org/apache/solr/client/solrj/SolrServer.java
public UpdateResponse add(SolrInputDocument doc ) throws SolrServerException, IOException { return add(doc, -1); }
// in solrj/src/java/org/apache/solr/client/solrj/SolrServer.java
public UpdateResponse add(SolrInputDocument doc, int commitWithinMs) throws SolrServerException, IOException { UpdateRequest req = new UpdateRequest(); req.add(doc); req.setCommitWithin(commitWithinMs); return req.process(this); }
// in solrj/src/java/org/apache/solr/client/solrj/SolrServer.java
public UpdateResponse addBean(Object obj) throws IOException, SolrServerException { return addBean(obj, -1); }
// in solrj/src/java/org/apache/solr/client/solrj/SolrServer.java
public UpdateResponse addBean(Object obj, int commitWithinMs) throws IOException, SolrServerException { return add(getBinder().toSolrInputDocument(obj),commitWithinMs); }
// in solrj/src/java/org/apache/solr/client/solrj/SolrServer.java
public UpdateResponse commit( ) throws SolrServerException, IOException { return commit(true, true); }
// in solrj/src/java/org/apache/solr/client/solrj/SolrServer.java
public UpdateResponse optimize( ) throws SolrServerException, IOException { return optimize(true, true, 1); }
// in solrj/src/java/org/apache/solr/client/solrj/SolrServer.java
public UpdateResponse commit( boolean waitFlush, boolean waitSearcher ) throws SolrServerException, IOException { return new UpdateRequest().setAction( UpdateRequest.ACTION.COMMIT, waitFlush, waitSearcher ).process( this ); }
// in solrj/src/java/org/apache/solr/client/solrj/SolrServer.java
public UpdateResponse commit( boolean waitFlush, boolean waitSearcher, boolean softCommit ) throws SolrServerException, IOException { return new UpdateRequest().setAction( UpdateRequest.ACTION.COMMIT, waitFlush, waitSearcher, softCommit ).process( this ); }
// in solrj/src/java/org/apache/solr/client/solrj/SolrServer.java
public UpdateResponse optimize( boolean waitFlush, boolean waitSearcher ) throws SolrServerException, IOException { return optimize(waitFlush, waitSearcher, 1); }
// in solrj/src/java/org/apache/solr/client/solrj/SolrServer.java
public UpdateResponse optimize(boolean waitFlush, boolean waitSearcher, int maxSegments ) throws SolrServerException, IOException { return new UpdateRequest().setAction( UpdateRequest.ACTION.OPTIMIZE, waitFlush, waitSearcher, maxSegments ).process( this ); }
// in solrj/src/java/org/apache/solr/client/solrj/SolrServer.java
public UpdateResponse rollback() throws SolrServerException, IOException { return new UpdateRequest().rollback().process( this ); }
// in solrj/src/java/org/apache/solr/client/solrj/SolrServer.java
public UpdateResponse deleteById(String id) throws SolrServerException, IOException { return deleteById(id, -1); }
// in solrj/src/java/org/apache/solr/client/solrj/SolrServer.java
public UpdateResponse deleteById(String id, int commitWithinMs) throws SolrServerException, IOException { UpdateRequest req = new UpdateRequest(); req.deleteById(id); req.setCommitWithin(commitWithinMs); return req.process(this); }
// in solrj/src/java/org/apache/solr/client/solrj/SolrServer.java
public UpdateResponse deleteById(List<String> ids) throws SolrServerException, IOException { return deleteById(ids, -1); }
// in solrj/src/java/org/apache/solr/client/solrj/SolrServer.java
public UpdateResponse deleteById(List<String> ids, int commitWithinMs) throws SolrServerException, IOException { UpdateRequest req = new UpdateRequest(); req.deleteById(ids); req.setCommitWithin(commitWithinMs); return req.process(this); }
// in solrj/src/java/org/apache/solr/client/solrj/SolrServer.java
public UpdateResponse deleteByQuery(String query) throws SolrServerException, IOException { return deleteByQuery(query, -1); }
// in solrj/src/java/org/apache/solr/client/solrj/SolrServer.java
public UpdateResponse deleteByQuery(String query, int commitWithinMs) throws SolrServerException, IOException { UpdateRequest req = new UpdateRequest(); req.deleteByQuery(query); req.setCommitWithin(commitWithinMs); return req.process(this); }
// in solrj/src/java/org/apache/solr/client/solrj/SolrServer.java
public SolrPingResponse ping() throws SolrServerException, IOException { return new SolrPing().process( this ); }
// in solrj/src/java/org/apache/solr/client/solrj/SolrServer.java
public QueryResponse query(SolrParams params) throws SolrServerException { return new QueryRequest( params ).process( this ); }
// in solrj/src/java/org/apache/solr/client/solrj/SolrServer.java
public QueryResponse query(SolrParams params, METHOD method) throws SolrServerException { return new QueryRequest( params, method ).process( this ); }
// in solrj/src/java/org/apache/solr/client/solrj/SolrServer.java
public QueryResponse queryAndStreamResponse( SolrParams params, StreamingResponseCallback callback ) throws SolrServerException, IOException { ResponseParser parser = new StreamingBinaryResponseParser( callback ); QueryRequest req = new QueryRequest( params ); req.setStreamingResponseCallback( callback ); req.setResponseParser( parser ); return req.process(this); }
// in core/src/java/org/apache/solr/handler/admin/CoreAdminHandler.java
protected void handleDistribUrlAction(SolrQueryRequest req, SolrQueryResponse rsp) throws IOException, InterruptedException, SolrServerException { // TODO: finish this and tests SolrParams params = req.getParams(); final ModifiableSolrParams newParams = new ModifiableSolrParams(params); newParams.remove("action"); SolrParams required = params.required(); final String subAction = required.get("subAction"); String collection = required.get("collection"); newParams.set(CoreAdminParams.ACTION, subAction); SolrCore core = req.getCore(); ZkController zkController = core.getCoreDescriptor().getCoreContainer() .getZkController(); CloudState cloudState = zkController.getCloudState(); Map<String,Slice> slices = cloudState.getCollectionStates().get(collection); for (Map.Entry<String,Slice> entry : slices.entrySet()) { Slice slice = entry.getValue(); Map<String,ZkNodeProps> shards = slice.getShards(); Set<Map.Entry<String,ZkNodeProps>> shardEntries = shards.entrySet(); for (Map.Entry<String,ZkNodeProps> shardEntry : shardEntries) { final ZkNodeProps node = shardEntry.getValue(); if (cloudState.liveNodesContain(node.get(ZkStateReader.NODE_NAME_PROP))) { newParams.set(CoreAdminParams.CORE, node.get(ZkStateReader.CORE_NAME_PROP)); String replica = node.get(ZkStateReader.BASE_URL_PROP); ShardRequest sreq = new ShardRequest(); newParams.set("qt", "/admin/cores"); sreq.purpose = 1; // TODO: this sucks if (replica.startsWith("http://")) replica = replica.substring(7); sreq.shards = new String[]{replica}; sreq.actualShards = sreq.shards; sreq.params = newParams; shardHandler.submit(sreq, replica, sreq.params); } } } ShardResponse srsp; do { srsp = shardHandler.takeCompletedOrError(); if (srsp != null) { Throwable e = srsp.getException(); if (e != null) { log.error("Error talking to shard: " + srsp.getShard(), e); } } } while(srsp != null); }
// in core/src/java/org/apache/solr/client/solrj/embedded/EmbeddedSolrServer.java
Override public NamedList<Object> request(SolrRequest request) throws SolrServerException, IOException { String path = request.getPath(); if( path == null || !path.startsWith( "/" ) ) { path = "/select"; } // Check for cores action SolrCore core = coreContainer.getCore( coreName ); if( core == null ) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "No such core: " + coreName ); } SolrParams params = request.getParams(); if( params == null ) { params = new ModifiableSolrParams(); } // Extract the handler from the path or params SolrRequestHandler handler = core.getRequestHandler( path ); if( handler == null ) { if( "/select".equals( path ) || "/select/".equalsIgnoreCase( path) ) { String qt = params.get( CommonParams.QT ); handler = core.getRequestHandler( qt ); if( handler == null ) { throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "unknown handler: "+qt); } } // Perhaps the path is to manage the cores if( handler == null && coreContainer != null && path.equals( coreContainer.getAdminPath() ) ) { handler = coreContainer.getMultiCoreHandler(); } } if( handler == null ) { core.close(); throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "unknown handler: "+path ); } SolrQueryRequest req = null; try { req = _parser.buildRequestFrom( core, params, request.getContentStreams() ); req.getContext().put( "path", path ); SolrQueryResponse rsp = new SolrQueryResponse(); SolrRequestInfo.setRequestInfo(new SolrRequestInfo(req, rsp)); core.execute( handler, req, rsp ); if( rsp.getException() != null ) { if(rsp.getException() instanceof SolrException) { throw rsp.getException(); } throw new SolrServerException( rsp.getException() ); } // Check if this should stream results if( request.getStreamingResponseCallback() != null ) { try { final StreamingResponseCallback callback = request.getStreamingResponseCallback(); BinaryResponseWriter.Resolver resolver = new BinaryResponseWriter.Resolver( req, rsp.getReturnFields()) { @Override public void writeResults(ResultContext ctx, JavaBinCodec codec) throws IOException { // write an empty list... SolrDocumentList docs = new SolrDocumentList(); docs.setNumFound( ctx.docs.matches() ); docs.setStart( ctx.docs.offset() ); docs.setMaxScore( ctx.docs.maxScore() ); codec.writeSolrDocumentList( docs ); // This will transform writeResultsBody( ctx, codec ); } }; ByteArrayOutputStream out = new ByteArrayOutputStream(); new JavaBinCodec(resolver) { @Override public void writeSolrDocument(SolrDocument doc) throws IOException { callback.streamSolrDocument( doc ); //super.writeSolrDocument( doc, fields ); } @Override public void writeSolrDocumentList(SolrDocumentList docs) throws IOException { if( docs.size() > 0 ) { SolrDocumentList tmp = new SolrDocumentList(); tmp.setMaxScore( docs.getMaxScore() ); tmp.setNumFound( docs.getNumFound() ); tmp.setStart( docs.getStart() ); docs = tmp; } callback.streamDocListInfo( docs.getNumFound(), docs.getStart(), docs.getMaxScore() ); super.writeSolrDocumentList(docs); } }.marshal(rsp.getValues(), out); InputStream in = new ByteArrayInputStream(out.toByteArray()); return (NamedList<Object>) new JavaBinCodec(resolver).unmarshal(in); } catch (Exception ex) { throw new RuntimeException(ex); } } // Now write it out NamedList<Object> normalized = getParsedResponse(req, rsp); return normalized; } catch( IOException iox ) { throw iox; } catch( SolrException sx ) { throw sx; } catch( Exception ex ) { throw new SolrServerException( ex ); } finally { if (req != null) req.close(); core.close(); SolrRequestInfo.clearRequestInfo(); } }
// in core/src/java/org/apache/solr/cloud/SyncStrategy.java
private boolean syncWithReplicas(ZkController zkController, SolrCore core, ZkNodeProps props, String collection, String shardId) throws MalformedURLException, SolrServerException, IOException { List<ZkCoreNodeProps> nodes = zkController.getZkStateReader() .getReplicaProps(collection, shardId, props.get(ZkStateReader.NODE_NAME_PROP), props.get(ZkStateReader.CORE_NAME_PROP), ZkStateReader.ACTIVE); // TODO: // should // there // be a // state // filter? if (nodes == null) { // I have no replicas return true; } List<String> syncWith = new ArrayList<String>(); for (ZkCoreNodeProps node : nodes) { // if we see a leader, must be stale state, and this is the guy that went down if (!node.getNodeProps().keySet().contains(ZkStateReader.LEADER_PROP)) { syncWith.add(node.getCoreUrl()); } } PeerSync peerSync = new PeerSync(core, syncWith, core.getUpdateHandler().getUpdateLog().numRecordsToKeep); return peerSync.sync(); }
// in core/src/java/org/apache/solr/cloud/SyncStrategy.java
private void syncToMe(ZkController zkController, String collection, String shardId, ZkNodeProps leaderProps) throws MalformedURLException, SolrServerException, IOException { // sync everyone else // TODO: we should do this in parallel at least List<ZkCoreNodeProps> nodes = zkController .getZkStateReader() .getReplicaProps(collection, shardId, leaderProps.get(ZkStateReader.NODE_NAME_PROP), leaderProps.get(ZkStateReader.CORE_NAME_PROP), ZkStateReader.ACTIVE); if (nodes == null) { // System.out.println("I have no replicas"); // I have no replicas return; } //System.out.println("tell my replicas to sync"); ZkCoreNodeProps zkLeader = new ZkCoreNodeProps(leaderProps); for (ZkCoreNodeProps node : nodes) { try { // System.out // .println("try and ask " + node.getCoreUrl() + " to sync"); log.info("try and ask " + node.getCoreUrl() + " to sync"); requestSync(zkLeader.getCoreUrl(), node.getCoreName()); } catch (Exception e) { SolrException.log(log, "Error syncing replica to leader", e); } } for(;;) { ShardResponse srsp = shardHandler.takeCompletedOrError(); if (srsp == null) break; boolean success = handleResponse(srsp); //System.out.println("got response:" + success); if (!success) { try { log.info("Sync failed - asking replica to recover."); //System.out.println("Sync failed - asking replica to recover."); RequestRecovery recoverRequestCmd = new RequestRecovery(); recoverRequestCmd.setAction(CoreAdminAction.REQUESTRECOVERY); recoverRequestCmd.setCoreName(((SyncShardRequest)srsp.getShardRequest()).coreName); HttpSolrServer server = new HttpSolrServer(zkLeader.getBaseUrl()); server.request(recoverRequestCmd); } catch (Exception e) { log.info("Could not tell a replica to recover", e); } shardHandler.cancelAll(); break; } } }
// in core/src/java/org/apache/solr/cloud/RecoveryStrategy.java
private void replicate(String nodeName, SolrCore core, ZkNodeProps leaderprops, String baseUrl) throws SolrServerException, IOException { String leaderBaseUrl = leaderprops.get(ZkStateReader.BASE_URL_PROP); ZkCoreNodeProps leaderCNodeProps = new ZkCoreNodeProps(leaderprops); String leaderUrl = leaderCNodeProps.getCoreUrl(); log.info("Attempting to replicate from " + leaderUrl); // if we are the leader, either we are trying to recover faster // then our ephemeral timed out or we are the only node if (!leaderBaseUrl.equals(baseUrl)) { // send commit commitOnLeader(leaderUrl); // use rep handler directly, so we can do this sync rather than async SolrRequestHandler handler = core.getRequestHandler(REPLICATION_HANDLER); if (handler instanceof LazyRequestHandlerWrapper) { handler = ((LazyRequestHandlerWrapper)handler).getWrappedHandler(); } ReplicationHandler replicationHandler = (ReplicationHandler) handler; if (replicationHandler == null) { throw new SolrException(ErrorCode.SERVICE_UNAVAILABLE, "Skipping recovery, no " + REPLICATION_HANDLER + " handler found"); } ModifiableSolrParams solrParams = new ModifiableSolrParams(); solrParams.set(ReplicationHandler.MASTER_URL, leaderUrl + "replication"); if (isClosed()) retries = INTERRUPTED; boolean success = replicationHandler.doFetch(solrParams, true); // TODO: look into making sure force=true does not download files we already have if (!success) { throw new SolrException(ErrorCode.SERVER_ERROR, "Replication for recovery failed."); } // solrcloud_debug // try { // RefCounted<SolrIndexSearcher> searchHolder = core.getNewestSearcher(false); // SolrIndexSearcher searcher = searchHolder.get(); // try { // System.out.println(core.getCoreDescriptor().getCoreContainer().getZkController().getNodeName() + " replicated " // + searcher.search(new MatchAllDocsQuery(), 1).totalHits + " from " + leaderUrl + " gen:" + core.getDeletionPolicy().getLatestCommit().getGeneration() + " data:" + core.getDataDir()); // } finally { // searchHolder.decref(); // } // } catch (Exception e) { // // } } }
// in core/src/java/org/apache/solr/cloud/RecoveryStrategy.java
private void commitOnLeader(String leaderUrl) throws MalformedURLException, SolrServerException, IOException { HttpSolrServer server = new HttpSolrServer(leaderUrl); server.setConnectionTimeout(30000); server.setSoTimeout(30000); UpdateRequest ureq = new UpdateRequest(); ureq.setParams(new ModifiableSolrParams()); ureq.getParams().set(DistributedUpdateProcessor.COMMIT_END_POINT, true); ureq.setAction(AbstractUpdateRequest.ACTION.COMMIT, false, true).process( server); server.shutdown(); }
// in core/src/java/org/apache/solr/cloud/RecoveryStrategy.java
private void sendPrepRecoveryCmd(String leaderBaseUrl, String leaderCoreName) throws MalformedURLException, SolrServerException, IOException { HttpSolrServer server = new HttpSolrServer(leaderBaseUrl); server.setConnectionTimeout(45000); server.setSoTimeout(45000); WaitForState prepCmd = new WaitForState(); prepCmd.setCoreName(leaderCoreName); prepCmd.setNodeName(zkController.getNodeName()); prepCmd.setCoreNodeName(coreZkNodeName); prepCmd.setState(ZkStateReader.RECOVERING); prepCmd.setCheckLive(true); prepCmd.setPauseFor(6000); server.request(prepCmd); server.shutdown(); }
6
            
// in solrj/src/java/org/apache/solr/client/solrj/request/QueryRequest.java
catch (SolrServerException e){ throw e; }
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
catch (SolrServerException e) { Throwable rootCause = e.getRootCause(); if (rootCause instanceof IOException) { ex = addZombie(server, e); } else { throw e; } }
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
catch (SolrServerException e) { Throwable rootCause = e.getRootCause(); if (rootCause instanceof IOException) { ex = e; // already a zombie, no need to re-add } else { throw e; } }
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
catch (SolrServerException e) { if (e.getRootCause() instanceof IOException) { ex = e; moveAliveToDead(wrapper); if (justFailed == null) justFailed = new HashMap<String,ServerWrapper>(); justFailed.put(wrapper.getKey(), wrapper); } else { throw e; } }
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
catch (SolrServerException e) { if (e.getRootCause() instanceof IOException) { ex = e; // still dead } else { throw e; } }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/SolrEntityProcessor.java
catch (SolrServerException e) { if (ABORT.equals(onError)) { wrapAndThrow(SEVERE, e); } else if (SKIP.equals(onError)) { wrapAndThrow(DataImportHandlerException.SKIP_ROW, e); } }
5
            
// in solrj/src/java/org/apache/solr/client/solrj/request/QueryRequest.java
catch (SolrServerException e){ throw e; }
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
catch (SolrServerException e) { Throwable rootCause = e.getRootCause(); if (rootCause instanceof IOException) { ex = addZombie(server, e); } else { throw e; } }
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
catch (SolrServerException e) { Throwable rootCause = e.getRootCause(); if (rootCause instanceof IOException) { ex = e; // already a zombie, no need to re-add } else { throw e; } }
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
catch (SolrServerException e) { if (e.getRootCause() instanceof IOException) { ex = e; moveAliveToDead(wrapper); if (justFailed == null) justFailed = new HashMap<String,ServerWrapper>(); justFailed.put(wrapper.getKey(), wrapper); } else { throw e; } }
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
catch (SolrServerException e) { if (e.getRootCause() instanceof IOException) { ex = e; // still dead } else { throw e; } }
0
checked (Lib) Throwable 0 0 7
            
// in solrj/src/java/org/apache/solr/client/solrj/impl/LBHttpSolrServer.java
Override protected void finalize() throws Throwable { try { if(this.aliveCheckExecutor!=null) this.aliveCheckExecutor.shutdownNow(); } finally { super.finalize(); } }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/JdbcDataSource.java
Override protected void finalize() throws Throwable { try { if(!isClosed){ LOG.error("JdbcDataSource was not closed prior to finalize(), indicates a bug -- POSSIBLE RESOURCE LEAK!!!"); close(); } } finally { super.finalize(); } }
// in core/src/java/org/apache/solr/update/SolrIndexWriter.java
Override protected void finalize() throws Throwable { try { if(!isClosed){ assert false : "SolrIndexWriter was not closed prior to finalize()"; log.error("SolrIndexWriter was not closed prior to finalize(), indicates a bug -- POSSIBLE RESOURCE LEAK!!!"); close(); } } finally { super.finalize(); } }
// in core/src/java/org/apache/solr/core/SolrCore.java
Override protected void finalize() throws Throwable { try { if (getOpenCount() != 0) { log.error("REFCOUNT ERROR: unreferenced " + this + " (" + getName() + ") has a reference count of " + getOpenCount()); } } finally { super.finalize(); } }
// in core/src/java/org/apache/solr/core/CoreContainer.java
Override protected void finalize() throws Throwable { try { if(!isShutDown){ log.error("CoreContainer was not shutdown prior to finalize(), indicates a bug -- POSSIBLE RESOURCE LEAK!!! instance=" + System.identityHashCode(this)); } } finally { super.finalize(); } }
// in core/src/java/org/apache/solr/util/ConcurrentLRUCache.java
Override protected void finalize() throws Throwable { try { if(!isDestroyed){ log.error("ConcurrentLRUCache was not destroyed prior to finalize(), indicates a bug -- POSSIBLE RESOURCE LEAK!!!"); destroy(); } } finally { super.finalize(); } }
// in core/src/java/org/apache/solr/util/ConcurrentLFUCache.java
Override protected void finalize() throws Throwable { try { if (!isDestroyed) { log.error("ConcurrentLFUCache was not destroyed prior to finalize(), indicates a bug -- POSSIBLE RESOURCE LEAK!!!"); destroy(); } } finally { super.finalize(); } }
58
            
// in solrj/src/java/org/apache/solr/client/solrj/impl/ConcurrentUpdateSolrServer.java
catch (Throwable e) { handleError(e); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/HttpSolrServer.java
catch (Throwable t) {}
// in contrib/clustering/src/java/org/apache/solr/handler/clustering/carrot2/LuceneCarrot2TokenizerFactory.java
catch (Throwable e) { logger .warn("Could not instantiate Smart Chinese Analyzer, clustering quality " + "of Chinese content may be degraded. For best quality clusters, " + "make sure Lucene's Smart Chinese Analyzer JAR is in the classpath"); }
// in contrib/clustering/src/java/org/apache/solr/handler/clustering/carrot2/LuceneCarrot2TokenizerFactory.java
catch (Throwable e) { return new ExtendedWhitespaceTokenizer(); }
// in contrib/clustering/src/java/org/apache/solr/handler/clustering/carrot2/LuceneCarrot2StemmerFactory.java
catch (Throwable e) { return IdentityStemmer.INSTANCE; }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DataImporter.java
catch (Throwable t) { SolrException.log(LOG, "Full Import failed", t); docBuilder.rollback(); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DataImporter.java
catch (Throwable t) { LOG.error("Delta Import Failed", t); docBuilder.rollback(); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/SolrWriter.java
catch (Throwable t) { log.error("Exception while solr commit.", t); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/SolrWriter.java
catch (Throwable t) { log.error("Exception while solr rollback.", t); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/ZKPropertiesWriter.java
catch (Throwable e) { log.warn( "Could not read DIH properties from " + path + " :" + e.getClass(), e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DataImportHandler.java
catch (Throwable e) { LOG.error( DataImporter.MSG.LOAD_EXP, e); throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, DataImporter.MSG.INVALID_CONFIG, e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DocBuilder.java
catch (Throwable t) { if (verboseDebug) { getDebugLogger().log(DIHLogLevels.ENTITY_EXCEPTION, epw.getEntity().getName(), t); } throw new DataImportHandlerException(DataImportHandlerException.SEVERE, t); }
// in core/src/java/org/apache/solr/handler/component/HttpShardHandler.java
catch (Throwable th) { srsp.setException(th); if (th instanceof SolrException) { srsp.setResponseCode(((SolrException)th).code()); } else { srsp.setResponseCode(-1); } }
// in core/src/java/org/apache/solr/handler/component/HttpShardHandlerFactory.java
catch (Throwable e) { SolrException.log(log, e); }
// in core/src/java/org/apache/solr/handler/component/HttpShardHandlerFactory.java
catch (Throwable e) { SolrException.log(log, e); }
// in core/src/java/org/apache/solr/handler/component/HttpShardHandlerFactory.java
catch (Throwable e) { SolrException.log(log, e); }
// in core/src/java/org/apache/solr/handler/admin/SystemInfoHandler.java
catch( Throwable ex ) { ex.printStackTrace(); }
// in core/src/java/org/apache/solr/handler/PingRequestHandler.java
catch( Throwable th ) { ex = th; }
// in core/src/java/org/apache/solr/SolrLogFormatter.java
catch (Throwable th) { // logging swallows exceptions, so if we hit an exception we need to convert it to a string to see it return "ERROR IN SolrLogFormatter! original message:" + record.getMessage() + "\n\tException: " + SolrException.toStr(th); }
// in core/src/java/org/apache/solr/SolrLogFormatter.java
catch (Throwable e) { e.printStackTrace(); }
// in core/src/java/org/apache/solr/request/SolrRequestInfo.java
catch (Throwable throwable) { SolrException.log(SolrCore.log, "Exception during close hook", throwable); }
// in core/src/java/org/apache/solr/servlet/SolrDispatchFilter.java
catch( Throwable t ) { // catch this so our filter still works log.error( "Could not start Solr. Check solr/home property and the logs"); SolrCore.log( t ); }
// in core/src/java/org/apache/solr/servlet/SolrDispatchFilter.java
catch (Throwable ex) { sendError( core, solrReq, request, (HttpServletResponse)response, ex ); return; }
// in core/src/java/org/apache/solr/servlet/SolrDispatchFilter.java
catch( Throwable t ) { // This error really does not matter SimpleOrderedMap info = new SimpleOrderedMap(); int code=getErrorInfo(ex, info); response.sendError( code, info.toString() ); }
// in core/src/java/org/apache/solr/search/LFUCache.java
catch (Throwable e) { SolrException.log(log, "Error during auto-warming of key:" + itemsArr[i].getKey(), e); }
// in core/src/java/org/apache/solr/search/LRUCache.java
catch (Throwable e) { SolrException.log(log,"Error during auto-warming of key:" + keys[i], e); }
// in core/src/java/org/apache/solr/search/FastLRUCache.java
catch (Throwable e) { SolrException.log(log, "Error during auto-warming of key:" + itemsArr[i].getKey(), e); }
// in core/src/java/org/apache/solr/cloud/SolrZkServer.java
catch (Throwable e) { log.error("ZooKeeper Server ERROR", e); throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/cloud/RecoveryStrategy.java
catch (Throwable t) { log.error("Error while trying to recover", t); }
// in core/src/java/org/apache/solr/cloud/RecoveryStrategy.java
catch (Throwable t) { SolrException.log(log, "", t); }
// in core/src/java/org/apache/solr/cloud/RecoveryStrategy.java
catch (Throwable t) { log.error("Error while trying to recover.", t); }
// in core/src/java/org/apache/solr/update/UpdateLog.java
catch (Throwable e) { SolrException.log(log, "Error opening realtime searcher for deleteByQuery", e); }
// in core/src/java/org/apache/solr/update/UpdateLog.java
catch (Throwable e) { recoveryInfo.errors++; SolrException.log(log,e); }
// in core/src/java/org/apache/solr/update/UpdateLog.java
catch (Throwable e) { SolrException.log(log,e); }
// in core/src/java/org/apache/solr/update/UpdateLog.java
catch (Throwable ex) { recoveryInfo.errors++; loglog.warn("REPLAY_ERR: Exception replaying log", ex); // something wrong with the request? }
// in core/src/java/org/apache/solr/update/DirectUpdateHandler2.java
catch (Throwable th) { log.error("Error in final commit", th); }
// in core/src/java/org/apache/solr/update/DirectUpdateHandler2.java
catch (Throwable th) { log.error("Error closing log files", th); }
// in core/src/java/org/apache/solr/update/DefaultSolrCoreState.java
catch (Throwable t) { log.error("Error during shutdown of writer.", t); }
// in core/src/java/org/apache/solr/update/DefaultSolrCoreState.java
catch (Throwable t) { log.error("Error during shutdown of directory factory.", t); }
// in core/src/java/org/apache/solr/update/DefaultSolrCoreState.java
catch (Throwable t) { log.error("Error cancelling recovery", t); }
// in core/src/java/org/apache/solr/core/Config.java
catch (Throwable e) { SolrException.log(log,"Error in xpath",e); throw new SolrException( SolrException.ErrorCode.SERVER_ERROR,"Error in xpath:" + xstr+ " for " + name,e); }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (Throwable e) { latch.countDown();//release the latch, otherwise we block trying to do the close. This should be fine, since counting down on a latch of 0 is still fine //close down the searcher and any other resources, if it exists, as this is not recoverable close(); throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, null, e); }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (Throwable e) { SolrException.log(log, e); }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (Throwable e) { SolrException.log(log,e); }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (Throwable e) { SolrException.log(log,e); }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (Throwable e) { SolrException.log(log, e); }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (Throwable e) { SolrException.log(log,e); }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (Throwable e) { SolrException.log(log,null,e); }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (Throwable e) { SolrException.log(log,null,e); }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (Throwable e) { SolrException.log(log, e); }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (Throwable e) { // do not allow decref() operations to fail since they are typically called in finally blocks // and throwing another exception would be very unexpected. SolrException.log(log, "Error closing searcher:", e); }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (Throwable e) { // an exception in register() shouldn't be fatal. log(e); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch(Throwable ex) { log.warn("Unable to read SLF4J version. LogWatcher will be disabled: "+ex); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (Throwable e) { log.warn("Unable to load LogWatcher", e); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (Throwable e) { SolrException.log(log,null,e); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (Throwable ex) { SolrException.log(log,null,ex); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (Throwable t) { SolrException.log(log, "Error shutting down core", t); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (Throwable t) { SolrException.log(log, "Error canceling recovery for core", t); }
5
            
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DataImportHandler.java
catch (Throwable e) { LOG.error( DataImporter.MSG.LOAD_EXP, e); throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, DataImporter.MSG.INVALID_CONFIG, e); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DocBuilder.java
catch (Throwable t) { if (verboseDebug) { getDebugLogger().log(DIHLogLevels.ENTITY_EXCEPTION, epw.getEntity().getName(), t); } throw new DataImportHandlerException(DataImportHandlerException.SEVERE, t); }
// in core/src/java/org/apache/solr/cloud/SolrZkServer.java
catch (Throwable e) { log.error("ZooKeeper Server ERROR", e); throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
// in core/src/java/org/apache/solr/core/Config.java
catch (Throwable e) { SolrException.log(log,"Error in xpath",e); throw new SolrException( SolrException.ErrorCode.SERVER_ERROR,"Error in xpath:" + xstr+ " for " + name,e); }
// in core/src/java/org/apache/solr/core/SolrCore.java
catch (Throwable e) { latch.countDown();//release the latch, otherwise we block trying to do the close. This should be fine, since counting down on a latch of 0 is still fine //close down the searcher and any other resources, if it exists, as this is not recoverable close(); throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, null, e); }
5
unknown (Lib) TikaException 0 0 0 1
            
// in contrib/extraction/src/java/org/apache/solr/handler/extraction/ExtractingDocumentLoader.java
catch (TikaException e) { if(ignoreTikaException) log.warn(new StringBuilder("skip extracting text due to ").append(e.getLocalizedMessage()) .append(". metadata=").append(metadata.toString()).toString()); else throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
1
            
// in contrib/extraction/src/java/org/apache/solr/handler/extraction/ExtractingDocumentLoader.java
catch (TikaException e) { if(ignoreTikaException) log.warn(new StringBuilder("skip extracting text due to ").append(e.getLocalizedMessage()) .append(". metadata=").append(metadata.toString()).toString()); else throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, e); }
1
unknown (Lib) TimeExceededException 0 0 0 6
            
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
catch( TimeLimitingCollector.TimeExceededException x ) { log.warn( "Query: " + query + "; " + x.getMessage() ); qr.setPartialResults(true); }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
catch( TimeLimitingCollector.TimeExceededException x ) { log.warn( "Query: " + query + "; " + x.getMessage() ); qr.setPartialResults(true); }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
catch( TimeLimitingCollector.TimeExceededException x ) { log.warn( "Query: " + query + "; " + x.getMessage() ); qr.setPartialResults(true); }
// in core/src/java/org/apache/solr/search/SolrIndexSearcher.java
catch( TimeLimitingCollector.TimeExceededException x ) { log.warn( "Query: " + query + "; " + x.getMessage() ); qr.setPartialResults(true); }
// in core/src/java/org/apache/solr/search/Grouping.java
catch (TimeLimitingCollector.TimeExceededException x) { logger.warn( "Query: " + query + "; " + x.getMessage() ); qr.setPartialResults(true); }
// in core/src/java/org/apache/solr/search/grouping/CommandHandler.java
catch (TimeLimitingCollector.TimeExceededException x) { partialResults = true; logger.warn( "Query: " + query + "; " + x.getMessage() ); }
0 0
unknown (Lib) TimeoutException 2
            
// in solrj/src/java/org/apache/solr/common/cloud/ConnectionManager.java
public synchronized void waitForConnected(long waitForConnection) throws InterruptedException, TimeoutException, IOException { long expire = System.currentTimeMillis() + waitForConnection; long left = waitForConnection; while (!connected && left > 0) { wait(left); left = expire - System.currentTimeMillis(); } if (!connected) { throw new TimeoutException("Could not connect to ZooKeeper " + zkServerAddress + " within " + waitForConnection + " ms"); } }
// in solrj/src/java/org/apache/solr/common/cloud/ConnectionManager.java
public synchronized void waitForDisconnected(long timeout) throws InterruptedException, TimeoutException { long expire = System.currentTimeMillis() + timeout; long left = timeout; while (connected && left > 0) { wait(left); left = expire - System.currentTimeMillis(); } if (connected) { throw new TimeoutException("Did not disconnect"); } }
0 11
            
// in solrj/src/java/org/apache/solr/common/cloud/DefaultConnectionStrategy.java
Override public void connect(String serverAddress, int timeout, Watcher watcher, ZkUpdate updater) throws IOException, InterruptedException, TimeoutException { updater.update(new SolrZooKeeper(serverAddress, timeout, watcher)); }
// in solrj/src/java/org/apache/solr/common/cloud/ConnectionManager.java
public synchronized void process(WatchedEvent event) { if (log.isInfoEnabled()) { log.info("Watcher " + this + " name:" + name + " got event " + event + " path:" + event.getPath() + " type:" + event.getType()); } state = event.getState(); if (state == KeeperState.SyncConnected) { connected = true; clientConnected.countDown(); } else if (state == KeeperState.Expired) { connected = false; log.info("Attempting to reconnect to recover relationship with ZooKeeper..."); try { connectionStrategy.reconnect(zkServerAddress, zkClientTimeout, this, new ZkClientConnectionStrategy.ZkUpdate() { @Override public void update(SolrZooKeeper keeper) throws InterruptedException, TimeoutException, IOException { synchronized (connectionStrategy) { waitForConnected(SolrZkClient.DEFAULT_CLIENT_CONNECT_TIMEOUT); client.updateKeeper(keeper); if (onReconnect != null) { onReconnect.command(); } synchronized (ConnectionManager.this) { ConnectionManager.this.connected = true; } } } }); } catch (Exception e) { SolrException.log(log, "", e); } log.info("Connected:" + connected); }
// in solrj/src/java/org/apache/solr/common/cloud/ConnectionManager.java
Override public void update(SolrZooKeeper keeper) throws InterruptedException, TimeoutException, IOException { synchronized (connectionStrategy) { waitForConnected(SolrZkClient.DEFAULT_CLIENT_CONNECT_TIMEOUT); client.updateKeeper(keeper); if (onReconnect != null) { onReconnect.command(); } synchronized (ConnectionManager.this) { ConnectionManager.this.connected = true; } } }
// in solrj/src/java/org/apache/solr/common/cloud/ConnectionManager.java
public synchronized void waitForConnected(long waitForConnection) throws InterruptedException, TimeoutException, IOException { long expire = System.currentTimeMillis() + waitForConnection; long left = waitForConnection; while (!connected && left > 0) { wait(left); left = expire - System.currentTimeMillis(); } if (!connected) { throw new TimeoutException("Could not connect to ZooKeeper " + zkServerAddress + " within " + waitForConnection + " ms"); } }
// in solrj/src/java/org/apache/solr/common/cloud/ConnectionManager.java
public synchronized void waitForDisconnected(long timeout) throws InterruptedException, TimeoutException { long expire = System.currentTimeMillis() + timeout; long left = timeout; while (connected && left > 0) { wait(left); left = expire - System.currentTimeMillis(); } if (connected) { throw new TimeoutException("Did not disconnect"); } }
// in core/src/java/org/apache/solr/cloud/RecoveryStrategy.java
private Future<RecoveryInfo> replay(UpdateLog ulog) throws InterruptedException, ExecutionException, TimeoutException { Future<RecoveryInfo> future = ulog.applyBufferedUpdates(); if (future == null) { // no replay needed\ log.info("No replay needed"); } else { log.info("Replaying buffered documents"); // wait for replay future.get(); } // solrcloud_debug // try { // RefCounted<SolrIndexSearcher> searchHolder = core.getNewestSearcher(false); // SolrIndexSearcher searcher = searchHolder.get(); // try { // System.out.println(core.getCoreDescriptor().getCoreContainer().getZkController().getNodeName() + " replayed " // + searcher.search(new MatchAllDocsQuery(), 1).totalHits); // } finally { // searchHolder.decref(); // } // } catch (Exception e) { // // } return future; }
3
            
// in solrj/src/java/org/apache/solr/client/solrj/impl/CloudSolrServer.java
catch (TimeoutException e) { throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/servlet/ZookeeperInfoServlet.java
catch (TimeoutException e) { writeError(503, "Could not connect to zookeeper at '" + addr + "'\""); zkClient = null; return; }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (TimeoutException e) { log.error("Could not connect to ZooKeeper", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
2
            
// in solrj/src/java/org/apache/solr/client/solrj/impl/CloudSolrServer.java
catch (TimeoutException e) { throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (TimeoutException e) { log.error("Could not connect to ZooKeeper", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
2
unknown (Lib) TransformerConfigurationException 0 0 3
            
// in contrib/dataimporthandler-extras/src/java/org/apache/solr/handler/dataimport/TikaEntityProcessor.java
private static ContentHandler getHtmlHandler(Writer writer) throws TransformerConfigurationException { SAXTransformerFactory factory = (SAXTransformerFactory) TransformerFactory.newInstance(); TransformerHandler handler = factory.newTransformerHandler(); handler.getTransformer().setOutputProperty(OutputKeys.METHOD, "html"); handler.setResult(new StreamResult(writer)); return new ContentHandlerDecorator(handler) { @Override public void startElement( String uri, String localName, String name, Attributes atts) throws SAXException { if (XHTMLContentHandler.XHTML.equals(uri)) { uri = null; } if (!"head".equals(localName)) { super.startElement(uri, localName, name, atts); } } @Override public void endElement(String uri, String localName, String name) throws SAXException { if (XHTMLContentHandler.XHTML.equals(uri)) { uri = null; } if (!"head".equals(localName)) { super.endElement(uri, localName, name); } } @Override public void startPrefixMapping(String prefix, String uri) {/*no op*/ } @Override public void endPrefixMapping(String prefix) {/*no op*/ } }; }
// in contrib/dataimporthandler-extras/src/java/org/apache/solr/handler/dataimport/TikaEntityProcessor.java
private static ContentHandler getXmlContentHandler(Writer writer) throws TransformerConfigurationException { SAXTransformerFactory factory = (SAXTransformerFactory) TransformerFactory.newInstance(); TransformerHandler handler = factory.newTransformerHandler(); handler.getTransformer().setOutputProperty(OutputKeys.METHOD, "xml"); handler.setResult(new StreamResult(writer)); return handler; }
// in core/src/java/org/apache/solr/handler/loader/XMLLoader.java
void processUpdate(SolrQueryRequest req, UpdateRequestProcessor processor, XMLStreamReader parser) throws XMLStreamException, IOException, FactoryConfigurationError, InstantiationException, IllegalAccessException, TransformerConfigurationException { AddUpdateCommand addCmd = null; SolrParams params = req.getParams(); while (true) { int event = parser.next(); switch (event) { case XMLStreamConstants.END_DOCUMENT: parser.close(); return; case XMLStreamConstants.START_ELEMENT: String currTag = parser.getLocalName(); if (currTag.equals(UpdateRequestHandler.ADD)) { log.trace("SolrCore.update(add)"); addCmd = new AddUpdateCommand(req); // First look for commitWithin parameter on the request, will be overwritten for individual <add>'s addCmd.commitWithin = params.getInt(UpdateParams.COMMIT_WITHIN, -1); addCmd.overwrite = params.getBool(UpdateParams.OVERWRITE, true); for (int i = 0; i < parser.getAttributeCount(); i++) { String attrName = parser.getAttributeLocalName(i); String attrVal = parser.getAttributeValue(i); if (UpdateRequestHandler.OVERWRITE.equals(attrName)) { addCmd.overwrite = StrUtils.parseBoolean(attrVal); } else if (UpdateRequestHandler.COMMIT_WITHIN.equals(attrName)) { addCmd.commitWithin = Integer.parseInt(attrVal); } else { log.warn("Unknown attribute id in add:" + attrName); } } } else if ("doc".equals(currTag)) { if(addCmd != null) { log.trace("adding doc..."); addCmd.clear(); addCmd.solrDoc = readDoc(parser); processor.processAdd(addCmd); } else { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Unexpected <doc> tag without an <add> tag surrounding it."); } } else if (UpdateRequestHandler.COMMIT.equals(currTag) || UpdateRequestHandler.OPTIMIZE.equals(currTag)) { log.trace("parsing " + currTag); CommitUpdateCommand cmd = new CommitUpdateCommand(req, UpdateRequestHandler.OPTIMIZE.equals(currTag)); ModifiableSolrParams mp = new ModifiableSolrParams(); for (int i = 0; i < parser.getAttributeCount(); i++) { String attrName = parser.getAttributeLocalName(i); String attrVal = parser.getAttributeValue(i); mp.set(attrName, attrVal); } RequestHandlerUtils.validateCommitParams(mp); SolrParams p = SolrParams.wrapDefaults(mp, req.getParams()); // default to the normal request params for commit options RequestHandlerUtils.updateCommit(cmd, p); processor.processCommit(cmd); } // end commit else if (UpdateRequestHandler.ROLLBACK.equals(currTag)) { log.trace("parsing " + currTag); RollbackUpdateCommand cmd = new RollbackUpdateCommand(req); processor.processRollback(cmd); } // end rollback else if (UpdateRequestHandler.DELETE.equals(currTag)) { log.trace("parsing delete"); processDelete(req, processor, parser); } // end delete break; } } }
2
            
// in contrib/dataimporthandler-extras/src/java/org/apache/solr/handler/dataimport/TikaEntityProcessor.java
catch (TransformerConfigurationException e) { wrapAndThrow(SEVERE, e, "Unable to create content handler"); }
// in core/src/java/org/apache/solr/util/xslt/TransformerProvider.java
catch(TransformerConfigurationException tce) { log.error(getClass().getName(), "getTransformer", tce); final IOException ioe = new IOException("newTransformer fails ( " + lastFilename + ")"); ioe.initCause(tce); throw ioe; }
1
            
// in core/src/java/org/apache/solr/util/xslt/TransformerProvider.java
catch(TransformerConfigurationException tce) { log.error(getClass().getName(), "getTransformer", tce); final IOException ioe = new IOException("newTransformer fails ( " + lastFilename + ")"); ioe.initCause(tce); throw ioe; }
0
unknown (Lib) TransformerException 1
            
// in core/src/java/org/apache/solr/util/SystemIdResolver.java
public Source resolve(String href, String base) throws TransformerException { try { final InputSource src = SystemIdResolver.this.resolveEntity(null, null, base, href); return (src == null) ? null : new SAXSource(src); } catch (IOException ioe) { throw new TransformerException("Cannot resolve entity", ioe); } }
1
            
// in core/src/java/org/apache/solr/util/SystemIdResolver.java
catch (IOException ioe) { throw new TransformerException("Cannot resolve entity", ioe); }
4
            
// in solrj/src/java/org/apache/solr/common/util/XMLErrorLogger.java
public void error(TransformerException e) throws TransformerException { throw e; }
// in solrj/src/java/org/apache/solr/common/util/XMLErrorLogger.java
public void fatalError(TransformerException e) throws TransformerException { throw e; }
// in core/src/java/org/apache/solr/core/CoreContainer.java
private Document copyDoc(Document document) throws TransformerException { TransformerFactory tfactory = TransformerFactory.newInstance(); Transformer tx = tfactory.newTransformer(); DOMSource source = new DOMSource(document); DOMResult result = new DOMResult(); tx.transform(source,result); return (Document)result.getNode(); }
// in core/src/java/org/apache/solr/util/SystemIdResolver.java
public URIResolver asURIResolver() { return new URIResolver() { public Source resolve(String href, String base) throws TransformerException { try { final InputSource src = SystemIdResolver.this.resolveEntity(null, null, base, href); return (src == null) ? null : new SAXSource(src); } catch (IOException ioe) { throw new TransformerException("Cannot resolve entity", ioe); } } }; }
// in core/src/java/org/apache/solr/util/SystemIdResolver.java
public Source resolve(String href, String base) throws TransformerException { try { final InputSource src = SystemIdResolver.this.resolveEntity(null, null, base, href); return (src == null) ? null : new SAXSource(src); } catch (IOException ioe) { throw new TransformerException("Cannot resolve entity", ioe); } }
4
            
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/XPathEntityProcessor.java
catch (TransformerException e) { if (ABORT.equals(onError)) { wrapAndThrow(SEVERE, e, "Exception in applying XSL Transformeation"); } else if (SKIP.equals(onError)) { wrapAndThrow(DataImportHandlerException.SKIP, e); } else { LOG.warn("Failed for url : " + s, e); rowIterator = Collections.EMPTY_LIST.iterator(); return; } }
// in core/src/java/org/apache/solr/handler/loader/XMLLoader.java
catch(TransformerException te) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, te.getMessage(), te); }
// in core/src/java/org/apache/solr/response/XSLTResponseWriter.java
catch(TransformerException te) { final IOException ioe = new IOException("XSLT transformation error"); ioe.initCause(te); throw ioe; }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (TransformerException e) { throw new SolrException(ErrorCode.SERVER_ERROR, "", e); }
3
            
// in core/src/java/org/apache/solr/handler/loader/XMLLoader.java
catch(TransformerException te) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, te.getMessage(), te); }
// in core/src/java/org/apache/solr/response/XSLTResponseWriter.java
catch(TransformerException te) { final IOException ioe = new IOException("XSLT transformation error"); ioe.initCause(te); throw ioe; }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (TransformerException e) { throw new SolrException(ErrorCode.SERVER_ERROR, "", e); }
2
unknown (Lib) URISyntaxException 0 0 2
            
// in core/src/java/org/apache/solr/update/processor/URLClassifyProcessor.java
public URL getNormalizedURL(String url) throws MalformedURLException, URISyntaxException { return new URI(url).normalize().toURL(); }
// in core/src/java/org/apache/solr/util/SystemIdResolver.java
URI resolveRelativeURI(String baseURI, String systemId) throws IOException,URISyntaxException { URI uri; // special case for backwards compatibility: if relative systemId starts with "/" (we convert that to an absolute solrres:-URI) if (systemId.startsWith("/")) { uri = new URI(RESOURCE_LOADER_URI_SCHEME, RESOURCE_LOADER_AUTHORITY_ABSOLUTE, "/", null, null).resolve(systemId); } else { // simply parse as URI uri = new URI(systemId); } // do relative resolving if (baseURI != null ) { uri = new URI(baseURI).resolve(uri); } return uri; }
4
            
// in core/src/java/org/apache/solr/handler/admin/ShowFileRequestHandler.java
catch (URISyntaxException e) { throw new SolrException( ErrorCode.FORBIDDEN, "Can not access configuration directory!"); }
// in core/src/java/org/apache/solr/update/processor/URLClassifyProcessor.java
catch (URISyntaxException e) { log.warn("cannot get the normalized url for \"" + url + "\" due to " + e.getMessage()); }
// in core/src/java/org/apache/solr/util/SystemIdResolver.java
catch (URISyntaxException use) { log.warn("An URI systax problem occurred during resolving SystemId, falling back to default resolver", use); return null; }
// in core/src/java/org/apache/solr/util/SystemIdResolver.java
catch (URISyntaxException use) { throw new IllegalArgumentException("Invalid syntax of Solr Resource URI", use); }
2
            
// in core/src/java/org/apache/solr/handler/admin/ShowFileRequestHandler.java
catch (URISyntaxException e) { throw new SolrException( ErrorCode.FORBIDDEN, "Can not access configuration directory!"); }
// in core/src/java/org/apache/solr/util/SystemIdResolver.java
catch (URISyntaxException use) { throw new IllegalArgumentException("Invalid syntax of Solr Resource URI", use); }
2
unknown (Lib) UnknownHostException 0 0 0 2
            
// in core/src/java/org/apache/solr/handler/admin/SystemInfoHandler.java
catch (UnknownHostException e) { //default to null }
// in core/src/java/org/apache/solr/cloud/SolrZkServer.java
catch (UnknownHostException e) { throw new RuntimeException(e); }
1
            
// in core/src/java/org/apache/solr/cloud/SolrZkServer.java
catch (UnknownHostException e) { throw new RuntimeException(e); }
1
unknown (Lib) UnsupportedEncodingException 0 0 3
            
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/FileDataSource.java
protected Reader openStream(File file) throws FileNotFoundException, UnsupportedEncodingException { if (encoding == null) { return new InputStreamReader(new FileInputStream(file)); } else { return new InputStreamReader(new FileInputStream(file), encoding); } }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/FieldReaderDataSource.java
private Reader getReader(Blob blob) throws SQLException, UnsupportedEncodingException { if (encoding == null) { return (new InputStreamReader(blob.getBinaryStream())); } else { return (new InputStreamReader(blob.getBinaryStream(), encoding)); } }
// in core/src/java/org/apache/solr/handler/admin/ShowFileRequestHandler.java
private void showFromZooKeeper(SolrQueryRequest req, SolrQueryResponse rsp, CoreContainer coreContainer) throws KeeperException, InterruptedException, UnsupportedEncodingException { String adminFile = null; SolrCore core = req.getCore(); SolrZkClient zkClient = coreContainer.getZkController().getZkClient(); final ZkSolrResourceLoader loader = (ZkSolrResourceLoader) core .getResourceLoader(); String confPath = loader.getCollectionZkPath(); String fname = req.getParams().get("file", null); if (fname == null) { adminFile = confPath; } else { fname = fname.replace('\\', '/'); // normalize slashes if (hiddenFiles.contains(fname.toUpperCase(Locale.ENGLISH))) { throw new SolrException(ErrorCode.FORBIDDEN, "Can not access: " + fname); } if (fname.indexOf("..") >= 0) { throw new SolrException(ErrorCode.FORBIDDEN, "Invalid path: " + fname); } adminFile = confPath + "/" + fname; } // Make sure the file exists, is readable and is not a hidden file if (!zkClient.exists(adminFile, true)) { throw new SolrException(ErrorCode.BAD_REQUEST, "Can not find: " + adminFile); } // Show a directory listing List<String> children = zkClient.getChildren(adminFile, null, true); if (children.size() > 0) { NamedList<SimpleOrderedMap<Object>> files = new SimpleOrderedMap<SimpleOrderedMap<Object>>(); for (String f : children) { if (hiddenFiles.contains(f.toUpperCase(Locale.ENGLISH))) { continue; // don't show 'hidden' files } if (f.startsWith(".")) { continue; // skip hidden system files... } SimpleOrderedMap<Object> fileInfo = new SimpleOrderedMap<Object>(); files.add(f, fileInfo); List<String> fchildren = zkClient.getChildren(adminFile, null, true); if (fchildren.size() > 0) { fileInfo.add("directory", true); } else { // TODO? content type fileInfo.add("size", f.length()); } // TODO: ? // fileInfo.add( "modified", new Date( f.lastModified() ) ); } rsp.add("files", files); } else { // Include the file contents // The file logic depends on RawResponseWriter, so force its use. ModifiableSolrParams params = new ModifiableSolrParams(req.getParams()); params.set(CommonParams.WT, "raw"); req.setParams(params); ContentStreamBase content = new ContentStreamBase.StringStream( new String(zkClient.getData(adminFile, null, null, true), "UTF-8")); content.setContentType(req.getParams().get(USE_CONTENT_TYPE)); rsp.add(RawResponseWriter.CONTENT, content); } rsp.setHttpCaching(false); }
7
            
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
catch (UnsupportedEncodingException e) { // can't happen - UTF-8 throw new RuntimeException(e); }
// in contrib/velocity/src/java/org/apache/solr/response/SolrParamResourceLoader.java
catch (UnsupportedEncodingException e) { throw new RuntimeException(e); // may not happen }
// in core/src/java/org/apache/solr/servlet/SolrRequestParsers.java
catch( UnsupportedEncodingException uex ) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, uex ); }
// in core/src/java/org/apache/solr/servlet/cache/HttpCacheHeaderUtil.java
catch (UnsupportedEncodingException e) { throw new RuntimeException(e); // may not happen }
// in core/src/java/org/apache/solr/spelling/suggest/Suggester.java
catch (UnsupportedEncodingException e) { // should not happen LOG.error("should not happen", e); }
// in core/src/java/org/apache/solr/update/processor/MD5Signature.java
catch (UnsupportedEncodingException e) { // won't happen log.error("UTF-8 not supported", e); throw new RuntimeException(e); }
// in core/src/java/org/apache/solr/util/SimplePostTool.java
catch (UnsupportedEncodingException e) { fatal("Shouldn't happen: UTF-8 not supported?!?!?!"); }
5
            
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
catch (UnsupportedEncodingException e) { // can't happen - UTF-8 throw new RuntimeException(e); }
// in contrib/velocity/src/java/org/apache/solr/response/SolrParamResourceLoader.java
catch (UnsupportedEncodingException e) { throw new RuntimeException(e); // may not happen }
// in core/src/java/org/apache/solr/servlet/SolrRequestParsers.java
catch( UnsupportedEncodingException uex ) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, uex ); }
// in core/src/java/org/apache/solr/servlet/cache/HttpCacheHeaderUtil.java
catch (UnsupportedEncodingException e) { throw new RuntimeException(e); // may not happen }
// in core/src/java/org/apache/solr/update/processor/MD5Signature.java
catch (UnsupportedEncodingException e) { // won't happen log.error("UTF-8 not supported", e); throw new RuntimeException(e); }
5
runtime (Lib) UnsupportedOperationException 48
            
// in solrj/src/java/org/apache/solr/common/SolrInputField.java
public void remove() { throw new UnsupportedOperationException(); }
// in solrj/src/java/org/apache/solr/common/util/NamedList.java
public void remove() { throw new UnsupportedOperationException(); }
// in solrj/src/java/org/apache/solr/common/util/IteratorChain.java
public void remove() { // we just need this class // to iterate in readonly mode throw new UnsupportedOperationException(); }
// in solrj/src/java/org/apache/solr/common/SolrDocument.java
public void clear() { throw new UnsupportedOperationException(); }
// in solrj/src/java/org/apache/solr/common/SolrDocument.java
public boolean containsValue(Object value) {throw new UnsupportedOperationException();}
// in solrj/src/java/org/apache/solr/common/SolrDocument.java
public Set<java.util.Map.Entry<String, Collection<Object>>> entrySet() {throw new UnsupportedOperationException();}
// in solrj/src/java/org/apache/solr/common/SolrDocument.java
public void putAll(Map<? extends String, ? extends Collection<Object>> t) {throw new UnsupportedOperationException();}
// in solrj/src/java/org/apache/solr/common/SolrDocument.java
public Collection<Collection<Object>> values() {throw new UnsupportedOperationException();}
// in solrj/src/java/org/apache/solr/common/SolrDocument.java
public Collection<Object> put(String key, Collection<Object> value) {throw new UnsupportedOperationException();}
// in solrj/src/java/org/apache/solr/common/SolrDocument.java
public Collection<Object> remove(Object key) {throw new UnsupportedOperationException();}
// in solrj/src/java/org/apache/solr/common/SolrDocument.java
public void clear() { throw new UnsupportedOperationException(); }
// in solrj/src/java/org/apache/solr/common/SolrDocument.java
public boolean containsValue(Object value) {throw new UnsupportedOperationException();}
// in solrj/src/java/org/apache/solr/common/SolrDocument.java
public Set<java.util.Map.Entry<String, Object>> entrySet() {throw new UnsupportedOperationException();}
// in solrj/src/java/org/apache/solr/common/SolrDocument.java
public void putAll(Map<? extends String, ? extends Object> t) {throw new UnsupportedOperationException();}
// in solrj/src/java/org/apache/solr/common/SolrDocument.java
public Collection<Object> values() {throw new UnsupportedOperationException();}
// in solrj/src/java/org/apache/solr/common/SolrDocument.java
public Collection<Object> put(String key, Object value) {throw new UnsupportedOperationException();}
// in solrj/src/java/org/apache/solr/common/SolrDocument.java
public Collection<Object> remove(Object key) {throw new UnsupportedOperationException();}
// in solrj/src/java/org/apache/solr/client/solrj/impl/HttpSolrServer.java
public void setAllowCompression(boolean allowCompression) { if (httpClient instanceof DefaultHttpClient) { HttpClientUtil.setAllowCompression((DefaultHttpClient) httpClient, allowCompression); } else { throw new UnsupportedOperationException( "HttpClient instance was not of type DefaultHttpClient"); } }
// in solrj/src/java/org/apache/solr/client/solrj/impl/HttpSolrServer.java
public void setDefaultMaxConnectionsPerHost(int max) { if (internalClient) { HttpClientUtil.setMaxConnectionsPerHost(httpClient, max); } else { throw new UnsupportedOperationException( "Client was created outside of HttpSolrServer"); } }
// in solrj/src/java/org/apache/solr/client/solrj/impl/HttpSolrServer.java
public void setMaxTotalConnections(int max) { if (internalClient) { HttpClientUtil.setMaxConnections(httpClient, max); } else { throw new UnsupportedOperationException( "Client was created outside of HttpSolrServer"); } }
// in contrib/dataimporthandler-extras/src/java/org/apache/solr/handler/dataimport/MailEntityProcessor.java
public void remove() { throw new UnsupportedOperationException("Its read only mode..."); }
// in contrib/dataimporthandler-extras/src/java/org/apache/solr/handler/dataimport/MailEntityProcessor.java
public void remove() { throw new UnsupportedOperationException("Its read only mode..."); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/SortedMapBackedCache.java
Override public void remove() { throw new UnsupportedOperationException(); }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/SolrEntityProcessor.java
Override public void remove() { throw new UnsupportedOperationException(); }
// in core/src/java/org/apache/solr/schema/BCDIntField.java
Override public ValueSource getValueSource(SchemaField field, QParser qparser) { throw new UnsupportedOperationException("ValueSource not implemented"); }
// in core/src/java/org/apache/solr/schema/LatLonType.java
Override public IndexableField createField(SchemaField field, Object value, float boost) { throw new UnsupportedOperationException("LatLonType uses multiple fields. field=" + field.getName()); }
// in core/src/java/org/apache/solr/schema/PointType.java
Override public IndexableField createField(SchemaField field, Object value, float boost) { throw new UnsupportedOperationException("PointType uses multiple fields. field=" + field.getName()); }
// in core/src/java/org/apache/solr/schema/ExternalFileField.java
Override public void write(TextResponseWriter writer, String name, IndexableField f) throws IOException { throw new UnsupportedOperationException(); }
// in core/src/java/org/apache/solr/schema/ExternalFileField.java
Override public SortField getSortField(SchemaField field,boolean reverse) { throw new UnsupportedOperationException(); }
// in core/src/java/org/apache/solr/schema/AbstractSubTypeFieldType.java
Override public Query getFieldQuery(QParser parser, SchemaField field, String externalVal) { throw new UnsupportedOperationException(); }
// in core/src/java/org/apache/solr/search/SortedIntDocSet.java
public void remove() { throw new UnsupportedOperationException("The remove operation is not supported by this Iterator."); }
// in core/src/java/org/apache/solr/search/MissingStringLastComparatorSource.java
Override public int compare(int slot1, int slot2) { throw new UnsupportedOperationException(); }
// in core/src/java/org/apache/solr/search/MissingStringLastComparatorSource.java
Override public void setBottom(int slot) { throw new UnsupportedOperationException(); }
// in core/src/java/org/apache/solr/search/MissingStringLastComparatorSource.java
Override public int compareBottom(int doc) { throw new UnsupportedOperationException(); }
// in core/src/java/org/apache/solr/search/MissingStringLastComparatorSource.java
Override public void copy(int slot, int doc) { throw new UnsupportedOperationException(); }
// in core/src/java/org/apache/solr/search/MissingStringLastComparatorSource.java
Override public BytesRef value(int slot) { throw new UnsupportedOperationException(); }
// in core/src/java/org/apache/solr/search/MissingStringLastComparatorSource.java
Override public int compareDocToValue(int doc, Comparable docValue) { throw new UnsupportedOperationException(); }
// in core/src/java/org/apache/solr/search/DocSlice.java
public void remove() { throw new UnsupportedOperationException("The remove operation is not supported by this Iterator."); }
// in core/src/java/org/apache/solr/logging/CircularList.java
Override public void remove() { throw new UnsupportedOperationException(); }
// in core/src/java/org/apache/solr/spelling/SolrSpellChecker.java
protected float getAccuracy() { throw new UnsupportedOperationException(); }
// in core/src/java/org/apache/solr/spelling/SolrSpellChecker.java
protected StringDistance getStringDistance() { throw new UnsupportedOperationException(); }
// in core/src/java/org/apache/solr/spelling/PossibilityIterator.java
public void remove() { throw new UnsupportedOperationException(); }
// in core/src/java/org/apache/solr/core/AbstractSolrEventListener.java
public void postCommit() { throw new UnsupportedOperationException(); }
// in core/src/java/org/apache/solr/core/AbstractSolrEventListener.java
Override public void postSoftCommit() { throw new UnsupportedOperationException(); }
// in core/src/java/org/apache/solr/core/AbstractSolrEventListener.java
public void newSearcher(SolrIndexSearcher newSearcher, SolrIndexSearcher currentSearcher) { throw new UnsupportedOperationException(); }
// in core/src/java/org/apache/solr/core/JmxMonitoredMap.java
public void setAttribute(Attribute attribute) throws AttributeNotFoundException, InvalidAttributeValueException, MBeanException, ReflectionException { throw new UnsupportedOperationException("Operation not Supported"); }
// in core/src/java/org/apache/solr/core/JmxMonitoredMap.java
public AttributeList setAttributes(AttributeList attributes) { throw new UnsupportedOperationException("Operation not Supported"); }
// in core/src/java/org/apache/solr/core/JmxMonitoredMap.java
public Object invoke(String actionName, Object[] params, String[] signature) throws MBeanException, ReflectionException { throw new UnsupportedOperationException("Operation not Supported"); }
0 0 4
            
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DataImporter.java
catch( UnsupportedOperationException e ) { LOG.warn( "XML parser doesn't support XInclude option" ); }
// in core/src/java/org/apache/solr/schema/CurrencyField.java
catch (UnsupportedOperationException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "XML parser doesn't support XInclude option", e); }
// in core/src/java/org/apache/solr/spelling/SolrSpellChecker.java
catch(UnsupportedOperationException uoe) { //just use .5 as a default }
// in core/src/java/org/apache/solr/core/Config.java
catch(UnsupportedOperationException e) { log.warn(name + " XML parser doesn't support XInclude option"); }
1
            
// in core/src/java/org/apache/solr/schema/CurrencyField.java
catch (UnsupportedOperationException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "XML parser doesn't support XInclude option", e); }
1
unknown (Lib) XMLStreamException 7
            
// in solrj/src/java/org/apache/solr/client/solrj/impl/XMLResponseParser.java
protected NamedList<Object> readNamedList( XMLStreamReader parser ) throws XMLStreamException { if( XMLStreamConstants.START_ELEMENT != parser.getEventType() ) { throw new RuntimeException( "must be start element, not: "+parser.getEventType() ); } StringBuilder builder = new StringBuilder(); NamedList<Object> nl = new SimpleOrderedMap<Object>(); KnownType type = null; String name = null; // just eat up the events... int depth = 0; while( true ) { switch (parser.next()) { case XMLStreamConstants.START_ELEMENT: depth++; builder.setLength( 0 ); // reset the text type = KnownType.get( parser.getLocalName() ); if( type == null ) { throw new RuntimeException( "this must be known type! not: "+parser.getLocalName() ); } name = null; int cnt = parser.getAttributeCount(); for( int i=0; i<cnt; i++ ) { if( "name".equals( parser.getAttributeLocalName( i ) ) ) { name = parser.getAttributeValue( i ); break; } } /** The name in a NamedList can actually be null if( name == null ) { throw new XMLStreamException( "requires 'name' attribute: "+parser.getLocalName(), parser.getLocation() ); } **/ if( !type.isLeaf ) { switch( type ) { case LST: nl.add( name, readNamedList( parser ) ); depth--; continue; case ARR: nl.add( name, readArray( parser ) ); depth--; continue; case RESULT: nl.add( name, readDocuments( parser ) ); depth--; continue; case DOC: nl.add( name, readDocument( parser ) ); depth--; continue; } throw new XMLStreamException( "branch element not handled!", parser.getLocation() ); } break; case XMLStreamConstants.END_ELEMENT: if( --depth < 0 ) { return nl; } //System.out.println( "NL:ELEM:"+type+"::"+name+"::"+builder ); nl.add( name, type.read( builder.toString().trim() ) ); break; case XMLStreamConstants.SPACE: // TODO? should this be trimmed? make sure it only gets one/two space? case XMLStreamConstants.CDATA: case XMLStreamConstants.CHARACTERS: builder.append( parser.getText() ); break; } } }
// in solrj/src/java/org/apache/solr/client/solrj/impl/XMLResponseParser.java
protected List<Object> readArray( XMLStreamReader parser ) throws XMLStreamException { if( XMLStreamConstants.START_ELEMENT != parser.getEventType() ) { throw new RuntimeException( "must be start element, not: "+parser.getEventType() ); } if( !"arr".equals( parser.getLocalName().toLowerCase(Locale.ENGLISH) ) ) { throw new RuntimeException( "must be 'arr', not: "+parser.getLocalName() ); } StringBuilder builder = new StringBuilder(); KnownType type = null; List<Object> vals = new ArrayList<Object>(); int depth = 0; while( true ) { switch (parser.next()) { case XMLStreamConstants.START_ELEMENT: depth++; KnownType t = KnownType.get( parser.getLocalName() ); if( t == null ) { throw new RuntimeException( "this must be known type! not: "+parser.getLocalName() ); } if( type == null ) { type = t; } /*** actually, there is no rule that arrays need the same type else if( type != t && !(t == KnownType.NULL || type == KnownType.NULL)) { throw new RuntimeException( "arrays must have the same type! ("+type+"!="+t+") "+parser.getLocalName() ); } ***/ type = t; builder.setLength( 0 ); // reset the text if( !type.isLeaf ) { switch( type ) { case LST: vals.add( readNamedList( parser ) ); depth--; continue; case ARR: vals.add( readArray( parser ) ); depth--; continue; case RESULT: vals.add( readDocuments( parser ) ); depth--; continue; case DOC: vals.add( readDocument( parser ) ); depth--; continue; } throw new XMLStreamException( "branch element not handled!", parser.getLocation() ); } break; case XMLStreamConstants.END_ELEMENT: if( --depth < 0 ) { return vals; // the last element is itself } //System.out.println( "ARR:"+type+"::"+builder ); Object val = type.read( builder.toString().trim() ); if( val == null && type != KnownType.NULL) { throw new XMLStreamException( "error reading value:"+type, parser.getLocation() ); } vals.add( val ); break; case XMLStreamConstants.SPACE: // TODO? should this be trimmed? make sure it only gets one/two space? case XMLStreamConstants.CDATA: case XMLStreamConstants.CHARACTERS: builder.append( parser.getText() ); break; } } }
// in solrj/src/java/org/apache/solr/client/solrj/impl/XMLResponseParser.java
protected SolrDocument readDocument( XMLStreamReader parser ) throws XMLStreamException { if( XMLStreamConstants.START_ELEMENT != parser.getEventType() ) { throw new RuntimeException( "must be start element, not: "+parser.getEventType() ); } if( !"doc".equals( parser.getLocalName().toLowerCase(Locale.ENGLISH) ) ) { throw new RuntimeException( "must be 'lst', not: "+parser.getLocalName() ); } SolrDocument doc = new SolrDocument(); StringBuilder builder = new StringBuilder(); KnownType type = null; String name = null; // just eat up the events... int depth = 0; while( true ) { switch (parser.next()) { case XMLStreamConstants.START_ELEMENT: depth++; builder.setLength( 0 ); // reset the text type = KnownType.get( parser.getLocalName() ); if( type == null ) { throw new RuntimeException( "this must be known type! not: "+parser.getLocalName() ); } name = null; int cnt = parser.getAttributeCount(); for( int i=0; i<cnt; i++ ) { if( "name".equals( parser.getAttributeLocalName( i ) ) ) { name = parser.getAttributeValue( i ); break; } } if( name == null ) { throw new XMLStreamException( "requires 'name' attribute: "+parser.getLocalName(), parser.getLocation() ); } // Handle multi-valued fields if( type == KnownType.ARR ) { for( Object val : readArray( parser ) ) { doc.addField( name, val ); } depth--; // the array reading clears out the 'endElement' } else if( type == KnownType.LST ) { doc.addField( name, readNamedList( parser ) ); depth--; } else if( !type.isLeaf ) { System.out.println("nbot leaf!:" + type); throw new XMLStreamException( "must be value or array", parser.getLocation() ); } break; case XMLStreamConstants.END_ELEMENT: if( --depth < 0 ) { return doc; } //System.out.println( "FIELD:"+type+"::"+name+"::"+builder ); Object val = type.read( builder.toString().trim() ); if( val == null ) { throw new XMLStreamException( "error reading value:"+type, parser.getLocation() ); } doc.addField( name, val ); break; case XMLStreamConstants.SPACE: // TODO? should this be trimmed? make sure it only gets one/two space? case XMLStreamConstants.CDATA: case XMLStreamConstants.CHARACTERS: builder.append( parser.getText() ); break; } } }
// in core/src/java/org/apache/solr/util/SystemIdResolver.java
public Object resolveEntity(String publicId, String systemId, String baseURI, String namespace) throws XMLStreamException { try { final InputSource src = SystemIdResolver.this.resolveEntity(null, publicId, baseURI, systemId); return (src == null) ? null : src.getByteStream(); } catch (IOException ioe) { throw new XMLStreamException("Cannot resolve entity", ioe); } }
1
            
// in core/src/java/org/apache/solr/util/SystemIdResolver.java
catch (IOException ioe) { throw new XMLStreamException("Cannot resolve entity", ioe); }
12
            
// in solrj/src/java/org/apache/solr/client/solrj/impl/XMLResponseParser.java
protected NamedList<Object> readNamedList( XMLStreamReader parser ) throws XMLStreamException { if( XMLStreamConstants.START_ELEMENT != parser.getEventType() ) { throw new RuntimeException( "must be start element, not: "+parser.getEventType() ); } StringBuilder builder = new StringBuilder(); NamedList<Object> nl = new SimpleOrderedMap<Object>(); KnownType type = null; String name = null; // just eat up the events... int depth = 0; while( true ) { switch (parser.next()) { case XMLStreamConstants.START_ELEMENT: depth++; builder.setLength( 0 ); // reset the text type = KnownType.get( parser.getLocalName() ); if( type == null ) { throw new RuntimeException( "this must be known type! not: "+parser.getLocalName() ); } name = null; int cnt = parser.getAttributeCount(); for( int i=0; i<cnt; i++ ) { if( "name".equals( parser.getAttributeLocalName( i ) ) ) { name = parser.getAttributeValue( i ); break; } } /** The name in a NamedList can actually be null if( name == null ) { throw new XMLStreamException( "requires 'name' attribute: "+parser.getLocalName(), parser.getLocation() ); } **/ if( !type.isLeaf ) { switch( type ) { case LST: nl.add( name, readNamedList( parser ) ); depth--; continue; case ARR: nl.add( name, readArray( parser ) ); depth--; continue; case RESULT: nl.add( name, readDocuments( parser ) ); depth--; continue; case DOC: nl.add( name, readDocument( parser ) ); depth--; continue; } throw new XMLStreamException( "branch element not handled!", parser.getLocation() ); } break; case XMLStreamConstants.END_ELEMENT: if( --depth < 0 ) { return nl; } //System.out.println( "NL:ELEM:"+type+"::"+name+"::"+builder ); nl.add( name, type.read( builder.toString().trim() ) ); break; case XMLStreamConstants.SPACE: // TODO? should this be trimmed? make sure it only gets one/two space? case XMLStreamConstants.CDATA: case XMLStreamConstants.CHARACTERS: builder.append( parser.getText() ); break; } } }
// in solrj/src/java/org/apache/solr/client/solrj/impl/XMLResponseParser.java
protected List<Object> readArray( XMLStreamReader parser ) throws XMLStreamException { if( XMLStreamConstants.START_ELEMENT != parser.getEventType() ) { throw new RuntimeException( "must be start element, not: "+parser.getEventType() ); } if( !"arr".equals( parser.getLocalName().toLowerCase(Locale.ENGLISH) ) ) { throw new RuntimeException( "must be 'arr', not: "+parser.getLocalName() ); } StringBuilder builder = new StringBuilder(); KnownType type = null; List<Object> vals = new ArrayList<Object>(); int depth = 0; while( true ) { switch (parser.next()) { case XMLStreamConstants.START_ELEMENT: depth++; KnownType t = KnownType.get( parser.getLocalName() ); if( t == null ) { throw new RuntimeException( "this must be known type! not: "+parser.getLocalName() ); } if( type == null ) { type = t; } /*** actually, there is no rule that arrays need the same type else if( type != t && !(t == KnownType.NULL || type == KnownType.NULL)) { throw new RuntimeException( "arrays must have the same type! ("+type+"!="+t+") "+parser.getLocalName() ); } ***/ type = t; builder.setLength( 0 ); // reset the text if( !type.isLeaf ) { switch( type ) { case LST: vals.add( readNamedList( parser ) ); depth--; continue; case ARR: vals.add( readArray( parser ) ); depth--; continue; case RESULT: vals.add( readDocuments( parser ) ); depth--; continue; case DOC: vals.add( readDocument( parser ) ); depth--; continue; } throw new XMLStreamException( "branch element not handled!", parser.getLocation() ); } break; case XMLStreamConstants.END_ELEMENT: if( --depth < 0 ) { return vals; // the last element is itself } //System.out.println( "ARR:"+type+"::"+builder ); Object val = type.read( builder.toString().trim() ); if( val == null && type != KnownType.NULL) { throw new XMLStreamException( "error reading value:"+type, parser.getLocation() ); } vals.add( val ); break; case XMLStreamConstants.SPACE: // TODO? should this be trimmed? make sure it only gets one/two space? case XMLStreamConstants.CDATA: case XMLStreamConstants.CHARACTERS: builder.append( parser.getText() ); break; } } }
// in solrj/src/java/org/apache/solr/client/solrj/impl/XMLResponseParser.java
protected SolrDocumentList readDocuments( XMLStreamReader parser ) throws XMLStreamException { SolrDocumentList docs = new SolrDocumentList(); // Parse the attributes for( int i=0; i<parser.getAttributeCount(); i++ ) { String n = parser.getAttributeLocalName( i ); String v = parser.getAttributeValue( i ); if( "numFound".equals( n ) ) { docs.setNumFound( Long.parseLong( v ) ); } else if( "start".equals( n ) ) { docs.setStart( Long.parseLong( v ) ); } else if( "maxScore".equals( n ) ) { docs.setMaxScore( Float.parseFloat( v ) ); } } // Read through each document int event; while( true ) { event = parser.next(); if( XMLStreamConstants.START_ELEMENT == event ) { if( !"doc".equals( parser.getLocalName() ) ) { throw new RuntimeException( "should be doc! "+parser.getLocalName() + " :: " + parser.getLocation() ); } docs.add( readDocument( parser ) ); } else if ( XMLStreamConstants.END_ELEMENT == event ) { return docs; // only happens once } } }
// in solrj/src/java/org/apache/solr/client/solrj/impl/XMLResponseParser.java
protected SolrDocument readDocument( XMLStreamReader parser ) throws XMLStreamException { if( XMLStreamConstants.START_ELEMENT != parser.getEventType() ) { throw new RuntimeException( "must be start element, not: "+parser.getEventType() ); } if( !"doc".equals( parser.getLocalName().toLowerCase(Locale.ENGLISH) ) ) { throw new RuntimeException( "must be 'lst', not: "+parser.getLocalName() ); } SolrDocument doc = new SolrDocument(); StringBuilder builder = new StringBuilder(); KnownType type = null; String name = null; // just eat up the events... int depth = 0; while( true ) { switch (parser.next()) { case XMLStreamConstants.START_ELEMENT: depth++; builder.setLength( 0 ); // reset the text type = KnownType.get( parser.getLocalName() ); if( type == null ) { throw new RuntimeException( "this must be known type! not: "+parser.getLocalName() ); } name = null; int cnt = parser.getAttributeCount(); for( int i=0; i<cnt; i++ ) { if( "name".equals( parser.getAttributeLocalName( i ) ) ) { name = parser.getAttributeValue( i ); break; } } if( name == null ) { throw new XMLStreamException( "requires 'name' attribute: "+parser.getLocalName(), parser.getLocation() ); } // Handle multi-valued fields if( type == KnownType.ARR ) { for( Object val : readArray( parser ) ) { doc.addField( name, val ); } depth--; // the array reading clears out the 'endElement' } else if( type == KnownType.LST ) { doc.addField( name, readNamedList( parser ) ); depth--; } else if( !type.isLeaf ) { System.out.println("nbot leaf!:" + type); throw new XMLStreamException( "must be value or array", parser.getLocation() ); } break; case XMLStreamConstants.END_ELEMENT: if( --depth < 0 ) { return doc; } //System.out.println( "FIELD:"+type+"::"+name+"::"+builder ); Object val = type.read( builder.toString().trim() ); if( val == null ) { throw new XMLStreamException( "error reading value:"+type, parser.getLocation() ); } doc.addField( name, val ); break; case XMLStreamConstants.SPACE: // TODO? should this be trimmed? make sure it only gets one/two space? case XMLStreamConstants.CDATA: case XMLStreamConstants.CHARACTERS: builder.append( parser.getText() ); break; } } }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/XPathRecordReader.java
private void parse(XMLStreamReader parser, Handler handler, Map<String, Object> values, Stack<Set<String>> stack, // lists of values to purge boolean recordStarted ) throws IOException, XMLStreamException { Set<String> valuesAddedinThisFrame = null; if (isRecord) { // This Node is a match for an XPATH from a forEach attribute, // prepare for the clean up that will occurr when the record // is emitted after its END_ELEMENT is matched recordStarted = true; valuesAddedinThisFrame = new HashSet<String>(); stack.push(valuesAddedinThisFrame); } else if (recordStarted) { // This node is a child of some parent which matched against forEach // attribute. Continue to add values to an existing record. valuesAddedinThisFrame = stack.peek(); } try { /* The input stream has deposited us at this Node in our tree of * intresting nodes. Depending on how this node is of interest, * process further tokens from the input stream and decide what * we do next */ if (attributes != null) { // we interested in storing attributes from the input stream for (Node node : attributes) { String value = parser.getAttributeValue(null, node.name); if (value != null || (recordStarted && !isRecord)) { putText(values, value, node.fieldName, node.multiValued); valuesAddedinThisFrame.add(node.fieldName); } } } Set<Node> childrenFound = new HashSet<Node>(); int event = -1; int flattenedStarts=0; // our tag depth when flattening elements StringBuilder text = new StringBuilder(); while (true) { event = parser.next(); if (event == END_ELEMENT) { if (flattenedStarts > 0) flattenedStarts--; else { if (hasText && valuesAddedinThisFrame != null) { valuesAddedinThisFrame.add(fieldName); putText(values, text.toString(), fieldName, multiValued); } if (isRecord) handler.handle(getDeepCopy(values), forEachPath); if (childNodes != null && recordStarted && !isRecord && !childrenFound.containsAll(childNodes)) { // nonReccord nodes where we have not collected text for ALL // the child nodes. for (Node n : childNodes) { // For the multivalue child nodes where we could have, but // didnt, collect text. Push a null string into values. if (!childrenFound.contains(n)) n.putNulls(values); } } return; } } else if (hasText && (event==CDATA || event==CHARACTERS || event==SPACE)) { text.append(parser.getText()); } else if (event == START_ELEMENT) { if ( flatten ) flattenedStarts++; else handleStartElement(parser, childrenFound, handler, values, stack, recordStarted); } // END_DOCUMENT is least likely to appear and should be // last in if-then-else skip chain else if (event == END_DOCUMENT) return; } }finally { if ((isRecord || !recordStarted) && !stack.empty()) { Set<String> cleanThis = stack.pop(); if (cleanThis != null) { for (String fld : cleanThis) values.remove(fld); } } } }
// in contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/XPathRecordReader.java
private void handleStartElement(XMLStreamReader parser, Set<Node> childrenFound, Handler handler, Map<String, Object> values, Stack<Set<String>> stack, boolean recordStarted) throws IOException, XMLStreamException { Node n = getMatchingNode(parser,childNodes); Map<String, Object> decends=new HashMap<String, Object>(); if (n != null) { childrenFound.add(n); n.parse(parser, handler, values, stack, recordStarted); return; } // The stream has diverged from the tree of interesting elements, but // are there any wildCardNodes ... anywhere in our path from the root? Node dn = this; // checking our Node first! do { if (dn.wildCardNodes != null) { // Check to see if the streams tag matches one of the "//" all // decendents type expressions for this node. n = getMatchingNode(parser, dn.wildCardNodes); if (n != null) { childrenFound.add(n); n.parse(parser, handler, values, stack, recordStarted); break; } // add the list of this nodes wild decendents to the cache for (Node nn : dn.wildCardNodes) decends.put(nn.name, nn); } dn = dn.wildAncestor; // leap back along the tree toward root } while (dn != null) ; if (n == null) { // we have a START_ELEMENT which is not within the tree of // interesting nodes. Skip over the contents of this element // but recursivly repeat the above for any START_ELEMENTs // found within this element. int count = 1; // we have had our first START_ELEMENT while (count != 0) { int token = parser.next(); if (token == START_ELEMENT) { Node nn = (Node) decends.get(parser.getLocalName()); if (nn != null) { // We have a //Node which matches the stream's parser.localName childrenFound.add(nn); // Parse the contents of this stream element nn.parse(parser, handler, values, stack, recordStarted); } else count++; } else if (token == END_ELEMENT) count--; } } }
// in core/src/java/org/apache/solr/handler/DocumentAnalysisRequestHandler.java
DocumentAnalysisRequest resolveAnalysisRequest(SolrQueryRequest req) throws IOException, XMLStreamException { DocumentAnalysisRequest request = new DocumentAnalysisRequest(); SolrParams params = req.getParams(); String query = params.get(AnalysisParams.QUERY, params.get(CommonParams.Q, null)); request.setQuery(query); boolean showMatch = params.getBool(AnalysisParams.SHOW_MATCH, false); request.setShowMatch(showMatch); ContentStream stream = extractSingleContentStream(req); InputStream is = null; XMLStreamReader parser = null; try { is = stream.getStream(); final String charset = ContentStreamBase.getCharsetFromContentType(stream.getContentType()); parser = (charset == null) ? inputFactory.createXMLStreamReader(is) : inputFactory.createXMLStreamReader(is, charset); while (true) { int event = parser.next(); switch (event) { case XMLStreamConstants.END_DOCUMENT: { parser.close(); return request; } case XMLStreamConstants.START_ELEMENT: { String currTag = parser.getLocalName(); if ("doc".equals(currTag)) { log.trace("Reading doc..."); SolrInputDocument document = readDocument(parser, req.getSchema()); request.addDocument(document); } break; } } } } finally { if (parser != null) parser.close(); IOUtils.closeQuietly(is); } }
// in core/src/java/org/apache/solr/handler/DocumentAnalysisRequestHandler.java
SolrInputDocument readDocument(XMLStreamReader reader, IndexSchema schema) throws XMLStreamException { SolrInputDocument doc = new SolrInputDocument(); String uniqueKeyField = schema.getUniqueKeyField().getName(); StringBuilder text = new StringBuilder(); String fieldName = null; boolean hasId = false; while (true) { int event = reader.next(); switch (event) { // Add everything to the text case XMLStreamConstants.SPACE: case XMLStreamConstants.CDATA: case XMLStreamConstants.CHARACTERS: text.append(reader.getText()); break; case XMLStreamConstants.END_ELEMENT: if ("doc".equals(reader.getLocalName())) { if (!hasId) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "All documents must contain a unique key value: '" + doc.toString() + "'"); } return doc; } else if ("field".equals(reader.getLocalName())) { doc.addField(fieldName, text.toString(), DEFAULT_BOOST); if (uniqueKeyField.equals(fieldName)) { hasId = true; } } break; case XMLStreamConstants.START_ELEMENT: text.setLength(0); String localName = reader.getLocalName(); if (!"field".equals(localName)) { log.warn("unexpected XML tag doc/" + localName); throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "unexpected XML tag doc/" + localName); } for (int i = 0; i < reader.getAttributeCount(); i++) { String attrName = reader.getAttributeLocalName(i); if ("name".equals(attrName)) { fieldName = reader.getAttributeValue(i); } } break; } } }
// in core/src/java/org/apache/solr/handler/loader/XMLLoader.java
void processUpdate(SolrQueryRequest req, UpdateRequestProcessor processor, XMLStreamReader parser) throws XMLStreamException, IOException, FactoryConfigurationError, InstantiationException, IllegalAccessException, TransformerConfigurationException { AddUpdateCommand addCmd = null; SolrParams params = req.getParams(); while (true) { int event = parser.next(); switch (event) { case XMLStreamConstants.END_DOCUMENT: parser.close(); return; case XMLStreamConstants.START_ELEMENT: String currTag = parser.getLocalName(); if (currTag.equals(UpdateRequestHandler.ADD)) { log.trace("SolrCore.update(add)"); addCmd = new AddUpdateCommand(req); // First look for commitWithin parameter on the request, will be overwritten for individual <add>'s addCmd.commitWithin = params.getInt(UpdateParams.COMMIT_WITHIN, -1); addCmd.overwrite = params.getBool(UpdateParams.OVERWRITE, true); for (int i = 0; i < parser.getAttributeCount(); i++) { String attrName = parser.getAttributeLocalName(i); String attrVal = parser.getAttributeValue(i); if (UpdateRequestHandler.OVERWRITE.equals(attrName)) { addCmd.overwrite = StrUtils.parseBoolean(attrVal); } else if (UpdateRequestHandler.COMMIT_WITHIN.equals(attrName)) { addCmd.commitWithin = Integer.parseInt(attrVal); } else { log.warn("Unknown attribute id in add:" + attrName); } } } else if ("doc".equals(currTag)) { if(addCmd != null) { log.trace("adding doc..."); addCmd.clear(); addCmd.solrDoc = readDoc(parser); processor.processAdd(addCmd); } else { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Unexpected <doc> tag without an <add> tag surrounding it."); } } else if (UpdateRequestHandler.COMMIT.equals(currTag) || UpdateRequestHandler.OPTIMIZE.equals(currTag)) { log.trace("parsing " + currTag); CommitUpdateCommand cmd = new CommitUpdateCommand(req, UpdateRequestHandler.OPTIMIZE.equals(currTag)); ModifiableSolrParams mp = new ModifiableSolrParams(); for (int i = 0; i < parser.getAttributeCount(); i++) { String attrName = parser.getAttributeLocalName(i); String attrVal = parser.getAttributeValue(i); mp.set(attrName, attrVal); } RequestHandlerUtils.validateCommitParams(mp); SolrParams p = SolrParams.wrapDefaults(mp, req.getParams()); // default to the normal request params for commit options RequestHandlerUtils.updateCommit(cmd, p); processor.processCommit(cmd); } // end commit else if (UpdateRequestHandler.ROLLBACK.equals(currTag)) { log.trace("parsing " + currTag); RollbackUpdateCommand cmd = new RollbackUpdateCommand(req); processor.processRollback(cmd); } // end rollback else if (UpdateRequestHandler.DELETE.equals(currTag)) { log.trace("parsing delete"); processDelete(req, processor, parser); } // end delete break; } } }
// in core/src/java/org/apache/solr/handler/loader/XMLLoader.java
void processDelete(SolrQueryRequest req, UpdateRequestProcessor processor, XMLStreamReader parser) throws XMLStreamException, IOException { // Parse the command DeleteUpdateCommand deleteCmd = new DeleteUpdateCommand(req); // First look for commitWithin parameter on the request, will be overwritten for individual <delete>'s SolrParams params = req.getParams(); deleteCmd.commitWithin = params.getInt(UpdateParams.COMMIT_WITHIN, -1); for (int i = 0; i < parser.getAttributeCount(); i++) { String attrName = parser.getAttributeLocalName(i); String attrVal = parser.getAttributeValue(i); if ("fromPending".equals(attrName)) { // deprecated } else if ("fromCommitted".equals(attrName)) { // deprecated } else if (UpdateRequestHandler.COMMIT_WITHIN.equals(attrName)) { deleteCmd.commitWithin = Integer.parseInt(attrVal); } else { log.warn("unexpected attribute delete/@" + attrName); } } StringBuilder text = new StringBuilder(); while (true) { int event = parser.next(); switch (event) { case XMLStreamConstants.START_ELEMENT: String mode = parser.getLocalName(); if (!("id".equals(mode) || "query".equals(mode))) { log.warn("unexpected XML tag /delete/" + mode); throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "unexpected XML tag /delete/" + mode); } text.setLength(0); if ("id".equals(mode)) { for (int i = 0; i < parser.getAttributeCount(); i++) { String attrName = parser.getAttributeLocalName(i); String attrVal = parser.getAttributeValue(i); if (UpdateRequestHandler.VERSION.equals(attrName)) { deleteCmd.setVersion(Long.parseLong(attrVal)); } } } break; case XMLStreamConstants.END_ELEMENT: String currTag = parser.getLocalName(); if ("id".equals(currTag)) { deleteCmd.setId(text.toString()); } else if ("query".equals(currTag)) { deleteCmd.setQuery(text.toString()); } else if ("delete".equals(currTag)) { return; } else { log.warn("unexpected XML tag /delete/" + currTag); throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "unexpected XML tag /delete/" + currTag); } processor.processDelete(deleteCmd); deleteCmd.clear(); break; // Add everything to the text case XMLStreamConstants.SPACE: case XMLStreamConstants.CDATA: case XMLStreamConstants.CHARACTERS: text.append(parser.getText()); break; } } }
// in core/src/java/org/apache/solr/handler/loader/XMLLoader.java
public SolrInputDocument readDoc(XMLStreamReader parser) throws XMLStreamException { SolrInputDocument doc = new SolrInputDocument(); String attrName = ""; for (int i = 0; i < parser.getAttributeCount(); i++) { attrName = parser.getAttributeLocalName(i); if ("boost".equals(attrName)) { doc.setDocumentBoost(Float.parseFloat(parser.getAttributeValue(i))); } else { log.warn("Unknown attribute doc/@" + attrName); } } StringBuilder text = new StringBuilder(); String name = null; float boost = 1.0f; boolean isNull = false; String update = null; while (true) { int event = parser.next(); switch (event) { // Add everything to the text case XMLStreamConstants.SPACE: case XMLStreamConstants.CDATA: case XMLStreamConstants.CHARACTERS: text.append(parser.getText()); break; case XMLStreamConstants.END_ELEMENT: if ("doc".equals(parser.getLocalName())) { return doc; } else if ("field".equals(parser.getLocalName())) { Object v = isNull ? null : text.toString(); if (update != null) { Map<String,Object> extendedValue = new HashMap<String,Object>(1); extendedValue.put(update, v); v = extendedValue; } doc.addField(name, v, boost); boost = 1.0f; } break; case XMLStreamConstants.START_ELEMENT: text.setLength(0); String localName = parser.getLocalName(); if (!"field".equals(localName)) { log.warn("unexpected XML tag doc/" + localName); throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "unexpected XML tag doc/" + localName); } boost = 1.0f; update = null; String attrVal = ""; for (int i = 0; i < parser.getAttributeCount(); i++) { attrName = parser.getAttributeLocalName(i); attrVal = parser.getAttributeValue(i); if ("name".equals(attrName)) { name = attrVal; } else if ("boost".equals(attrName)) { boost = Float.parseFloat(attrVal); } else if ("null".equals(attrName)) { isNull = StrUtils.parseBoolean(attrVal); } else if ("update".equals(attrName)) { update = attrVal; } else { log.warn("Unknown attribute doc/field/@" + attrName); } } break; } } }
// in core/src/java/org/apache/solr/util/SystemIdResolver.java
public XMLResolver asXMLResolver() { return new XMLResolver() { public Object resolveEntity(String publicId, String systemId, String baseURI, String namespace) throws XMLStreamException { try { final InputSource src = SystemIdResolver.this.resolveEntity(null, publicId, baseURI, systemId); return (src == null) ? null : src.getByteStream(); } catch (IOException ioe) { throw new XMLStreamException("Cannot resolve entity", ioe); } } }; }
// in core/src/java/org/apache/solr/util/SystemIdResolver.java
public Object resolveEntity(String publicId, String systemId, String baseURI, String namespace) throws XMLStreamException { try { final InputSource src = SystemIdResolver.this.resolveEntity(null, publicId, baseURI, systemId); return (src == null) ? null : src.getByteStream(); } catch (IOException ioe) { throw new XMLStreamException("Cannot resolve entity", ioe); } }
4
            
// in solrj/src/java/org/apache/solr/client/solrj/impl/XMLResponseParser.java
catch (XMLStreamException e) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "parsing error", e); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/XMLResponseParser.java
catch (XMLStreamException e) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "parsing error", e); }
// in core/src/java/org/apache/solr/handler/loader/XMLLoader.java
catch (XMLStreamException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e.getMessage(), e); }
// in core/src/java/org/apache/solr/handler/loader/XMLLoader.java
catch (XMLStreamException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e.getMessage(), e); }
4
            
// in solrj/src/java/org/apache/solr/client/solrj/impl/XMLResponseParser.java
catch (XMLStreamException e) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "parsing error", e); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/XMLResponseParser.java
catch (XMLStreamException e) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "parsing error", e); }
// in core/src/java/org/apache/solr/handler/loader/XMLLoader.java
catch (XMLStreamException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e.getMessage(), e); }
// in core/src/java/org/apache/solr/handler/loader/XMLLoader.java
catch (XMLStreamException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e.getMessage(), e); }
4
unknown (Lib) XPathExpressionException 0 0 3
            
// in core/src/java/org/apache/solr/schema/IndexSchema.java
static SimilarityFactory readSimilarity(ResourceLoader loader, Node node) throws XPathExpressionException { if (node==null) { return null; } else { SimilarityFactory similarityFactory; final Object obj = loader.newInstance(((Element) node).getAttribute("class"), Object.class, "search.similarities."); if (obj instanceof SimilarityFactory) { // configure a factory, get a similarity back SolrParams params = SolrParams.toSolrParams(DOMUtil.childNodesToNamedList(node)); similarityFactory = (SimilarityFactory)obj; similarityFactory.init(params); } else { // just like always, assume it's a Similarity and get a ClassCastException - reasonable error handling similarityFactory = new SimilarityFactory() { @Override public Similarity getSimilarity() { return (Similarity) obj; } }; } return similarityFactory; } }
// in core/src/java/org/apache/solr/schema/FieldTypePluginLoader.java
private Analyzer readAnalyzer(Node node) throws XPathExpressionException { final SolrResourceLoader loader = schema.getResourceLoader(); // parent node used to be passed in as "fieldtype" // if (!fieldtype.hasChildNodes()) return null; // Node node = DOMUtil.getChild(fieldtype,"analyzer"); if (node == null) return null; NamedNodeMap attrs = node.getAttributes(); String analyzerName = DOMUtil.getAttr(attrs,"class"); if (analyzerName != null) { try { // No need to be core-aware as Analyzers are not in the core-aware list final Class<? extends Analyzer> clazz = loader.findClass(analyzerName, Analyzer.class); try { // first try to use a ctor with version parameter // (needed for many new Analyzers that have no default one anymore) Constructor<? extends Analyzer> cnstr = clazz.getConstructor(Version.class); final String matchVersionStr = DOMUtil.getAttr(attrs, LUCENE_MATCH_VERSION_PARAM); final Version luceneMatchVersion = (matchVersionStr == null) ? schema.getDefaultLuceneMatchVersion() : Config.parseLuceneVersionString(matchVersionStr); if (luceneMatchVersion == null) { throw new SolrException ( SolrException.ErrorCode.SERVER_ERROR, "Configuration Error: Analyzer '" + clazz.getName() + "' needs a 'luceneMatchVersion' parameter"); } return cnstr.newInstance(luceneMatchVersion); } catch (NoSuchMethodException nsme) { // otherwise use default ctor return clazz.newInstance(); } } catch (Exception e) { log.error("Cannot load analyzer: "+analyzerName, e); throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "Cannot load analyzer: "+analyzerName, e ); } } // Load the CharFilters final ArrayList<CharFilterFactory> charFilters = new ArrayList<CharFilterFactory>(); AbstractPluginLoader<CharFilterFactory> charFilterLoader = new AbstractPluginLoader<CharFilterFactory> ("[schema.xml] analyzer/charFilter", CharFilterFactory.class, false, false) { @Override protected void init(CharFilterFactory plugin, Node node) throws Exception { if( plugin != null ) { final Map<String,String> params = DOMUtil.toMapExcept(node.getAttributes(),"class"); String configuredVersion = params.remove(LUCENE_MATCH_VERSION_PARAM); plugin.setLuceneMatchVersion(parseConfiguredVersion(configuredVersion, plugin.getClass().getSimpleName())); plugin.init( params ); charFilters.add( plugin ); } } @Override protected CharFilterFactory register(String name, CharFilterFactory plugin) { return null; // used for map registration } }; charFilterLoader.load( loader, (NodeList)xpath.evaluate("./charFilter", node, XPathConstants.NODESET) ); // Load the Tokenizer // Although an analyzer only allows a single Tokenizer, we load a list to make sure // the configuration is ok final ArrayList<TokenizerFactory> tokenizers = new ArrayList<TokenizerFactory>(1); AbstractPluginLoader<TokenizerFactory> tokenizerLoader = new AbstractPluginLoader<TokenizerFactory> ("[schema.xml] analyzer/tokenizer", TokenizerFactory.class, false, false) { @Override protected void init(TokenizerFactory plugin, Node node) throws Exception { if( !tokenizers.isEmpty() ) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR, "The schema defines multiple tokenizers for: "+node ); } final Map<String,String> params = DOMUtil.toMapExcept(node.getAttributes(),"class"); String configuredVersion = params.remove(LUCENE_MATCH_VERSION_PARAM); plugin.setLuceneMatchVersion(parseConfiguredVersion(configuredVersion, plugin.getClass().getSimpleName())); plugin.init( params ); tokenizers.add( plugin ); } @Override protected TokenizerFactory register(String name, TokenizerFactory plugin) { return null; // used for map registration } }; tokenizerLoader.load( loader, (NodeList)xpath.evaluate("./tokenizer", node, XPathConstants.NODESET) ); // Make sure something was loaded if( tokenizers.isEmpty() ) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR,"analyzer without class or tokenizer & filter list"); } // Load the Filters final ArrayList<TokenFilterFactory> filters = new ArrayList<TokenFilterFactory>(); AbstractPluginLoader<TokenFilterFactory> filterLoader = new AbstractPluginLoader<TokenFilterFactory>("[schema.xml] analyzer/filter", TokenFilterFactory.class, false, false) { @Override protected void init(TokenFilterFactory plugin, Node node) throws Exception { if( plugin != null ) { final Map<String,String> params = DOMUtil.toMapExcept(node.getAttributes(),"class"); String configuredVersion = params.remove(LUCENE_MATCH_VERSION_PARAM); plugin.setLuceneMatchVersion(parseConfiguredVersion(configuredVersion, plugin.getClass().getSimpleName())); plugin.init( params ); filters.add( plugin ); } } @Override protected TokenFilterFactory register(String name, TokenFilterFactory plugin) throws Exception { return null; // used for map registration } }; filterLoader.load( loader, (NodeList)xpath.evaluate("./filter", node, XPathConstants.NODESET) ); return new TokenizerChain(charFilters.toArray(new CharFilterFactory[charFilters.size()]), tokenizers.get(0), filters.toArray(new TokenFilterFactory[filters.size()])); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
private Properties readProperties(Config cfg, Node node) throws XPathExpressionException { XPath xpath = cfg.getXPath(); NodeList props = (NodeList) xpath.evaluate("property", node, XPathConstants.NODESET); Properties properties = new Properties(); for (int i=0; i<props.getLength(); i++) { Node prop = props.item(i); properties.setProperty(DOMUtil.getAttr(prop, "name"), DOMUtil.getAttr(prop, "value")); } return properties; }
4
            
// in core/src/java/org/apache/solr/handler/component/QueryElevationComponent.java
catch (XPathExpressionException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "query requires '<doc .../>' child"); }
// in core/src/java/org/apache/solr/schema/CurrencyField.java
catch (XPathExpressionException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Error parsing currency config.", e); }
// in core/src/java/org/apache/solr/core/Config.java
catch (XPathExpressionException e) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR,"Error in xpath:" + path +" for " + name,e); }
// in core/src/java/org/apache/solr/core/Config.java
catch (XPathExpressionException e) { SolrException.log(log,"Error in xpath",e); throw new SolrException( SolrException.ErrorCode.SERVER_ERROR,"Error in xpath:" + xstr + " for " + name,e); }
4
            
// in core/src/java/org/apache/solr/handler/component/QueryElevationComponent.java
catch (XPathExpressionException e) { throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "query requires '<doc .../>' child"); }
// in core/src/java/org/apache/solr/schema/CurrencyField.java
catch (XPathExpressionException e) { throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "Error parsing currency config.", e); }
// in core/src/java/org/apache/solr/core/Config.java
catch (XPathExpressionException e) { throw new SolrException( SolrException.ErrorCode.SERVER_ERROR,"Error in xpath:" + path +" for " + name,e); }
// in core/src/java/org/apache/solr/core/Config.java
catch (XPathExpressionException e) { SolrException.log(log,"Error in xpath",e); throw new SolrException( SolrException.ErrorCode.SERVER_ERROR,"Error in xpath:" + xstr + " for " + name,e); }
4
runtime (Domain) ZooKeeperException
public class ZooKeeperException extends SolrException {

  public ZooKeeperException(ErrorCode code, String msg, Throwable th) {
    super(code, msg, th);
  }
  
  public ZooKeeperException(ErrorCode code, String msg) {
    super(code, msg);
  }

}
52
            
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
Override public void update(SolrZooKeeper zooKeeper) { SolrZooKeeper oldKeeper = keeper; keeper = zooKeeper; if (oldKeeper != null) { try { oldKeeper.close(); } catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } } }
// in solrj/src/java/org/apache/solr/common/cloud/ZkStateReader.java
public void command() { try { ZkStateReader.this.createClusterStateWatchersAndUpdate(); } catch (KeeperException e) { log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } }
// in solrj/src/java/org/apache/solr/common/cloud/ZkStateReader.java
public synchronized void createClusterStateWatchersAndUpdate() throws KeeperException, InterruptedException { // We need to fetch the current cluster state and the set of live nodes synchronized (getUpdateLock()) { cmdExecutor.ensureExists(CLUSTER_STATE, zkClient); log.info("Updating cluster state from ZooKeeper... "); zkClient.exists(CLUSTER_STATE, new Watcher() { @Override public void process(WatchedEvent event) { log.info("A cluster state change has occurred"); try { // delayed approach // ZkStateReader.this.updateCloudState(false, false); synchronized (ZkStateReader.this.getUpdateLock()) { // remake watch final Watcher thisWatch = this; byte[] data = zkClient.getData(CLUSTER_STATE, thisWatch, null, true); CloudState clusterState = CloudState.load(data, ZkStateReader.this.cloudState.getLiveNodes()); // update volatile cloudState = clusterState; } } catch (KeeperException e) { if (e.code() == KeeperException.Code.SESSIONEXPIRED || e.code() == KeeperException.Code.CONNECTIONLOSS) { log.warn("ZooKeeper watch triggered, but Solr cannot talk to ZK"); return; } log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.warn("", e); return; } }
// in solrj/src/java/org/apache/solr/common/cloud/ZkStateReader.java
public void close() { if (closeClient) { try { zkClient.close(); } catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } } }
// in solrj/src/java/org/apache/solr/common/cloud/ZkStateReader.java
public List<ZkCoreNodeProps> getReplicaProps(String collection, String shardId, String thisNodeName, String coreName, String mustMatchStateFilter, String mustNotMatchStateFilter) { CloudState cloudState = this.cloudState; if (cloudState == null) { return null; } Map<String,Slice> slices = cloudState.getSlices(collection); if (slices == null) { throw new ZooKeeperException(ErrorCode.BAD_REQUEST, "Could not find collection in zk: " + collection + " " + cloudState.getCollections()); } Slice replicas = slices.get(shardId); if (replicas == null) { throw new ZooKeeperException(ErrorCode.BAD_REQUEST, "Could not find shardId in zk: " + shardId); } Map<String,ZkNodeProps> shardMap = replicas.getShards(); List<ZkCoreNodeProps> nodes = new ArrayList<ZkCoreNodeProps>(shardMap.size()); String filterNodeName = thisNodeName + "_" + coreName; for (Entry<String,ZkNodeProps> entry : shardMap.entrySet()) { ZkCoreNodeProps nodeProps = new ZkCoreNodeProps(entry.getValue()); String coreNodeName = nodeProps.getNodeName() + "_" + nodeProps.getCoreName(); if (cloudState.liveNodesContain(nodeProps.getNodeName()) && !coreNodeName.equals(filterNodeName)) { if (mustMatchStateFilter == null || mustMatchStateFilter.equals(nodeProps.getState())) { if (mustNotMatchStateFilter == null || !mustNotMatchStateFilter.equals(nodeProps.getState())) { nodes.add(nodeProps); } } } } if (nodes.size() == 0) { // no replicas - go local return null; } return nodes; }
// in solrj/src/java/org/apache/solr/client/solrj/impl/CloudSolrServer.java
public void connect() { if (zkStateReader == null) { synchronized (this) { if (zkStateReader == null) { try { ZkStateReader zk = new ZkStateReader(zkHost, zkConnectTimeout, zkClientTimeout); zk.createClusterStateWatchersAndUpdate(); zkStateReader = zk; } catch (InterruptedException e) { Thread.currentThread().interrupt(); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } catch (KeeperException e) { throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } catch (IOException e) { throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } catch (TimeoutException e) { throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } } } } }
// in core/src/java/org/apache/solr/cloud/Overseer.java
Override public void run() { while (amILeader()) { LinkedList<CloudStateUpdateRequest> requests = new LinkedList<Overseer.CloudStateUpdateRequest>(); while (!fifo.isEmpty()) { // collect all queued requests CloudStateUpdateRequest req; req = fifo.poll(); if (req == null) { break; } requests.add(req); } if (requests.size() > 0) { // process updates synchronized (reader.getUpdateLock()) { try { reader.updateCloudState(true); CloudState cloudState = reader.getCloudState(); for (CloudStateUpdateRequest request : requests) { switch (request.operation) { case LeaderChange: cloudState = setShardLeader(cloudState, (String) request.args[0], (String) request.args[1], (String) request.args[2]); break; case StateChange: cloudState = updateState(cloudState, (String) request.args[0], (CoreState) request.args[1]); break; case CoreDeleted: cloudState = removeCore(cloudState, (String) request.args[0], (String) request.args[1]); break; } } log.info("Announcing new cluster state"); zkClient.setData(ZkStateReader.CLUSTER_STATE, ZkStateReader.toJSON(cloudState), true); } catch (KeeperException e) { if (e.code() == KeeperException.Code.SESSIONEXPIRED || e.code() == KeeperException.Code.CONNECTIONLOSS) { log.warn("ZooKeeper watch triggered, but Solr cannot talk to ZK"); return; } SolrException.log(log, "", e); throw new ZooKeeperException( SolrException.ErrorCode.SERVER_ERROR, "", e); } catch (InterruptedException e) { Thread.currentThread().interrupt(); return; } } } try { Thread.sleep(STATE_UPDATE_DELAY); } catch (InterruptedException e) { Thread.currentThread().interrupt(); } }
// in core/src/java/org/apache/solr/cloud/Overseer.java
Override public void process(WatchedEvent event) { try { List<String> leaderNodes = zkClient.getChildren( ZkStateReader.getShardLeadersPath(collection, null), this, true); processLeaderNodesChanged(collection, leaderNodes); } catch (KeeperException e) { if (e.code() == KeeperException.Code.SESSIONEXPIRED || e.code() == KeeperException.Code.CONNECTIONLOSS) { log.warn("ZooKeeper watch triggered, but Solr cannot talk to ZK"); return; } SolrException.log(log, "", e); throw new ZooKeeperException( SolrException.ErrorCode.SERVER_ERROR, "", e); } catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); } }
// in core/src/java/org/apache/solr/cloud/Overseer.java
Override public void process(WatchedEvent event) { try { List<String> liveNodes = zkClient.getChildren( ZkStateReader.LIVE_NODES_ZKNODE, this, true); synchronized (nodeStateWatches) { processLiveNodesChanged(nodeStateWatches.keySet(), liveNodes); } } catch (KeeperException e) { if (e.code() == KeeperException.Code.SESSIONEXPIRED || e.code() == KeeperException.Code.CONNECTIONLOSS) { log.warn("ZooKeeper watch triggered, but Solr cannot talk to ZK"); return; } SolrException.log(log, "", e); throw new ZooKeeperException( SolrException.ErrorCode.SERVER_ERROR, "", e); } catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); } }
// in core/src/java/org/apache/solr/cloud/ZkSolrResourceLoader.java
Override public String getConfigDir() { throw new ZooKeeperException( ErrorCode.SERVER_ERROR, "ZkSolrResourceLoader does not support getConfigDir() - likely, what you are trying to do is not supported in ZooKeeper mode"); }
// in core/src/java/org/apache/solr/cloud/ZkSolrResourceLoader.java
Override public String[] listConfigDir() { List<String> list; try { list = zkController.getZkClient().getChildren(collectionZkPath, null, true); } catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } catch (KeeperException e) { log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } return list.toArray(new String[0]); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
public void command() { try { // we need to create all of our lost watches // seems we dont need to do this again... //Overseer.createClientNodes(zkClient, getNodeName()); ElectionContext context = new OverseerElectionContext(getNodeName(), zkClient, zkStateReader); overseerElector.joinElection(context); zkStateReader.createClusterStateWatchersAndUpdate(); List<CoreDescriptor> descriptors = registerOnReconnect .getCurrentDescriptors(); if (descriptors != null) { // before registering as live, make sure everyone is in a // down state for (CoreDescriptor descriptor : descriptors) { final String coreZkNodeName = getNodeName() + "_" + descriptor.getName(); try { publishAsDown(getBaseUrl(), descriptor, coreZkNodeName, descriptor.getName()); waitForLeaderToSeeDownState(descriptor, coreZkNodeName); } catch (Exception e) { SolrException.log(log, "", e); } } } // we have to register as live first to pick up docs in the buffer createEphemeralLiveNode(); // re register all descriptors if (descriptors != null) { for (CoreDescriptor descriptor : descriptors) { // TODO: we need to think carefully about what happens when it was // a leader that was expired - as well as what to do about leaders/overseers // with connection loss register(descriptor.getName(), descriptor, true); } } } catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); throw new ZooKeeperException( SolrException.ErrorCode.SERVER_ERROR, "", e); } catch (Exception e) { SolrException.log(log, "", e); throw new ZooKeeperException( SolrException.ErrorCode.SERVER_ERROR, "", e); } }
// in core/src/java/org/apache/solr/cloud/ZkController.java
public void close() { try { zkClient.close(); } catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.warn("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } }
// in core/src/java/org/apache/solr/cloud/ZkController.java
public byte[] getConfigFileData(String zkConfigName, String fileName) throws KeeperException, InterruptedException { String zkPath = CONFIGS_ZKNODE + "/" + zkConfigName + "/" + fileName; byte[] bytes = zkClient.getData(zkPath, null, null, true); if (bytes == null) { log.error("Config file contains no data:" + zkPath); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "Config file contains no data:" + zkPath); } return bytes; }
// in core/src/java/org/apache/solr/cloud/ZkController.java
private String getHostNameFromAddress(String addr) { Matcher m = URL_POST.matcher(addr); if (m.matches()) { return m.group(1); } else { log.error("Unrecognized host:" + addr); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "Unrecognized host:" + addr); } }
// in core/src/java/org/apache/solr/cloud/ZkController.java
private void init() { try { // makes nodes zkNode cmdExecutor.ensureExists(ZkStateReader.LIVE_NODES_ZKNODE, zkClient); Overseer.createClientNodes(zkClient, getNodeName()); createEphemeralLiveNode(); cmdExecutor.ensureExists(ZkStateReader.COLLECTIONS_ZKNODE, zkClient); syncNodeState(); overseerElector = new LeaderElector(zkClient); ElectionContext context = new OverseerElectionContext(getNodeName(), zkClient, zkStateReader); overseerElector.setup(context); overseerElector.joinElection(context); zkStateReader.createClusterStateWatchersAndUpdate(); } catch (IOException e) { log.error("", e); throw new SolrException(SolrException.ErrorCode.SERVER_ERROR, "Can't create ZooKeeperController", e); } catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } catch (KeeperException e) { log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } }
// in core/src/java/org/apache/solr/cloud/ZkController.java
public String readConfigName(String collection) throws KeeperException, InterruptedException, IOException { String configName = null; String path = ZkStateReader.COLLECTIONS_ZKNODE + "/" + collection; if (log.isInfoEnabled()) { log.info("Load collection config from:" + path); } byte[] data = zkClient.getData(path, null, null, true); if(data != null) { ZkNodeProps props = ZkNodeProps.load(data); configName = props.get(CONFIGNAME_PROP); } if (configName != null && !zkClient.exists(CONFIGS_ZKNODE + "/" + configName, true)) { log.error("Specified config does not exist in ZooKeeper:" + configName); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "Specified config does not exist in ZooKeeper:" + configName); } return configName; }
// in core/src/java/org/apache/solr/cloud/ZkController.java
public String register(String coreName, final CoreDescriptor desc, boolean recoverReloadedCores) throws Exception { final String baseUrl = getBaseUrl(); final CloudDescriptor cloudDesc = desc.getCloudDescriptor(); final String collection = cloudDesc.getCollectionName(); final String coreZkNodeName = getNodeName() + "_" + coreName; String shardId = cloudDesc.getShardId(); Map<String,String> props = new HashMap<String,String>(); // we only put a subset of props into the leader node props.put(ZkStateReader.BASE_URL_PROP, baseUrl); props.put(ZkStateReader.CORE_NAME_PROP, coreName); props.put(ZkStateReader.NODE_NAME_PROP, getNodeName()); if (log.isInfoEnabled()) { log.info("Register shard - core:" + coreName + " address:" + baseUrl + " shardId:" + shardId); } ZkNodeProps leaderProps = new ZkNodeProps(props); try { joinElection(desc); } catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } catch (KeeperException e) { throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } catch (IOException e) { throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } // rather than look in the cluster state file, we go straight to the zknodes // here, because on cluster restart there could be stale leader info in the // cluster state node that won't be updated for a moment String leaderUrl = getLeaderProps(collection, cloudDesc.getShardId()).getCoreUrl(); // now wait until our currently cloud state contains the latest leader String cloudStateLeader = zkStateReader.getLeaderUrl(collection, cloudDesc.getShardId(), 30000); int tries = 0; while (!leaderUrl.equals(cloudStateLeader)) { if (tries == 60) { throw new SolrException(ErrorCode.SERVER_ERROR, "There is conflicting information about the leader of shard: " + cloudDesc.getShardId()); } Thread.sleep(1000); tries++; cloudStateLeader = zkStateReader.getLeaderUrl(collection, cloudDesc.getShardId(), 30000); } String ourUrl = ZkCoreNodeProps.getCoreUrl(baseUrl, coreName); log.info("We are " + ourUrl + " and leader is " + leaderUrl); boolean isLeader = leaderUrl.equals(ourUrl); SolrCore core = null; if (cc != null) { // CoreContainer only null in tests try { core = cc.getCore(desc.getName()); // recover from local transaction log and wait for it to complete before // going active // TODO: should this be moved to another thread? To recoveryStrat? // TODO: should this actually be done earlier, before (or as part of) // leader election perhaps? // TODO: if I'm the leader, ensure that a replica that is trying to recover waits until I'm // active (or don't make me the // leader until my local replay is done. UpdateLog ulog = core.getUpdateHandler().getUpdateLog(); if (!core.isReloaded() && ulog != null) { Future<UpdateLog.RecoveryInfo> recoveryFuture = core.getUpdateHandler() .getUpdateLog().recoverFromLog(); if (recoveryFuture != null) { recoveryFuture.get(); // NOTE: this could potentially block for // minutes or more! // TODO: public as recovering in the mean time? // TODO: in the future we could do peerync in parallel with recoverFromLog } else { log.info("No LogReplay needed for core="+core.getName() + " baseURL=" + baseUrl); } } boolean didRecovery = checkRecovery(coreName, desc, recoverReloadedCores, isLeader, cloudDesc, collection, coreZkNodeName, shardId, leaderProps, core, cc); if (!didRecovery) { publishAsActive(baseUrl, desc, coreZkNodeName, coreName); } } finally { if (core != null) { core.close(); } } } else { publishAsActive(baseUrl, desc, coreZkNodeName, coreName); } // make sure we have an update cluster state right away zkStateReader.updateCloudState(true); return shardId; }
// in core/src/java/org/apache/solr/cloud/ZkController.java
private void getConfName(String collection, String collectionPath, Map<String,String> collectionProps) throws KeeperException, InterruptedException { // check for configName log.info("Looking for collection configName"); List<String> configNames = null; int retry = 1; int retryLimt = 6; for (; retry < retryLimt; retry++) { if (zkClient.exists(collectionPath, true)) { ZkNodeProps cProps = ZkNodeProps.load(zkClient.getData(collectionPath, null, null, true)); if (cProps.containsKey(CONFIGNAME_PROP)) { break; } } // if there is only one conf, use that try { configNames = zkClient.getChildren(CONFIGS_ZKNODE, null, true); } catch (NoNodeException e) { // just keep trying } if (configNames != null && configNames.size() == 1) { // no config set named, but there is only 1 - use it log.info("Only one config set found in zk - using it:" + configNames.get(0)); collectionProps.put(CONFIGNAME_PROP, configNames.get(0)); break; } if (configNames != null && configNames.contains(collection)) { log.info("Could not find explicit collection configName, but found config name matching collection name - using that set."); collectionProps.put(CONFIGNAME_PROP, collection); break; } log.info("Could not find collection configName - pausing for 3 seconds and trying again - try: " + retry); Thread.sleep(3000); } if (retry == retryLimt) { log.error("Could not find configName for collection " + collection); throw new ZooKeeperException( SolrException.ErrorCode.SERVER_ERROR, "Could not find configName for collection " + collection + " found:" + configNames); } }
// in core/src/java/org/apache/solr/cloud/ZkController.java
private void publishState() { final String nodePath = "/node_states/" + getNodeName(); long version; byte[] coreStatesData; synchronized (coreStates) { version = ++coreStatesVersion; coreStatesData = ZkStateReader.toJSON(coreStates.values()); } // if multiple threads are trying to publish state, make sure that we never write // an older version after a newer version. synchronized (coreStatesPublishLock) { try { if (version < coreStatesPublishedVersion) { log.info("Another thread already published a newer coreStates: ours="+version + " lastPublished=" + coreStatesPublishedVersion); } else { zkClient.setData(nodePath, coreStatesData, true); coreStatesPublishedVersion = version; // put it after so it won't be set if there's an exception } } catch (KeeperException e) { throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "could not publish node state", e); } catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "could not publish node state", e); } } }
// in core/src/java/org/apache/solr/cloud/LeaderElector.java
public int joinElection(ElectionContext context) throws KeeperException, InterruptedException, IOException { final String shardsElectZkPath = context.electionPath + LeaderElector.ELECTION_NODE; long sessionId = zkClient.getSolrZooKeeper().getSessionId(); String id = sessionId + "-" + context.id; String leaderSeqPath = null; boolean cont = true; int tries = 0; while (cont) { try { leaderSeqPath = zkClient.create(shardsElectZkPath + "/" + id + "-n_", null, CreateMode.EPHEMERAL_SEQUENTIAL, false); context.leaderSeqPath = leaderSeqPath; cont = false; } catch (ConnectionLossException e) { // we don't know if we made our node or not... List<String> entries = zkClient.getChildren(shardsElectZkPath, null, true); boolean foundId = false; for (String entry : entries) { String nodeId = getNodeId(entry); if (id.equals(nodeId)) { // we did create our node... foundId = true; break; } } if (!foundId) { throw e; } } catch (KeeperException.NoNodeException e) { // we must have failed in creating the election node - someone else must // be working on it, lets try again if (tries++ > 9) { throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } cont = true; Thread.sleep(50); } } int seq = getSeq(leaderSeqPath); checkIfIamLeader(seq, context, false); return seq; }
// in core/src/java/org/apache/solr/update/processor/DistributedUpdateProcessor.java
private List<Node> setupRequest(int hash) { List<Node> nodes = null; // if we are in zk mode... if (zkEnabled) { // the leader is... // TODO: if there is no leader, wait and look again // TODO: we are reading the leader from zk every time - we should cache // this and watch for changes?? Just pull it from ZkController cluster state probably? String shardId = getShard(hash, collection, zkController.getCloudState()); // get the right shard based on the hash... try { // TODO: if we find out we cannot talk to zk anymore, we should probably realize we are not // a leader anymore - we shouldn't accept updates at all?? ZkCoreNodeProps leaderProps = new ZkCoreNodeProps(zkController.getZkStateReader().getLeaderProps( collection, shardId)); String leaderNodeName = leaderProps.getCoreNodeName(); String coreName = req.getCore().getName(); String coreNodeName = zkController.getNodeName() + "_" + coreName; isLeader = coreNodeName.equals(leaderNodeName); DistribPhase phase = DistribPhase.parseParam(req.getParams().get(DISTRIB_UPDATE_PARAM)); if (DistribPhase.FROMLEADER == phase) { // we are coming from the leader, just go local - add no urls forwardToLeader = false; } else if (isLeader) { // that means I want to forward onto my replicas... // so get the replicas... forwardToLeader = false; List<ZkCoreNodeProps> replicaProps = zkController.getZkStateReader() .getReplicaProps(collection, shardId, zkController.getNodeName(), coreName, null, ZkStateReader.DOWN); if (replicaProps != null) { nodes = new ArrayList<Node>(replicaProps.size()); for (ZkCoreNodeProps props : replicaProps) { nodes.add(new StdNode(props)); } } } else { // I need to forward onto the leader... nodes = new ArrayList<Node>(1); nodes.add(new RetryNode(leaderProps, zkController.getZkStateReader(), collection, shardId)); forwardToLeader = true; } } catch (InterruptedException e) { Thread.currentThread().interrupt(); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } } return nodes; }
// in core/src/java/org/apache/solr/update/processor/DistributedUpdateProcessor.java
private List<Node> setupRequest() { List<Node> nodes = null; String shardId = cloudDesc.getShardId(); try { ZkCoreNodeProps leaderProps = new ZkCoreNodeProps(zkController.getZkStateReader().getLeaderProps( collection, shardId)); String leaderNodeName = leaderProps.getCoreNodeName(); String coreName = req.getCore().getName(); String coreNodeName = zkController.getNodeName() + "_" + coreName; isLeader = coreNodeName.equals(leaderNodeName); // TODO: what if we are no longer the leader? forwardToLeader = false; List<ZkCoreNodeProps> replicaProps = zkController.getZkStateReader() .getReplicaProps(collection, shardId, zkController.getNodeName(), coreName); if (replicaProps != null) { nodes = new ArrayList<Node>(replicaProps.size()); for (ZkCoreNodeProps props : replicaProps) { nodes.add(new StdNode(props)); } } } catch (InterruptedException e) { Thread.currentThread().interrupt(); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } return nodes; }
// in core/src/java/org/apache/solr/update/processor/DistributedUpdateProcessor.java
private List<Node> getCollectionUrls(SolrQueryRequest req, String collection, String shardZkNodeName) { CloudState cloudState = req.getCore().getCoreDescriptor() .getCoreContainer().getZkController().getCloudState(); List<Node> urls = new ArrayList<Node>(); Map<String,Slice> slices = cloudState.getSlices(collection); if (slices == null) { throw new ZooKeeperException(ErrorCode.BAD_REQUEST, "Could not find collection in zk: " + cloudState); } for (Map.Entry<String,Slice> sliceEntry : slices.entrySet()) { Slice replicas = slices.get(sliceEntry.getKey()); Map<String,ZkNodeProps> shardMap = replicas.getShards(); for (Entry<String,ZkNodeProps> entry : shardMap.entrySet()) { ZkCoreNodeProps nodeProps = new ZkCoreNodeProps(entry.getValue()); if (cloudState.liveNodesContain(nodeProps.getNodeName()) && !entry.getKey().equals(shardZkNodeName)) { urls.add(new StdNode(nodeProps)); } } } if (urls.size() == 0) { return null; } return urls; }
// in core/src/java/org/apache/solr/core/CoreContainer.java
protected void initZooKeeper(String zkHost, int zkClientTimeout) { // if zkHost sys property is not set, we are not using ZooKeeper String zookeeperHost; if(zkHost == null) { zookeeperHost = System.getProperty("zkHost"); } else { zookeeperHost = zkHost; } String zkRun = System.getProperty("zkRun"); if (zkRun == null && zookeeperHost == null) return; // not in zk mode // zookeeper in quorum mode currently causes a failure when trying to // register log4j mbeans. See SOLR-2369 // TODO: remove after updating to an slf4j based zookeeper System.setProperty("zookeeper.jmx.log4j.disable", "true"); if (zkRun != null) { String zkDataHome = System.getProperty("zkServerDataDir", solrHome + "zoo_data"); String zkConfHome = System.getProperty("zkServerConfDir", solrHome); zkServer = new SolrZkServer(zkRun, zookeeperHost, zkDataHome, zkConfHome, hostPort); zkServer.parseConfig(); zkServer.start(); // set client from server config if not already set if (zookeeperHost == null) { zookeeperHost = zkServer.getClientString(); } } int zkClientConnectTimeout = 15000; if (zookeeperHost != null) { // we are ZooKeeper enabled try { // If this is an ensemble, allow for a long connect time for other servers to come up if (zkRun != null && zkServer.getServers().size() > 1) { zkClientConnectTimeout = 24 * 60 * 60 * 1000; // 1 day for embedded ensemble log.info("Zookeeper client=" + zookeeperHost + " Waiting for a quorum."); } else { log.info("Zookeeper client=" + zookeeperHost); } zkController = new ZkController(this, zookeeperHost, zkClientTimeout, zkClientConnectTimeout, host, hostPort, hostContext, new CurrentCoreDescriptorProvider() { @Override public List<CoreDescriptor> getCurrentDescriptors() { List<CoreDescriptor> descriptors = new ArrayList<CoreDescriptor>(getCoreNames().size()); for (SolrCore core : getCores()) { descriptors.add(core.getCoreDescriptor()); } return descriptors; } }); String confDir = System.getProperty("bootstrap_confdir"); if(confDir != null) { File dir = new File(confDir); if(!dir.isDirectory()) { throw new IllegalArgumentException("bootstrap_confdir must be a directory of configuration files"); } String confName = System.getProperty(ZkController.COLLECTION_PARAM_PREFIX+ZkController.CONFIGNAME_PROP, "configuration1"); zkController.uploadConfigDir(dir, confName); } boolean boostrapConf = Boolean.getBoolean("bootstrap_conf"); if(boostrapConf) { ZkController.bootstrapConf(zkController.getZkClient(), cfg, solrHome); } } catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } catch (TimeoutException e) { log.error("Could not connect to ZooKeeper", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } catch (IOException e) { log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } catch (KeeperException e) { log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } }
// in core/src/java/org/apache/solr/core/CoreContainer.java
private void registerInZk(SolrCore core) { if (zkController != null) { try { zkController.register(core.getName(), core.getCoreDescriptor()); } catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); SolrException.log(log, "", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } catch (Exception e) { // if register fails, this is really bad - close the zkController to // minimize any damage we can cause zkController.publish(core.getCoreDescriptor(), ZkStateReader.DOWN); SolrException.log(log, "", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } } }
// in core/src/java/org/apache/solr/core/CoreContainer.java
public SolrCore create(CoreDescriptor dcore) throws ParserConfigurationException, IOException, SAXException { // Make the instanceDir relative to the cores instanceDir if not absolute File idir = new File(dcore.getInstanceDir()); if (!idir.isAbsolute()) { idir = new File(solrHome, dcore.getInstanceDir()); } String instanceDir = idir.getPath(); log.info("Creating SolrCore '{}' using instanceDir: {}", dcore.getName(), instanceDir); // Initialize the solr config SolrResourceLoader solrLoader = null; SolrConfig config = null; String zkConfigName = null; if(zkController == null) { solrLoader = new SolrResourceLoader(instanceDir, libLoader, getCoreProps(instanceDir, dcore.getPropertiesName(),dcore.getCoreProperties())); config = new SolrConfig(solrLoader, dcore.getConfigName(), null); } else { try { String collection = dcore.getCloudDescriptor().getCollectionName(); zkController.createCollectionZkNode(dcore.getCloudDescriptor()); zkConfigName = zkController.readConfigName(collection); if (zkConfigName == null) { log.error("Could not find config name for collection:" + collection); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "Could not find config name for collection:" + collection); } solrLoader = new ZkSolrResourceLoader(instanceDir, zkConfigName, libLoader, getCoreProps(instanceDir, dcore.getPropertiesName(),dcore.getCoreProperties()), zkController); config = getSolrConfigFromZk(zkConfigName, dcore.getConfigName(), solrLoader); } catch (KeeperException e) { log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } } IndexSchema schema = null; if (indexSchemaCache != null) { if (zkController != null) { File schemaFile = new File(dcore.getSchemaName()); if (!schemaFile.isAbsolute()) { schemaFile = new File(solrLoader.getInstanceDir() + "conf" + File.separator + dcore.getSchemaName()); } if (schemaFile.exists()) { String key = schemaFile.getAbsolutePath() + ":" + new SimpleDateFormat("yyyyMMddHHmmss", Locale.US).format(new Date( schemaFile.lastModified())); schema = indexSchemaCache.get(key); if (schema == null) { log.info("creating new schema object for core: " + dcore.name); schema = new IndexSchema(config, dcore.getSchemaName(), null); indexSchemaCache.put(key, schema); } else { log.info("re-using schema object for core: " + dcore.name); } } } else { // TODO: handle caching from ZooKeeper - perhaps using ZooKeepers versioning // Don't like this cache though - how does it empty as last modified changes? } } if(schema == null){ if(zkController != null) { try { schema = getSchemaFromZk(zkConfigName, dcore.getSchemaName(), config, solrLoader); } catch (KeeperException e) { log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } } else { schema = new IndexSchema(config, dcore.getSchemaName(), null); } } SolrCore core = new SolrCore(dcore.getName(), null, config, schema, dcore); if (zkController == null && core.getUpdateHandler().getUpdateLog() != null) { // always kick off recovery if we are in standalone mode. core.getUpdateHandler().getUpdateLog().recoverFromLog(); } return core; }
// in core/src/java/org/apache/solr/core/CoreContainer.java
public void reload(String name) throws ParserConfigurationException, IOException, SAXException { name= checkDefault(name); SolrCore core; synchronized(cores) { core = cores.get(name); } if (core == null) throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "No such core: " + name ); CoreDescriptor cd = core.getCoreDescriptor(); File instanceDir = new File(cd.getInstanceDir()); if (!instanceDir.isAbsolute()) { instanceDir = new File(getSolrHome(), cd.getInstanceDir()); } log.info("Reloading SolrCore '{}' using instanceDir: {}", cd.getName(), instanceDir.getAbsolutePath()); SolrResourceLoader solrLoader; if(zkController == null) { solrLoader = new SolrResourceLoader(instanceDir.getAbsolutePath(), libLoader, getCoreProps(instanceDir.getAbsolutePath(), cd.getPropertiesName(),cd.getCoreProperties())); } else { try { String collection = cd.getCloudDescriptor().getCollectionName(); zkController.createCollectionZkNode(cd.getCloudDescriptor()); String zkConfigName = zkController.readConfigName(collection); if (zkConfigName == null) { log.error("Could not find config name for collection:" + collection); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "Could not find config name for collection:" + collection); } solrLoader = new ZkSolrResourceLoader(instanceDir.getAbsolutePath(), zkConfigName, libLoader, getCoreProps(instanceDir.getAbsolutePath(), cd.getPropertiesName(),cd.getCoreProperties()), zkController); } catch (KeeperException e) { log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } } SolrCore newCore = core.reload(solrLoader); // keep core to orig name link String origName = coreToOrigName.remove(core); if (origName != null) { coreToOrigName.put(newCore, origName); } register(name, newCore, false); }
42
            
// in solrj/src/java/org/apache/solr/common/cloud/SolrZkClient.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in solrj/src/java/org/apache/solr/common/cloud/ZkStateReader.java
catch (KeeperException e) { log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in solrj/src/java/org/apache/solr/common/cloud/ZkStateReader.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in solrj/src/java/org/apache/solr/common/cloud/ZkStateReader.java
catch (KeeperException e) { if (e.code() == KeeperException.Code.SESSIONEXPIRED || e.code() == KeeperException.Code.CONNECTIONLOSS) { log.warn("ZooKeeper watch triggered, but Solr cannot talk to ZK"); return; } log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in solrj/src/java/org/apache/solr/common/cloud/ZkStateReader.java
catch (KeeperException e) { if (e.code() == KeeperException.Code.SESSIONEXPIRED || e.code() == KeeperException.Code.CONNECTIONLOSS) { log.warn("ZooKeeper watch triggered, but Solr cannot talk to ZK"); return; } log.error("", e); throw new ZooKeeperException( SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in solrj/src/java/org/apache/solr/common/cloud/ZkStateReader.java
catch (KeeperException e) { if (e.code() == KeeperException.Code.SESSIONEXPIRED || e.code() == KeeperException.Code.CONNECTIONLOSS) { log.warn("ZooKeeper watch triggered, but Solr cannot talk to ZK"); return; } log.error("", e); throw new ZooKeeperException( SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in solrj/src/java/org/apache/solr/common/cloud/ZkStateReader.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException( SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in solrj/src/java/org/apache/solr/common/cloud/ZkStateReader.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/CloudSolrServer.java
catch (InterruptedException e) { Thread.currentThread().interrupt(); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/CloudSolrServer.java
catch (KeeperException e) { throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/CloudSolrServer.java
catch (IOException e) { throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in solrj/src/java/org/apache/solr/client/solrj/impl/CloudSolrServer.java
catch (TimeoutException e) { throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/cloud/Overseer.java
catch (KeeperException e) { if (e.code() == KeeperException.Code.SESSIONEXPIRED || e.code() == KeeperException.Code.CONNECTIONLOSS) { log.warn("ZooKeeper watch triggered, but Solr cannot talk to ZK"); return; } SolrException.log(log, "", e); throw new ZooKeeperException( SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/cloud/Overseer.java
catch (KeeperException e) { if (e.code() == KeeperException.Code.SESSIONEXPIRED || e.code() == KeeperException.Code.CONNECTIONLOSS) { log.warn("ZooKeeper watch triggered, but Solr cannot talk to ZK"); return; } SolrException.log(log, "", e); throw new ZooKeeperException( SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/cloud/Overseer.java
catch (KeeperException e) { if (e.code() == KeeperException.Code.SESSIONEXPIRED || e.code() == KeeperException.Code.CONNECTIONLOSS) { log.warn("ZooKeeper watch triggered, but Solr cannot talk to ZK"); return; } SolrException.log(log, "", e); throw new ZooKeeperException( SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/cloud/ZkSolrResourceLoader.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/cloud/ZkSolrResourceLoader.java
catch (KeeperException e) { log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); throw new ZooKeeperException( SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (Exception e) { SolrException.log(log, "", e); throw new ZooKeeperException( SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.warn("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (KeeperException e) { log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (KeeperException e) { throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (IOException e) { throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (KeeperException e) { throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "could not publish node state", e); }
// in core/src/java/org/apache/solr/cloud/ZkController.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "could not publish node state", e); }
// in core/src/java/org/apache/solr/cloud/LeaderElector.java
catch (KeeperException.NoNodeException e) { // we must have failed in creating the election node - someone else must // be working on it, lets try again if (tries++ > 9) { throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); } cont = true; Thread.sleep(50); }
// in core/src/java/org/apache/solr/update/processor/DistributedUpdateProcessor.java
catch (InterruptedException e) { Thread.currentThread().interrupt(); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/update/processor/DistributedUpdateProcessor.java
catch (InterruptedException e) { Thread.currentThread().interrupt(); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (TimeoutException e) { log.error("Could not connect to ZooKeeper", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (IOException e) { log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (KeeperException e) { log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); SolrException.log(log, "", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (Exception e) { // if register fails, this is really bad - close the zkController to // minimize any damage we can cause zkController.publish(core.getCoreDescriptor(), ZkStateReader.DOWN); SolrException.log(log, "", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (KeeperException e) { log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (KeeperException e) { log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (KeeperException e) { log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
// in core/src/java/org/apache/solr/core/CoreContainer.java
catch (InterruptedException e) { // Restore the interrupted status Thread.currentThread().interrupt(); log.error("", e); throw new ZooKeeperException(SolrException.ErrorCode.SERVER_ERROR, "", e); }
0 0 0 0

Miscellanous Metrics

nF = Number of Finally 179
nF = Number of Try-Finally (without catch) 111
Number of Methods with Finally (nMF) 160 / 7101 (2.3%)
Number of Finally with a Continue 0
Number of Finally with a Return 1
Number of Finally with a Throw 1
Number of Finally with a Break 0
Number of different exception types thrown 31
Number of Domain exception types thrown 8
Number of different exception types caught 61
Number of Domain exception types caught 5
Number of exception declarations in signatures 1837
Number of different exceptions types declared in method signatures 45
Number of library exceptions types declared in method signatures 41
Number of Domain exceptions types declared in method signatures 4
Number of Catch with a continue 4
Number of Catch with a return 79
Number of Catch with a Break 5
nbIf = Number of If 7169
nbFor = Number of For 1329
Number of Method with an if 2331 / 7101
Number of Methods with a for 836 / 7101
Number of Method starting with a try 122 / 7101 (1.7%)
Number of Expressions 82093
Number of Expressions in try 12417 (15.1%)