首页 > 解决方案 > 使用 Maven 使 Janino 的 2 个版本一起用于 Junit 测试

问题描述

我正在开发一个内部使用 Drools 的 spark 程序。代码工作正常。现在,我正在编写测试用例来测试 drools 代码。由于这个错误https://issues.redhat.com/browse/DROOLS-329,我不得不添加依赖项

<dependency>
  <groupId>org.codehaus.janino</groupId>
  <artifactId>janino</artifactId>
  <version>2.5.16</version>
 </dependency>

然后是 jacoco 测试的 JVM 参数

      <plugin>
        <groupId>org.apache.maven.plugins</groupId>
        <artifactId>maven-surefire-plugin</artifactId>
        <configuration>
          <systemPropertyVariables>
            <jacoco-agent.destfile>${project.build.directory}/jacoco.exec</jacoco-agent.destfile>
          </systemPropertyVariables>
                    <argLine>-Ddrools.dialect.java.compiler=JANINO</argLine>
        </configuration>
      </plugin>

添加 Spark 测试用例后开始失败,因为它们依赖于

<dependency>
  <groupId>org.codehaus.janino</groupId>
  <artifactId>janino</artifactId>
  <version>3.0.8</version>
 </dependency>

因此,我在 POM 中注释掉了 Drools 测试和 2.5.16 janino 依赖项。现在 Spark 代码工作正常。我们如何确保 Janino 2.5.16 和 Janino 3.0.8 在 Maven 测试构建期间共存,这样 Drools 测试和 Spark 测试都通过了?

这是我正在使用的 Drools 代码

public static void applyRules(ResultObj obj) {
        try {
            KnowledgeBuilder kbuilder = KnowledgeBuilderFactory.newKnowledgeBuilder();
            kbuilder.add( ResourceFactory.newClassPathResource( "rules/rules.drl", DroolsPreprocessor.class),
                    ResourceType.DRL );
            if ( kbuilder.hasErrors() ) {
                System.err.println( kbuilder.getErrors().toString() );
            }
            StatelessKieSession session = kbuilder.newKieBase().newStatelessKieSession();
            session.execute(obj);
            System.out.println("Reached the end ");
        } catch (Exception e) {
            e.printStackTrace();
        }
    }

这是 Drools 依赖项

    <dependency>
      <groupId>org.drools</groupId>
      <artifactId>drools-decisiontables</artifactId>
      <version>7.7.0.Final</version>
    </dependency>

    <dependency>
      <groupId>org.drools</groupId>
      <artifactId>drools-core</artifactId>
      <version>7.7.0.Final</version>
    </dependency>
    <dependency>
      <groupId>org.drools</groupId>
      <artifactId>drools-compiler</artifactId>
      <version>7.7.0.Final</version>
    </dependency>

对于 Janino 2.5.16,Spark 不高兴会抛出此异常

Exception in thread "main" java.lang.NoClassDefFoundError: org/codehaus/janino/InternalCompilerException
    at org.apache.spark.sql.catalyst.expressions.codegen.JavaCode$.variable(javaCode.scala:63)
    at org.apache.spark.sql.catalyst.expressions.codegen.JavaCode$.isNullVariable(javaCode.scala:76)
    at org.apache.spark.sql.catalyst.expressions.Expression.$anonfun$genCode$3(Expression.scala:145)
    at scala.Option.getOrElse(Option.scala:189)
    at org.apache.spark.sql.catalyst.expressions.Expression.genCode(Expression.scala:141)
    at org.apache.spark.sql.catalyst.expressions.codegen.CodegenContext.$anonfun$generateExpressions$1(CodeGenerator.scala:1170)
    at scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:238)
    at scala.collection.mutable.ResizableArray.foreach(ResizableArray.scala:62)
    at scala.collection.mutable.ResizableArray.foreach$(ResizableArray.scala:55)
    at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:49)
    at scala.collection.TraversableLike.map(TraversableLike.scala:238)
    at scala.collection.TraversableLike.map$(TraversableLike.scala:231)
    at scala.collection.AbstractTraversable.map(Traversable.scala:108)
    at org.apache.spark.sql.catalyst.expressions.codegen.CodegenContext.generateExpressions(CodeGenerator.scala:1170)
    at org.apache.spark.sql.catalyst.expressions.codegen.GenerateUnsafeProjection$.createCode(GenerateUnsafeProjection.scala:290)
    at org.apache.spark.sql.catalyst.expressions.codegen.GenerateUnsafeProjection$.create(GenerateUnsafeProjection.scala:338)
    at org.apache.spark.sql.catalyst.expressions.codegen.GenerateUnsafeProjection$.generate(GenerateUnsafeProjection.scala:327)
    at org.apache.spark.sql.catalyst.expressions.UnsafeProjection$.createCodeGeneratedObject(Projection.scala:123)
    at org.apache.spark.sql.catalyst.expressions.UnsafeProjection$.createCodeGeneratedObject(Projection.scala:119)
    at org.apache.spark.sql.catalyst.expressions.CodeGeneratorWithInterpretedFallback.createObject(CodeGeneratorWithInterpretedFallback.scala:52)
    at org.apache.spark.sql.catalyst.expressions.UnsafeProjection$.create(Projection.scala:150)
    at org.apache.spark.sql.catalyst.expressions.UnsafeProjection$.create(Projection.scala:143)
    at org.apache.spark.sql.catalyst.expressions.UnsafeProjection$.create(Projection.scala:135)
    at org.apache.spark.sql.kafka010.KafkaRecordToRowConverter.<init>(KafkaRecordToRowConverter.scala:36)
    at org.apache.spark.sql.kafka010.KafkaRelation.<init>(KafkaRelation.scala:52)
    at org.apache.spark.sql.kafka010.KafkaSourceProvider.createRelation(KafkaSourceProvider.scala:150)
    at org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:339)
    at org.apache.spark.sql.DataFrameReader.loadV1Source(DataFrameReader.scala:279)
    at org.apache.spark.sql.DataFrameReader.$anonfun$load$2(DataFrameReader.scala:268)
    at scala.Option.getOrElse(Option.scala:189)
    at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:268)
    at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:203)

如果我不添加 Janino 2.5.16 依赖项,我会收到以下测试错误

java.lang.RuntimeException: wrong class format
at org.drools.compiler.commons.jci.compilers.EclipseJavaCompiler$2.findType(EclipseJavaCompiler.java:302)
at org.drools.compiler.commons.jci.compilers.EclipseJavaCompiler$2.findType(EclipseJavaCompiler.java:255)
at org.eclipse.jdt.internal.compiler.lookup.LookupEnvironment.askForType(LookupEnvironment.java:174)
at org.eclipse.jdt.internal.compiler.lookup.PackageBinding.getTypeOrPackage(PackageBinding.java:214)
at org.eclipse.jdt.internal.compiler.lookup.CompilationUnitScope.findImport(CompilationUnitScope.java:466)
at org.eclipse.jdt.internal.compiler.lookup.CompilationUnitScope.findSingleImport(CompilationUnitScope.java:520)
at org.eclipse.jdt.internal.compiler.lookup.CompilationUnitScope.faultInImports(CompilationUnitScope.java:397)
at org.eclipse.jdt.internal.compiler.lookup.CompilationUnitScope.faultInTypes(CompilationUnitScope.java:445)
at org.eclipse.jdt.internal.compiler.Compiler.process(Compiler.java:860)
at org.eclipse.jdt.internal.compiler.Compiler.processCompiledUnits(Compiler.java:550)
at org.eclipse.jdt.internal.compiler.Compiler.compile(Compiler.java:462)
at org.eclipse.jdt.internal.compiler.Compiler.compile(Compiler.java:417)
at org.drools.compiler.commons.jci.compilers.EclipseJavaCompiler.compile(EclipseJavaCompiler.java:436)
at org.drools.compiler.commons.jci.compilers.AbstractJavaCompiler.compile(AbstractJavaCompiler.java:49)
at org.drools.compiler.rule.builder.dialect.java.JavaDialect.compileAll(JavaDialect.java:420)
at org.drools.compiler.compiler.DialectCompiletimeRegistry.compileAll(DialectCompiletimeRegistry.java:61)
at org.drools.compiler.compiler.PackageRegistry.compileAll(PackageRegistry.java:109)
at org.drools.compiler.builder.impl.KnowledgeBuilderImpl.compileAll(KnowledgeBuilderImpl.java:1471)
at org.drools.compiler.builder.impl.KnowledgeBuilderImpl.wireAllRules(KnowledgeBuilderImpl.java:950)
at org.drools.compiler.builder.impl.KnowledgeBuilderImpl.addPackage(KnowledgeBuilderImpl.java:938)
at org.drools.compiler.builder.impl.KnowledgeBuilderImpl.addPackageFromDrl(KnowledgeBuilderImpl.java:543)
at org.drools.compiler.builder.impl.KnowledgeBuilderImpl.addKnowledgeResource(KnowledgeBuilderImpl.java:743)
at org.drools.compiler.builder.impl.KnowledgeBuilderImpl.add(KnowledgeBuilderImpl.java:2288)
at org.drools.compiler.builder.impl.KnowledgeBuilderImpl.add(KnowledgeBuilderImpl.java:2277)

标签: javamavenapache-sparkdroolsjanino

解决方案


推荐阅读