将 Proguard 用于 Scala AWS Lambda

Use Proguard for Scala AWS Lambda

我有一个关于 proguard 与 scala aws lambda 函数一起使用的问题。我创建了一个非常简单的 aws lambda 函数,如下所示:

package example

import scala.collection.JavaConverters._
import com.amazonaws.services.lambda.runtime.events.S3Event
import com.amazonaws.services.lambda.runtime.Context

object Main extends App {

def kinesisEventHandler(event: S3Event, context: Context): Unit = {
val result = event.getRecords.asScala.map(m => m.getS3.getObject.getKey)
println(result)
}

}

我导入了以下包:

"com.amazonaws" % "aws-lambda-java-core" % "1.1.0"
"com.amazonaws" % "aws-lambda-java-events" % "1.3.0"

当我创建一个 fat jar 时,它的大小为 13 MB,并且像预期的那样作为 AWS Lambda 函数工作(仅用于测试输出)。

13 MB 非常大,所以我尝试使用 proguard 缩小 jar,但它不起作用,我总是遇到问题,两天后,我不知道如何解决这个问题。

这是我的混淆器配置:

-injars "/Users/x/x/x/AWS_Lambda/target/scala-2.12/lambda-demo-assembly-1.0.jar"
-libraryjars "/Users/x/x/x/AWS_Lambda/lib_managed/jars/org.scala-lang/scala-library/scala-library-2.12.1.jar"
-libraryjars "/Users/x/x/x/AWS_Lambda/lib_managed/jars/com.amazonaws/aws-lambda-java-core/aws-lambda-java-core-1.1.0.jar"
-libraryjars "/Library/Java/JavaVirtualMachines/jdk1.8.0_102.jdk/Contents/Home/jre/lib/rt.jar"
-libraryjars "/Users/x/x/x/AWS_Lambda/lib_managed/jars/com.amazonaws/aws-java-sdk-s3/aws-java-sdk-s3-1.11.0.jar"
-libraryjars "/Users/x/x/x/AWS_Lambda/lib_managed/jars/com.amazonaws/aws-lambda-java-events/aws-lambda-java-events-1.3.0.jar"
-outjars "/Users/x/x/x/AWS_Lambda/target/scala-2.12/proguard/lambda-demo_2.12-1.0.jar"
-dontoptimize
-dontobfuscate
-dontnote
-dontwarn

-keepattributes SourceFile,LineNumberTable

# Preserve all annotations.

-keepattributes *Annotation*

# Preserve all public applications.

-keepclasseswithmembers public class * {
    public static void main(java.lang.String[]);
}

# Preserve some classes and class members that are accessed by means of
# introspection.

-keep class * implements org.xml.sax.EntityResolver

-keepclassmembers class * {
    ** MODULE$;
}

-keepclassmembernames class scala.concurrent.forkjoin.ForkJoinPool {
    long eventCount;
    int  workerCounts;
    int  runControl;
    scala.concurrent.forkjoin.ForkJoinPool$WaitQueueNode syncStack;
    scala.concurrent.forkjoin.ForkJoinPool$WaitQueueNode spareStack;
}

-keepclassmembernames class scala.concurrent.forkjoin.ForkJoinWorkerThread {
    int base;
    int sp;
    int runState;
}

-keepclassmembernames class scala.concurrent.forkjoin.ForkJoinTask {
    int status;
}

-keepclassmembernames class scala.concurrent.forkjoin.LinkedTransferQueue {
    scala.concurrent.forkjoin.LinkedTransferQueue$PaddedAtomicReference head;
    scala.concurrent.forkjoin.LinkedTransferQueue$PaddedAtomicReference tail;
    scala.concurrent.forkjoin.LinkedTransferQueue$PaddedAtomicReference cleanMe;
}

# Preserve some classes and class members that are accessed by means of
# introspection in the Scala compiler library, if it is processed as well.

#-keep class * implements jline.Completor
#-keep class * implements jline.Terminal

#-keep class scala.tools.nsc.Global

#-keepclasseswithmembers class * {
#    <init>(scala.tools.nsc.Global);
#}

#-keepclassmembers class * {
#    *** scala_repl_value();
#    *** scala_repl_result();
#}

# Preserve all native method names and the names of their classes.

-keepclasseswithmembernames,includedescriptorclasses class * {
    native <methods>;
}

# Preserve the special static methods that are required in all     enumeration
# classes.

-keepclassmembers,allowoptimization enum * {
    public static **[] values();
    public static ** valueOf(java.lang.String);
}

# Explicitly preserve all serialization members. The Serializable interface
# is only a marker interface, so it wouldn't save them.
# You can comment this out if your application doesn't use serialization.
# If your code contains serializable classes that have to be backward
# compatible, please refer to the manual.

-keepclassmembers class * implements java.io.Serializable {
    static final long serialVersionUID;
    static final java.io.ObjectStreamField[] serialPersistentFields;
    private void writeObject(java.io.ObjectOutputStream);
    private void readObject(java.io.ObjectInputStream);
    java.lang.Object writeReplace();
    java.lang.Object readResolve();
}

# Your application may contain more items that need to be preserved;
# typically classes that are dynamically created using Class.forName:

# -keep public class mypackage.MyClass
# -keep public interface mypackage.MyInterface
# -keep public class * implements mypackage.MyInterface

-keep,includedescriptorclasses class example.** { *; }

-keepclassmembers class * {
    <init>(...);
}

当我 运行 我的 jar 非常小(大约 5 MB),但是当我启动 lambda 时,我得到以下错误

"errorMessage": "java.lang.NoSuchMethodException: com.amazonaws.services.s3.event.S3EventNotification.parseJson(java.lang.String)",
"errorType": "lambdainternal.util.ReflectUtil$ReflectException"

我查看了 class,proguard 删除了这个函数。当我更改配置以保留此文件时,我在另一个文件中遇到了另一个问题。

是否有人已经将 proguard 与 scala AWS lambda 函数一起使用并且设置良好或知道这个问题?有没有其他好的方法可以缩小罐子的大小?

最好的, 镨

老实说,13MB 并没有那么大。但是,尽管我确信这将被 Scala 开发人员认为是异端邪说,但我在 Java 中创建了一个等效方法,它的大小超过 7MB。我没有尝试在上面使用 Proguard - 它可能会进一步缩小。

那是您使用的 S3Event 包。如果您查看由于该软件包而包含的内容,它会带来大量额外的东西——SQS、SNS、Dynamo 等等。最终,这是最大的部分。我做了一个小测试,试图消除除 aws-lambda-java-core 之外的所有库,而是使用 JsonPath。这让我的 jar 文件变成了 458K.

我的代码如下。我知道这不是 Scala,但也许你可以从中得到一些想法。关键是消除尽可能多的 AWS 库。当然,如果您想在 Lambda 中执行除打印密钥之外的任何其他操作,则需要引入更多 AWS 库,这同样会使大小约为 7MB。

import java.io.IOException;
import java.io.InputStream;
import java.io.OutputStream;
import java.util.List;

import com.amazonaws.services.lambda.runtime.Context;
import com.amazonaws.services.lambda.runtime.RequestStreamHandler;
import com.jayway.jsonpath.JsonPath;


public class S3EventLambdaHandler implements RequestStreamHandler {
    public void handleRequest(InputStream inputStream, OutputStream outputStream, Context context) {

        try {
            List<String> keys = JsonPath.read(inputStream, "$.Records[*].s3.object.key");

            for( String nextKey: keys )
                System.out.println(nextKey);
        }
        catch( IOException ioe ) {
            context.getLogger().log("caught IOException reading input stream");
        }
    }
}