[Solved] Jar conflict with Couchbase inside Spark Streaming job

Hi,

Couchbase Java SDK 1.4.4 uses netty 3.5.5 as dependency but Spark uses Akka Actor which uses netty with a different version.
Does anyone have an idea to solve this jar conflict issue?

Here below the stack strace inside Spark:
akka.actor.ActorSystemImpl - Uncaught fatal error from thread [spark-akka.actor.default-dispatcher-4] shutting down ActorSystem [spark]
java.lang.NoSuchMethodError: org.jboss.netty.channel.socket.nio.NioClientSocketChannelFactory.(Ljava/util/concurrent/Executor;ILorg/jboss/netty/channel/socket/nio/WorkerPool;Lorg/jboss/netty/util/Timer;)V

Thanks

To fix this conflict with Spark you’ll need to exclude netty from couchbase client lib:

com.couchbase.client couchbase-client 1.4.4 io.netty netty

Hi,
I am having the same issue with map reduce job. I am new to java and eclipse. how I can exclude that. should I just remove it from jar reference ??

Thanks.

@draiwn hey, I just wanted to mention that we now have an official connector which also solves your issues :slight_smile: http://developer.couchbase.com/documentation/server/4.0/connectors/spark-1.0/spark-intro.html