Importing bucket to hadoop with hadoop connector: Auth Failed

Hello,

I have just installed Hadoop connector 1.2 to use it with my installation of hadoop-2.7.0 and sqoop-1.4.6.bin__hadoop-1.0.0. Couchbase is 3.0.2.

I tried the following command to import a bucket called test-bucket with sasl but no password and had the following error:

sqoop import --connect http:// 192.168.1.219:8091 --password password --username test-bucket --table DUMP

ERROR binary.SASLStepOperationImpl: Error: Auth failure
WARN binary.BinaryMemcachedNodeImpl: Discarding partially completed op: SASL steps operation
WARN auth.AuthThread: Authentication failed to /192.168.1.219:11210, Status: {OperationStatus success=false: cancelled}
WARN auth.AuthThread: Authentication failed to /192.168.1.219:11210, Status: {OperationStatus success=false: Invalid arguments}

I tried with --password “” but no success.

I searched the forum and found here the possible reason:

" When you connect to the cluster to do “normal operations” you have 2 options

  • connect without any security/password
  • connect with an SASL password (that you set at the bucket level). "

So, I added a password to the bucket like this:

curl -X POST -u Administrator:password -d authType=sasl -d saslPassword=my_pwd http:// 192.168.1.219:8091/pools/default/buckets/test-bucket

Now, it seems that authentication success at a first step but then it fails again:

$ sqoop import --connect http:// 192.168.1.219:8091/pools --password my_pwd --username test-bucket --table DUMP
Warning: /opt/sqoop-1.4.6.bin__hadoop-1.0.0//…/hbase does not exist! HBase imports will fail.
Please set $HBASE_HOME to the root of your HBase installation.
Warning: /opt/sqoop-1.4.6.bin__hadoop-1.0.0//…/hcatalog does not exist! HCatalog jobs will fail.
Please set $HCAT_HOME to the root of your HCatalog installation.
Warning: /opt/sqoop-1.4.6.bin__hadoop-1.0.0//…/accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
Warning: /opt/sqoop-1.4.6.bin__hadoop-1.0.0//…/zookeeper does not exist! Accumulo imports will fail.
Please set $ZOOKEEPER_HOME to the root of your Zookeeper installation.
15/06/03 13:07:46 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6
15/06/03 13:07:46 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
15/06/03 13:07:46 INFO tool.CodeGenTool: Beginning code generation
15/06/03 13:07:46 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /opt/hadoop-2.7.0
Note: /tmp/sqoop-hadoop/compile/fa6005effbc577b536e2811153a019f1/DUMP.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
15/06/03 13:07:47 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-hadoop/compile/fa6005effbc577b536e2811153a019f1/DUMP.jar
15/06/03 13:07:47 INFO mapreduce.ImportJobBase: Beginning import of DUMP
15/06/03 13:07:47 INFO Configuration.deprecation: mapred.jar is deprecated. Instead, use mapreduce.job.jar
15/06/03 13:07:47 INFO Configuration.deprecation: mapred.map.tasks is deprecated. Instead, use mapreduce.job.maps
15/06/03 13:07:47 WARN util.Jars: No such class couchbase doesn’t use a jdbc driver available.
15/06/03 13:07:47 INFO client.RMProxy: Connecting to ResourceManager at hadoop-1/192.168.1.216:8050
15/06/03 13:07:58 INFO Configuration.deprecation: mapreduce.map.class is deprecated. Instead, use mapreduce.job.map.class
15/06/03 13:07:58 INFO auth.AuthThread: Authenticated to /192.168.1.219:11210
15/06/03 13:07:58 INFO provider.BucketConfigurationProvider: Could bootstrap through carrier publication.
15/06/03 13:07:58 INFO client.CouchbaseConnection: Added {QA sa=centos-test.mydomain.com/192.168.1.219:11210, #Rops=0, #Wops=0, #iq=0, topRop=null, topWop=null, toWrite=0, interested=0} to connect queue
15/06/03 13:07:58 INFO client.CouchbaseClient: CouchbaseConnectionFactory{bucket=‘test-bucket’, nodes=[http://192.168.1.219:8091/pools], order=RANDOM, opTimeout=2500, opQueue=16384, opQueueBlockTime=10000, obsPollInt=10, obsPollMax=500, obsTimeout=5000, viewConns=10, viewTimeout=75000, viewWorkers=1, configCheck=10, reconnectInt=1100, failureMode=Redistribute, hashAlgo=NATIVE_HASH, authWaitTime=2500}
15/06/03 13:07:58 INFO client.CouchbaseClient: viewmode property isn’t defined. Setting viewmode to production mode
15/06/03 13:07:58 INFO client.CouchbaseConnection: Shut down Couchbase client
15/06/03 13:07:58 WARN auth.AuthThreadMonitor: Connection shutdown in progress - interrupting waiting authentication thread.
15/06/03 13:07:58 WARN auth.AuthThread: Authentication failed to centos-test.mydomain.com/192.168.1.219:11210, Status: {OperationStatus success=false: cancelled}
15/06/03 13:07:58 INFO client.ViewConnection: I/O reactor terminated
15/06/03 13:07:58 INFO mapreduce.JobSubmitter: Cleaning up the staging area /tmp/hadoop-yarn/staging/hadoop/.staging/job_1433238053665_0012
Exception in thread “main” java.lang.IncompatibleClassChangeError: Found interface org.apache.hadoop.mapreduce.JobContext, but class was expected
at org.apache.sqoop.config.ConfigurationHelper.getJobNumMaps(ConfigurationHelper.java:65)
at com.cloudera.sqoop.config.ConfigurationHelper.getJobNumMaps(ConfigurationHelper.java:36)
at com.couchbase.sqoop.mapreduce.db.CouchbaseInputFormat.getSplits(CouchbaseInputFormat.java:100)
at org.apache.hadoop.mapreduce.JobSubmitter.writeNewSplits(JobSubmitter.java:304)
at org.apache.hadoop.mapreduce.JobSubmitter.writeSplits(JobSubmitter.java:321)
at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:199)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1290)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1287)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:1287)
at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1308)
at org.apache.sqoop.mapreduce.ImportJobBase.doSubmitJob(ImportJobBase.java:196)
at org.apache.sqoop.mapreduce.ImportJobBase.runJob(ImportJobBase.java:169)
at org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:266)
at com.couchbase.sqoop.manager.CouchbaseManager.importTable(CouchbaseManager.java:145)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:497)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:605)
at org.apache.sqoop.Sqoop.run(Sqoop.java:143)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:179)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:218)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:227)
at org.apache.sqoop.Sqoop.main(Sqoop.java:236)

My question is:

Is this error related to just authentication or to something else as per above error:

Exception in thread “main” java.lang.IncompatibleClassChangeError: Found interface org.apache.hadoop.mapreduce.JobContext, but class was expected

Any help would be welcome.

Ok,

I tried this time sqoop-1.4.6.bin__hadoop-2.0.4-alpha and this time the same command retrieved successfully the specified bucket into hadoop. The errors are not errors any more but warnings:

15/06/03 16:56:43 INFO mapreduce.ImportJobBase: Beginning import of DUMP
15/06/03 16:56:43 INFO Configuration.deprecation: mapred.jar is deprecated. Instead, use mapreduce.job.jar
15/06/03 16:56:44 INFO Configuration.deprecation: mapred.map.tasks is deprecated. Instead, use mapreduce.job.maps
15/06/03 16:56:44 WARN util.Jars: No such class couchbase doesn’t use a jdbc driver available.
15/06/03 16:56:44 INFO client.RMProxy: Connecting to ResourceManager at hadoop-1/192.168.1.216:8050
15/06/03 16:56:57 INFO Configuration.deprecation: mapreduce.map.class is deprecated. Instead, use mapreduce.job.map.class
15/06/03 16:56:57 INFO auth.AuthThread: Authenticated to /192.168.1.219:11210
15/06/03 16:56:58 INFO provider.BucketConfigurationProvider: Could bootstrap through carrier publication.
15/06/03 16:56:58 INFO client.CouchbaseConnection: Added {QA sa=centos-test.mydomain.com/192.168.1.219:11210, #Rops=0, #Wops=0, #iq=0, topRop=null, topWop=null, toWrite=0, interested=0} to connect queue
15/06/03 16:56:58 INFO client.CouchbaseClient: CouchbaseConnectionFactory{bucket=‘test-bucket’, nodes=[http://192.168.1.219:8091/pools], order=RANDOM, opTimeout=2500, opQueue=16384, opQueueBlockTime=10000, obsPollInt=10, obsPollMax=500, obsTimeout=5000, viewConns=10, viewTimeout=75000, viewWorkers=1, configCheck=10, reconnectInt=1100, failureMode=Redistribute, hashAlgo=NATIVE_HASH, authWaitTime=2500}
15/06/03 16:56:58 INFO client.CouchbaseClient: viewmode property isn’t defined. Setting viewmode to production mode
15/06/03 16:56:58 INFO client.CouchbaseConnection: Shut down Couchbase client
15/06/03 16:56:58 WARN auth.AuthThreadMonitor: Connection shutdown in progress - interrupting waiting authentication thread.
15/06/03 16:56:58 WARN auth.AuthThread: Authentication failed to centos-test.mydomain.com/192.168.1.219:11210, Status: {OperationStatus success=false: cancelled}
15/06/03 16:56:58 INFO client.ViewConnection: I/O reactor terminated

Should I still be worried about all those warnings regarding authentication failed or can I just ignore them?

Regards,

SysAdmin

Hi,

Thank you for your post. Did the import work successfully, despite the warnings? We are in the process of updating this connector, and your feedback is incredibly valuable.

Thanks

Todd

Hello Todd,

Yes, the import work successfully. The bucket was imported in HDFS as expected.

Could you please tell us about the update for this connector? Basically, we need something that imports the buckets in a more “structured way”, something like the Couchdoop connector that works with views. Are you working on something similar to that?

Regards,

SysAdmin