dimanche 12 avril 2015

Sqoop Import using remote java client


Vote count:

0




I am writing a remote java client for sqoop(1.4.5) import from mysql to HDFS(hadoop-1.2.1).


This is my code:



Configuration config = new Configuration();
config.set("fs.default.name","hdfs://x.y.z.w:8020");
config.set("mapred.job.tracker", "x.y.z.w:9101");
SqoopOptions options = new SqoopOptions(config);
options.setConnectString("jdbc:mysql://x.y.z.w:3306/testdb");
options.setUsername("user");
options.setPassword("password");
options.setTableName("test");
options.setTargetDir("/testOut");
options.setNumMappers(1);
int ret = new ImportTool().run(options);`


I am getting the following error:


ERROR security.UserGroupInformation: PriviledgedActionException as:xxx cause:java.net.UnknownHostException: unknown host: xxxx ERROR tool.ImportTool: Encountered IOException running import job: java.net.UnknownHostException: unknown host: xxxx at org.apache.hadoop.ipc.Client$Connection.<init>(Client.java:236) at org.apache.hadoop.ipc.Client.getConnection(Client.java:1239) at org.apache.hadoop.ipc.Client.call(Client.java:1093) at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:229) at com.sun.proxy.$Proxy2.getProtocolVersion(Unknown Source) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:483) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:85) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:62) at com.sun.proxy.$Proxy2.getProtocolVersion(Unknown Source) at org.apache.hadoop.ipc.RPC.checkVersion(RPC.java:422) at org.apache.hadoop.hdfs.DFSClient.createNamenode(DFSClient.java:183) at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:281) at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:245) at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:100) at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1446) at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:67) at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1464) at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:263) at org.apache.hadoop.fs.Path.getFileSystem(Path.java:187) at org.apache.hadoop.mapreduce.JobSubmissionFiles.getStagingDir(JobSubmissionFiles.java:103) at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:942) at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:936) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1190) at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:936) at org.apache.hadoop.mapreduce.Job.submit(Job.java:550) at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:580) at org.apache.sqoop.mapreduce.ImportJobBase.runJob(ImportJobBase.java:119) at org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:179) at org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:413) at org.apache.sqoop.manager.MySQLManager.importTable(MySQLManager.java:97) at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:381) at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:454)


The hadoop logs show the following:


namenode log:


INFO org.apache.hadoop.ipc.Server: IPC Server listener on 8020: readAndProcess threw exception java.io.IOException: Connection reset by peer. Co unt of bytes read: 0 java.io.IOException: Connection reset by peer


jobtracker log:


INFO org.apache.hadoop.ipc.Server: IPC Server listener on 9101: readAndProcess threw exception java.io.IOException: Connection reset by peer. Co unt of bytes read: 0 java.io.IOException: Connection reset by peer


Can anybody help with this?



asked 34 secs ago

mag

1






Sqoop Import using remote java client

Aucun commentaire:

Enregistrer un commentaire