本文共 2369 字,大约阅读时间需要 7 分钟。
今天执行一个spark本地测试程序时出现以下错误:
java.net.BindException: Cannot assign requested address: Service 'sparkDriver' failed after 16 retries (on a random free port)! Consider explicitly setting the appropriate binding address for the service 'sparkDriver' (for example spark.driver.bindAddress for SparkDriver) to the correct binding address. at sun.nio.ch.Net.bind0(Native Method) ~[?:1.8.0_181] at sun.nio.ch.Net.bind(Net.java:433) ~[?:1.8.0_181] at sun.nio.ch.Net.bind(Net.java:425) ~[?:1.8.0_181] at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223) ~[?:1.8.0_181] at io.netty.channel.socket.nio.NioServerSocketChannel.doBind(NioServerSocketChannel.java:128) ~[learn.jar:?] at io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:558) ~[learn.jar:?] at io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1283) ~[learn.jar:?] at io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:501) ~[learn.jar:?] at io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:486) ~[learn.jar:?] at io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:989) ~[learn.jar:?] at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:254) ~[learn.jar:?] at io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:365) ~[learn.jar:?] at io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:163) ~[learn.jar:?] at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:403) ~[learn.jar:?] at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:463) ~[learn.jar:?] at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858) ~[learn.jar:?] at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:138) ~[learn.jar:?] at java.lang.Thread.run(Thread.java:748) ~[?:1.8.0_181] 3833 [main] INFO o.a.s.SparkContext - Successfully stopped SparkContext
执行hostname,获取主机名。
es@ambari:~$ hostname ambari
编辑/etc/hosts,添加如下行。
127.0.0.1 ambari # 实际添加时需要将ambari换成your_hostname
https://github.com/mattshma/bigdata/issues/107
转载地址:http://hzjmb.baihongyu.com/