[Lxc-users] some quesitons about storm-0.8.0-SNAPSHOT

Jianbin Ma fengqing24 at gmail.com
Fri Jul 20 08:23:10 UTC 2012


Hi,all
  I am using the storm 0.8.0-SNAPSHOT. and manly I want to use the
feature "Pluggable
Scheduler".
After I reconfig the storm.yaml and put the jar which includes
DemoScheduler in the $STORM_HOME/lib,and ,I start the storm commonly.
  but something strange happen.
    1. after storm kill topo_name, the topo_name is still on  the storm UI
web.and its status is killed. but I can't submit topo with the name
topot_name. the error is  topo_name is still active.
    2.after delete all the things in directory $STORM_HOME/storm-local,I
resubmit the topo with name special-topology(this topo need scheduling,the
name is set in DemoScheduler.jar in the $STORM_HOME/lib)
     in the terminal of nimbus,the info"
Our special topology needs scheduling.
Found the special-spout.
We assigned executors:[backtype.storm.scheduler.ExecutorDetails at 54,
backtype.storm.scheduler.ExecutorDetails at 46,
backtype.storm.scheduler.ExecutorDetails at 62] to slot:
[aa2e7306-87ba-4507-b38e-56cef6136b79, 6701]
Our special topology needs scheduling.
Found the special-spout.
We assigned executors:[backtype.storm.scheduler.ExecutorDetails at 54,
backtype.storm.scheduler.ExecutorDetails at 46,
backtype.storm.scheduler.ExecutorDetails at 62] to slot:
[aa2e7306-87ba-4507-b38e-56cef6136b79, 6701]
Our special topology needs scheduling.
Found the special-spout.
We assigned executors:[backtype.storm.scheduler.ExecutorDetails at 54,
backtype.storm.scheduler.ExecutorDetails at 46,
backtype.storm.scheduler.ExecutorDetails at 62] to slot:
[aa2e7306-87ba-4507-b38e-56cef6136b79, 6701]
"      is appearing again and again. and its log info is :
"
2012-07-20 15:50:07 nimbus [INFO] Cleaning up mmm-1-1342766582
2012-07-20 15:50:17 nimbus [INFO] Executor special-topology-1-1342769790:[2
2] not alive
2012-07-20 15:50:17 nimbus [INFO] Executor special-topology-1-1342769790:[3
3] not alive
2012-07-20 15:50:17 nimbus [INFO] Executor special-topology-1-1342769790:[4
4] not alive
2012-07-20 15:50:17 nimbus [INFO] Executor special-topology-1-1342769790:[8
8] not alive
2012-07-20 15:50:17 nimbus [INFO] Executor special-topology-1-1342769790:[9
9] not alive
2012-07-20 15:50:17 nimbus [INFO] Executor
special-topology-1-1342769790:[10 10] not alive
2012-07-20 15:49:47 nimbus [INFO] Setting new assignment for topology id
special-topology-1-1342769790:
#backtype.storm.daemon.common.Assignment{:master-code-dir
"/opt/software/storm-0.8.0-SNAPSHOT/storm-local/nimbus/stormdist/special-topology-1-1342769790",
:node->host {"aa2e7306-87ba-4507-b38e-56cef6136b79" "cloud-slave-016"},
:executor->node+port {[6 6] ["aa2e7306-87ba-4507-b38e-56cef6136b79" 6700],
[5 5] ["aa2e7306-87ba-4507-b38e-56cef6136b79" 6700], [7 7]
["aa2e7306-87ba-4507-b38e-56cef6136b79" 6700]}, :executor->start-time-secs
{[5 5] 1342770587, [6 6] 1342770587, [7 7] 1342770587}}
2012-07-20 15:49:47 nimbus [INFO] Cleaning up mmm-1-1342766582
2012-07-20 15:49:57 nimbus [INFO] Executor special-topology-1-1342769790:[2
2] not alive
2012-07-20 15:49:57 nimbus [INFO] Executor special-topology-1-1342769790:[3
3] not alive
2012-07-20 15:49:57 nimbus [INFO] Executor special-topology-1-1342769790:[4
4] not alive
2012-07-20 15:49:57 nimbus [INFO] Executor special-topology-1-1342769790:[8
8] not alive
2012-07-20 15:49:57 nimbus [INFO] Executor special-topology-1-1342769790:[9
9] not alive
2012-07-20 15:49:57 nimbus [INFO] Executor
special-topology-1-1342769790:[10 10] not alive
2012-07-20 15:49:57 nimbus [INFO] Executor special-topology-1-1342769790:[1
1] not alive

"
the topo name "mmm" in id "mmm-1-1342766582" is the first topo I submit.
   so I think whether the topo mmm influences the topo special-topology ?
and what is the right way to run storm 0.8.0-SNAPSHOT with the feature
"Pluggable
Schedule",
I am very hopeful to acquire your suggestions and help me to
troubleshooting.
Thx
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.linuxcontainers.org/pipermail/lxc-users/attachments/20120720/5a9776ba/attachment.html>


More information about the lxc-users mailing list